Quickstart
This section provides a quickstart example for creating an AI Agent with Llama Stack.
Prerequisites
- Python 3.12 or higher (if not satisfied, refer to FAQ: How to prepare Python 3.12 in Notebook)
- Llama Stack Server installed and running via Operator (see Install Llama Stack)
- Access to a Notebook environment (e.g., Jupyter Notebook, JupyterLab)
- Python environment with
llama-stack-clientand required dependencies installed - API key for the LLM provider (e.g., DeepSeek API key)
Quickstart Example
A simple example of creating an AI Agent with Llama Stack is available in the following resources:
- Notebook:Llama Stack Quick Start Demo
Download the notebook and upload it to a Notebook environment to run.
The notebook demonstrates:
- Connecting to Llama Stack Server and client setup
- Tool definition using the
@client_tooldecorator (weather query tool example) - Client connection to Llama Stack Server
- Model selection and Agent creation with tools and instructions
- Agent execution with session management and streaming responses
- Result handling and display
- Optional FastAPI deployment example
FAQ
How to prepare Python 3.12 in Notebook
-
Download the pre-compiled Python installation package:
-
Extract with:
-
Install and Register Kernel:
-
Switch kernel in the notebook page:
- Open your Notebook environment (e.g., Jupyter Notebook or JupyterLab) in the browser, then open an existing notebook or create a new one.
- In the notebook interface, find the current kernel name (usually shown in the top-right corner of the page, e.g., "Python 3" or "python3").
- Click that kernel name, or use the menu Kernel → Change Kernel.
- In the kernel list, select "Python 3.12" (the display name registered in step 3).
- After switching, new cells will run with Python 3.12.
Note: When executing python and pip commands directly in the notebook page, the default python will still be used. You need to specify the full path to use the python312 version commands.
Additional Resources
For more resources on developing AI Agents with Llama Stack, see:
- Llama Stack Documentation - The official Llama Stack documentation covering all usage-related topics, API providers, and core concepts.
- Llama Stack Core Concepts - Deep dive into Llama Stack architecture, API stability, and resource management.
- Llama Stack GitHub Repository - Source code, example applications, distribution configurations, and how to add new API providers.
- Llama Stack Example Apps - Official examples demonstrating how to use Llama Stack in various scenarios.