Unlocking the Power of Multi-Agent AI Systems with Amazon Bedrock AgentCore Runtime
Transforming AI Deployment: From Basic Helpers to Dynamic Collaborators
What is Amazon Bedrock AgentCore?
Core Capabilities of AgentCore Runtime
Real-World Example: Integrating Deep Agents
Deploying to AgentCore Runtime: A Step-by-Step Guide
Invoking Your Deployed Agent
Deep Agents in Action
Clean Up: Managing Your Agent Runtime
Conclusion: A New Era for AI Agents
About the Authors
Unlocking the Future of AI: Deploying Deep Agents with Amazon Bedrock AgentCore Runtime
Artificial Intelligence (AI) is evolving rapidly, transitioning from basic single-task helpers into sophisticated systems capable of planning, critiquing, and collaborating with other agents to tackle complex problems. Enter Deep Agents, a groundbreaking framework built on LangGraph, which enables multi-agent workflows that emulate real-world team dynamics. However, deploying these advanced systems in a reliable and secure manner poses significant challenges. This is where Amazon Bedrock AgentCore Runtime comes into play, providing a secure, serverless environment specifically designed for AI agents and tools.
In this blog post, we’ll explore how to deploy Deep Agents on AgentCore Runtime, showcasing the simplicity and effectiveness of this powerful combination.
What is Amazon Bedrock AgentCore?
Amazon Bedrock AgentCore is a versatile and model-agnostic framework that allows developers to deploy and operate advanced AI agents securely and at scale. Whether working with Strands Agents, CrewAI, LangGraph, LlamaIndex, or other frameworks, AgentCore offers an optimized infrastructure for diverse model deployments. Its modular services are specifically engineered for dynamic agent workloads, offering tools to extend agent capabilities while simplifying the production process.
By removing the burden of infrastructure management, AgentCore enables developers to focus on crafting intelligent solutions rather than the complexities of deployment.
Core Capabilities of AgentCore Runtime
AgentCore Runtime provides a secure, serverless hosting environment perfect for agentic workloads. Some of its key features include:
- Session Isolation: Each user session runs in dedicated microVMs, ensuring security and preventing cross-session contamination.
- Extended Execution Times: It supports up to 8 hours of processing time for complex reasoning tasks.
- Consumption-Based Pricing: Only charged during actual processing, not while waiting for responses from LLMs or tools.
- Compatibility: Works seamlessly with various frameworks and foundation model providers, while integrating built-in corporate authentication and observability.
With these capabilities, AgentCore Runtime transforms local agent prototypes into production-ready systems.
Real-World Example: Integrating Deep Agents
In this post, we’ll focus on deploying a Deep Agents implementation on AgentCore Runtime. This integration consists of:
- A Research Agent: Conducts deep internet searches using the Tavily API.
- A Critique Agent: Provides feedback on generated reports.
- A Main Orchestrator: Manages workflow and file operations.
With LangGraph’s state management, Deep Agents exhibit:
- Built-in task planning through a
write_todos
tool. - A virtual file system for context maintenance.
- A sub-agent architecture for task specialization.
- High recursion limits for complex workflows.
This multi-agent environment enables Deep Agents to perform intricate research tasks through a streamlined collaborative process.
Here’s a quick code snip to get started:
# Import the AgentCore runtime
from bedrock_agentcore.runtime import BedrockAgentCoreApp
app = BedrockAgentCoreApp()
# Decorate your agent function
@app.entrypoint
async def langgraph_bedrock(payload):
user_input = payload.get("prompt")
stream = agent.astream(
{"messages": [HumanMessage(content=user_input)]},
stream_mode="values"
)
async for chunk in stream:
yield chunk
# Run the application
if __name__ == "__main__":
app.run()
This integration pattern is largely framework-agnostic, allowing flexibility in how agents are constructed and deployed.
Deploying to AgentCore Runtime: Step-by-Step Guide
Prerequisites
Before you begin, ensure you have:
- Python 3.10+ installed.
- AWS credentials configured.
- Amazon Bedrock AgentCore SDK set up.
Step 1: IAM Permissions
Two IAM roles are essential for deploying an agent in AgentCore Runtime: one for the developer creating resources and a second execution role for the agent. The latter can be automated via the AgentCore Starter Toolkit.
Step 2: Wrap Your Agent
Include the AgentCore imports and decorator in your existing agent code.
Step 3: Deploy Using the Starter Toolkit
Utilize the straightforward tooling:
from bedrock_agentcore_starter_toolkit import Runtime
# Step 1: Configure
agentcore_runtime = Runtime()
config_response = agentcore_runtime.configure(
entrypoint="hello.py",
execution_role=role_arn,
auto_create_ecr=True,
requirements_file="requirements.txt",
region="us-west-2",
agent_name="deepagents-research"
)
# Step 2: Launch
launch_result = agentcore_runtime.launch()
print(f"Agent deployed! ARN: {launch_result['agent_arn']}")
# Step 3: Invoke
response = agentcore_runtime.invoke({
"prompt": "Research the latest developments in quantum computing"
})
Step 4: Behind the Scenes
During deployment, the toolkit handles:
- Docker file generation and container creation.
- Amazon Elastic Container Registry (ECR) setup.
- Monitoring and observability integration with AWS services.
Invoking Your Deployed Agent
Post-deployment, you can invoke your agent through either the starter kit or directly using the boto3 SDK.
Example Invocation Code
Using the starter toolkit:
response = agentcore_runtime.invoke({
"prompt": "Research the latest developments in quantum computing"
})
Or, directly through boto3:
import boto3
import json
agentcore_client = boto3.client('bedrock-agentcore', region_name="us-west-2")
response = agentcore_client.invoke_agent_runtime(
agentRuntimeArn=agent_arn,
qualifier="DEFAULT",
payload=json.dumps({
"prompt": "Analyze the impact of AI on healthcare in 2024"
})
)
Deep Agents in Action
As the primary agent engages its sub-agents, it orchestrates the workflow, leading to refined outcomes through a structured plan involving layered research and critique processes.
Clean-Up After Deployment
To maintain efficiency, remember to de-allocate the provisioned AgentCore Runtime and the ECR repository:
agentcore_control_client = boto3.client('bedrock-agentcore-control', region_name=region)
ecr_client = boto3.client('ecr', region_name=region)
runtime_delete_response = agentcore_control_client.delete_agent_runtime(agentRuntimeId=launch_result.agent_id)
response = ecr_client.delete_repository(repositoryName=launch_result.ecr_uri.split('/')[1], force=True)
Conclusion
Amazon Bedrock AgentCore is revolutionizing the deployment of AI agents. It abstracts away infrastructure complexities while maintaining flexibility for various frameworks and models, allowing developers to concentrate on building sophisticated solutions. Our Deep Agents deployment underscores how effortlessly complex agent systems with external API integrations can be launched with minimal code changes.
With enterprise-grade security, integrated observability, and serverless scalability, AgentCore stands as the optimal choice for production-ready AI agent deployments.
Are you ready to bring your agents to production? Here’s how to get started:
- Install the AgentCore starter kit:
pip install bedrock-agentcore-starter-toolkit
- Follow our step-by-step guide to deploy your code.
The era of production-ready AI agents is here. With AgentCore, the transition from prototype to production has never been smoother.
About the Authors
Vadim Omeltchenko is a Senior AI/ML Solutions Architect, passionate about cloud innovation.
Eashan Kaushik is a Specialist Solutions Architect at AWS, with a keen focus on generative AI solutions.
Shreyas Subramanian is a Principal Data Scientist, dedicated to solving business challenges with machine learning.
Mark Roy is a Principal Machine Learning Architect at AWS, specializing in generative AI solutions across numerous industries.