Orchestration models: From rule-based to AI-native
In event-driven serverless AI systems, orchestration is the connective logic that determines how events trigger and shape the behavior of the system. In AWS, orchestration can follow two primary models:
-
Rule-based orchestration is defined by developers using workflows and state machines.
-
AI-native orchestration is powered by agents and large language models (LLMs) that reason, plan, and act based on intent and context.
Each model plays a distinct role in building flexible, reactive, and intelligent systems. Together, they enable developers to transition from procedural automation to autonomous, goal-driven systems.
Rule-based orchestration with AWS Step Functions
Step Functions provides a visual workflow engine to orchestrate services like AWS Lambda, Amazon SageMaker, Amazon Bedrock, Amazon DynamoDB, and Amazon Simple Storage Service (Amazon S3). The logic is deterministic in that steps are explicitly defined, and transitions are condition-based.
Key benefits of rule-based orchestration with Step Functions include the following:
-
Strong auditability and visibility through a visual workflow console
-
Built-in error handling, retries, and parallelism
-
Ideal for linear or branched control flows with well-defined paths
The following diagram shows the workflow of an example use case of document ingestion and processing.

In this example, a legal firm automates the analysis of uploaded contracts in the following steps:
-
Event trigger – Legal documents are uploaded to an Amazon S3 bucket, which triggers an Amazon EventBridge event, which routes to a Step Functions workflow.
-
Workflow – Step Functions performs the following steps:
-
Document processing – A Lambda function cleans and performs initial optical character recognition (OCR) on the document.
-
Text extraction – Amazon Textract extracts key text and data from the document.
-
Analysis – Amazon Comprehend analyzes the text to classify risk levels and sentiment.
-
Summarization – Amazon Bedrock generates a concise summary of the contract.
-
Data storage – Results are written to Amazon OpenSearch Service for indexing.
-
-
Retrieval – The legal team can search, filter, and visualize contract analysis through dashboards.
This architecture leverages the AWS SDK integration capabilities of Step Functions to directly interact with each AWS service in the workflow. This approach reduces complexity and eliminates the need for separate Lambda functions between each processing step. The final write to OpenSearch Service is also handled through SDK integration. As a result, Step Functions can index the document analysis results, risk classifications, sentiment analysis, and AI-generated summaries directly into OpenSearch Service. The legal team can access the information through dashboards for searching, filtering, and visualizing contract analysis.
Each task is a defined state with built-in error handling. No decisions are made by the AI, and orchestration is explicit.
AI-native orchestration with Amazon Bedrock Agents
Where Step Functions manages how things happen, agents for Amazon Bedrock decide what should happen based on user goals. An Amazon Bedrock agent combines the following:
-
An LLM such as Anthropic Claude or Amazon Nova
-
A set of tool integrations such as Lambda functions
-
Optional knowledge bases for contextual grounding
-
Built-in memory and goal tracking
Agents interpret natural language input, reason about it, and autonomously invoke tools to fulfill the user's intent, offloading orchestration logic to the model.
Key benefits of AI-native orchestration with Amazon Bedrock Agents include the following:
-
Semantic flexibility – Interpret varied natural language inputs.
-
Tool autonomy – Select the right tools at runtime.
-
Contextual grounding - Cite knowledge base content accurately.
-
Minimal developer maintenance – Define the tools, and not the flow.
The following diagram shows the workflow of an example use case of customer support automation.

In this example, a user on a retail website types a message in the support chatbot. The following workflow occurs:
-
The event trigger actions are as follows:
-
User sends a message: "I need to return the shoes I ordered last week. Can you help?"
-
The message is received and routed through EventBridge.
-
EventBridge triggers the Amazon Bedrock agent.
-
-
The agent reasoning process is as follows:
-
Intent extraction – Agent identifies the intent as "return order".
-
Data retrieval – Agent queries the CRM system by using
GetOrderHistory
Lambda function. -
Eligibility check – Agent calls the
ProcessReturn
Lambda function to verify return eligibility. -
Response generation – Agent formulates appropriate response.
-
-
The customer communication action occurs when the agent responds "Your return is being processed. Expect a confirmation email shortly."
The entire workflow demonstrates how Amazon Bedrock Agents orchestrates complex business logic through defined action groups. By connecting customer intent with backend systems and processes, it delivers an automated yet contextually appropriate customer service experience.
The orchestration is not hardcoded. The LLM determines the workflow dynamically, making the system more resilient to variation and ambiguity in inputs.
Rule-based or AI-native: When to use which?
AWS Step Functions and Amazon Bedrock Agents each excel in different orchestration scenarios. As a best practice, use Step Functions for controlled processes and Amazon Bedrock Agents for natural language interaction and flexible goal fulfillment. The following table compares these services across various use case types.
Use case type |
Step Functions (Rule-based) |
Amazon Bedrock Agents (AI-native) |
---|---|---|
Deterministic workflow |
Ideal |
Not needed. |
Unstructured user input |
Rigid |
Interprets and adapts. |
Complex business rules |
Model by using conditions |
Can infer by using semantic reasoning. |
Requires fine-grained audit trail |
Full state trace |
Limited trace, depending on agent logs. However, tools like weights, biases and model invocation logging can mitigate this limitation. |
Latency-sensitive automation |
Real-time coordination |
Real-time, although slightly higher because of LLM processing. |
Goal-directed user experiences |
Requires explicit design |
Agent can infer goal and compose flow. |
Event-driven orchestration
Whether using rule-based or AI-native orchestration, events are the mechanism that activate intelligence in a serverless system. In both orchestration models, the following sequence occurs:
-
An event is emitted through EventBridge. Examples of an event are user inputs, document uploads, and transactions.
-
That event triggers the appropriate orchestrator:
-
Step Functions if the logic is deterministic
-
Amazon Bedrock Agents if the logic is dynamic or conversational
-
-
Each orchestrator coordinates AI services and emits further events such as completion, error, and downstream triggers.
This reactive model ensures scalability, resilience, and modular design, allowing parts of the system to evolve independently.
Strategic perspective
EDA supports both rule-based orchestration and AI-native orchestration models, and it enables both models to coexist. Step Functions provides reliable, repeatable automation and Amazon Bedrock Agents introduces dynamic, context-aware intelligence.
Together, they provide organizations with the ability to do the following:
-
Automate repetitive, high-volume processes
-
Offer intelligent, adaptive user-facing assistants
-
Scale AI without bottlenecks or architectural rigidity
Orchestration is no longer just about rules, it's about intent interpretation, tool selection, and autonomous execution. Serverless on AWS combines AWS Step Functions for structured workflows and Amazon Bedrock Agents for semantic orchestration. This unified framework enables building the next generation of agentic, serverless AI systems.