Multi-Tool Orchestration
Complex workflows often require coordinating multiple tools. This lesson covers patterns for orchestrating tools effectively.
Orchestration Patterns
Sequential Orchestration
Tools run one after another, each building on previous results:
async function sequentialOrchestration(userQuery) {
// Tool 1: Search for relevant customer
const customer = await tools.searchCustomer({ query: userQuery });
// Tool 2: Get customer's order history (needs customer ID)
const orders = await tools.getOrders({ customerId: customer.id });
// Tool 3: Analyze recent orders (needs order data)
const analysis = await tools.analyzeOrders({ orders: orders.recent });
return { customer, orders, analysis };
}
Parallel Orchestration
Independent tools run simultaneously:
async function parallelOrchestration(customerId) {
// All these can run in parallel - no dependencies
const [profile, orders, tickets, preferences] = await Promise.all([
tools.getCustomerProfile({ id: customerId }),
tools.getOrderHistory({ customerId }),
tools.getSupportTickets({ customerId }),
tools.getPreferences({ customerId })
]);
return { profile, orders, tickets, preferences };
}
Hybrid Orchestration
Mix sequential and parallel based on dependencies:
Loading Prompt Playground...
Dynamic Tool Selection
LLM-Driven Selection
Let the LLM decide which tools to use:
async function dynamicToolSelection(userQuery, availableTools) {
const selectionPrompt = `
Given the user query, select which tools are needed.
QUERY: ${userQuery}
AVAILABLE TOOLS:
${availableTools.map(t => `- ${t.name}: ${t.description}`).join('\n')}
Return a JSON array of tool names needed, in order of execution.
For parallel tools, nest them in an array.
Example: ["tool1", ["tool2", "tool3"], "tool4"]
(tool1 first, then 2&3 parallel, then tool4)
`;
const selection = await llm.chat({ content: selectionPrompt });
return JSON.parse(selection);
}
Rule-Based Selection
Pre-defined rules determine tool usage:
const toolSelectionRules = {
// If query mentions customer, always get profile
customerMention: {
condition: (query) => /customer|account|user/i.test(query),
tools: ['getCustomerProfile']
},
// If asking about orders, get order history
orderQuery: {
condition: (query) => /order|purchase|bought/i.test(query),
tools: ['getOrderHistory', 'getOrderDetails']
},
// If complaint detected, get support context
complaint: {
condition: (query, context) => context.sentiment < -0.5,
tools: ['getSupportHistory', 'getEscalationPath']
}
};
function selectToolsByRules(query, context) {
const selectedTools = new Set();
for (const rule of Object.values(toolSelectionRules)) {
if (rule.condition(query, context)) {
rule.tools.forEach(t => selectedTools.add(t));
}
}
return Array.from(selectedTools);
}
Tool Execution Strategies
Batching Tool Calls
Group similar tool calls for efficiency:
async function batchedToolExecution(items, toolName) {
// Instead of N individual calls
// Bad: items.forEach(item => tools[toolName](item))
// Batch into groups
const batchSize = 10;
const batches = chunk(items, batchSize);
const results = [];
for (const batch of batches) {
const batchResults = await Promise.all(
batch.map(item => tools[toolName](item))
);
results.push(...batchResults);
}
return results;
}
Caching Tool Results
Avoid redundant tool calls:
class ToolCache {
constructor(ttl = 60000) { // 1 minute default
this.cache = new Map();
this.ttl = ttl;
}
getCacheKey(toolName, args) {
return `${toolName}:${JSON.stringify(args)}`;
}
async executeWithCache(tool, args) {
const key = this.getCacheKey(tool.name, args);
// Check cache
const cached = this.cache.get(key);
if (cached && Date.now() - cached.timestamp < this.ttl) {
return cached.result;
}
// Execute and cache
const result = await tool.execute(args);
this.cache.set(key, { result, timestamp: Date.now() });
return result;
}
}
Error Handling in Orchestration
Partial Success Handling
Loading Prompt Playground...
Fallback Chains
async function orchestrateWithFallbacks(query) {
try {
// Try primary tools
return await primaryOrchestration(query);
} catch (error) {
if (error.toolName === 'primaryDatabase') {
// Fall back to secondary data source
return await fallbackOrchestration(query);
}
if (error.code === 'RATE_LIMITED') {
// Use cached/stale data
return await cachedOrchestration(query);
}
throw error;
}
}
Orchestration Patterns for Common Workflows
Research Pattern
Query → Search → [Multiple Sources] → Aggregate → Verify → Synthesize
Customer Service Pattern
Input → [Profile + History + Context] → Analyze → Route → Respond
Data Pipeline Pattern
Source → Extract → Transform → Validate → Load → Report
Exercise: Design a Multi-Tool Orchestration
Design orchestration for a complex scenario:
Loading Prompt Playground...
Key Takeaways
- Use sequential orchestration when tools have dependencies
- Use parallel orchestration for independent tools
- Hybrid approaches optimize for both correctness and speed
- Let LLMs dynamically select tools for flexible workflows
- Batch similar tool calls to reduce overhead
- Cache results to avoid redundant calls
- Handle partial success gracefully
- Design fallback strategies for critical tools
In the next module, we'll build complete research workflows.

