[ ABORT TO HUD ]
SEQ. 1
SEQ. 2
SEQ. 3
SEQ. 4
SEQ. 5

Defining & Executing Tools

🔧 Tool Use & Function Calling10 min70 BASE XP

Passing Tools to Models

To let an LLM use a tool, you define its signature using a JSON Schema. The LLM doesn't execute the code—it asks you to execute it.

{
  "name": "get_weather",
  "description": "Get the current weather in a given location.",
  "input_schema": {
    "type": "object",
    "properties": {
      "location": { "type": "string", "description": "City name" }
    },
    "required": ["location"]
  }
}

The Handshake

  1. Send prompt + tools array.
  2. Model responds with tool_use intent (e.g., location="Tokyo").
  3. Your code executes get_weather("Tokyo").
  4. You send the result back as a tool_result message.
SYNAPSE VERIFICATION
QUERY 1 // 1
Who actually executes the code when an agent calls a tool?
The LLM provider's servers
Your local application/orchestrator code
The user's browser
A third-party executing service
Watch: 139x Rust Speedup
Defining & Executing Tools | Tool Use & Function Calling — AI Agents Academy