Function calling is the mechanism that transforms an LLM from a text generator into an agent. You define functions with JSON Schema parameters, and the model decides when and how to call them.
tools array.tool_call with the function name and JSON arguments.const response = await openai.responses.create({
model: "gpt-5.4",
tools: [{
type: "function",
name: "get_stock_price",
description: "Get the current stock price for a ticker symbol",
parameters: {
type: "object",
properties: {
symbol: { type: "string", description: "Stock ticker (e.g., AAPL)" },
currency: { type: "string", enum: ["USD", "EUR", "GBP"] }
},
required: ["symbol"]
}
}],
input: "What's Apple's stock price in euros?"
});
// Model returns: tool_call { name: "get_stock_price", arguments: { symbol: "AAPL", currency: "EUR" } }
The model can call multiple functions simultaneously when the queries are independent:
// User: "Compare AAPL and MSFT stock prices"
// Model returns TWO tool_calls in parallel:
// 1. get_stock_price({ symbol: "AAPL" })
// 2. get_stock_price({ symbol: "MSFT" })