The ultimate power of an MCP Prompt is assembling vast amounts of context before the conversation even starts. Inside your prompt function, you can load external Resource data.
server.prompt(
"onboard_developer",
{},
async () => {
// Dynamically assemble context
const architecture = await fs.readFile('architecture.md');
return {
messages: [{
role: "user",
content: {
type: "text",
text: `Here is the team architecture: ${architecture}\n\nPlease explain the build process.`
}
}]
};
}
);
This pattern ensures the LLM is perfectly grounded with absolute truth before the user asks their first question.