Dotprompt introduces the concept of Prompts are Code.
Following Yaml-like format and using Handlebar variables, a product person can create a prompt file where define a text prompt, set a model and model parameters necessary to execute that prompt.
Then a developer puts that prompt file into /prompts
folder of the app and executes it as a regular function using Firebase Genkit. Cool, isn't it?
You can even store such prompt files separately from your main app code to let non-developers iterate over them through something like Genkit Developer UI or just a text editor. Think about it as if a designer developing web page design in Figma exports the design into React components and transfers them to a developer for integration.
Let's look at a simple app that fetches top 5 largest cities in the world through one prompt and then uses that list to get temperature in each city through another prompt. I will use a simple (fake) temperature tool just to focus on important stuff here.
I'll start from creating the cities.prompt
file:
---
model: googleai/gemini-2.0-flash
config:
temperature: 0
input:
schema:
num: integer
output:
format: json
schema:
cities(array): string
---
List top {{num}} largest cities in the world.
You may see the model and model params set there as well as the num
input param for city number.
That prompt should return a JSON object like:
{
"cities": ["City1", "City2"]
}
Now I'm going to use that array of cities as a parameter in another temperature.prompt
file:
---
model: googleai/gemini-2.5-flash
config:
temperature: 0
tools:
- temperature
input:
schema:
cities(array): string
output:
format: json
schema:
cities(array):
name: string
temperature: string
---
Get temperature for the following cities:
{{#each cities}}
- {{this}}
{{/each}}
As you see, I've used more powerful model which can properly execute tools, and I need to call a tool to request current temperature for each city.
Besides that, I used the #each
loop to list all cities. The dotPrompt standard lets you use conditions, loops, etc.
Notice, I plugged in a temperature
tool which will be implemented through code soon.
This prompt should return JSON like:
{
"cities": [
{"name": "City1", "temperature": "10°C"},
{"name": "City2", "temperature": "20°C"}
]
}
Let's glue it all together with Genkit:
const ai = genkit({
plugins: [googleAI({ apiKey: process.env.GOOGLE_API_KEY })],
model: gemini('gemini-2.0-flash'),
// Set the path to your prompt files.
promptDir: 'src/prompts',
})
// Define a simple city temperature tool.
ai.defineTool(
{
name: 'temperature',
description: 'Gets current temperature in the given city',
inputSchema: z.object({
city: z
.string()
.describe('The city to get the current temperature for'),
}),
outputSchema: z.string(),
},
async ({ city }) => {
try {
const temperature = (Math.random() * city.length * 10 - 20).toFixed(0)
return `${temperature}°C`
} catch (error: any) {
return error.message
}
}
)
// Get the list of 5 largest cities in the world using the cities prompt.
const citiesPrompt = await ai.prompt('cities')
const { text: citiesText } = await citiesPrompt({ num: 5 })
const { cities } = JSON.parse(citiesText)
// Get temperature for each city through the temperature prompt,
// which calls the temperature tool by its name.
const temperaturePrompt = await ai.prompt('temperature')
const { text: temperatureText } = await temperaturePrompt(
{ cities },
// We need a few rounds here to execute the tool multiple times.
{ maxTurns: 10 }
)
console.log(temperatureText)
As you may see, the code is pretty simple thanks to dotPrompt and Genkit which turn the prompts to executable functions.
Check our the entire project here kometolabs/genkit-dotprompt-example.
With dotPrompt, product people can work on the prompts independently while you're building UI and stuff. Nice thing also is that prompt developers can even version-control prompts in a separate repo, so you can just pull their updates through Git submodules for example.
I bet dotPrompt has the potential to become a standard, similar to the Model Context Protocol by Anthropic. What do you think?