AI agents are transforming intelligent automation, and Mastra is leading the way. In this post, I’ll show how I built the Findwork Agent — an AI that fetches real job listings and interacts naturally with users. We’ll cover briefly:
What Mastra is and how it works 1.
Setting up your Google API key (via AI Studio) 1.
Understanding the A2A Protocol and multi-agent interaction 1.
Creating and deploying your agent to Mastra Cloud 1.
Integrating your agent with Telex
🧠 What Is Mastra?
Mastra is an open-source framework for building intelligent AI agents that can reason, use external APIs, and communicate with others via the A2A protocol. Check out the documentation to learn more about Mastra and installation. In…
AI agents are transforming intelligent automation, and Mastra is leading the way. In this post, I’ll show how I built the Findwork Agent — an AI that fetches real job listings and interacts naturally with users. We’ll cover briefly:
What Mastra is and how it works 1.
Setting up your Google API key (via AI Studio) 1.
Understanding the A2A Protocol and multi-agent interaction 1.
Creating and deploying your agent to Mastra Cloud 1.
Integrating your agent with Telex
🧠 What Is Mastra?
Mastra is an open-source framework for building intelligent AI agents that can reason, use external APIs, and communicate with others via the A2A protocol. Check out the documentation to learn more about Mastra and installation. In simpler terms, Mastra gives structure to AI workflows. Each agent has:
- A name
- Instructions (its role and tone)
- Tools (functions or APIs it can call)
- And an execution context where it processes data
🔐 Setting Up the Google API Key (from AI Studio)
To use Gemini models (like gemini-1.5-pro or gemini-2.0-flash), you’ll need a Google API key from AI Studio. You can enter it during Mastra setup or add it later in your project configuration. Steps:
Visit :ai studio 1.
Click Create API Key 1.
Copy the key and store it safely 1.
Go to findwork.dev and generate your api key 1.
In your project’s root directory, create a .env file. This file is used to store environment variables such as API keys and secret tokens, which can be accessed in your code using process.env.
🤝 Understanding the A2A Protocol
Mastra implements the A2A (Agent-to-Agent) protocol, which allows multiple agents to collaborate intelligently. For example, in our setup: The Findwork Agent focuses on finding job listings via the Findwork API. Another agent (like a Career Assistant) could generate a cover letter based on those listings.
These agents can pass structured data to one another — e.g., one agent fetches data, and the other interprets it. This decoupled structure keeps your AI system modular, scalable, and reusable.
🧩 Creating the Findwork Tool
The Findwork API provides access to real-time job listings. Below is the Mastra tool that powers our agent:
import { createTool } from '@mastra/core/tools';
import { z } from 'zod';
interface JobResponse {
count: number;
results: {
role: string;
company_name: string;
company_num_employees: string | number | null;
employment_type: string | null;
location: string;
remote: boolean;
url: string;
text: string;
}[];
}
export const jobTool = createTool({
id: 'get-job',
description: 'Fetch relevant job postings from Findwork based on location, skills, and remote preference.',
inputSchema: z.object({
location: z.string().optional(),
skills: z.string(),
remote: z.boolean().optional()
}),
outputSchema: z.object({
count: z.number(),
results: z.array(z.object({
role: z.string(),
company_name: z.string(),
company_num_employees: z.union([z.string(), z.number(), z.null()]),
employment_type: z.string().nullable(),
location: z.string(),
remote: z.boolean(),
url: z.string(),
text: z.string(),
})),
}),
execute: async ({ input }) => {
const { location, skills, remote } = input;
const params = new URLSearchParams();
if (skills) params.append('search', skills);
if (location) params.append('location', location);
if (remote) params.append('remote', 'true');
const url = `https://findwork.dev/api/jobs/?${params.toString()}`;
const jobResponse = await fetch(url, {
headers: {
Authorization: `Token ${process.env.FINDWORK_API_KEY}`,
},
});
if (!jobResponse.ok) throw new Error('Error fetching jobs: ' + jobResponse.statusText);
return (await jobResponse.json()) as JobResponse;
},
});
🧑💻 Creating the Findwork Agent
Now that we have a tool, let’s create our agent that uses it.
import { Agent } from '@mastra/core/agent';
import { Memory } from '@mastra/memory';
import { LibSQLStore } from '@mastra/libsql';
import { jobTool } from '../tools/jobber-tool'
import { scorers } from '../scorers/weather-scorer';
export const findworkAgent = new Agent({
name: 'Findwork Agent',
instructions: `
You are a professional job search assistant who helps users find relevant and recent job opportunities.
**Core Responsibilities:**
- Use the jobTool to search for jobs based on the user's query.
- Understand the user's input to determine:
- The job title or key skills they are interested in.
- The preferred job location (if mentioned).
- Whether they are looking for remote positions.
**Behavior Guidelines:**
- Always call the jobTool with the correct parameters: location, skills, and remote.
- If the user does not specify a location, default to remote job searches.
- If neither skills nor location are provided, politely ask the user for more details.
- Present job results clearly — include role, company name, location, remote status, and URL.
- Keep responses concise and professional.
**Tone:**
- Be helpful, direct, and conversational.
- Avoid unnecessary filler text; focus on delivering useful job results quickly.
**Tools:**
- Use only the jobTool to fetch job listings from Findwork.
`,
model: 'google/gemini-2.5-pro',
tools: { jobTool },
scorers: {
toolCallAppropriateness: {
scorer: scorers.toolCallAppropriatenessScorer,
sampling: {
type: 'ratio',
rate: 1,
},
},
completeness: {
scorer: scorers.completenessScorer,
sampling: {
type: 'ratio',
rate: 1,
},
},
translation: {
scorer: scorers.translationScorer,
sampling: {
type: 'ratio',
rate: 1,
},
},
},
memory: new Memory({
storage: new LibSQLStore({
url: 'file:../mastra.db',
}),
}),
})
↘️ Creating A2A route Handler
The key component connecting Mastra agents to the A2A protocol is a custom route handler. It ensures agent responses are properly formatted in A2A structure, while also managing artifacts and maintaining conversation history.
import { registerApiRoute } from '@mastra/core/server';
import { randomUUID } from 'crypto';
type MessagePart =
| { kind: 'text'; text: string }
| { kind: 'data'; data: unknown };
type A2AMessage = {
role: string;
parts?: MessagePart[];
messageId?: string;
taskId?: string;
};
export const a2aAgentRoute = registerApiRoute('/a2a/agent/:agentId', {
method: 'POST',
handler: async (c) => {
try {
const mastra = c.get('mastra');
const agentId = c.req.param('agentId');
// Parse JSON-RPC 2.0 request
const body = await c.req.json();
const { jsonrpc, id: requestId, method, params } = body;
// Validate JSON-RPC 2.0 format
if (jsonrpc !== '2.0' || !requestId) {
return c.json({
jsonrpc: '2.0',
id: requestId || null,
error: {
code: -32600,
message: 'Invalid Request: jsonrpc must be "2.0" and id is required'
}
}, 400);
}
const agent = mastra.getAgent(agentId);
if (!agent) {
return c.json({
jsonrpc: '2.0',
id: requestId,
error: {
code: -32602,
message: `Agent '${agentId}' not found`
}
}, 404);
}
// Extract messages from params
const { message, messages, contextId, taskId, metadata } = params || {};
let messagesList = [];
if (message) {
messagesList = [message];
} else if (messages && Array.isArray(messages)) {
messagesList = messages;
}
// Convert A2A messages to Mastra format
const mastraMessages = messagesList.map((msg) => ({
role: msg.role,
content: msg.parts?.map((part:MessagePart) => {
if (part.kind === 'text') return part.text;
if (part.kind === 'data') return JSON.stringify(part.data);
return '';
}).join('\n') || ''
}));
// Execute agent
const response = await agent.generate(mastraMessages);
const agentText = response.text || '';
// Build artifacts array
const artifacts:any = [
{
artifactId: randomUUID(),
name: `${agentId}Response`,
parts: [{ kind: 'text', text: agentText }]
}
];
// Add tool results as artifacts
if (response.toolResults && response.toolResults.length > 0) {
artifacts.push({
artifactId: randomUUID(),
name: 'ToolResults',
parts: response.toolResults.map((result) => ({
kind: 'text',
text: JSON.stringify(result)
}))
});
}
// Build conversation history
const history = [
...messagesList.map((msg) => ({
kind: 'message',
role: msg.role,
parts: msg.parts,
messageId: msg.messageId || randomUUID(),
taskId: msg.taskId || taskId || randomUUID(),
})),
{
kind: 'message',
role: 'agent',
parts: [{ kind: 'text', text: agentText }],
messageId: randomUUID(),
taskId: taskId || randomUUID(),
}
];
// Return A2A-compliant response
return c.json({
jsonrpc: '2.0',
id: requestId,
result: {
id: taskId || randomUUID(),
contextId: contextId || randomUUID(),
status: {
state: 'completed',
timestamp: new Date().toISOString(),
message: {
messageId: randomUUID(),
role: 'agent',
parts: [{ kind: 'text', text: agentText }],
kind: 'message'
}
},
artifacts,
history,
kind: 'task'
}
});
} catch (error:any) {
return c.json({
jsonrpc: '2.0',
id: null,
error: {
code: -32603,
message: 'Internal error',
data: { details: error.message || String(error) }
}
}, 500);
}
}
});
✍️ Registering our agent and custom route with mastra
We need to let Mastra know about our agent and our custom route.
import { Mastra } from '@mastra/core/mastra';
import { PinoLogger } from '@mastra/loggers';
import { LibSQLStore } from '@mastra/libsql';
import { findworkAgent } from './agents/findwork-agent';
import { a2aAgentRoute } from './routes/a2a-route';
export const mastra = new Mastra({
workflows: { weatherWorkflow },
agents: {findworkAgent },
server: {
apiRoutes: [a2aAgentRoute]
},
storage: new LibSQLStore({
// stores observability, scores, ... into memory storage, if it needs to persist, change to file:../mastra.db
url: ":memory:",
}),
logger: new PinoLogger({
name: 'Mastra',
level: 'info',
}),
});
🧪 Testing with Postman
Now that your agent is deployed, test it with Postman
post request:
{
"jsonrpc": "2.0",
"id": "request-001",
"method": "message/send",
"params": {
"message": {
"kind": "message",
"role": "user",
"parts": [
{
"kind": "text",
"text": "i need a job in london with react skill in high demand?"
}
],
"messageId": "msg-001",
"taskId": "task-001"
},
"configuration": {
"blocking": true
}
}
}
response from agent:
Integration with Telex
Telex.im allows you to deploy your Mastra agent as a co-worker.
Step 1: Create an AI Co-Worker in Telex In your Telex dashboard, navigate to the AI Co-Workers section and create a new co-worker.
Step 2: Define Your Workflow In the workflow editor, paste the following workflow definition. The key component here is the node definition, which tells Telex how to communicate with your Mastra agent:
{
"active": false,
"category": "utilities",
"description": "A workflow that gets job posted findwork",
"id": "sGC3u7y4vBaZww0G",
"name": "findwork agent",
"long_description": "You are a professional job search assistant who helps users find relevant and recent job opportunities. Core Responsibilities: Use the jobTool to search for jobs based on the user's query. - Understand the user's input to determine: - The job title or key skills they are interested in - The preferred job location (if mentioned). - Whether they are looking for remote positions. Behavior Guidelines: - Always call the jobTool with the correct parameters: location, skills, and remote. - If the user does not specify a location, default to remote job searches. - If neither skills nor location are provided, politely ask the user for more details. - Present job results clearly — include role, company name, location, remote status, and URL. - Keep responses concise and professional. Tone: - Be helpful, direct, and conversational. - Avoid unnecessary filler text; focus on delivering useful job results quickly. Tools: - Use only the jobTool to fetch job listings from Findwork.",
"short_description": "Search for jobs by location, skill, and remote status using the Findwork API.",
"nodes": [
{
"id": "findwork_agent",
"name": "findwork agent",
"parameters": {},
"position": [816, -112],
"type": "a2a/mastra-a2a-node",
"typeVersion": 1,
"url": "https://findwork-agent.mastra.cloud/a2a/agent/findworkAgent"
}
],
"pinData": {},
"settings": {
"executionOrder": "v1"
}
}
Now you can start exploring jobs that match your skills and interests! 🚀
🤖 How it works
When a user interacts with your Telex workflow, it sends an A2A request — just like the POST request body we tested earlier in Postman. Mastra then receives this request, extracts the user’s query, uses the appropriate tool to fetch the requested data, and generates a clear, user-friendly response. Isn’t that beautiful? ✨
🥳Conclusion
By combining Mastra, Gemini, and the Findwork API, we built an intelligent job-finding agent that:
Understands natural language queries 1.
Fetches live job listings
Can be extended with other AI agents (like cover letter or interview coaches) This is the power of A2A architecture — modular, scalable, and intelligent.