This is a submission for the Agentic Postgres Challenge with Tiger Data
What I Built
Overview
GitResume is an AI-powered platform that analyzes GitHub repositories to provide coders with professional insights & career guidance. This application uses 4 specialized AI agents running in parallel to evaluate code quality, technology choices, career readiness, & innovation across selected repositories.
Built with Tiger Cloudâs Agentic Postgres, this transforms what was previously a 1-2 min sequential analysis process into a sub-10 sec real-time experience. Coders can select their best 4-6 repositories & receive comprehensive feedback including individual repository breakdowns, career trajectory detection, & actioâŚ
This is a submission for the Agentic Postgres Challenge with Tiger Data
What I Built
Overview
GitResume is an AI-powered platform that analyzes GitHub repositories to provide coders with professional insights & career guidance. This application uses 4 specialized AI agents running in parallel to evaluate code quality, technology choices, career readiness, & innovation across selected repositories.
Built with Tiger Cloudâs Agentic Postgres, this transforms what was previously a 1-2 min sequential analysis process into a sub-10 sec real-time experience. Coders can select their best 4-6 repositories & receive comprehensive feedback including individual repository breakdowns, career trajectory detection, & actionable recommendations for professional growth.
The system integrates GitHub API for repository data, implements multi-agent coordination through Tiger Cloudâs database forks, & provide a clean web interface for github portfolio analysis & assessment for career planning.
The Core Problem
Many of us coders struggle to effectively communicate our technical abilities. Traditional resumes list technologies & job titles, but they donât capture what really matters: how we actually code, solve problems, & build solutions.
For coders, especially developers, our GitHub repositories are our real portfolio - they contain the evidence of our skills, growth, & technical decision-making. Yet translating that code into career opportunities remains a challenge.
The Solution
GitResume analyzes our selected repositories (typically 4-6 of our best projects) and provides:
⢠Multi-agent analysis across 4 key dimensions: code architecture, technology choices, career readiness, & innovation. ⢠Individual repository insights with specific feedback on each project. ⢠Career trajectory detection based on our actual coding patterns. ⢠Actionable recommendations for professional development.
Why It Matters
GitResume addresses a real need in the developers community: turning our actual work into career advancement opportunities. By analyzing the code weâve already written, it provides insights that help us understand our strengths, identify growth areas, & position ourself more effectively for the roles we want.
It demonstrates how modern database architecture can enable new categories of developer productivity tools that provide immediate, actionable value.
Demo
đ Live Application
Experience GitResume in action - analyze your GitHub repositories & receive professional insights in under 10 secs.
Check it out here:- GitResumeAssessment
đ GitHub Repository
Checkout my source code here:-
Divya4879 / GitResume
Transform your GitHub into a professional resume with multi-agent AI analysis.
đ GitResume : TigerData-Powered Github Resume Analyzer
Transform your GitHub repositories into professional developer insights with AI-powered multi-agent analysis
GitResume leverages Tiger Cloudâs Agentic Postgres architecture to provide comprehensive analysis of GitHub repositories through 4 specialized AI agents. The platform integrates Tiger CLI for service management and implements a multi-agent system that analyzes real repository code, providing actionable career guidance and professional development recommendations.
đĽ Live Demo
đ Check it out here: GitResumeAssessment
đ Key Features
đ¤ Multi-Agent AI Analysis System
-
4 Specialized AI Agents working in parallel:
-
Code Architect: Analyzes code structure, design patterns, and architectural quality.
-
Tech Scout: Evaluates technology stack, framework usage, and modern practices.
-
Career Advisor: Assesses professional readiness and portfolio quality.
-
Innovation Detector: Identifies cutting-edge technologies and problem-solving approaches.
đ Advanced Tiger Cloud Integration
- pg_text Search: Semantic pattern detection across repositories.
- Agent Learning Evolution: AI agents improve accuracy overâŚ
đĽ Project Demo
A complete walkthrough of GitResumeAssessmentâs features, from entering GitHub username, repository selection to the GitResume professional level assessment & guidance.
đ¸ Project Snapshots
How I Used Agentic Postgres
1. Tiger CLI : Service Orchestration
Agentic Postgres Feature: Tiger CLI provides command-line interface for managing Tiger Cloud services, enabling programmatic DB operations & service lifecycle management.
How I Used It: Automated the creation and management of Tiger services for multi-agent coordination. The CLI integration allows GitResume to dynamically provision database infrastructure for each analysis session.
Why Itâs Better: Eliminates manual database setup, enables on-demand scaling, & provides programmatic control over DB resources. This transforms GitResume from a static application to a dynamic, infrastructure-aware system.
// Automated Tiger service creation for multi-agent system
async initializeMultiAgentSystem(username: string): Promise<void> {
try {
// Create Tiger service programmatically
const serviceResult = execSync('./bin/tiger service create --name advanced-gitresume', {
encoding: 'utf-8',
cwd: process.cwd()
});
this.tigerServiceId = serviceResult.trim().split(' ').pop() || '';
console.log(`đŻ Tiger Service Created: ${this.tigerServiceId}`);
} catch (error) {
console.log('â ď¸ Tiger service creation failed, try again');
}
}
2. Fast Zero-Copy Forks : Agent Isolation
Agentic Postgres Feature: Zero-copy database forks create instant, isolated database instances without data duplication, enabling parallel processing with complete data isolation.
How I Used It: Each of the 4 AI agents(code-architect, tech-scout, career-advisor, innovation-detector) gets its own dedicated database fork, allowing true parallel analysis without data conflicts.
Why Itâs Revolutionary: Traditional databases require expensive data replication for isolation. Tigerâs zero-copy forks enable instant agent workspaces, reducing setup time from mins to secs & enabling real-time multi-agent collaboration.
// Create isolated workspaces for each AI agent
const agents = ['code-architect', 'tech-scout', 'career-advisor', 'innovation-detector'];
for (const agent of agents) {
try {
const forkResult = execSync(`./bin/tiger fork create --service ${this.tigerServiceId} --name ${agent}-workspace`, {
encoding: 'utf-8',
cwd: process.cwd()
});
const forkId = forkResult.trim().split(' ').pop() || '';
this.agentForks.set(agent, forkId);
// Initialize agent-specific schema
await this.initializeAgentWorkspace(agent, forkId);
} catch (error) {
console.log(`â ď¸ Fork creation failed for ${agent}, using shared workspace`);
}
}
3. Agent Workspace Schema Design
Agentic Postgres Feature: Full PostgreSQL compatibility with agent-specific table structures & indexing for optimized AI workloads.
How I Used It: Each agent fork contains specialized tables for insights, learnings, & pattern detection, enabling agents to build knowledge over time & share insights across analysis sessions.
Why Itâs Powerful: Transforms AI agents from stateless functions to learning entities with persistent memory, enabling continuous improvement & cross-session knowledge retention.
// Agent-specific schema for learning and insights
private async initializeAgentWorkspace(agent: string, forkId: string): Promise<void> {
const schema = `
CREATE TABLE IF NOT EXISTS ${agent}_insights (
id SERIAL PRIMARY KEY,
repository TEXT,
pattern TEXT,
insight TEXT,
confidence FLOAT,
created_at TIMESTAMP DEFAULT NOW()
);
CREATE TABLE IF NOT EXISTS ${agent}_learnings (
id SERIAL PRIMARY KEY,
pattern_type TEXT,
learning TEXT,
success_rate FLOAT,
updated_at TIMESTAMP DEFAULT NOW()
);
`;
console.log(`đ Initialized workspace for ${agent}`);
}
4. Parallel Agent Coordination
Agentic Postgres Feature: Multi-database coordination enabling simultaneous operations across multiple isolated environments with eventual consistency.
How I Used It: Orchestrated 4 specialized agents to analyze repositories simultaneously, with each agent contributing unique insights that are aggregated into comprehensive career profiles.
Why Itâs Game-Changing: Reduced analysis time from 1-2 mins (sequential) to under 10 secs (parallel), while maintaining data integrity & enabling sophisticated cross-agent pattern detection.
// Parallel agent execution with real-time coordination
async analyzeWithAdvancedAgents(username: string, repositories: string[]) {
// Initialize multi-agent system with Tiger forks
await this.initializeMultiAgentSystem(username);
// Run all agents in parallel across repositories
const agentPromises = repositories.map(async (repo) => {
return await this.runParallelAgentAnalysis(username, repo);
});
// Aggregate results from all agents
const repoAnalyses = await Promise.all(agentPromises);
// Cross-repository pattern detection
const crossRepoPatterns = await this.detectCrossRepoPatterns(allInsights);
return {
insights: allInsights,
careerProfile: await this.generateCareerProfile(allInsights, crossRepoPatterns),
crossRepoPatterns,
learningEvolution: await this.updateAgentLearnings(allInsights)
};
}
5. pg_text Search : Semantic Pattern Detection
Agentic Postgres Feature: PostgreSQLâs full-text search capabilities with to_tsvector and plainto_tsquery functions, enabling semantic analysis & pattern matching across large text datasets.
How I Used It: Implemented cross-repository semantic analysis to detect technology patterns, coding approaches, & architectural decisions across a developerâs entire portfolio. The system searches for semantic relationships between repositories using natural language processing.
Why Itâs Revolutionary: Traditional keyword matching misses semantic relationships. pg_text search enables GitResume to understand that âauthentication,â âauth,â âJWT,â & âOAuthâ are related concepts, providing deeper insights into a developerâs expertise patterns across projects.
// pg_text search implementation for semantic pattern detection
private async pgTextSearchPatterns(insights: AgentInsight[]): Promise<any[]> {
const searchTerms = ['react', 'typescript', 'api', 'authentication', 'testing', 'deployment'];
const patterns: any[] = [];
for (const term of searchTerms) {
// Real PostgreSQL full-text search query
const query = `
SELECT repository, pattern, insight,
ts_rank(to_tsvector('english', insight), plainto_tsquery($1)) as relevance
FROM agent_insights
WHERE to_tsvector('english', insight) @@ plainto_tsquery($1)
ORDER BY relevance DESC
LIMIT 10;
`;
if (semanticMatches.length > 1) {
patterns.push({
pattern: `semantic-${term}`,
searchMethod: 'pg_text_search',
relevanceScore: semanticMatches.reduce((sum, i) => sum + i.score, 0) / semanticMatches.length
});
}
}
return patterns;
}
6. Fluid Storage : Dynamic Resource Scaling
Agentic Postgres Feature: Intelligent storage management that dynamically scales resources based on workload complexity, enabling efficient processing of varying data sizes without manual configuration.
How I Used It: Implemented adaptive repository analysis where large or complex repositories (10MB+ or high star count) automatically triggers distributed processing across multiple agent forks, while smaller repositories use optimized with single-fork processing.
Why Itâs Game-Changing: Eliminates the âone-size-fits-allâ limitation of traditional databases. GitResume automatically adapts its processing strategy based on repository complexity, ensuring optimal performance whether analyzing a simple script or a massive enterprise codebase.
// Fluid Storage: Dynamic scaling based on repository complexity
private async fetchRepositoryData(username: string, repo: string): Promise<any> {
// Assess repository complexity for intelligent scaling
const repoComplexity = await this.assessRepositoryComplexity(username, repo, token);
if (repoComplexity.isLarge) {
console.log(`Using Fluid Storage for large repository: ${repo}`);
return await this.fluidStorageFetch(username, repo, token);
} else {
console.log(`Using standard fetch for repository: ${repo}`);
return await this.standardRepositoryFetch(username, repo, token);
}
}
private async fluidStorageFetch(username: string, repo: string, token: string): Promise<any> {
// Distributed fetching across multiple agent forks for large repositories
const agents = Array.from(this.agentForks.keys());
// Fluid Storage: Distribute file analysis across agent forks
const importantFiles = (tree.tree || []).filter((file: any) =>
file.type === 'blob' && this.isAnalysisWorthy(file)
).slice(0, 20); // Intelligent file limiting
return {
info: repoInfo,
tree: tree.tree || [],
readme,
fluidStorage: {
used: true,
agentsUsed: agents.length,
filesDistributed: importantFiles.length,
distributionStrategy: 'agent-fork-based'
}
};
}
Overall Experience
What Worked Well
Tiger Cloudâs architecture enables genuine innovation in developer tooling. The database fork concept is transformative - giving each AI agent its own isolated workspace while maintaining data consistency is exactly what this multi-agent system needed. The documentation is surprisingly comprehensive for a cutting-edge platform, making the learning curve smoother than expected for my first-time experience with Agentic Postgres.
What Surprised Me
The performance improvement was staggering. Moving from my previous non-Tiger implementation (1-2 mins) to Tiger Cloudâs Agentic Postgres (5-10 secs) wasnât just optimization, it fundamentally changed the entire user experience from âsubmit and waitâ to âwatch real-time analysis.â The efficiency & speed of Tiger Cloud services exceeded my expectations.
Key Challenges & Solutions
Challenge 1: Free Tier Service Limitations
⢠Problem: Only 2 services per free tier, but I initially wanted 4+ dedicated agent workspaces. ⢠Reality Check: Hit this limit immediately during development. ⢠Solution: Redesigned architecture with intelligent fallback - agents share workspaces when fork creation fails. ⢠Learning: Always design for graceful degradation, especially with cloud resource constraints.
Challenge 2: GitHub API Rate Limits Crisis
⢠Problem: First local test consumed 5023+ requests in one analysis run, hitting the 5000/hour limit. ⢠Impact: Had to wait 1 hour before I could test again - full panic mode! ⢠Solution: Complete optimization overhaul using Tiger Cloudâs caching capabilities. ⢠Result: Reduced to <100 requests per analysis through intelligent file filtering and Tiger storage.
// Emergency optimization that saved the project
const importantFiles = tree.tree?.filter((file: any) =>
file.type === 'blob' && (
file.path.includes('README') ||
file.path.endsWith('.js') ||
file.path.endsWith('.ts') ||
file.path === 'package.json'
)
).slice(0, 10); // Ruthless limiting to essential files only
Challenge 3: Tiger Cloud Service Outages
⢠Problem: Encountered Tiger Cloud outages during development. ⢠Reality: Had to build robust fallback systems for production reliability. ⢠Solution: Implemented some fallbacks to maintain functionality even when Tiger services are unavailable.
Development Reality Check
This was my first experience with Tiger Cloud, Tiger CLI, & Agentic Postgres, essentially learning everything from scratch. Despite being new to the platform, I managed to build a working multi-agent system in over 20 hrs of development. The fact that a newcomer could achieve this level of integration speaks volumes about Tiger Cloudâs developer experience.
Additional complexity: This was also my first Next.js project, adding another learning curve, but the combination worked seamlessly.
Key Learnings
- Agentic Postgres isnât just a database, itâs a platform for building intelligent, collaborative systems.
- Zero-copy forks enable architectural patterns that simply werenât possible with traditional databases.
- Resource constraints drive innovation - the free tier limitations forced better design decisions, for me atleast.
- Performance optimization through intelligent caching can be more impactful than code optimization.
- Always plan for service outages - robust fallbacks are essential for production applications.
My Experience building with Agentic Postgres
Tiger Cloud transformed what could have been a slow, batch-processing tool into a real-time, interactive developer assistant. The 750MB storage limit on the free tier proved more than adequate, & the service creation limitations actually led to a more efficient architecture.
Bottom line: Tiger Cloud didnât just improve my application, it enabled an entirely new category of developer productivity tool that provides immediate, actionable value.
Thank You
Building GitResume has been an incredible journey for me. Tiger Cloud didnât just provide a database - it provided a new way of thinking about AI Agents & Applications. The ability to give each AI agent its own workspace through zero-copy forks opened up architectural possibilities Iâd never imagined.
To the Tiger Data team: Thank you for creating a technology that enables developers like me to build things that seemed impossible just months ago. The seamless integration between Tiger CLI, database forks, & Agentic Postgres features made this hackathon project feel less like wrestling with infrastructure and more like pure innovation.
To the developer community: GitResume exists because we all know that our code tells our story better than any traditional resume ever could. I hope this platform helps fellow developers showcase their true capabilities & land the opportunities they deserve â¨.
The future of developer tools is collaborative AI systems, & Tiger Cloud has given us the foundation to build that future. GitResume is just the beginning.
And thank you, dear reader, for reading till the end đ