
GitHub shows totals and a grid. You already know you wrote code this year; what you want is the story behind it.
CommitRecap turns a username into a guided recap that feels personal, visual, and easy to share.
Live demo: https://commit-recap.vercel.app
The landing screen sets the promise
One input, one action, no detours. The line βWe only access public GitHub dataβ tells you the scope and removes privacy anxiety before you start.
I built it this way because a recap is a flow. If you slow people down at the start, they never...

GitHub shows totals and a grid. You already know you wrote code this year; what you want is the story behind it.
CommitRecap turns a username into a guided recap that feels personal, visual, and easy to share.
Live demo: https://commit-recap.vercel.app
The landing screen sets the promise
One input, one action, no detours. The line βWe only access public GitHub dataβ tells you the scope and removes privacy anxiety before you start.
I built it this way because a recap is a flow. If you slow people down at the start, they never reach the pages that make them smile. The start screen exists to reduce friction and make the next click inevitable.
The recap flow is built to move

The large number gives you the headline: commits in the year. Then the supporting stats follow, like PRs and reviews. Itβs the right order. Developers scan the big number, then confirm with the smaller ones.
The activity timeline underneath is the second act. It highlights the busiest day with a sharp spike. That single highlight does more work than a dense chart. It gives you a memory: you can look at that peak and think about what you shipped.
The ending needs to feel shareable

The recap ends with a compact share card and two actions; download or copy. This is the whole point. Recaps are for sharing, and the share artifact has to stand on its own. The card compresses your year into a few lines: top languages, commit count, PRs, reviews, and a streak. It is small, readable, and looks good on a timeline.
This screen also proves a broader lesson, the recap is not done until it can travel.
Architecture overview
CommitRecap runs on two separate systems: a Next.js client hosted on Vercel and a FastAPI backend running on AWS Lambda.
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ β Vercel Edge β β βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ β β β Next.js App Router β β β β β β β β βββββββββββββββ βββββββββββββββ βββββββββββββββββββ β β β β β React Query βββββ Zustand βββββ Page Components β β β β β β (fetch) β β (store) β β (render) β β β β β βββββββββββββββ βββββββββββββββ βββββββββββββββββββ β β β βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ β βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ β β HTTPS βΌ βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ β AWS Lambda β β βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ β β β FastAPI Application β β β β β β β β βββββββββββββββ βββββββββββββββ βββββββββββββββββββ β β β β β Router βββββ Services βββββ GitHub Client β β β β β β /github/* β β (aggregate) β β (REST + GQL) β β β β β βββββββββββββββ βββββββββββββββ βββββββββββββββββββ β β β βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ β βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ β β GraphQL / REST βΌ βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ β GitHub API β β REST (profiles, repos) + GraphQL (contributions) β βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Why this split works
Vercel for the client handles edge caching, automatic deployments from Git, and global CDN distribution. The Next.js App Router gives me file-based routing and server components where needed.
AWS Lambda for the backend runs FastAPI in a serverless function. Cold starts are minimal because the function stays warm during peak usage. Lambda scales automatically when multiple users hit the app simultaneously, and I only pay for actual compute time.
This separation keeps concerns clean. The client handles rendering and user flow. The backend handles GitHub API calls, rate limiting, and data aggregation.
Server architecture
The FastAPI backend runs on AWS Lambda through Mangum, an adapter that translates Lambda events into ASGI requests.
server/ βββ lambda_handler.py # AWS Lambda entry point (Mangum wrapper) βββ main.py # FastAPI app initialization βββ config/ β βββ env.py # Environment variables β βββ cors.py # CORS configuration β βββ exceptions.py # Custom exception handlers βββ api/ β βββ routers/ β β βββ health_router.py β β βββ github_search_router.py β βββ controllers/ β βββ github_search_controller.py βββ telemetry/ β βββ logging.py # Structured logging βββ utils/ β βββ pathing.py βββ python-layer/ # Lambda layer with dependencies βββ python/ βββ mangum/ # ASGI adapter for Lambda βββ fastapi/ βββ pydantic/ βββ requests/ βββ starlette/ βββ anyio/
Request flow through the server
- Lambda receives the event. AWS triggers the function when a request hits the API Gateway endpoint.
- Mangum translates the event. The lambda_handler.py wraps the FastAPI app with Mangum, converting the Lambda event into a standard ASGI request.
- FastAPI routes to the appropriate handler. The github_search_router.py maps endpoints like /github/search/year-summary/{username} to controller methods.
- Controller handles business logic. The github_search_controller.py calls GitHub's REST and GraphQL APIs, aggregates the data, and computes narratives.
- Response flows back. Mangum converts the FastAPI response into a Lambda-compatible format, and API Gateway returns it to the client.
Why the router-controller pattern
Routers define HTTP endpoints and handle request validation. Controllers contain the business logic. This separation makes testing straightforward: you can unit test controllers without spinning up HTTP servers.
The GitHub search controller is the core of the backend. It handles year summary totals (commits, PRs, reviews, issues), monthly commit counts for the timeline, language breakdown by bytes written, and commit size distribution with narrative generation.
Each method returns a focused response. The API is organized around questions, not data types.
Lambda layer for dependencies
The python-layer/ directory contains pre-packaged dependencies. This layer gets deployed separately from the function code, which speeds up deployments and keeps the function package small.
Key dependencies in the layer: Mangum for Lambda-to-ASGI translation, FastAPI and Starlette for the web framework, Pydantic for request/response validation, Requests for GitHub API calls, and orjson for fast JSON serialization.
Client architecture
The Next.js client follows a clear data flow pattern.
client/src/ βββ app/ β βββ page.tsx # Landing page (username input) β βββ layout.tsx # Root layout with providers β βββ globals.css β βββ recap/ β βββ [username]/ β βββ page.tsx # Main recap orchestrator β βββ loading.tsx # Loading skeleton βββ components/ β βββ pages/ # Full-screen recap pages β β βββ welcome-page.tsx β β βββ opening-page.tsx β β βββ activity-timeline-page.tsx β β βββ monthly-journey-page.tsx β β βββ top-languages-page.tsx β β βββ commit-size-distribution-page.tsx β β βββ battle-card-page.tsx # Final share card β βββ charts/ β β βββ contribution-dots.tsx # GitHub-style heatmap β β βββ activity-bars.tsx # Monthly bar chart β βββ shared/ β β βββ animated-number.tsx # Count-up animations β β βββ typing-text.tsx # Typewriter effect β β βββ keyboard-hint.tsx # Navigation hint β βββ ui/ # Design system primitives β β βββ button.tsx β β βββ card.tsx β β βββ input.tsx β β βββ avatar.tsx β β βββ badge.tsx β β βββ progress.tsx β β βββ chart.tsx β β βββ skeleton.tsx β βββ layout/ β β βββ page-container.tsx # Consistent page wrapper β βββ providers.tsx # React Query + theme providers βββ hooks/ β βββ use-github-data.ts # React Query fetch logic β βββ use-page-navigation.ts # Keyboard + swipe navigation βββ stores/ β βββ recap-store.ts # Zustand state management βββ lib/ β βββ api.ts # API client for Lambda backend β βββ utils.ts # Shared utilities β βββ ranks.ts # Gamification rank logic β βββ achievements.ts # Badge calculations βββ types/ βββ api.ts # TypeScript interfaces
Data flow through the client
- User enters a username on the landing page. The form submits and navigates to /recap/[username].
- React Query fetches all data in parallel. The use-github-data.ts hook dispatches multiple requests to the Lambda backend simultaneously: year summary, monthly commits, languages, commit sizes, and heatmap data.
- Zustand store normalizes the responses. Once data arrives, the recap-store.ts stores it in a normalized format. Pages read from the store, not from individual query results.
- Page components render the recap sequence. The user navigates through welcome-page.tsx, opening-page.tsx, activity-timeline-page.tsx, and so on. Each page is a self-contained screen.
- Navigation is keyboard and swipe enabled. The use-page-navigation.ts hook listens for arrow keys and touch gestures to move between pages.
- The battle card is the final output. The battle-card-page.tsx renders a compact share card with download and copy actions.
Why Zustand over prop drilling
The recap has seven pages. Passing data through props would create a tangled hierarchy. Zustand gives each page direct access to the data it needs without intermediary components.
React Query handles caching. If a user navigates back to the landing page and enters the same username, the data loads instantly from cache.
The pages directory pattern
Each file in components/pages/ is a full-screen recap page. This pattern keeps the UI organized:
- welcome-page.tsx β Animated intro with the user's avatar
- opening-page.tsx β Total commits, PRs, reviews with animated numbers
- activity-timeline-page.tsx β Monthly bar chart with peak highlight
- monthly-journey-page.tsx β Contribution dots heatmap
- top-languages-page.tsx β Language breakdown by percentage
- commit-size-distribution-page.tsx β Small/medium/large commits with narrative
- battle-card-page.tsx β Final shareable card with download button
Each page has one job. When a page tried to show two metrics, I split it.
Request flow: end to end
Hereβs what happens when someone enters a username:
1. User types "username" and clicks Generate β βΌ 2. Next.js navigates to /recap/username loading.tsx shows skeleton UI β βΌ 3. use-github-data.ts dispatches parallel requests to Lambda backend via lib/api.ts β βΌ 4. Lambda cold starts if needed (~200ms) Mangum translates event to FastAPI request β βΌ 5. github_search_router.py routes to controller github_search_controller.py calls GitHub API β βΌ 6. Controller aggregates data, computes narratives Returns focused JSON response β βΌ 7. React Query caches responses (5 min TTL) recap-store.ts normalizes and stores data β βΌ 8. Page components render from Zustand store Navigation between pages is instant (no API calls) β βΌ 9. User reaches battle-card-page.tsx Downloads or copies their year-end wrap
The share card: the whole point
The recap ends with a compact share card and two actions: download or copy.
Recaps are for sharing. The share artifact has to stand on its own. The card compresses your year into a few lines: top languages, commit count, PRs, reviews, and a streak. Itβs small, readable, and looks good on a timeline.
This screen proves a broader lesson: the recap isnβt done until it can travel.
Performance decisions that matter
Parallel fetching on the client. All recap data loads at once through React Queryβs parallel queries. The user sees a loading skeleton, then the full recap appears. No progressive disclosure that feels slow.
5-minute client cache. React Query caches responses so navigating back and forth between recap pages doesnβt trigger new Lambda calls. Re-entering the same username loads instantly.
Lambda warm starts. During peak usage, Lambda functions stay warm. Typical response times hit 100β200ms. Cold starts add ~500ms but only happen after periods of inactivity.
Aggregation on the backend. The API returns computed insights, not raw data. The commit size endpoint returns a narrative like βMostly small, steady commits with occasional medium pushesβ instead of raw percentiles. This keeps payloads small and moves computation off the client.
Lambda layer for dependencies. Pre-packaged dependencies in python-layer/ speed up cold starts because Lambda doesn't need to unzip the same libraries repeatedly.
The biggest learning
A recap isnβt a report. Itβs a narrative powered by data.
That framing changes how you design pages, how you order them, and how you choose what to compute. GitHub gives you totals. Your job is to turn those totals into a story worth sharing.
If you want to see your year in code, open commit-recap.vercel.app and run a username. If you want to build your own recap, start with a single question and build one page that answers it cleanly. The rest will follow.
π Connect
For more insights on AI and LLM systems follow me on:
I Built CommitRecap so Your GitHub Year Reads Like a Story was originally published in Towards AI on Medium, where people are continuing the conversation by highlighting and responding to this story.