I Asked Claude to Do Archaeology on My Own Codebase
Claude dug through git history and Coolify deployment data to tell me the story of a 10-day project. The receipts were illuminating.
I Asked Claude to Do Archaeology on My Own Codebase
DevTrends is a tech intelligence pipeline I built over the last 10 days. 100+ data sources, proxy infrastructure for scraping, MCP endpoint so Claude can query it directly.
I wanted to see if Claude could reconstruct the journey from git history and Coolify deployment data alone.
It could.
Prompt
Ready for some archeology? You have access to my coolify mcp, and you have full access to this repo - READ ONLY - I want you to tell me the story of this application - when was it created? what happened to it? when did it hit coolify? what changes has it been through? Let's go with some bullet points to start with, give me the skinny!
The Story
Day 1: The Firehose Era
31 commits. Fresh Laravel app at 4:26 AM. Built fetchers for HackerNews, Reddit, GitHub, security feeds. Hit Reddit's IP blocks immediately. Pivoted to RSS. That wasn't enough — built residential proxy rotation with Webshare.
The immortal commit message of the day: "what have you even been doing?" (it was fixing package-lock.json)
Day 2: The Culling
Killed the AI digest feature. It lived less than 24 hours. Pivoted to long-term trend collection instead.
Day 3: The MCP Awakening
Built an MCP server so Claude could explore the database directly. Spent 4 commits fighting MCP bugs — ->optional() calls, Request::get() vs input(), return types.
Day 4: DATALAKATION
18 commits. 34 new sources across all categories. Built a blog scraper for sites without RSS feeds. Shipped a REST API. HuggingFace papers replaced Papers With Code (it broke).
Day 6: The Third Dashboard
Dashboard rewritten again as a "modular intelligence hub". Third major iteration. Source seeder expanded to 100+ sources across 6 domains.
Day 10: Stabilisation
Public MCP endpoint shipped with rate limiting. Now Claude can pull trends data in any conversation.
The Numbers
- 66 commits (plus 15 from Dependabot)
- 57,700 lines added
- 23 data fetchers built
- 100+ data sources
- 3 Reddit fetcher iterations (JSON API → RSS → residential proxy)
- 3 dashboard rewrites
- 1 feature killed at age 24 hours
The Point
The interesting part isn't the app — it's that Claude told me my own story better than I remembered it.
I'd forgotten about the Reddit IP blocks. Forgot I'd killed the AI digest so quickly. Forgot there were three dashboard rewrites. The git log remembers everything; I just needed something to make sense of it.
This is what AI-native development actually looks like. Not Claude writing code for you — Claude helping you understand what you've built, finding patterns in your own work, turning 81 commits into a coherent narrative.
The app exists. It's running. But the archaeology session taught me more about my own process than building the thing did.
Check it out here trends.stumason.dev
Get the Friday email
What I shipped this week, what I learned, one useful thing.
No spam. Unsubscribe anytime. Privacy policy.