- Get a personalized feed of content related to your interests.
- Subscribe to specific feeds or scour all of them.
- Works with RSS, Atom, Hacker News, Reddit, Bluesky, Substack, Medium, and more — including blogs that don't have RSS feeds.
- Identify hidden gems from noisy sources like Hacker News Newest.
- Get feed recommendations based on how well their content matches your interests.
- Import feeds from other readers via OPML.
- Extremely fast. Your personalized feed should load in under 100 milliseconds.
Scour noisy feeds for content related to your interests
We pull content from a wide range of sources and use an AI model to find hidden gems that match your interests.
Are user feeds exposed as RSS/Atom feeds?
Yes! You can append /rss.xml
or /atom.xml
to the end of any user feed URL to get an RSS or Atom feed of that user's posts.
Is Scour free?
It is! I'm also planning to add additional paid features.
Who is building this?
I'm Evan Schwartz, a software engineer and inventor.
Why are you building this?
I like keeping up with tech and global news. However, there are many great content feeds that are too noisy to keep up with. Aggregators like Hacker News, Lobsters, and Reddit can be nice but they often have plenty of posts that I'm not interested in. Social aggreggators in particular can also miss great content that doesn't happen to go viral (ever tried submitting something to Hacker News, only for it to get buried before it escapes the Newest page?). Scour watches all of these feeds and surfaces only the content that matches your interests.
What tech stack are you using?
Scour is written in Rust. It is built with Maud HTML templating, the Axum web framework, SQLx and SQLite for persistence, and Tokio for async I/O. Scour uses the mxbai-embed-large-v1 embedding model with binary vector embeddings for semantic search. And it is hosted on Fly.io.