September 11, 2025 : Issue #98
lawrenceweschler.substack.comยท10hยท
Discuss: Substack
๐ŸงฒMagnetic Philosophy
Frankenstein Variant of the ToneShell Backdoor Targeting Myanmar
intezer.comยท1d
๐Ÿ”“Hacking
What are the AI privacy concerns?
proton.meยท12h
๐Ÿ”’Privacy Archives
Free sudoku game with evil, and extreme killer difficulty levels
minisudoku.onlineยท2dยท
Discuss: Hacker News
๐Ÿ”ฒCellular Automata
The 4p Developer: The Missing Layer in Platform Thinking
davidpoll.comยท1dยท
Discuss: Hacker News
๐Ÿ”ŒInterface Evolution
Why retention is so hard for new tech products
andrewchen.substack.comยท12hยท
Discuss: Substack
๐Ÿ’พPersistence Strategies
How thousands of โ€˜overworked, underpaidโ€™ humans train Googleโ€™s AI to seem smart
theguardian.comยท12hยท
Discuss: Hacker News
๐Ÿค–AI Curation
Under the Hood of Fuzzy Search: Building a Search Engine 15 times fuzzier than Lucene
andrewjsaid.comยท5dยท
๐ŸŽฏAutomata theory
Large Language Mode(ration)
ashley.rolfmore.comยท14hยท
Discuss: Hacker News
๐Ÿ—บ๏ธCompetency Maps
Blend Fan on blend.ad: Empowering DeFi and Fan Token Innovation with AI in 2025
dev.toยท1dยท
Discuss: DEV
๐ŸŽ™๏ธWhisper
Unfit
nplusonemag.comยท1d
๐Ÿ“กInformation theory
How We Built Our Model-Agnostic Agent for Log Analysis
blog.runreveal.comยท1dยท
Discuss: Hacker News
๐ŸŽฏThreat Hunting
Real-Time Genome Sequencing Error Correction via Dynamic Bayesian Graph Refinement
dev.toยท2dยท
Discuss: DEV
๐ŸงฌBitstream Evolution
Adaptive Acoustic Metamaterial Composites for Broadband Noise Reduction via Topology Optimization
dev.toยท1dยท
Discuss: DEV
โš™๏ธTape Engineering
Day 92: Authentication, Insomnia, and Life Decisions
dev.toยท10hยท
Discuss: DEV
๐Ÿ’พPersistence Strategies
Recently inherited this rack
reddit.comยท21hยท
Discuss: r/homelab
๐Ÿ HomeLab
Automated Radiotracer Distribution Quantification via Multi-Modal Graph Analysis in WBA
dev.toยท2dยท
Discuss: DEV
๐Ÿ”Vector Forensics
Reconstruction Alignment Improves Unified Multimodal Models
arxiv.orgยท1d
๐Ÿ“Projective Geometry
Flexible inference of learning rules from de novo learning data using neural networks
arxiv.orgยท3d
๐Ÿค–Grammar Induction
Quantization Explained: A Concise Guide for LLMs
dev.toยท1dยท
Discuss: DEV
๐Ÿ“ŠQuantization