Fewer versus Less
en.wikipedia.orgยท1hยท
Discuss: Hacker News
๐Ÿ”—Concatenative Theory
Protovalidate Is Now v1.0
buf.buildยท6hยท
Discuss: Hacker News
โœ…Configuration Validation
Affinda launches agentic AI platform for any document workflow
channellife.com.auยท13h
๐Ÿ”„Incremental Lexing
Speedier elisp-refs-* with Dumb Grep
feyor.shยท1d
๐Ÿlisp
Charles Sanders Peirce, George Spencer Brown, and Me โ€ข 20
inquiryintoinquiry.comยท4h
๐Ÿ”—Category Theory
Baking with Rails at scale: recipes in Ruby, cookware from Go, C, and Rust
evilmartians.comยท1d
๐Ÿ’ฌSmalltalk VMs
How Space Debris Cleanup Could Become the Next Trillion-Dollar Industry
hackernoon.comยท12h
๐ŸคZipper Structures
๐Ÿง  What is a Sequence in Programming?
dev.toยท2dยท
Discuss: DEV
๐ŸชขRope Data Structures
From Legal Documents to Knowledge Graphs
neo4j.comยท3dยท
Discuss: Hacker News
๐Ÿ“ˆEarley Parsing
Primed for Performance: Turbocharging Transformers for Time Series Analysis by Arvind Sundararajan
dev.toยท5hยท
Discuss: DEV
๐ŸŒช๏ธV8 TurboFan
LLM in the Middle: A Systematic Review of Threats and Mitigations to Real-World LLM-based Systems
arxiv.orgยท17h
๐Ÿ”ML Language
Your Unit Tests Suck
medium.comยท4hยท
Discuss: Hacker News
๐ŸงชCompiler Testing
Balance between refactoring and inheritance in your code
github.comยท1dยท
Discuss: Hacker News
๐ŸงชCompiler Testing
Top Dependency Scanners: A Comprehensive Guide
dev.toยท6hยท
Discuss: DEV
๐Ÿ“ฆDependency Analysis
I built an LLM from Scratch in Rust (Just ndarray and rand)
github.comยท2dยท
๐ŸŒฑMinimal ML
Do Large Language Models Favor Recent Content? A Study on Recency Bias in LLM-Based Reranking
arxiv.orgยท17h
๐Ÿ”คLanguage Tokenizers
Building a Unified Intent Recognition Engine
towardsdatascience.comยท2h
๐Ÿง Semantic Parsing
Building a Legal Document Intelligence Platform with BigQuery AI: 99% Efficiency Implementation Guide
dev.toยท12hยท
Discuss: DEV
๐Ÿ”„Incremental Lexing
LLM's Functions, Use-cases & Architecture: Introduction
dev.toยท1dยท
Discuss: DEV
๐Ÿ“ŠLR Parsing
Decoupling Search and Learning in Neural Net Training
arxiv.orgยท17h
๐ŸชœRecursive Descent