Writing Our Own Structure: Tries in Haskell & Rust
mmhaskell.comยท8h
๐ŸŒฟTrie Structures
Re-FRAME the Meeting Summarization SCOPE: Fact-Based Summarization and Personalization via Questions
arxiv.orgยท12h
๐Ÿ”—Lexical Scoping
X Design Notes: Pattern Matching II
blog.polybdenum.comยท17h
๐ŸŽฏPattern Matching
LLM-JEPA: Large Language Models Meet Joint Embedding Predictive Architectures
arxiviq.substack.comยท6hยท
Discuss: Substack
๐ŸชœRecursive Descent
Replacing clojure-lsp with clj-kondo and Refactor-nREPL
andreyor.stยท22h
๐Ÿ”ฎClojure
Polymorphism for Beginners
roscidus.comยท2dยท
Discuss: Hacker News
๐ŸŽญPolymorphic Variants
Python Tuples: The Ultimate Guide to Immutable Sequences
dev.toยท10hยท
Discuss: DEV
๐ŸงฉPersistent Vectors
Identity Types
bartoszmilewski.comยท3hยท
Discuss: Hacker News
๐ŸŽฏType Theory
haskell/mtl
github.comยท4d
๐Ÿš‚Error Monads
My experience with AI as a front end developer
frontendundefined.comยท8hยท
Discuss: Hacker News
๐ŸŽฎLanguage Ergonomics
็ฌฌๅ››: A Japanese and Forth inspired postfix language
gist.github.comยท12hยท
๐Ÿ”—Concatenative Languages
GSoC 2025: Improving Core Clang-Doc Functionality
blog.llvm.orgยท16h
๐Ÿ“‹Tablegen
Issue 490
haskellweekly.newsยท4d
โšกfunctional programming
HelixDB - An open-source graph-vector database built in Rust
reddit.comยท18hยท
Discuss: r/opensource
๐ŸŒณPersistent Data
Comparative Analysis of Tokenization Algorithms for Low-Resource Language Dzongkha
arxiv.orgยท12h
โšกTokenizer Benchmarks
Effect Systems vs. Print Debugging: A Pragmatic Solution
blog.flix.devยท1dยท
Discuss: Hacker News
โšกAlgebraic Effects
Transforming Recursion into Iteration for LLVM Loop Optimizations
dspace.mit.eduยท22hยท
Discuss: Hacker News
๐Ÿ“šStack Allocation
From "Decentralized" to "Unified": SUPCON Uses SeaTunnel to Build an Efficient Data Collection Frame
hackernoon.comยท4h
๐Ÿ“Rope Editors
Typename syntax and resolution in ClojureCLR
dmiller.github.ioยท1d
๐Ÿ”ฎType Inference Visualization
Token Models as Statistical Simulations: A Different Take
medium.comยท17hยท
Discuss: Hacker News
๐Ÿ”Tokenizers