Skip to main content
Scour
Browse
Getting Started
Login
Sign Up
You are offline. Trying to reconnect...
Close
Copied to clipboard
Close
Unable to share or copy to clipboard
Close
⚗️ Knowledge Distillation
Specific
model compression, distillation, teacher-student, efficient ML
Filter Results
Timeframe
Fresh
Past Hour
Today
This Week
This Month
Feeds to Scour
Subscribed
All
Scoured
7195
posts in
18.3
ms
Pre-trained LLMs Meet
Sequential
Recommenders
: Efficient User-Centric Knowledge Distillation
🎛️
Feed Filtering
arxiv.org
·
6d
Three
Cobblers
, One
Zhuge
Liang: Making Cheaper Models Work Together
🪄
Prompt Engineering
markhuang.ai
·
21h
·
Hacker News
House panels probe
Airbnb
,
Anysphere
over use of Chinese AI models
🇨🇳
Chinese AI
nextgov.com
·
1d
·
Hacker News
A
chemistry
lab that runs itself to find the perfect
reaction
🔬
AI Labs
nature.com
·
3d
·
Hacker News
Improving
Diversity
in Black-box Few-shot Knowledge
Distillation
📚
RAG
arxiv.org
·
1d
U.S. Orders Global
Diplomatic
Warning on Chinese ‘
Distillation
’…
🇨🇳
China Tech Policy
underlines.news
·
4d
·
Hacker News
PAINT: Partial-Solution Adaptive
Interpolated
Training for Self-Distilled
Reasoners
⚙️
MLOps
arxiv.org
·
17h
Turning the
TIDE
: Cross-Architecture
Distillation
for Diffusion Large Language Models
🤖
LLM
arxiv.org
·
17h
On the
Memorization
of
Consistency
Distillation for Diffusion Models
🤖
LLM
arxiv.org
·
2d
GaitKD
: A Universal
Decoupled
Distillation Framework for Efficient Gait Recognition
✨
Gemini
arxiv.org
·
17h
The Surprising
Effectiveness
of
Canonical
Knowledge Distillation for Semantic Segmentation
✨
Gemini
arxiv.org
·
1d
Edge AI for Automotive Vulnerable Road User Safety:
Deployable
Detection via Knowledge
Distillation
⚡
Edge AI
arxiv.org
·
17h
S-SONDO
: Self-Supervised Knowledge
Distillation
for General Audio Foundation Models
✨
Gemini
arxiv.org
·
1d
On-the-fly
LTLf
Synthesis under Partial
Observability
🪄
Prompt Engineering
arxiv.org
·
17h
Knowledge
Distillation
Must Account for What It
Loses
🔢
BitNet
arxiv.org
·
1d
Efficient Diffusion
Distillation
via
Embedding
Loss
🔢
BitNet
arxiv.org
·
3d
Select to Think: Unlocking
SLM
Potential with Local
Sufficiency
✨
LLMs
arxiv.org
·
17h
Diverse
Image
Priors
for Black-box Data-free Knowledge Distillation
✨
Gemini
arxiv.org
·
1d
AlphaJet
: Automated Conceptual Aircraft Synthesis via
Disentangled
Generative Priors and Topology-Preserving Evolutionary Search
🔍
AI Interpretability
arxiv.org
·
17h
Aligning Dense
Retrievers
with LLM Utility via
DistillationAligning
Dense
Retrievers
with LLM Utility via Distillation
📚
RAG
arxiv.org
·
3d
Page 2 »
Log in to enable infinite scrolling
Keyboard Shortcuts
Navigation
Next / previous item
j
/
k
Open post
o
or
Enter
Preview post
v
Post Actions
Love post
a
Like post
l
Dislike post
d
Undo reaction
u
Save / unsave
s
Recommendations
Add interest / feed
Enter
Not interested
x
Go to
Home
g
h
Interests
g
i
Feeds
g
f
Likes
g
l
History
g
y
Changelog
g
c
Settings
g
s
Browse
g
b
Search
/
Pagination
Next page
n
Previous page
p
General
Show this help
?
Submit feedback
!
Close modal / unfocus
Esc
Press
?
anytime to show this help