What Least Squares Assumes About Your Data
sebmestre.blogspot.com·8w·
Discuss: Hacker News

In my linear algebra course in college I learned about least squares regression, where it wasn’t very motivated outside of "we square the error because it makes error non-negative, really penalizes outliers, and it’s easy to compute".

Recently, I learned about maximum likelihood estimation, where we pick parameters for a model based on whatever maximizes the likelihood of the observed data. For example, if we have a random variable

Y = a X + b + E
where
X ~ U(0, 1)         (X uniformly sampled from [0, 1])
E ~ N(0, s^2)       (error sampled from a Gaussian distribution)

Given some set of samples (X1, Y1), (X2, Y2), ..., (Xn, Yn), omitting some constants for clarity, we can find the parameters (a, b) that maximize the likelihood via:

argmax[a,b] of product[...

Similar Posts

Loading similar posts...

Keyboard Shortcuts

Navigation
Next / previous item
j/k
Open post
oorEnter
Preview post
v
Post Actions
Love post
a
Like post
l
Dislike post
d
Undo reaction
u
Recommendations
Add interest / feed
Enter
Not interested
x
Go to
Home
gh
Interests
gi
Feeds
gf
Likes
gl
History
gy
Changelog
gc
Settings
gs
Browse
gb
Search
/
General
Show this help
?
Submit feedback
!
Close modal / unfocus
Esc

Press ? anytime to show this help