š§µ 1/ AI may break the internet because it reduces the effort to produce good content, and therefore EFFORT no longer acts as a reliable signal of quality. Thereās an interesting economics back story if anyone is interested below.
wired.com/story/ai-slop-⦠**
2/ First, the Internet is an information "Market for Lemons."
When used car lots were new in the 50s-60s, there was no way to tell whether you were buying a bad car (a lemon), or a good one (a peach). There was no Carfax or even standardized VINs.
3/ Because buyers didnāt know a good car from a bad, they would only pay the *average* price between them.
Since they paidā¦
š§µ 1/ AI may break the internet because it reduces the effort to produce good content, and therefore EFFORT no longer acts as a reliable signal of quality. Thereās an interesting economics back story if anyone is interested below.
wired.com/story/ai-slop-⦠**
2/ First, the Internet is an information "Market for Lemons."
When used car lots were new in the 50s-60s, there was no way to tell whether you were buying a bad car (a lemon), or a good one (a peach). There was no Carfax or even standardized VINs.
3/ Because buyers didnāt know a good car from a bad, they would only pay the *average* price between them.
Since they paid the "same price," for a peach or al lemon, buyers underpaid for good cars and overpaid for bad.
Car dealers might not have started out bad, but.. **
4/ Over time, since dealers couldnāt get the true value of a good car, and were overpaid for a bad car, used car dealers just made more money selling lemons
This is why bad information persists - itās much easier to produce than good information... **
5/ And low quality has a Greshamās law effect. Bad money drives out all good. Bad info can price good info out of business. People get tired of fact checking and engaging and just quit.
But there is still good, real, content out there despite these forces... **
6/ And that is in large part because the very effort that someone took to produce good content - a long form essay, a video, a photograph, etc - is PROOF of its quality.
Which brings us to Spenceās work on Signaling. sfu.ca/~allen/Spence.⦠**
7/ Spence looked at the job market, but when a user interacts with online content, they are making an investment of their attention. They only have so much of it so they want to judge its quality and move on.... **
8/ If a market has a quality gradient, people can tell good from bad by looking for a signal that only a good player can send but a bad player canāt. For employers, education worked well..
For info, a good argument has citations, proper grammar, etc, and a bad one might not. **
9/ With AI, the effort to produce "lemon" content that presents as a "peach" is much lower than in the past. AI can slop out articles or videos or images that look good but arenāt.
So we are losing this Spencian "signal" of quality. **
10/ What may thus happen is that since itās harder for good content to signal its quality, its value may get dragged down by lemon content.
The price to produce good content may exceed the value anyone expects from it, so they stop producing it.
And bad content is "overpaid" **
11/ Because of this dynamic, I think weāll see an absolute flood of garbage content. We have already seen people in developing countries arbitraging political discontent in rich countries with fake twitter personas. This is going to get worse. **
12/ So a big question will be how purveyors of quality content figure out how to signal they are actually good so that the market can reward them for it.
Side the Bladerunner opening credits look like they were written by GPT - the em-dashes AND the dramatic dichotomy. **
6/ As it relates to AI **
⢠⢠ā¢
Missing some Tweet in this thread? You can try to force a refresh