Metadata →
- Tags: AI cultura fav tecnología
[!summary]Erik Hoel argues that modern culture has become overfitted, losing creativity and producing repetitive, in-distribution outputs. This overfitting comes from mass production, algorithmic feeds, and replacing conscious human judgment with efficiency-driven systems. AI accelerates these feedback loops, risking further mode and model collapse in culture.
Highlights
id973662052
I think overfitting is precisely the thing to be focused on here. While Kriss mentions overfitting when it comes to AI writing, I’ve thought for a long while now that AI is merely accelerating an overfitting process that started when culture began to be mass-produced in more efficient ways, from the late 20th through the 21st century.
Estancamiento cultural. Publicación que cruce esta observación con los datos de baja en conductas de riesgo y, eventualmente, con viralización de falsos tourettes en TikTok. Smartphone e internet como aplanadores de la distribución probabilística.
id973662620
Basically, during the day, your mammalian brain can’t stop learning. Your brain is just too plastic. But since you do repetitive boring daily stuff (hey, me too!), your brain starts to overfit to that stuff, as it can’t stop learning.
Enter dreams. Dreams shake your brain away from its overfitted state and allow you to generalize once more. And they’re needed because brains can’t stop learning.
id973663863
I don’t think “Instagram face” is just an averaging of faces. It’s more like you’ve made a copy of a copy of a copy and arrived at something overfitted to people’s concept of beauty.
id973666566
An example of cultural mode collapse might be superhero franchises, driven by a discriminatory algorithm (the entire system of Big Budget franchise production) overfitting to financial and box office signals, leading to dimensionally-reduced outputs. And so on. Basically, overfitting is a common root cause of a lot of learning and generative problems, and increased efficiency can easily lead to hyper-discriminatory capabilities that, in turn, lead to overfitting.
id973666805
Overall, I think the switch from an editorial room with conscious human oversight to algorithmic feeds (which plenty of others pinpoint as a possible cause for cultural stagnation) likely was a major factor in the 21st century becoming overfitted. And also, again, the efficiency of financing, capital markets (and now prediction markets), and so on, all conspire toward this.
Por eso es importante mantener la práctica de pensar por escrito (y abiertamente)
id973673955
If I picture the last three hundred years as a montage it blinks by on fast-forward: first individual artisans sitting in their houses, their deft fingers flowing, and then an assembly line with many hands, hands young and old and missing fingers, and then later only adult intact hands as the machines get larger, safer, more efficient, with more blinking buttons and lights, and then the machines themselves join the line, at first primitive in their movements, but still the number of hands decreases further, until eventually there are no more hands and it is just a whirring robotic factory of appendages and shapes; and yet even here, if zoomed out, there is still a spark of human consciousness lingering as a bright bulb in the dark, for the office of the overseer is the only room kept lit. Then, one day, there’s no overseer at all. It all takes place in the dark. And the entire thing proceeds like Leibniz’s mill, without mind in sight.
Alienación aplicada al outsourcear nuestro pensamiento a las máquinas (LLM).
id973675449
An inability to create novelty is a sign of an inability to generalize to new situations. That seems potentially very dangerous to me.
id973675815
in the cognitive sciences, David Marr said that information processing systems had to be understood at the computational, the algorithmic, and the physical levels. It’s all describing the same thing, in the end, but you’re explaining at one level or another.
Niveles de análisis
id973678624
People act like cultural fragmentation and walled gardens are bad, but if global culture is stagnant, then we need to be erecting our own walled gardens as much as possible (and this is mentioned by plenty of others, including David Marx and Noah Smith). We need to be encircling ourselves, so that we’re like some small barely-connected component of the brain, within which can bubble up some truly creative dream.
¿Cómo implementar esto en mi vida personal? Además del cultivo del jardín digital
Enter dreams.
Dreams shake your brain away from its overfitted state and allow you to generalize once more. And they’re needed because brains can’t stop learning.