Home 

Fifty-Seven: Linden Dollars (12-13-2023)

Read on substack 

A big-enough YouTuber quit recently. No need to push any publicity towards him, he's done and gone. He had ~4,250,000 subscribers. His reasoning for giving up was fatigue with being a creator. Eye-evaluation pins him at 30 to 35 years old. How lovely it must be to quit because you're sick and fed up of being adored and known, showered with money for doing only what you love. 

This guy must have made a lot of money. Sure, he has myriad talents and recognition to parlay visible fame into professional acumen, but he must have made a lot of money. I don't know how much YouTube pays out, but attracting millions of people to the site and regularly creating multi-million view viral hits has to result in some substantial compensation. Good for him, I guess. Everyone has their great interest, their singular fixation, and plenty of people have a passion that is possible to monetize.  It's impossible to not be jealous in the face of someone whose talent was selected by the digital deity to be their salvation.


Substack seems to be the only website that has collectively accepted the end-of-digital-days. I don't understand the celebration in the face of the internet decentralization of post-twitter. There will not be a return to geocities Wild West, but just a endless torrent of AI slop, bot echoing, and confusing hypertrends pushed by algorithms that are impossible to rein - per these people's own assertion. Some people see the writing on the wall and jump ship, but substackers think themselves immune and continue to party.

AI will, within 5 years, optimize my job out of existence, or entirely ruin the industry to the point of my works' total tedium. This isn't a great third or fourth (Info-)Industrial Revolution. Smart people do not see the value of AI, and some other "smart" people have whipped themselves into histeria foreseeing our collective doom. Idiots run the show at all levels, gawking at a new but not novel-confusing trinket to command. Someone who does not understand the workings of a LLM or machine learning has no right to guide its implementation, and someone who does understand it does not necessarily as well understand the sociological implications of its use. Often these two sides of the parity have no knowledge of the other's existence. Tech CEOs obviously know how their tech works... right? Someone does... right?


A big faultering of political culture-warring was the normalization (one on side) of trusting authority. It was a gambit made when trusting authority meant hedging bets on stability, but it can't be taken back now - politics moves in one direction. People in roles of leadership should have the knowledge and ability to wield power and to lead, and it is then, wrongly, assumed that they simply do. Obviously this illogic is not applied uniformly, otherwise Presidents would have consistently majority approval, but for business leaders, doctors, bosses, neighbors, it's applied. No one wants their doctor to be an idiot, and the best way to avoid such is to just operate as though they aren't. 

Plenty of people think Elon Musk is stupid or vapid or unfit to hold wealth, but Oprah is. Plenty of people think Bill Gates is a ********* who eats ****** on **** *******'* *****, but that Donald Trump is a genius-level calculating business leader and diplomat. Plenty of people think their boss is stupid, but their other boss is a good guy. 


The current state of affairs is that people who don't know what's going on or how the goings on work are in charge, and a big flux is occurring outside of their control or understanding. People automatically, reflexively belay their trust in these people, they have no choice to do otherwise, and no recourse if power is misused. It's also too mentally taxing to audit the intelligence of every person around you, or who may enter your newsfeed.

Being needlessly skeptical of everyone and everything is equally exhaustive, and as early 2010s internet atheists proved, terminally cringe. But when change is already occurring everywhere around you, it's impossible not to become at least a little paranoid. Can you trust your boss to make the right choice when her boss tells her to implement AI to increase efficiency? Can you trust the AI company to produce a useful and controllable product? Can you trust your doctor to not get you hooked on something that might kill you?