The Future Is Weird

After much reading and discussions of post-TAI (transformative AI) futures from those leading the AGI labs and startups of SF, there’s a few consistent disagreements I find myself having.

While I generally agree with narrow-yet-strong capability predictions, for example that AI will enable the creation of new software, pharmaceuticals, and companies at much greater speeds, I consistently disagree with the social and cultural predictions I see from my peers.

Humanity is often treated as a constant in such predictions, with technology as the mutating variable. This leads to wildly miscalibrated predictions about the future.

This post muses on this topic a bit. It contains four sections:

  • I. Culture is downstream of technology
  • II. Time spent consuming digital content (technology) is increasing
  • III. Successful digital content decoheres from reality
  • IV. As a result, people and culture will decohere from reality

I. Culture is downstream of technology

When discussing long-term futures (post-AGI/ASI) it’s common for technologies like genetic engineering and neural interfaces to be brought up, as most serious thinkers realize it is simply not possible for humanity to persist as-is. Competitive forces will destroy those who are not augmented with technology. While this may sound stark, it has been the status quo for thousands of years – the primary difference this time is it will happen more quickly. If I refused to use the Internet, cell phones, or even basic tools, our present society would severely punish me both economically and socially. I’d have trouble holding a job, maintaining friendships, or even finding food and shelter.

I don’t intend to opine about what will happen with BCI or gametogensis or embryo selection or anything of the sort here; rather my point is about the shorter-term: the things that will happen even if the hardware humanity runs on remains a constant.

Phrased as succinctly as possible: culture is downstream of technology.

Culture is an emergent phenomenon induced by the personalities of people; if it’s the case that personalities are downstream of technology, then it follows that culture will be as well. When thinking about uses of future technology one should not think “what would the people of today do if given this?” but rather “what will the people of the future, after everything from the present to the future first modifies them, afterwards do if given this?”. When electricity was invented no one was making forecasts of how much of it would be spent mining bitcoin or powering ad recommendation systems or training large language models, yet these are currently among the most transformative use cases society is employing.

We can go a step further than this too, because future technology is built by future people, not by present people. The things people will want to build later this century are likely not things we think we want now, because they were have different values and cultures (even if the hardware remains constant, which I also believe won’t be the case).

Societies used to consist of mere hundreds but now consist of millions in part due to the agricultural revolution, the effects of which are hard to overstate, allowing for hierarchical societies, specialized labor, and organized religion. The printing press caused book prices to fall by 98% within decades, helping to consolidate language and improve literacy. The effects of simple single inventions like the steam engine, the mechanical clock, or birth control have all caused massive first-order effects and even more massive second-order effects.

All of the cultural changes which seem most important have all been caused by technology when properly examined. Some obvious examples include city-states (influenced by agriculture), the abolition of slavery (influenced by industrialization, transportation, communication), extension of the franchise to both sexes (influenced by birth control, household labor-saving devices from washing machines to vacuum cleaners to refrigerators and more), universal education (influenced by mass printing), and mutually-assured destruction (requires nuclear weapons).

While it’s true that many basic human values have remained constant, the methods we employ to satisfy them have drastically changed as technology has modified our environments and our cultures. When I first learned about the Internet while young I imagined how amazing it would be to be able to speak with literally anyone anywhere in the world, and how that could enable optimized match-making of every type (friendships, employment, relationships, etc). I’m happy to now conclude that it is in fact, very amazing! Yet I didn’t predict that loneliness would drastically increase among younger and more online groups. If I had done a much better job forecasting incentive structures perhaps I could have made such a prediction, but it’s often unintuitive, intellectually unpopular, and further confounded by how unpredictable much of realities most important events and inventions end up being (as a side note I still find it frustrating that we lost proper Internet dating with OKC and it has not been brought back even a decade later. The market is a cruel force!).

If you get anything out of this section I’d like it to be that projecting your current desires onto future-humanity’s-desires may not be as straightforward as you think. I often hear people tell me they can’t wait for humanity to explore the stars, but the humanity of 100 years from now may have very different goals.

For this reason I find most “post-AGI futures” I read about woefully incoherent, with the most common mistake being the author projecting their own values onto the future rather than attempting to simulate that-which-decides-the-values-of-future-people and make projections from the results. Although there’s many points I disagree with about the standard methodologies of thoughts here, for the purpose of this post I’m only going to focus on a sub-point, which is that the lived experiences of future humans will have almost nothing to do with base reality.

II. Time spent consuming technology is increasing

Source: Richard Ngo

I’d like to add to the above take: reality is already on its way out (and in some cases has been fully decimated) for millions of people.

Sometimes I hear someone in SF say something like “What if people spend hours a day talking to AIs rather than humans? Isn’t that concerning, like they’d be living in a fantasy world?” (for those of you reading from Home, yes, people are actually this out of touch here). I love my friends here, but most of them have a six-figure job they love and more real-life friends than they know what to do with. It’s difficult for them to know what life is like for the average American zoomer as a result.

Anyway, time for some charts.

First: Roblox has more monthly active users than the entire population of the United States

Source: Matthew Ball

Second: Over 7,000,000,000 hours of human time is spent inside of Roblox. Every month.

Source: Matthew Ball


Third: Onlyfans generated 6.5B USD of profit last year. This is more than double the combined revenues of OpenaI and Anthropic.

This graph is of revenue, but the margins are high enough such that the statement above remains correct.
Source: Breaking Down OnlyFans’ Stunning Economics

Fourth: Tiktok has over a billion MAU with engagement still rapidly growing.

Fifth: Meta’s daily userbase seems to be growing more slowly. Perhaps this is because over half of Earth’s population with Internet access are already daily users (3.29B / ~5.52B -> 60%).

While I can’t share up-to-date Character AI numbers, I can share what was posted in their series A blog post from the before-times, all the way back in March 2023:

American zoomers spend around seven hours on their phones each day, and there’s many cohorts of desktop users and console gamers (generally males) who average 5-8+ hours a day. I myself averaged 14 hours a day for several years in a row when I was younger.

There’s 50 additional charts I’d love to include here, but the point is already lucid: much of humanity’s time is already spent inside virtual worlds. The proportion is rapidly increasing, especially among younger demographics.

II. Successful consumer content decoheres from reality

Jean Baudrillard, Simulacra and Simulation

Facebook started as a way to talk with your friends, then turned into a feed of nearby activities and people you may be interested in, and now has morphed into a bottomless feeding trough of hyper-optimized machine-crafted AI slop which numerically speaking is more addictive than most psychotropic drugs.

source: @venturetwins
I have like five hundred of these images but I’ll spare you the rest of them. Some are funny, others are horrifying.

The same pattern repeats in other areas – Youtube started as a way to view short funny videos (after the dating app shtick at least), then transformed into long-form content with maximally-alluring thumbnails of things which don’t actually exist, and has now ascended into its final form of a superlative-maximization machine the likes of which you absolutely wouldn’t believe without first watching a thirty-minute video sponsored by NordVPN and AG1 (which are themselves strongly decohered from reality – almost all their marketing claims are lies). Youtube does have troves of beautiful music and enlightening education, but the latter represents a minority of time spent on the platform.

One of the best Youtube success strategies is to master the Art Of the Superlative
Source: Mr Beast

Short-form video is an even more pernicious example of the results of content hyper-optimization. When I open Youtube shorts in a fresh session (no accounts or cookies) most of the videos fed to me seem to have nothing to do with reality whatsoever. There are so many layers of decoherence that if all I was given to learn about humanity was youtube shorts, I would understand virtually nothing of the human experience.

Some results when searching for short-form content with ‘AI’ in the title. Source

Anyone paying the least bit attention to popular meme culture among the youth over the last decade has likely noticed the drastic increases in surreality, artificiality, meta-humor, anti-humor, layers of abstraction, and shortening attention span, both of the users themselves and of the cultures which so graciously permit such alien psychofauna to inhabit their collective minds.

I can’t believe Jean Baudrillard predicted swedish fish-flavored oreos

There’s currently two remaining categories of popular short-form videos which have coherence with reality. The first is music (primarily human-made as of writing this) and the second is what I refer to as “SFW pornography”.

A large fraction of short-form video content is the latter. For some reason this surprises some people. To illustrate the point, below is a screenshot I took of a fresh Youtube shorts session (no account, no cookies, no view history). Take note that the proportion of videos which feature young women in a sexually-suggestive manner is greater than half. Instagram and Tiktok are similar.

I regularly open different social media apps with no account to see what they try suggesting to new users. Sometimes I play bingo with it. It’s easy to score a bingo.

What’s unique about the genres which have required less morphosis to remain memetically competitive is that they best exhibit content that is the equivalent of anti-meditation.

Meditation is often associated with sustained attention, introspection, and heightened awareness. You’re able to become introspective, to figure out what you truly are, what you truly want, and how you should act in order to attain these things (or in many cases, yes, how to not want things to begin with). But all styles of meditation involve sustaining a type of mindset for extended periods.

By forcing the viewer’s brain to continually context-switch rapidly while constantly inducing redirections of their conscious awareness (think: arrows, sound effects, fast camera movements, pop-ups, etc), watching short-form video is the opposite of meditation: it is anti-meditation.

Your gradients will rapidly update in ways you are barely aware of and your time will pass quickly with nothing of note improving about your life whatsoever. I don’t think addictive apps are inherently bad, but I’d at least prefer they were pleasurable or strengthening rather than neither.

IV. Consumption of decohered content results in decohered people and cultures

Source: Industrial Society and Its Future, Theodore Kaczynski. Obviously not an endorsement of other work the author is known for.

You become the average of the five people you spend the most time with. These five people don’t have to be next to you. They can be youtubers, twitch streamers, or twitter users. They can also be AI chatbots, vtubers, anthropomorphic animals, or Chinese content farms.

Because the nature of these five people consumers spend the most time with becomes increasingly decohered from reality over time, the resulting person also becomes increasingly decohered from reality over time.

The goal of all digital content is to personality-capture you into becoming the type of person that wants to consume more of it. The combined ecosystems of millions of intelligent programmers and AI researchers paired with millions of content creators is becoming quite good at this!

Due to human genetic and environmental diversity, we should also expect that the optimal content for each person to consume is wildly different. So we should expect there to be less of a global culture and fewer shared values over time and users slowly get siloed into their own walled gardens of bespoke content.

It’d be no fun if I posted a strong hypothesis without concrete predictions, so here’s a list:

  • The preferences of men and women will continue to diverge, as optimal digital content is wildly different for each
  • The percentage of those identifying as LGBTQ will increase
  • The percentage of people identifying as furries, weeaboos, otherkin, plural, etc, will increase
  • There will never be any such thing as a unifying culture or cause in the future, with the only three potential exceptions being war, x-risk, and wireheading
  • The distribution of wealth and agency will skew even higher in free-market countries such as the United States
  • The fertility rate will continue dropping. most government interventions will not change this and neither will gametogenesis, although ectogenesis might depending on the type of government at the time
  • The amount of strange subcultures will increase
  • The percentage of US citizens who believe democracy works will decrease continually over time. Ultimately democracy will obviously not be a thing due to AI, but I’m not yet confident how this will play out at all.

Note: This post is not yet finished. If you stumbled upon this via RSS or JSON or pure chance, please keep this in mind!