The life of a social media influencer looks to my children as the pinnacle of doing as less as possible. Having creative freedom, direct audience and unlimited money. They haven’t yet truly understood perception vs reality. But helping them peeling back the curtain, I try to point out a different reality. Many creators are locked in a relentless dance with an invisible boss, the algorithm. There are many examples were they have a video of them taking a break, having mental stress, etc.
I am not sure where I heard it first, but it really clicked and stuck. They are, in effect, zero-hour contractors working for a non-human entity. Well it was in Dutch and very coarse, but the meaning is the same. Your success is dictated by the code of the platform designed to capture human attention. That attention is of course how long you stay on that platform, but also that thumb up or down.
This isn’t just about influencers anymore. This dynamic is a preview of a much larger shift. As new, more powerful AI tools are pushed into our work and lives, we run the risk of extending this model to everything. Tturning us all into gig workers for an algorithm we don’t fully understand.
The influencer in the Algorithmic Coal Mine
Creators on platforms like YouTube, TikTok, and Instagram live and die by their metrics. Their challenge isn’t just to create good content, an point you to their OnlyFans. But to create content the algorithm chooses to promote. As Nathaniel Drew from the Better Ideas channel discusses, these platforms are designed to be as addictive as possible. Mainly because “our attention is their currency.”
Historian Yuval Noah Harari explains the mechanics behind this pressure. These platforms gave their primitive AIs a simple, seemingly benign goal: increase user engagement. The algorithms, experimenting on billions of users, quickly learned that the most effective way to glue eyeballs to screens is by “pressing the greed or hate or fear button in our minds.”
This dynamic forces creators into a corner. To succeed, they must align their work with the algorithm’s goal. The signal that I want to point out is that they are not paid for their time or effort, but purely for their ability to generate engagement as defined by the algorithm of that specific platform.
The Ghost in the Machine
This isn’t magic, it’s math. The underlying principle for many of these systems is the Markov Chain. A way of predicting the next step in a sequence based only on the current state. This “memoryless” property is what makes it so powerful.
Google’s PageRank used this to predict the most relevant webpage by modeling a user’s journey across the internet. Now even their AI ads are the first results, above paid advertisers. Large Language Models use a far more complex version to predict the next word in a sentence. Social media algorithms use it to predict which video or post you are most likely to click on next, creating a “chain of events” designed to keep you on the platform.
The algorithm isn’t a conscious entity with preferences. It is a predictive machine optimizing for a single variable: your continued attention. The same can be said be said about the AI tools. I have never encountered a response like “That is a really stupid comparison, are you sure?”.
From Content Creators to Global Workforce
The risk now is that this model is expanding from the creative sector to the entire economy. The new generation of AI tools is poised to redefine what it means to be a knowledge worker.
Geoffrey Hinton, the “Godfather of AI”, is blunt about the consequences. He warns of mass joblessness as AI replaces “mundane intellectual labor.” He argues that just as the industrial revolution made muscle power less valuable, this revolution is devaluing cognitive work. His stark advice for future-proofing a career? “Train to be a plumber.”
This creates a future where many white-collar workers could find themselves in a similar position to today’s influencers: competing with AI agents or using them to become hyper-productive, ultimately reducing the need for human labor. As Hinton’s niece experienced, an AI assistant turned her 25-minute task into a 5-minute one, making her five times more efficient. This is great for productivity, but it also means one person can do the work of five.
Yuval Noah Harari builds on this by describing the rise of AI bureaucracies. We are already outsourcing critical decisions about our lives like job applications, bank loans, even criminal sentences. In these scenarios, we are all subject to the opaque judgment of a machine. When the “computer says no,” there is often no human to appeal to, placing us in a powerless position, subject to a system we cannot understand or influence.
An Arms Race with No Brakes
What troubles me is that both former Google CEO Eric Schmidt and Yuval Noah Harari describe a global AI “arms race” where nations and corporations are locked in a battle for supremacy.
The prevailing logic, as Harari notes, is one of deep distrust: “If we slow down, how do we know that our competitors will also slow down?” Schmidt echoes this, framing it as a national security imperative between the US and China. The fear that an adversary will achieve a breakthrough first forces everyone to accelerate, putting safety and ethical considerations in the back seat.
This race forces us to adapt to the machine’s timeline. Harari calls it a “tug-of-war” between organic and inorganic systems. The AI is “always on,” functioning at a digital pace that humans cannot match. The danger is that it will force us to speed up until our human systems, our societies, our mental health, our democracies begin to collapse.
Reclaiming Human Agency
So, are we doomed to become cogs in a machine of our own making? Not necessarily, the answer isn’t a rejection of technology. As Nathaniel Drew suggests, it begins with personal accountability and intentionality (is that even a word?). We must actively design the life we want to live, rather than passively letting technology dictate how we spend our time. This means cultivating an “internal locus of control” and creating a reality we don’t feel the need to escape from.
Yuval Noah Harari advocates for a similar approach on a mental level, suggesting we all adopt an “information diet” and practices like meditation to find clarity amidst the noise.
Ultimately, the question is not whether AI will be a part of our future, but on whose terms. We must demand transparency and human oversight. We need to build the guardrails Schmidt mentions, ensuring systems have a “human in the loop.” If we fail to do so, we may find ourselves in a world where we are all just temporary contractors, hoping our performance is good enough to be renewed by the algorithm.
Leave a Reply