From Scroll to Chat: How AI Became Social Media’s New Moderator

Brands built reach through influence; with AI, they now need to build it through relevance.

Mark Sage - 11 min read - 21/10/2025

Since 2022 there’s been a curious inflection in social media usage. After nearly two decades of relentless growth, research from GWI/FT shows that global time spent scrolling peaked that year, and has been slipping ever since.

At first, it looked like progress, and that perhaps the world was finally recovering from its addiction to doomscrolling. It surely can’t be a bad thing if people are spending less time on social media. However, I’m not sure that’s really the case.

I’d argue the more plausible explanation is not that people stopped scrolling; it’s that they started talking — not to one another, but to machines.

The year 2022 was also a pivotal year for AI, with models such as OpenAI’s ChatGPT, Google’s LaMDA, and Meta’s OPT bringing large-scale language systems into the public eye.

While no panel yet tracks ‘minutes in AI assistants’ the way we track social feeds, there are still studies in the UK and US that show rapid, mainstream adoption of these tools — even as global social media time plateaued after 2022 — strongly implying a reallocation of attention toward AI-led interactions.

In fact, a 2024 OfCom report from the UK highlighted that “over half (54%) of online 8–15 year olds and 41% of [..] 16+ said they had used a generative AI tool in the past year.”

More recently, OpenAI reported that they have 800m weekly users and token usage has increased from 300m in 2023 to 6bn in 2025. That 20x increase in usage suggests not just an increase in users, but an increase in the scale and depth of usage. That’s longer conversations, more complex prompts — in short, that’s more engagement.

Where social feeds once filled idle moments, conversation now does — not with people, but with machines.

The Ofcom research captured that 50–70% of generative AI usage was for non-search activities — so people are not simply using it to replace solutions like Google, they have new use cases, new ways to engage.

This was highlighted in an article from The Guardian where it talked about the phenomenon of ‘chatfishing’ — with dating app users leveraging tools like ChatGPT to create initial chat-up lines to hook a potential match. Some users are even extending its usage to basically the whole conversational back and worth — meaning that increasingly, you’re as likely to be building a rapport with AI as you are with a potential mate!

It’s unlikely this trend will slow either. Sam Altman, CEO of OpenAI indicated via X that they plan to put out a new version that “allows [..] your ChatGPT to respond in a very human-like way [..] act like a friend” with an aim to “treat adult users like adults [and] allow even more, like erotica for verified adults.”

So, it seems that chat-based AI may not only impact social media usage but could also impact online dating too — quietly becoming a new social and digital companion interface.

Younger audiences in particular have grown up in a world where it feels normal to type questions, confide feelings, or seek advice from an algorithm. These systems are increasingly not just answering queries — they are helping people think, reflect, and even construct who they are.

For the better part of two decades, social media defined identity as performance. We learned to express ourselves through a feed with filtered photos, clever captions, and carefully curated opinions.

We all curated our lives to some degree online, typically showing our best side. This moved from the early days of social media with friends feeds and cat pics, to a world of marketing influencers. No longer satisfied with seeing where our old school friends and acquaintances have been on holiday, we got hooked on the opinions, recommendations and general lifestyle of these KOLs.

In this shift from curation to consumption, brands managed to embed themselves as part of the conversation, promoting products and soft selling aspiration through trusted social content.

Unilever’s CEO Fernando Fernandez captured this perfectly when he saidthat “Creating marketing activity systems in which others can speak for your brand at scale is very important. Influencers, celebrities, TikTokers. These are the voices that matter.”, adding that “messages from brands coming from corporations are suspicious messages”.

He didn’t just say it, he built it — with 2025 seeing them reportedly allocate 50% of their total marketing budget to influencers and an increase of 20x in the number of KOLs they worked with.

Whilst Fernandez is likely correct that consumer trust in institutions and media declined, I’d argue that trust in social media itself is declining too.

With increasingly manipulative algorithmic targeting and the explosion in AI generated content (or slop), we’re seeing fatigue setting in. That same research from GWI showed that not only is social media time declining, but the reason for usage is changing too. The social aspect — meeting new people, sharing opinion and keeping up with friends — is down 30–40% since 2014. More interestingly though is the drop in ‘following celebrities’ — down by around 10% from its peak in 2022.

This is further compounded by the ‘Dead Internet Theory’ which suggests that an increasing amount of engagement such as likes and comments is bot generated, and this isn’t just a conspiracy theory; 2025 was the year when automated traffic surpassed human activity on the internet for the first time.

Associal media has scaled its influence, it has also scaled instability. In recent years there has been an engagement gold-rush which has meant we’ve entered a world of post-truth and post moderation.

Platforms have cloaked themselves in the mantra of free speech so that they can potentially mask their real intent — free flowing cash.

Pulling back on the efforts to fact check what we see, they have opened the gates to anyone with money and an opinion. As Mark Zuckerberg said in January 2025, “we’re going back to our roots and focus [..] restoring free expression on our platforms”.

You can pretty much guarantee that those roots aren’t the innocent days when we “poked” each other and liked each other’s family pics.

This free expression means an increase in ‘opinions’, and whatever your view on a given opinion, the end result is generally clickbait content, designed to create virality, not value.

In a sense, by chasing engagement, the social platforms stripped away the very things that kept information sane — the editors, the context, the consequence. Their algorithms rewarded reaction and not reflection, and so facts became fiction, and moderation became censorship.

We built a world where virality replaced verification and where noise replaced news.

It’s in this environment then — with declining social media usage, and declining trust and engagement (at least from humans) — that people have begun to look more inward than outward.

And AI fits this shift perfectly.

Large language models are reintroducing what we lost — a personal moderator.

They don’t ban or block, but instead they filter, contextualize, and question. Sure, they still have challenges — they still hallucinate, mis-reference, misunderstand. But they restore trust by forcing coherence, consistency, and causality back into the flow of ideas. They balance division, and they create a space for questioning.

Where the social feed gave us infinite noise and polarisation, the LLM gives us structured meaning and rationalisation. This change — with AI essentially acting as the adult in the room — has the potential to change how consumers interact and participate.

Rather than social media prompting the question “Who should I be like?”, AI supports more introspection around “Who am I really, and what do I need?”

It’s private, adaptive, and endlessly patient; quite the opposite of the noisy, performative world of social feeds. It doesn’t make us feel like we’re missing out and instead helps us feel like we’re growing.

However, as people spend more time in these private dialogues, they are outsourcing a growing portion of their decision-making to the systems that guide them.

This outsourcing of decision making is something researchers from MIT could actually see. Where test subjects leveraged ChatGPT for a writing test, they saw the lowest levels of brain engagement and recall versus using search or nothing at all. The AI users literally outsourced their thinking — or as the study says, showed “a diminished sense of cognitive agency”.

If the social feed shaped what we decide, the agent (and its control over our cognitive agency) is shaping how we decide. Moving us from desires to decisions; from influence to intelligence.

This evolution in consumer decision making introduces a new dynamic for marketers — the modern consumer now has an intermediary. An AI model that is a digital proxy filtering, comparing, and recommending on the consumers behalf.

Sometimes this proxy will be visible, like ChatGPT or Perplexity; often it will be invisible, embedded in the systems and platforms that surface choices for us every day. Either way, it will increasingly become the first point of contact between the brand and the buyer.

In a marketing sense, there is now a new channel to the buyer — and for marketers, this is now channel marketing.

This isn’t something new — channel marketing is a well-known discipline. Many companies don’t sell directly to their consumer. They sell through distributors, brokers, and retailers. They understand who is the buyer and who is the gatekeeper.

You seek to drive awareness through brand advertising with the end consumer, and drive activity through the channel to ensure physical availability. In channel marketing, this is known as ‘incentivising the chain’, which is all about managing the push and the pull.

This new consumer landscape is no different — only now that intermediary is algorithmic.

A new route to market, that reads the data, weighs the reviews, interprets the claims, and makes or shapes the recommendation. It’s not swayed by an influencers flashy posts, personality or looks. Today its decisions aren’t based on incentives (although tomorrow that may not be the case!).

Brands must therefore treat this agent as a channel partner — something to be educated and supplied. Ensuring the brand is visible to the machine, with clean data, consistent purpose, and credible promises. No longer can you simply out bid a competitor to reach the front of the queue. Instead, you need to be relevant to the conversation and to the need; and you need to sell that relevance to the agent.

The task then is not only to win the consumer’s affection, but also to win the agent’s confidence. Just as the retailer shapes shelf visibility, the algorithm now shapes semantic visibility.

We’re used to marketing being about the creative. Whether moving or still images, the visual aspect of creative has a unique part to play in drawing us in.

Even simple changes like whether the model is looking at us or looking away can determine the impact of that creative in lifting sales. Author Phil Barden, in his book Decoded, discussed how making small changes to the eye-gaze for a website landing page helped increase conversion by 39%. As humans we are visual and emotional animals.

But with AI, a picture isn’t worth a thousand words — you actually need words.

Models infer “authority” through semantic consistency, where it sees repeated and aligned explanations across multiple credible sources. So, when marketing to the agent, we’re moving from aesthetic influence to semantic influence, and when an LLM interprets content words are the unit of meaning.

For brands, that means ensuring your name consistently appears alongside meaningful, explanatory language. It’s the same principle that builds mental availability in the human mind — repeated exposure deepens associations and strengthens neural pathways. The AI model is no different.

Every word and phrase is broken down into tokens, the atomic unit of language. Tokens are what models consume, interpret, and weight. In that sense, words are literally the energy source that drives the system.

From a brand marketing perspective, our words aren’t just communication, they are combustion. They fuel the LLM and like any engine, if there isn’t enough fuel or the fuel isn’t right, then the engine will stall.

So, what we say matters. Every descriptor. Every review. Every influencer.

And every piece of influencer content needs to say something. It’s no longer enough for a creator to look good holding a product or filming in a beautiful destination. If you’re betting 50% of your marketing budget on what influencers say, it would pay to make sure the LLM is able to listen.

Their content must create language that instructs and teaches the LLM — machine-readable, transcribed, and designed not simply to influence, but to inform. Indeed, this is now even a setting in YouTube so you can allow companies to scrape your transcripts to train their models.

In this new era, it isn’t enough to be seen, you also need to be understood.

Recognising that we now have an intermediary between brand and consumer, we may need to move channel marketing from a below-the-line activity — sitting in a box alongside trade marketing, field marketing and incentive travel — to a through the line activity. Elevated to a core discipline that sits alongside brand marketing and digital marketing.

This isn’t just optimisation — it isn’t a new form of SEO. This is a strategic discipline to ensure that this new channel partner is fully informed and fully equipped.

In the past, we would train retail partners, provide sales kits, and explain why our brand deserved shelf space. Now we need to be creating semantically rich content that ensures the AI agent is able to give us screen space.

It’s not enough to occupy the human mind; we must inhabit the models ‘mind’ — its knowledge graph. Making sure our information is structured, our claims verifiable, and our APIs accessible, so the agent can promote us on our behalf.

Advertising still builds emotion and the reason to believe — the mental availability. The funnel may still need content for those users who are flicking from video to video. Humans are still in the loop.

But the machine needs the reason to trust and the fuel for this is words. Not simply beautiful copy, but credible copy that can inform. Multiple voices, that amplify the “why” for our brand and product.

One without the other risks leaving us invisible — either to the human or to the agent.

In a sense, we’ve been here before.

Before mass media, most brand relationships were mediated by people — the shopkeeper, the pharmacist, the travel agent. Marketing was built on educating those intermediaries so they could represent the brand faithfully. Person to person.

Equally, every technological leap has created a new middleman — from the modern trade retailer, the broadcaster, the publisher, or the platform.

AI is simply the next intermediary. A data-driven gatekeeper that decides what you see, based on what we as marketers help it to see.

So we need to ensure we’re marketing to it, through it and ultimately with it. Only then can we shape the experience it delivers, and only then can we shape consumer choice.

The challenge, as ever, is the same.

Lets collaborate

If you’re exploring how to shape customer behaviour — through loyalty, platforms, or data —
there’s always more to unpack.

Sometimes that starts with a conversation.
Sometimes it turns into something more.

Customer platforms, loyalty, and behaviour design

Lets collaborate

If you’re exploring how to shape customer behaviour — through loyalty, platforms, or data —
there’s always more to unpack.

Sometimes that starts with a conversation.
Sometimes it turns into something more.

Customer platforms, loyalty, and behaviour design

Lets collaborate

If you’re exploring how to shape customer behaviour through loyalty, platforms, or data — there’s always more to unpack.

Sometimes that starts with a chat.
Sometimes it turns into something more.

Customer platforms, loyalty,
and behaviour design