Will AI kill the influencer?
Generating income of $500,000 in just 10 months sounds attractive, right?
This is just one of the many claims made on Reddit amongst threads about how to create and drive sales from AI generated influencers. Unsurprisingly, there’s a great deal of interest in jumping on the next money-making bandwagon.
But we need to stop for a moment, look past the temptation to earn a quick buck and consider the consequences of doing so.
Interestingly, out of 6,000 pieces of content on Reddit, there were just 18 that mentioned “ethical”, “ethic” or “ethics”. This is summed up perfectly by one commentator: “They did it because they could do it but didn’t think whether or not they should do it. Ethics is not a high priority at any tech job these days”.
We are currently experiencing the biggest seismic shift in the impact technology is going to have on our lives and we’re hurtling along at break-neck speed in a global race. And not enough people are stepping back to think ‘so what?’.
Where it impacts most, is around trust. Trust is on the floor
Trust in governments, trust in business, trust in media… trust in people.
Edelman’s 2025 Trust Barometer puts this down to the public developing a sense of grievance that institutions are not acting in their interest. That businesses are not acting ethically or properly, that politicians are all the same, that the media pumps out fake news.
Consumers are used to marketers putting a positive spin on things and we have guidelines and rules in place to keep over-claiming or downright lying in check. But it’s widely acknowledged that with the development of AI, it’s the ‘wild west’ out there at the moment. There’s a lack of guardrails or laws to adhere to, so consumers are fair game.
Where AI influencers are concerned, is it the ‘pretending to be real’ that is the issue? Marketers and advertisers have always been about selling a product or service that is often not as good as it seems. Equally, characters or mascots have been a mainstay of many marketing campaigns - think Ronald McDonald, Meerkats, Captain Birdseye, Honeymonster - the list goes on. But with all of these, they weren’t pretending to be human.
AI influencers are already out there
On average, marketers are already allocating 25% of their marketing budgets to influencers (adamconnell.me) and 48% of 18- to 29-year-olds already follow an AI or virtual influencer.(izea.com).
It all comes down to why consumers follow influencers. Is it just for the content’s entertainment factor at which point, an AI persona can win the day potentially.
If so, this comes with complete control for the brand - no skeletons in cupboards, no unreliability, no long contract negotiations - the AI influencer will do exactly what the brand wants it to do.
The AI generated influencers are consistent in appearance and tone, they can fit to any brand’s schedule, will never be subject to any of the human frailties that can affect us mere mortals, won’t engage in any outside controversies - oh, and they’re cheaper too!
That’s why names like Lil Miquela, Imma and Lu do Magalu are being snapped up by global brands such as IKEA, Samsung and Burger King for social campaigns. In 2024 Lu do Magalu had more than 6 million followers and earnt an estimated $33,000 per Instagram post (AIT News Desk, 2024) and in just six months, Lil Miquela was sponsored by 60 brands (onlinelibrary)
Consumers are more open to engaging with AI influencers than we might expect. A peer reviewed study by Kings College found that people are equally happy to follow an AI or human influencer and that the level of personalisation is similar in either case. This was especially true of those ‘who have a high need for uniqueness (1).
Brands that have been early adopters of AI generated influencers, have certainly been grabbing the headlines. It is an approach that has been enthusiastically embraced by brands particularly in the luxury and fashion sectors including BMW, Calvin Klein and Prada.
The human domain
However, consumers, especially younger ones, are purportedly constantly seeking authenticity above all else.
Perhaps it’s about the person and not just the content. Followers trust an influencer and it’s that trust that makes the content creator valuable. As one Reddit contributor says “.. that trust comes from something AI can't actually replicate - being a human.”
Will anybody look at an AI influencer and be influenced to try a product, eat a particular food, go to an event, visit a city? An AI influencer cannot interact in the same way, cannot walk in store, cannot use the same humour, nuance, experience or emotion. An AI influencer cannot mix up their brand content with snapshots of real life - doing chores, going for a hike, going for a meal, taking the dog for a walk!
Unless of course, the audience doesn't realise that it’s not a real human.
When you cannot easily or quickly distinguish if the person you’re looking at on screen is real or a computer generated image - how does the public know who to trust any longer? And what does this mean for the future of influencer marketing?
Trust signals
The Drum recently reported that 92% of consumers trust individuals, even strangers, more than branded content.
This is at the heart of the argument - how can you trust something that isn’t real? Surely this is the ultimate branded content. Without knowing who the puppet-master is, many would argue establishing trust is impossible.
Risk behind the glitz
While it’s arguable that an AI influencer is easier to control than the human equivalent, let’s stop for a moment to consider who is behind these AI generated personas. What are their motivations and ethics? These could be the keyboard warriors of the future - just with a pretty face..
An AI influencer is likely to be much cheaper than a human one, so cash-strapped brands might be tempted to pursue an AI influencer route - but will they always be transparent and highlight that this isn’t a real human being? If not, it’s a serious level of deception that has far-reaching implications.
David Edmundson-Bird, Faculty Lead for AI at Manchester Metropolitan University is calling for clear guardrails to be introduced. He notes that the UK AI Act (due in 2026) can’t come soon enough. He predicts that while the Act will likely be more stringent than legislation in the US, it probably won’t quite meet EU standards. Fines of £300k have already been levelled at brands in Italy when they failed to clearly disclose AI generated content, with even steeper penalties in France.
Fortunately, the industry is already making moves to self-regulate. Earlier this year The Cannes Lions International Festival of Creativity saw Brazilian agency DM9 stripped of its award for using AI-generated and manipulated footage to bolster its winning entry. To protect the integrity of the awards, the much-respected organisation has subsequently revised its policies to strengthen transparency and enforce AI disclosure - certainly a taste of things to come.(The Drum)
As we head into 2026
What should brand owners and marketers consider as we plan for the new year?
Long-term goals - trust is hard-fought and easily lost. In this cost-of-living era, no company can risk ruining the trust their customers place in them. But it’s nuanced. For some brands with a digitally-native and curious consumer, working with an AI influencer may be a good short-term publicity generator and won’t necessarily do the long-term brand health any harm - as long as it’s all transparent and above-board.
But for many brands a short-term spike in SOV could lead to seismic fall-out if consumers feel hook-winked or simply aren’t ready to embrace the virtual personas.
How human influencers will win the long game
A period of coexistence between human and AI creators is happening now, with audiences adapting to new and different forms of interaction.
The influencer market is likely to then ‘level up’, with opportunities for those human creators who adapt strategically to the evolving landscape.
We can expect to see a shift from content producers to content partners: brands who see beyond a one dimensional transaction and seek partners who can provide versatile raw footage, share behind the scenes, and meet with consumers face-to-face.
All things an AI influencer cannot do - yet.
We will be discussing this and more at Prolific North Live on Thursday 6 November at University Academy 92 (UA92), Brian Statham Way, Old Trafford, Stretford, Manchester M16 0PU. We have a stellar line-up of speakers including body positivity influencer Sophie Noa, Manchester Met University’s AI Faculty lead David Edmundson-Bird and our very own Senior Influencer Manager, Colette Reid. Graham McGilliard, Strategic Comms Director will be asking the questions we all want answers to.
If this has raised questions about your Influencer strategy and campaigns, try our free Influencer Audit to find out just how influential your content creators are.
Or if you just fancy a chat about how influencers can help supercharge your business in 2026, contact Collette Reid, Democracy’s Senior Influencer Manager at collette@democracypr.com or call 0161 881 5941.
References
https://ourownbrand.co/influencer-marketing-statistics-2025-uk-trends-spend-compliance/
https://onlinelibrary.wiley.com/doi/10.1002/mar.22105
1 Kings College Citation for published version (APA):
Sands, S., Campbell, C., Plangger, K., & Ferraro, C. (2022). Unreal influence: Leveraging AI in influencer
marketing. EUROPEAN JOURNAL OF MARKETING, 56(6), 1721-1747. https://doi.org/10.1108/EJM-12-2019-
0949
https://www.thedrum.com/news/cannes-lions-tightens-ai-rules-after-dm9-award-scandal
Reddit: From May 2025, we have recorded 6,668 pieces of content on Reddit that contained the term “Ai Influencer”. There were 1,092 original posts and the rest are engagements to those posts. The conversation is growing: From 1st Sep-31st Oct we recorded 2,674 pieces of content - nearly 60% more than the 60 days prior to 1st Sept.
https://www.reddit.com/r/socialmedia/comments/1o19rtl/sora_2_might_be_the_point_where_influencers/