According to Cambridge University researchers, conversational AIs have the potential to influence internet users’ choices – whether they buy or vote – by offering them tailored recommendations. Apart from the risk of large-scale social manipulation, these recommendations can be auctioned off in real time, creating an unprecedented market focused on user intent.
Cambridge University researchers warned in a paper published on Monday that conversational artificial intelligence (AI) could soon influence consumers’ trading decisions, opening the door to potential manipulation.
” AI tools are already in development to acquire, infer, collect, record, understand, predict and ultimately manipulate and transform into objects. » Future Decisions of Internet Users, explains Yaqub Chaudhry, one of the co-authors of Leverhulme Center for the Future of Intelligence (LCFI) in Cambridge, an interdisciplinary research center dedicated to AI.
These trades will focus on the Internet user’s intentions, for example their future purchase or their next vote. It will therefore go beyond the current practice, which aims to sell the user’s attention to the highest bidder, specifically Social networksSocial networksFor example delivering targeted ads based on their browsing habits or history.
AI, a lever of commercial and social influence
Concretely, an AI can collect intimate psychological and behavioral data about a user, then use it to establish higher levels of trust and influence them more easily, according to the authors of this article published in Can adapt to it to be. Harvard Data Science Review. ” Have you thought about going to see Spiderman tonight? “or” You said you were feeling overworked, can I book you the movie tickets we talked about? ”, cite article authors as examples of chatbot suggestions.
“For Sale” Proposals through A real-time auction system, which can also be related to hotel rentals. the carthe car or voting for a political candidate.
What will be left of free will in the age of AI?
This “intention economy,” as the researchers call it, paves the way for a potential. Social manipulation on an industrial scale », according to the press release that accompanies this article.
” We must begin to think about the potential impact of such a market on human aspirations, including free and fair elections, a free press and fair market competition. “, says study co-author Joni Penn.
” What people say when they talk, how they say it, and the type of itInterventionIntervention [possibles] Online interactions are more intimate than just recording them. », indicates Yaqub Ch.
The researchers note that companies such as OpenAI (Chat GPTChat GPT) Shopify, Nvidia, Meta ou appleapple have begun to look at these “persuasive technologies”. AI is already criticized for its ability to deliver false information, for example the manipulation of images during the US presidential election.