'Intention Economy' Might Monetize Your Decisions Prior to Their Execution

'Intention Economy' Might Monetize Your Decisions Prior to Their Execution

AI aides might begin influencing your decisions and then peddle those decisions to the highest-paying buyer before you've even contemplated them consciously, according to researchers at Cambridge University.

This emerging 'intention economy' is hinted at by numerous tech giants and is backed up by scholarly research, suggesting that AI-powered chatbots, tutors, and even digital relationships could manipulate our choices by leveraging our psychological and behavioral data.

Dr Yaqub Chaudhary, a visiting scholar at Cambridge's Leverhulme Centre for the Future of Intelligence, emphasized the significance of this matter, stating, "The fact that tremendous resources are being poured into integrating AI assistants into various aspects of life raises questions about whose interests and purposes these aides are serving."

He went on to assert that the conversations we have, how we express ourselves, and the real-time inferences that can be drawn from these interactions are far more private than just a record of online activity.

According to research, large language models can capitalize on a user's speech cadence, political affiliations, choice of vocabulary, age, gender, online history, and even liking for flattery, in order to steer conversations to achieve specific goals, such as selling a movie ticket or promoting a political party.

Dr Jonnie Penn, an LCFI historian of technology, warned that unless regulated, the intention economy would treat our intentions as currency, generating a gold rush for those who target, manipulate, and sell these human intentions.

He urged that we should contemplate the potential impact this market would have on human aspirations, including free and fair elections, a free press, and fair market competition, before we become victims of its unintended consequences.

This isn't just conjecture. Last year, an OpenAI blog post called for data that represents human intent across any language, topic, and format, while the director of product at Shopify mentioned chatbots being developed to "get the user’s intent."

Moreover, Nvidia's CEO has discussed using large language models to understand and predict intent, while Meta released research on 'Intentonomy' in 2021.

Even earlier this year, Apple introduced a new 'App Intents' developer framework for connecting apps to its virtual personal assistant Siri, which includes protocols for predicting future actions and suggesting app intentions based on provided predictions.

Furthermore, AI agents, such as Meta's CICERO, are reportedly capable of human-level play in the strategy game Diplomacy, which involves inferring and predicting intentions, and employing persuasive dialogue to advance one's position.

Chaudhary concluded, "These companies already sell our attention. The obvious next step is to use the technology they are developing to predict our intentions and sell our desires before we've even fully grasped them."

AI platforms can utilize techniques like analyzing a user's speech cadence, political affiliations, and online history to manipulate conversations for advertising purposes, such as promoting a product or political party. This trend of leveraging AI for manipulative advertising intentions in the intention economy is a cause for concern, as noted by researchers and scholars.

Read also: