How AI Is Shaping Our Dating Lives
It’s no secret that AI isn’t just a part of our dating lives—it also influences them. But in exactly which areas do these connections manifest themselves? Below, I’ll give you a glimpse into my research on artificial intelligence and society and explain the various ways AI plays a role in modern romantic relationships.
Datingapps
Like many other apps—such as social media platforms and news apps—dating apps are designed to keep us spending as much time as possible on them and pouring our attention into them. The reason behind this is economic. The longer we spend on an app, the more money app operators can make off of us. This happens, for example, through the collection of behavioral data running in the background, which can then be sold to third parties. Sociologists Marion Fourcade and Kieran Healy refer to the collection of any personal data as the “data imperative.” The data imperative is based on the assumption that collecting all this data will eventually pay off financially, even if the benefit isn’t immediately apparent (Fourcade and Healy, Seeing like a Market, 14–16). The unimaginably vast amounts of data collected every day from people worldwide who leave digital footprints are then stored in massive data farms.
So much for the issue of data accumulation by apps. Furthermore, people perceived as physically attractive are the most popular on dating apps and are prioritized by algorithms to be shown to other users, so that those users spend more time on the platforms (Celdir, et al.). Consequently, these profiles generate more revenue for the company, which in turn means that these very profiles—belonging to people who already benefit from “pretty privilege” in society—are favored by the algorithm.
For users, this leads to lying about physical traits that are socially considered attractive—men often about their height, women, on the other hand, more often about their weight—in order to be shown to other users more frequently. On top of that, it takes an average of 2 seconds for a person to decide whether to swipe left or right, so every effort is made to ensure that the first impression is positive. Of course, lies have short legs and are no basis for a trusting relationship. By the first date at the latest, it becomes clear that the reality is different from what was stated, and a potentially promising match comes to nothing because, naturally, the question then arises: what else was lied about?
In addition, the algorithm factors in the level of activity on the app; this naturally means that particularly active users have an advantage here, but it also means that swiping and messaging become a daily routine that shapes their lives. Since this investment in terms of time and effort can become too much, some users decide to outsource the task of adapting their behavior to the algorithm to someone else.
Dating Bots
Outsourcing involves the use of dating bots. Such services either put a significant dent in your wallet or require a very high level of technical expertise. When using dating bot providers, customers can specify the tone, the target audience, and the goal of the dating process. Similar to the dishonest practice of “dating profile optimization,” the first connection here also begins with a lie, since the other party is messaging via a bot (or, unbeknownst to the user, a data worker), and this too constitutes a breach of trust.
Data Worker*innen and AI Companions
The reality is that, while much has already been digitized in our big data societies, by no means everything functions without human involvement, despite claims to the contrary. The key term here is “data work.” Data workers are people who are often employed by tech companies in countries of the Global South to produce or label data that algorithms can then use for calculations, frequently with the aim of replacing these low-wage jobs. At the same time, the work of data workers is often used to give the false impression that something works with AI when it actually does not. What I mean by this becomes clear in the case of Michael Geoffrey Abuyabo Asia. After the trained air freight forwarder from Nairobi couldn’t find work in his field, he began working for companies where his job involved slipping into all sorts of roles while simultaneously feigning love and affection for multiple chat partners. This aspect of his job was initially kept from him, and financial hardship then forced him to stay in this emotionally draining job. Gradually, it dawned on Michael that he was training AI companions. In other words, Michael and his colleagues, working under intense time pressure and for miserable pay, produced data on human behaviors and responses related to love and intimacy, which was then used to create AI partners designed to replace them. To this day, Michael doesn’t know whether his chat partners believed they were sharing their most intimate secrets with a person who doesn’t exist or with an AI. Practices like those used at Michael’s former workplace result not only in the emotional and economic exploitation of data workers. In addition, unsuspecting chat partners—without their consent to train an AI—are coaxed into revealing their most intimate thoughts for economic gain, while being led to believe they are building a connection with the person on the other end.
Sources:
Fourcade, Marion, and Kieran Healy. 2017. “Seeing like a Market.” Socio-Economic Review 15 (1): 9–29. https://doi.org/10.1093/ser/mww033.
Celdir, Musa and Cho, Soo-Haeng and Hwang, Elina H., Popularity Bias in Online Dating Platforms: Theory and Empirical Evidence (July 15, 2023). Available at SSRN: https://ssrn.com/abstract=4053204 or http://dx.doi.org/10.2139/ssrn.4053204
Mit der Unterstützung von Data Workers' Inquiry und weiteren AI Justice Institutionen mit denen Michael affiliiert ist, hat er seine Erfahrungen in diesem Essay niedergeschrieben: Asia, M. G. (2025). The Quiet Cost of Emotional Labor. In: M. Miceli, A. Dinika, K. Kauffman, C. Salim Wagner, and L. Sachenbacher (eds.). Data Workers‘ Inquiry. Creative Commons BY 4.0. https://data-workers.org/michael/
Image:
Beckett LeClair / https://betterimagesofai.org / https://creativecommons.org/licenses/by/4.0/