The need for some degree of control and adjustment of action-reaction with man-made technical artifacts is of itself proportional to their purposes, be they, for instance, tool-like or toy-like in nature. From a primitive situation in which processes are executed and register values checked manually to the automation of “commands” and feedback –adequately named cybernetics–, there has been a constant human-object interaction. But a sort of bilateral “communication” between humans and their artifacts is a phenomenon of the last century that has recently seen a clear exponential growth through the advancement of artificial intelligences. In particular, generative AIs (GenAIs) have profoundly transformed the way people interact with machines and will continue to interact with them to an even greater extent in the foreseeable future.

In this context, it is worth exploring some epistemic aspects of AIs’ collaboration with users and the changing boundaries in the human-machine relationship. We can begin by analyzing the causes and consequences of the phenomenon known as the ELIZA effect. In the 1960s, the ELIZA chatbot developed by Joseph Weizenbaum at MIT, which used basic pattern-matching and repetition techniques, successfully led many users to attribute emotional understanding and genuine intent to it. Its name is now used to describe the human tendency to attribute mental states and cognitive abilities to AI systems that mimic human language. On the basis of a philosophical and psychological analysis, we explore how this mental attribution reflects mechanisms that are characteristic of interpersonal relationships and point out the challenges this poses in the contemporary world. Far from being anecdotal, the situation reveals a human predisposition to project subjectivity onto any entity that uses language consistently, a trait that persists and is amplified by today's natural language processing (NLP) technologies and advanced AI models.

This phenomenon is not due to cognitive limitations, but to a deep physiological and psychological inertia that leads humans to seek reciprocity and understanding in any interlocutor, even if it is a machine with a human likeness. By simulating complex dialogues, GenAI reinforces this tendency, generating an epistemic shift: a change in perception that leads to considering machines as agents equipped with a mind and perhaps subjectivity. However, this mental attribution is a conceptual error that obscures the essential distinction between personal intelligence and the computational processes of machines. Although AI can mimic linguistic patterns with amazing results, its operations are based on models and algorithms that lack conscious experience or intentionality.

To address this problem, it can be helpful to delve into the importance of the second-person perspective (2PP), a key notion in philosophy and psychology that emphasizes the role of joint attention and the dynamics of reciprocity in interpersonal relationships. While the first-person perspective (1PP) emphasizes the incommunicability of the individual’s experience and introspection, and the third-person perspective (3PP) carries with it the scientific difficulties of interpreting seemingly similar processes in another person as being internally the same, P2P is eminently pragmatic in relying on the actual confirmation from at least two individuals that categorize an experience as truly shared.

Whereas interpersonal communication involves this level of mutual recognition, interactions with AI lack this essential element, and although GenAIs an collaborate with people effectively, they are not capable of engaging in a genuine relationship. They can facilitate tasks, expand access to information and improve productivity, but they do not possess an identity with which to establish a genuine bond.

The distinction between “someone” and “something” is crucial to preserve the richness of human relationships in a world increasingly mediated by technology. The increasing personalization of machines, if not balanced by critical reflection, could lead to a reification of people and a personalization of artifacts, blurring the line between humanity and its creations.

LINKED TO: