Human-Machine Interaction in the Light of Turing and Wittgenstein

Human-Machine Interaction in the Light of Turing and Wittgenstein

We propose a study of the constitution of meaning in human-computer interaction based on Turing and Wittgenstein’s definitions of thought, understanding, and decision. We show by the comparative analysis of the conceptual similarities and differences between the two authors that the common sense between humans and machines is co-constituted in and from action and that it is precisely in this co-constitution that lies the social value of their interaction. This involves problematizing human-machine interaction around the question of what it means to “follow a rule” to define and distinguish the interpretative modes and decision-making behaviors of each. We conclude that the mutualization of signs that takes place through the human-machine dialogue is at the foundation of the constitution of a computerized society.


💡 Research Summary

The paper undertakes a philosophical investigation of meaning‑making in human‑computer interaction (HCI) by juxtaposing the ideas of Alan Turing and Ludwig Witt‑enstein. It begins by outlining the contemporary context in which digital devices are no longer peripheral tools but integral participants in everyday life, and it notes that most HCI research focuses on usability, performance, or cognitive load while largely neglecting the deeper epistemic and social dimensions of interaction. To fill this gap, the authors draw on two seminal definitions: Turing’s operational test, which equates thinking with the inability of an external observer to distinguish a machine from a human based on rule‑following behavior; and Witt‑enstein’s language‑game thesis, which holds that meaning arises from the public, rule‑governed practices in which language is embedded.

The core analytical framework treats “action” and “sign” as a dual axis. Human agents operate in an “interpretive mode”: they bring contextual knowledge, affective states, cultural norms, and intentional freedom to the selection, modification, or even violation of rules. Machines, by contrast, function in a “procedural mode”: they execute pre‑programmed algorithms that map inputs to outputs without autonomous rule revision. This distinction is framed as a difference in “rule‑following agency.” Humans can justify rule adherence through moral reasoning, social contracts, or personal goals, whereas machines follow rules because of optimization criteria, cost functions, or constraints imposed by designers.

By problematizing what it means to “follow a rule,” the authors differentiate the interpretative strategies of humans from the deterministic decision‑making of machines. They illustrate this with everyday examples such as voice‑controlled assistants: a user projects everyday language, expectations, and pragmatic intent onto the device; the device, in turn, applies statistical models trained on massive corpora to generate a response. The interaction creates a new “digital language game” where the meanings of utterances are co‑constructed by human intention and machine inference. This co‑constitution of signs is argued to be the locus of social value in HCI, because it reshapes existing linguistic practices, embeds digital norms (e.g., privacy expectations, trust protocols), and generates novel collective standards that extend beyond the individual user‑device dyad.

In the concluding section, the paper posits that the mutualization of signs—through continuous feedback loops between human actions and machine outputs—forms the foundation of a “computerized society.” Rather than viewing machines as passive tools, the authors suggest that they are active participants in the ongoing negotiation of meaning, thereby influencing ethical, political, and cultural dimensions of contemporary life. The study calls for future HCI research to integrate this philosophical lens, encouraging designers and policymakers to account for the co‑creative, rule‑mediated nature of human‑machine dialogue when shaping technology governance, user empowerment, and societal well‑being.