Understanding how people understand AI systems is crucial to unravelling the dynamics of human-AI interaction. It may also inform the design of AI systems, offering information about people’s expectations and needs. Empirical and theoretical research on this topic has been so far mainly focused on the attribution of mental states and intentionality to robots, and more specifically on whether people adopt an intentional stance (à la Dennett) towards AI systems, based on the attribution of beliefs, desires, intentions and other propositional attitudes. Here it is claimed that people may occasionally adopt a predictive and explanatory style that interestingly differs from the intentional stance. It will be dubbed “folk cognitive science” and - unlike the intentional stance - involves the decomposition of the AI system into a set of functional modules responsible for certain transformations between input and output representations. Some notional examples of folk-cognitivist accounts of AI systems will be provided. By distinguishing between a folk-psychological and a folk-cognitivist understanding of AI systems, this paper aims at contributing to the development of theoretical frameworks useful for the empirical assessment of how people understand AI systems in their everyday activities.
Larghi, S., Datteri, E. (2023). How people understand AI: from folk psychology to folk cognitive science. Intervento presentato a: AIC 2023 - 9th Workshop on Artificial Intelligence and Cognition 2023, Bremen, Germany.
How people understand AI: from folk psychology to folk cognitive science
Larghi,S
;Datteri,E
2023
Abstract
Understanding how people understand AI systems is crucial to unravelling the dynamics of human-AI interaction. It may also inform the design of AI systems, offering information about people’s expectations and needs. Empirical and theoretical research on this topic has been so far mainly focused on the attribution of mental states and intentionality to robots, and more specifically on whether people adopt an intentional stance (à la Dennett) towards AI systems, based on the attribution of beliefs, desires, intentions and other propositional attitudes. Here it is claimed that people may occasionally adopt a predictive and explanatory style that interestingly differs from the intentional stance. It will be dubbed “folk cognitive science” and - unlike the intentional stance - involves the decomposition of the AI system into a set of functional modules responsible for certain transformations between input and output representations. Some notional examples of folk-cognitivist accounts of AI systems will be provided. By distinguishing between a folk-psychological and a folk-cognitivist understanding of AI systems, this paper aims at contributing to the development of theoretical frameworks useful for the empirical assessment of how people understand AI systems in their everyday activities.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.