World Scientific
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.

If A.I. Only Had a Heart: Why Artificial Intelligence Research Needs to Take Emotions More Seriously

    https://doi.org/10.1142/S2705078521500120Cited by:7 (Source: Crossref)

    Artificial Emotional Intelligence research has focused on emotions in a limited “black box” sense, concerned only with emotions as ‘inputs/outputs’ for the system, disregarding the processes and structures that constitute the emotion itself. We’re teaching machines to act as if they can feel emotions without the capacity to actually feel emotions.

    Serous moral and social problems will arise if we stick with the black box approach. As A.I.’s become more integrated with our lives, humans will require more than mere emulation of emotion; we’ll need them to have ‘the real thing.’ Moral psychology suggests emotions are necessary for moral reasoning and moral behavior. Socially, the role of ‘affective computing’ foreshadows the intimate ways humans will expect emotional reciprocity from their machines.

    Three objections are considered and responded to: (1) it’s not possible, (2) not necessary, and (3) too dangerous to give machines genuine emotions.