Back to LessWrong

Empathic inference

From Lesswrongwiki

Jump to: navigation, search

Empathic inference is every-day common mind-reading. It’s an inference made about other person’s mental states using your own brain as reference, by making your brain feel or think in the same way as the other person you can emulate their mental state and predict their reactions. This method is extremely less costly than trying to model the brain as a physical system and then calculating the expected reaction. It can be used in science, for example in cognitive psychology, it’s standard procedure to use a panel of reviewers ascribing emotional states and expected reactions to other individuals. However, it doesn’t reveal the underlying mechanisms behind the reactions and the emotional states[1].

Empathic inference is often wrongly used to predict the behavior of non-human agents in a manifestation of Anthropomorphism. This is commonly applied to forces of nature in mythology and religion[2]. For example, attributing intentionality to hurricanes, saying they are the manisfestation of God's wrath. Today, AGI is often subject to this kind of bias, i.e., some people tend to think a superintelligent AGI would have to be necessarily benevolent or malignant, based on "putting themselves in the AI's shoes". Putting oneself in an AI's shoes cannot be used to evaluate AI behavior because AI will not have human-like motivations unless they are explicitly programmed in [3].

Blog Posts

See Also