It is a truism in evolutionary biology that conditional responses require more genetic complexity than unconditional responses. To develop a fur coat in response to cold weather requires more genetic complexity than developing a fur coat whether or not there is cold weather, because in the former case you also have to develop cold-weather sensors and wire them up to the fur coat.

But this can lead to Lamarckian delusions: Look, I put the organism in a cold environment, and poof, it develops a fur coat! Genes? What genes? It's the cold that does it, obviously.

This fallacy underlies a form of anthropomorphism in which people expect that, as a universal rule, particular stimuli applied to any mind-in-general will produce some particular response - for example, that if you punch an AI in the nose, it will get angry. Humans are programmed with that particular conditional response, but not all possible minds would be.

Similarly: You've seen apples, touched apples, picked them up and held them, bought them for money, cut them into slices, eaten the slices and tasted them. Your eyes can see the raw pixels of an apple, and your visual cortex process it into a 3D shape, recognized by your temporal lobe as an apple similar (despite differences of shape and color and angle of vision) to other apples in your experience. Your motor cortex and cerebellum can move your fingers to pick up the apple.

That is all the complex machinery your brain has for apples; and you can pull the lever on that complex machinery just by saying "apple" to a fellow English-speaking human, since the two of you have both learned to associate the sound "ap-ple" to activate all that complicated machinery.

But if that machinery isn't there - if you're writing "apple" as five ASCII letters inside a so-called AI's so-called knowledge base, when the AI can't recognize apples or do anything with apples - then the word is just a lever detached from its underlying machinery.

Main post

See also