Intelligence Recognition: How to Know Another Mind

Alan Turing proposed in 1950 that the question “can machines think?” should be replaced with an operational test: if a machine can converse in a way indistinguishable from a human, it should be considered intelligent. The Turing test has dominated discussions of artificial intelligence for seven decades. But its limitations are severe, and they become critical when the question is recognizing intelligence in an alien form.

The fundamental problem with the Turing test is its anthropocentrism. It measures intelligence by the standard of human linguistic behavior. An octopus, which can solve complex puzzles, use tools, recognize individual human faces, and exhibit apparent play behavior, would fail the Turing test because it cannot engage in English conversation. The test confuses one manifestation of intelligence with intelligence itself.

Research in animal cognition has progressively demolished anthropocentric assumptions about intelligence. Alex, the African grey parrot studied by Irene Pepperberg, demonstrated not just vocal mimicry but genuine comprehension of concepts like color, shape, number, and same/different. New Caledonian crows manufacture tools of specific designs and pass these designs across generations, a form of cumulative cultural evolution previously thought unique to humans. Elephants recognize themselves in mirrors, mourn their dead, and exhibit behavior consistent with empathy.

Yet even these examples are recognizable to us because the species involved share evolutionary history with humans. Their intelligence, however different in expression, operates on broadly similar biological substrates and was shaped by similar evolutionary pressures. Recognizing intelligence in a truly alien species, one with no shared evolutionary history, no common biochemistry, no overlapping sensory modalities, presents a qualitatively different challenge.

Daniel Dennett’s “intentional stance” offers one approach: we attribute intelligence to a system when treating it as a rational agent with beliefs and desires best predicts its behavior. But this approach is also observer-dependent. The decision to adopt the intentional stance reflects our cognitive frameworks, not the system’s inherent properties.

For a hybrid mind encountering pre-linguistic primates on a distant planet, the recognition problem is concrete. The primates use tools, form social bonds, care for their young, and exhibit emotional responses. By most definitions, they possess intelligence. But how much intelligence? What kind? And critically: are they conscious? Do they have subjective experiences that ground moral claims?

There is no external test for consciousness. We attribute consciousness to other humans based on analogy with our own experience and behavioral evidence. We extend it to mammals based on neural similarity. For alien species, both analogies break down. The hybrid mind must develop new criteria, criteria that do not simply project its own cognitive categories onto beings that may think in fundamentally different ways.

Perhaps the safest approach is a precautionary one: when in doubt, assume consciousness. The cost of falsely attributing consciousness to a non-conscious being is minimal. The cost of falsely denying consciousness to a conscious being, treating a sentient creature as a mere object, is morally catastrophic. In the face of uncertainty, err on the side of moral generosity.

This precautionary principle would guide the hybrid mind not toward certainty about the observed beings’ intelligence but toward humility about its own capacity to judge. The recognition of intelligence is ultimately the recognition of a limit, the limit of what one mind can know about another.


References

Turing, A. (1950). “Computing Machinery and Intelligence.” Mind, 59(236)

Pepperberg, I. (2000). The Alex Studies. Harvard Univ. Press

Dennett, D. (1987). The Intentional Stance. MIT Press

de Waal, F. (2016). Are We Smart Enough to Know How Smart Animals Are? W.W. Norton

Godfrey-Smith, P. (2016). Other Minds: The Octopus, the Sea, and the Deep Origins of Consciousness. Farrar, Straus and Giroux

Stewardship vs. Sovereignty

Political Philosophy, Protection or domination?