Apropos fiatluxemburg on reverse gestalt psychology. Made me think…
Our culture almost always envisions AI emerging from an enunciative function, of learning to become self-aware (think, I, Robot or, inversely, Babel-17).
But, what if AI is much more likely to first become … complex, let us say … not based on the level of the symbolic (abstract and abstracted self-conception, language), but on the level of the imaginary (like birds reacting to breast plumage, or dogs reacting to smells or facial recognition), i.e., something automatic and environmental.
Avatars who can recognize anger and run, or happiness and approach. Then, for some crazy reason, like broken machines being used for something, by something, not in their programming, they learn to say “I” …
Ian Hacking’s critique of the Theory-of-Mind-deficit theory of autism « What Sorts of People