bombs in bottles
Yesterday I was gardening and listening to Ed Zitron's interview with Karen Hao, author of EMPIRE OF AI:
Better Offline: Empires of AI by Karen Hao
At one point, Hao describes how the people working on generative AI start to buy the idea that everything is computable. Everything can be emulated in terms of a computer model.
"Everything" here includes human minds. Ergo, they decide, if we throw enough resources at making a computer generate outputs that sound like thinking, we have made a computer think, just as a human mind does. Everything is computable; therefore human minds are computable; therefore if we work at it long enough, we can make a computer that is indistinguishable in any meaningful sense from a human mind.
Meanwhile, I've reached a point where I can scarcely discuss my own cognition without resorting to plant metaphors: gardening, growth, compost, terroir. My lived experience of my own cognition is that the mind becomes more complex with time, not less. When I attempt to describe my mind (which is, of course, the only human mind I can attempt to describe), I lean not into circuits but ecosystems.
"Brains are computable" versus "brains are ecosystems" is not, of course, an "argument" I can "win." Nobody reasoned themselves into their respective metaphors here, including me, and none of us can reason our way out of them.
I don't make this observation to "prove" anything. It's only that I find the image of cognition-as-computation less useful as I become more familiar with how my own mind participates in, and responds to, the ever-changing sum of my experiences.
The longer I live, the more I experience my mind less as an algorithm and more as a compost pile. Everything feeds it; it turns over slowly; what comes out is typically fertile, often surprising, and always fit to suit the moment of its growth, whether or not it was fit for anything when it went in.
Can we train an algorithm to do the same thing if we feed it enough plagiarized information?
It's not a question I'm equipped to answer. The more I ponder it, the less I am convinced the answer even interests me. My own mind is sufficiently interesting to occupy me for the rest of my human lifetime - which is all the time I need concern myself with, in any case.
[Sidebar: Even if it's true that "if we throw enough resources at the problem we can make computers be brains," brains are brains from birth, if not before. No one needs to burn $9 billion to produce a baby or even raise it to adulthood. "It needs to think like a human" is a solved problem, mates.]
--