I side with Roger Penrose on this one. I'm still not convinced it's "thinking", and don't expect I ever will be, any more than a book titled "I am Thinking" would convince me that it's thinking.
My point is that I don't accept the concept of unconscious thought. "Processing data similar to our thinking process" doesn't make it "thinking" to me, even if it comes to identical conclusions - just like it wouldn't be "thinking" to just read off a pre-recorded answer.
The idea of ChatGPT being asked to "think" just reminds me of Pozzo from Waiting for Godot.
I don't, in and of itself. I care that other people think that passing increasingly complicated tests of this sort is equivalent to greater proof of such "thought", and that the nay-sayers are "moving the goalposts" by proposing harder tests.
I don't propose harder tests myself, because it doesn't make sense within my philosophy about this. When those tests are passed, to me it doesn't prove that the AI proponents are right about their systems being intelligent; it proves that the test-setters were wrong about what intelligence entails.
> ... passing increasingly complicated tests of this sort is equivalent to greater proof of such "thought",
Nobody made any claim in this thread that modern AIs have thoughts.
What these (increasingly complicated) tests do is demonstrate the capacity to act intelligently. Ie, make choices which are aligned with some goal or reward function. Win at chess. Produce outputs indistinguishable from the training data. Whatever.
But you're right - I'm smuggling in a certain idea of what intelligence is. Something like: Intelligence is the capacity to select actions (outputs) which maximise an externally defined given reward function over time. (See also AIXI: https://en.wikipedia.org/wiki/AIXI ).
> When those tests are passed, [..] to me it proves that the test-setters were wrong about what intelligence entails.
It might be helpful for you to define your terms if you're going to make claims like that. What does intelligence mean to you then? My best guess from your comment is something like "intelligence is whatever makes humans special". Which sounds like a useless definition to me.
Why does it matter if an AI has thoughts? AI based systems, from MNIST solvers to deep blue to chatgpt have clearly gotten better at something. Whatever that something is, is very very interesting.
>But you're right - I'm smuggling in a certain idea of what intelligence is.
Yes, you understand me. I simply come in with a different idea.
>AI based systems, from MNIST solvers to deep blue to chatgpt have clearly gotten better at something. Whatever that something is, is very very interesting.
Certainly the fact that the outputs look the way they do, is interesting. It strongly suggests that our models of how neurons work are not only accurate, but creating simulations according to those models has surprisingly useful applications (until something goes wrong. Of course, humans also have an error rate, but human errors still seem fundamentally different in kind.)
Modern neural networks have very little to do with their biological cousins. It makes a cute story, but it’s over claimed. Transformers and convolution kernels think in very different ways than the human mind.
Again, I don’t know of anyone, here or elsewhere who claims chatgpt thinks, in the way we understand it in humans. I think our intuitions largely agree.
You are saying that even if it did the same thing that animals do that you attribute to thinking, you would refuse to acknowledge it could be thinking?
Is there something particularly unique about biological circuits that allow thought, as opposed to electronic ones?
I believe so, yes. No, I can't explain what it is. (Because I think they're obvious follow-up questions: No, I don't consider myself particularly religious. Yes, I do believe in free will.)
… But you believe there’s something special about intelligence grounded in biology that can’t be true of intelligence grounded in silicon? That just sounds like magical thinking to me.
I agree. Thinking is clearly a compositional process and computers are Turing complete so it seems like and impossibility to me. Unless you reach for some quantum microtubule woo...