Which basically means the brain doesn’t think the way the brain thinks the brain thinks.
Aside from the horrible phrasing, it's quite funny to think about, leading us to:
My brain thinks it is funny to think about how the brain doesn’t think the way the brain thinks the brain thinks.
In the same way that atoms don’t exactly work the way the atoms think the atoms work since, as Niels Bohr said, "A physicist is just an atom's way of looking at itself."
Reminds me of some comment I heard somewhere, about how the brain and the body don’t come with an instruction manual, and how it makes sense from an evolutionary perspective: we so far apparently haven’t needed to know how they work to use them well enough to survive.
I was annoyed by the phrasing of the submission, which could have been "The way the brain thinks is counter-intuitive" (don't assume I don't know what you are going to tell, or, conversely, that I spent even one second thinking about how the brain thinks), but we probably would not have had the pleasure to read your comment with this title.
You are speaking about the editorialized title "Mental Phenomena Don’t Map Into the Brain as Expected", right? Well, yes, I would have titled a bug report exactly this way now that you say it.
For reference, the original title is "The Brain Doesn’t Think the Way You Think It Does", this is the one I reacted to.
For me it is more mandane: there were times when people thought of human health in terms of 4 elements. [Much] Later they discovered germ theory that uses more useful concepts.
Using "memory", "perception" to describe how the brain works is like using 4 elements to describe how our bodies function.
Except the four element theory was rendered irrelevant by the germ theory. I doubt that understanding the brain will render memory and perception irrelevant.
Robots literally have memory and perception (we know because we built them), so clearly these are real phenomena that exist in the real world.
It seems to me unlikely that we are totally mistaken in conceptualizing ourselves in terms of these known real phenomena.
My intuition is that it isn't totally mistaken, more so that it is a massive simplification.
Whereas we seem to describe the brain as having "memory", as if that is something statically stored that is simply "retrieved" (somehow), it seems to me that this overlooks that the device that is doing this is also running an entire virtual model of simulated reality, which can not only remember things, but it can replay them, change variables and play them again, play them in reverse, manufacture completely fictional scenarios, run (as its default) a custom modified variation of "actual" reality that it finds more pleasing (this one has gotten lots of news coverage in the last few years), read the contents of realities running inside other minds (tens of millions if it so chooses), see into the future of "actual" reality, all sorts of different things.
"Brains have memory" is a bit of an understatement of what they actually do.
> "Brains have memory" is a bit of an understatement of what they actually do.
...but, I don't think anyone thinks that "brains have memory" is a complete statement of what they actually do. It's not a complete statement of what computers do either.
> ...but, I don't think anyone thinks that "brains have memory" is a complete statement of what they actually do.
Perhaps. This topic is a bit of a hobby for me, and I haven't really encountered much discussion that explicitly gets into the fine details and distinctions of the matter with respect to the human mind, as opposed to the plentiful discussion on the matter when it comes to computers.
A noteworthy distinction between these two is that computers are man made (such discussions are a necessary component of the development process), whereas the mind is made by ~nature, therefore it requires no such discussion.
If you can point me to any in-depth discussions on this distinction, or even any keywords that one might google, I would appreciate it.
Four element theory is on approximately the same epistemological plane as modern personality psychology. In some ways it's more useful than psychology, because it has a straightforward logic in terms of metaphor and relationships between parts, whereas psychology is a mess. Of course, it has very little to do with how the brain works, but psychology also has very little to do with how the brain works (it's more about the use of language to describe and regulate behavior), so it's not a big deal.
Sure. But let’s take the average computer user’s analogy from another comment [1].
A random, average smartphone can use their device every day, and accomplish a great number of things without knowing how it actually works at all.
For all they care, it could be powered by tiny fairies running in wheels like hamsters, and the Internet and cell networks could be irascible spirits connecting all smartphones together through a worldwide mycelium network.
Knowing how smartphones and computers is not necessary for them to accomplish what they set to.
The same way, if we actually needed to have a deeper understanding of our own inner workings to gain a noticeable survival edge, evolution would probably have taken care of that, the same way it has endowed nearly all of us with at least a basic survival instinct. By eliminating most individuals who don’t feel any need to eat food or drink water from the gene pool.
Newtons theory of gravity and mechanics and such did explain things, just not as accurately as general & special relativity did. But we still use newtons formulas today as a good enough approximation in many cases.
I have a feeling concepts like memory and perception will still be used in the future, even when we figure out the equivalent of general relativity for the brain in the future too, especially since they are more higher level summation phenomena, like personality, or higher level languages vs. asm.
For me a key insight is that it's not necessarily possible for a brain to understand itself. Obviously there's a certain "capability floor" that a brain needs to have in order to understand a system of some complexity. A rat's brain is capable of understanding some systems, but it's definitely not sufficiently capable to understand a rat's brain. A human brain can understand various systems that are too complex for rats, but it's also finite in it's capabilities - and it's not totally certain that the inherent complexity of how human brains work is small enough to be understandable by human brains; perhaps it would take something more (e.g. brains augmented with tech or biologically enhanced or something totally alien) to properly understand how our brains work.
I see this a lot, but I don't think it's accurate. It's definitely not possible for a brain to completely understand itself, as in know the state of every neuron and how all of those states will evolve over time, but that absolutely doesn't preclude a very complete understanding.
The more we learn about the brain, the more it seems it is very friendly to a certain degree of abstraction. If it wasn't it wouldn't be able to turn sense data into world models so well!
Hmm, but perhaps we could manage to understand the brain and behavior of a fruitfly or even a mouse. The “self-conscious” tier is trivial recursion. Hofstadter has this right in “I Am A Strange Loop”.
Aside from the horrible phrasing, it's quite funny to think about, leading us to:
My brain thinks it is funny to think about how the brain doesn’t think the way the brain thinks the brain thinks.
In the same way that atoms don’t exactly work the way the atoms think the atoms work since, as Niels Bohr said, "A physicist is just an atom's way of looking at itself."
Reminds me of some comment I heard somewhere, about how the brain and the body don’t come with an instruction manual, and how it makes sense from an evolutionary perspective: we so far apparently haven’t needed to know how they work to use them well enough to survive.