Back to Basics — Metacognition as Dark Matter

Two therapy dogs, Ghillie and Cecilia, getting ready for a child client

One of the most frustrating parts of what I write about is getting people to realize that they don’t know stuff, and the real solution when you don’t know stuff — at least to start — is to realize you don’t know stuff. You can’t effectively harness new modes of understanding until you get to the point where you realize that all the old answers you used to think might explain stuff just aren’t going to cut the mustard. Too many contradictions, and such, means you have to accept your ignorance and move on. It is only then that enlightenment can occur.

This is hardly a new idea, and the Zen masters — my favorite go-tos — were big on this. One of my favorite stories from Paul Reps’ collection, Zen Flesh, Zen Bones, is below, and deals directly with my line of employment.

Once, a university professor went to visit a well-respected Zen Master to learn about Zen. The Master first invited him to sit for a cup of tea. The professor sat down and started talking about Zen. The Master quietly prepared and poured the tea. When the tea was filled to the cup’s brim, he kept pouring. The professor watched the overflowing cup until he could no longer restrain himself. “It’s full! No more will go in!” blurted the professor. “The same with your mind. How can I teach you Zen unless you first empty your cup?”

But changing adapted mental models is hard. Why, for example, would you bother to learn what I talk about on this blog? You really have to be tortured by your own confusion to sit down and spend the time to instantiate all this stuff. And you’re likely not going to get much community support dwelling on what some rando on the Internet says might change your life. (That Rando would be me, of course.)

In short, you have to possess the developed ability of metacognition — knowing what you don’t know, and having a sense that there is stuff out there that you’re not even aware of.

Why is this so challenging? As I said in this piece, once you open your mind to the notion that maybe the truth is really shared information that different, active sentient agents use for inter-agent coordination (read the piece for details — it’s a little complicated) you realize that if you adopt a different mental model than your friends, you risk alienation and loneliness from your cohort group. And humans no likey that kind of thing, at all. Being alone means that tigers are gonna eat ya. And if you think you’re going to retreat from some likely 10M years of evolution just to figure out how to help pilot our society out of its current mess, I’ve got news for you.

Metacognition — or admitting that you don’t know — in a group is going to have also other active agents rush in to fill you up with their views, which probably aren’t any better, and likely worse than your own. It’s how we get those mass psychoses we’ve got going. And the more externally defined/emotionally available you are to what others think, it’s going to get ugly fast. Corrections in this kind of peer pressure are long-term. People just don’t want to hear your bullshit confusion.

Some things we don’t know are also profoundly comforting in not knowing, especially if you already have a narrative figured out that makes sense with the surrounding sensory inputs in your environment. I used to be a big Anthropogenic Global Warming (AGW) advocate. But as time went by, and, well, the seas didn’t swallow New York City, I became more and more of a skeptic. And then when people in the IPCC threatened something I happen to love very much — in this case, vast swaths of native forest, which at least some of them wanted to cut down to make the planet more shiny (that’s the albedo thing) I woke up. There are more things in heaven and Earth than my philosophy can know, Horatio.

And then I continued my journey with meeting people like Judy Curry, the former chair of atmospheric sciences at Georgia Tech, and someone that had made the jump herself a couple of decades ago. Judy’s book, Climate Uncertainty and Risk, is dense — but a classic. Only someone like Judy could go through the probabilistic analyses of what actually is going to happen in the climate space, as a risk management and probability expert. It was one piece in the puzzle that convinced me we actually have a memetic problem with climate science — not so much a scientific one. Status elevation in the field was (and still is) tied to how catastrophic the narrative one creates, instead of anything resembling a grounded reality. Those louder voices have seized the megaphone, and they’re screaming. And if you don’t fall in line, it’s only tigers for you.

And what do those loud voices do? That’s where my expertise kicks in. Some very famous loud voices in the climate science community are also connected in a very closed-loop feedback modality to the insurance industry. If they’re all saying we’re gonna wash away in the next big storm, someone has to sell us insurance so we can rebuild back in the same place. That’s what insurance is all about. And that means they have to raise their rates, because business is business, don’tchaknow? Or the government has to cover the house. Or something. Short answer — the real problem is brain worms in the scientific climate community.

So to understand all of how this might be connected, you gotta start admitting you don’t know stuff, and looking for other signals that people are lying to you. The biggest would be insurance company profits. Which is downright metacognitive-y. Because now people are paying increased premiums for things that didn’t happen. And our news media stream is not about reporting things that didn’t happen. You didn’t read a piece recently “China didn’t invade Mongolia this week,” because that wouldn’t have much signal value. Or emotional value either.

But just because I wasn’t aware of insurance profits, didn’t mean that the signal wasn’t there. That’s the whole Dark Matter part of metacognition. Dark Matter is the stuff in the universe that doesn’t reflect light, but it’s still there tugging on all sorts of other stuff through gravity. Considering that it makes up 85% of the matter in the universe, though, you can’t just ignore it. And that’s what is happening in the memetic-sphere with our thoughts. Metacognition is accepting that it really does exist, and then starting the process of adjusting our worldview to understand it.

My friend, and atmospheric scientist at UC-Davis Joe Biello sent me this picture. Once you understand where that Dark Matter is, it’s not surprising that the picture it gives of what’s going on starts becoming more coherent, or in the colloquial, making more sense. Here’s insurance industry profits.

I used to use the signal that the insurance industry was raising their rates as proof that AGW was real. But it turns out not so much. It turns out the same people spreading the AGW hysteria are also looped into the money-making machine. And it’s not that some level of GW is happening (and some is caused by humans) it’s that the hysteria signal prevents more reasoned debates from occurring on what actual solutions might be. Or on what scale we should respond. I’m extremely pro-environment (spent my entire life working on various issues) and totally believe humans can fuck up stuff locally, as well as regionally. Big time. Anyone can see a clearcut. Or an urban heat island. But actually grounding yourself to changes in the global system needs lots more research.

Which we should be doing. But when the hysteria meter is off the charts, instead of understanding how our natural systems, which are obviously complex, modulate the climate, through vegetation, circulation and growth (see my buddies Anastassia’s and Andrei’s work on the biotic pump) we end up with people demanding we turn Siberia into a parking lot. We still don’t know exactly how all this works. But we won’t even study it if all the money is diverted into computer time and large models. It’s like sticking our fingers in our ears and saying “Nyah nyah nyah!” Not very metacognitive-y. Nor wise.

It’s no surprise that human brains work like this. Yeah, I like my work on knowledge complexity. But you’ll also find me recognizing Michael L. Commons’ work on hierarchical complexity as well. Not quite as system-y as mine is but spot on as far as understanding what humans are capable of knowing. And here’s the key. One of the hardest things for humans to process is cross-paradigmatic complexity. In our example case here, the cross-paradigmatic complexity is how AGW research feeds into insurance industry profits. There are at least three jumps across physical to social systems that reveal the relative truth of a lot of this stuff. Most human brains no likey. And even if your brain DOES like it, you’re likely to be missing something. I know I certainly was. The easiest immediate proof that storm intensity and frequency are NOT increasing is found in insurance industry profits. Because if it actually was, you better believe the insurance industry would be howling more than they already are. And there’s also ancillary cause-and-effect (like building more cheap houses in places like Florida) that are also potentially causal in insurance industry profits going down, if there actually were a hurricane. It’s all part of the metacognitive puzzle — not just looking at the connections, but also looking at how, and which are the connections that matter.

This kind of analysis (or really, meta-analysis) can leap all over the map. I’ve been going back-and-forth on the risks of AI tech, for example. And one ALWAYS ends up with the “correlation is not causation” tropes, like increasing ice cream prices are tied to tiger predation, or some such icks. You can look those up yourself.

If there is any answer to all this, it is awareness and your agency. So walk around and think about stuff you really don’t know much about. And then investigate. The worst thing that can happen is you become a more interesting cocktail party guest. Even if no one wants to invite you.

P.S. Judy’s latest contribution to the DOE’s climate report is here. They did a great job in pulling apart a very confounded body of work that is mostly nonsensical. You’ll hear the usual hue and cry about the oil industry blah blah blah, but I really encourage you to read it. It’s good mental exercise.