4 Comments
User's avatar
Neural Foundry's avatar

Your parallel processing metaphor is really compelling. The idea that we need distributed subroutines to tackle hyperobjects makes a lot of sense, especially when you consider how the brain operates. I've been thinking alot about how decentalized systems might actully mirror natural intelligence, but hadnt connected it to this planetary wisdom architecture you're describing. Do you think we're at risk of creating a system that's too complex for us to even undrestand, or is that maybe the whole point?

Sterlin's avatar

Hi Neural!

100% I do. Chardin referred to this ramping complexity as "complexification." In The Phenomenon of Man, he also said we would "evolve" to become more fully human as a result of this process. He called it hominization. The idea is that we will come to understand what is happening generally better, but by the same token, we would not be able to control its development fully. This is the Omega Point. It is there we lose comprehension of the predicament and merge with the infinite. In a way, we already do not really understand what is happening concerning tech acceleration. We're just in for the ride. Nonetheless, we can actively and purposely build this parallel mind. It is a tangible and achievable goal for the entire species.

Parallel Citizen's avatar

Intriguing idea - only by stepping outside of existing minds/structures can we posit alternative and possibly competing structures that could save us from ourselves.

Sterlin's avatar

Absolutely, primarily because we can activate new governance structures and dissolve many of the zero-sum game dynamics that legacy systems operate on.