The title of this piece was "Is Tribalism a Natural Malfunction?" but that seems an incorrect characterization.
It's
all about trust, which is one of the reasons globalists have a problem
convincing ordinary people to share their grand dreams and visions. Many
of the things globalists have promised turned out not to be true so
people go to the population size they feel they can trust.
Can't trust the U.N. after the Oil-for-Food frauds and the Rwandan genocides? Let's try nation-state.
Can't trust nation-states because one part of the populace cheats or shows themselves to be hypocrites?
(And
it is this very point, Orwell's “All animals are equal, but some
animals are more equal than others.” where globalists lose the masses)
Let's try states.
And
then city-states and if you can't trust your fellow metropolitans we'll
go with blood relations, first tribes and if there are schisms there,
to immediate family. Consanguinity and all that.
Tribalism
isn't a "mal" anything, it's a survival mechanism for when you really,
really have to increase the odds that you will be able to trust another
person.
From Nautil.us, August 22, 2019:
What computers teach us about getting along.
From an office at Carnegie Mellon, my colleague John Miller and I had evolved a computer program with a taste for genocide. This
was certainly not our intent. We were not scholars of race, or war. We
were interested in the emergence of primitive cooperation. So we built
machines that lived in an imaginary society, and made them play a game
with each other—one known to engender complex social behavior just as
surely as a mushy banana makes fruit flies.
The game is called
Prisoner’s Dilemma. It takes many guises, but it is at heart a story
about two individuals that can choose to cooperate or to cheat. If they
both cheat, they both suffer. If they both cooperate, they both prosper.
But if one tries to cooperate while the other cheats, the cheater
prospers even more.
The game has a generality that appeals to a
political philosopher, but a rigorous specificity that makes it possible
to guide computer simulations. As a tool for the mathematical study of
human behavior, it is the equivalent of Galileo’s inclined plane, or
Gregor Mendel’s pea plants. Do you join the strike, or sneak across the
picket line? Rein in production to keep prices high, or undercut the
cartel and flood the market? Pull your weight in a study group, or leave
the work to others?
Our
simulation was simple: In a virtual world, decision-making machines
with limited powers of reasoning played the game over and over. We, as
the unforgiving account-keepers, rewarded the ones who prospered and
punished the ones who did not. Successful machines passed their
strategies to the next generation, with the occasional slight variations
designed to imitate the blind distortions typical of cultural
evolution.
We also gave the machines a simple language to think
with and enough resources to have memories and to act on them. Each
generation, paired machines faced each other multiple times. This is how
life appears to us: We encounter our trading partners over and over,
and how we treat them has consequences. Our model for the world was two
Robinson Crusoes encountering each other on the sands.
When we
ran these little societies forward, we expected to confirm what many
believed to be the optimal strategy for playing Prisoner’s Dilemma:
tit-for-tat. A machine playing this strategy begins by keeping its
promises, but retaliates against an instance of cheating by cheating,
once, in return. Tit-for-tat is the playground rule of honor: Treat
others well, unless they give you reason otherwise—and be reasonably
quick to forgive.
Yet when we looked at the output of our
simulations, where the strategies were free to evolve in arbitrary
directions, we saw something very different. After an early, chaotic
period, a single machine would rise rapidly to dominance, taking over
its imaginary world for hundreds of generations until, just as suddenly,
it collapsed, sending the world into a chaos of conflict out of which
the next cycle arose. An archaeologist of such a world would have
encountered thick layers of prosperity alternating with eras of ash and
bone.
Instead of an orderly playground ruled by cautious,
prideful cooperators, the population produced bizarre configurations
that made no sense to us. That is, until one evening, in the office and
after filling up pads of graph paper, we stumbled onto the truth. The
dominant machines had taken players’ actions to be a code by which they
could recognize when they were faced with copies of themselves.
In the opening moves
of the game, they would tap out a distinct pattern: cooperate, cheat,
cheat, cooperate, cheat, cooperate (for example). If their opponent
responded in exactly the same fashion, cheating when they cheated,
cooperating when they cooperated, they would eventually switch to a
phase of permanent cooperation, rewarding the opponent with the benefits
of action to mutual advantage.
Woe, however, to those who
did not know the code. Any deviation from the expected sequence was
rewarded with total and permanent war. Such a response might take both
machines down, in a kind of a digital suicide attack. Because the
sequence was so hard to hit upon by accident, only the descendants of
ruling machines could profit from the post-code era of selfless
cooperation. All others were killed off, including those using the
tit-for-tat strategy. This domination would last until enough errors
accumulated in the code handed down between generations for dominant
machines to stop recognizing each other. Then, they would turn against
each other as viciously as they once turned against outsiders, in a kind
of population-level autoimmune disease.
As long as the codes lasted we called them Shibboleths, after the
tribal genocide recounted in the Old Testament Book of Judges:
And the Gileadites took the passages of Jordan before the Ephraimites: and it was so, that when those Ephraimites which were escaped said, Let me go over; that the men of Gilead said unto him, Art
thou an Ephraimite? If he said, Nay; / Then said they unto him, Say now
Shibboleth: and he said Sibboleth: for he could not frame to pronounce it
right. Then they took him, and slew him at the passages of Jordan: and
there fell at that time of the Ephraimites forty and two thousand.
Shibboleths are a common feature of human culture and conflict. Finns who could not pronounce yksi (meaning
“one”) were identified as Russians during the Finnish Civil War.
Tourists in downtown Manhattan quickly out themselves if they pronounce
Houston Street like the city in Texas.
Here our machines had used
them to dominate a population so effectively that no others could
survive. Even after the era was over, it was their descendants that
inherited the ashes. The blind hand of evolution had found a simple, if
vicious, solution....
....
MUCH MORE