This article is about dunbar functions
after reading this article your concepts about dunbar will be caleared.
The study of eudaimonic community sizes began with a seemingly silly method of calculation: Robin Dunbar calculated the correlation between the (logs of the) relative volume of the neocortex and observed group size in primates, then extended the graph outward to get the group size for a primate with a human-sized neocortex. You immediately ask, "How much of the variance in primate group size can you explain like that, anyway?" and the answer is 76% of the variance among 36 primate genera, which is respectable. Dunbar came up with a group size of 148. Rounded to 150, and with the confidence interval of 100 to 230 tossed out the window, this became known as "Dunbar's Number".
It's probably fair to say that a literal interpretation of this number is more or less bogus.
There was a bit more to it than that, of course. Dunbar went looking for corroborative evidence from studies of corporations, hunter-gatherer tribes, and utopian communities. Hutterite farming communities, for example, had a rule that they must split at 150 - with the rationale explicitly given that it was impossible to control behavior through peer pressure beyond that point.
But 30-50 would be a typical size for a cohesive hunter-gatherer band; 150 is more the size of a cultural lineage of related bands. Life With Alacrity has an excellent series on Dunbar's Number which exhibits e.g. a histogram of Ultima Online guild sizes - with the peak at 60, not 150. LWA also cites further research by PARC's Yee and Ducheneaut showing that maximum internal cohesiveness, measured in the interconnectedness of group members, occurs at a World of Warcraft guild size of 50. (Stop laughing; you can get much more detailed data on organizational dynamics if it all happens inside a computer server.)
And Dunbar himself did another regression and found that a community of 150 primates would have to spend 43% of its time on social grooming, which Dunbar interpreted as suggesting that 150 was an upper bound rather than an optimum, when groups were highly incentivized to stay together. 150 people does sound like a lot of employees for a tight-knit startup, doesn't it?
Also from Life With Alacrity:
A group of 3 is often unstable, with one person feeling left out, or else one person controlling the others by being the "split" vote. A group of 4 often devolves into two pairs... At 5 to 8 people, you can have a meeting where everyone can speak out about what the entire group is doing, and everyone feels highly empowered. However, at 9 to 12 people this begins to break down -- not enough "attention" is given to everyone and meetings risk becoming either too noisy, too boring, too long, or some combination thereof.
As you grow past 12 or so employees, you must start specializing and having departments and direct reports; however, you are not quite large enough for this to be efficient, and thus much employee time that you put toward management tasks is wasted. Only as you approach and pass 25 people does having simple departments and managers begin to work again...
I've already noted the next chasm when you go beyond 80 people, which I think is the point that Dunbar's Number actually marks for a non-survival oriented group. Even at this lower point, the noise level created by required socialization becomes an issue, and filtering becomes essential. As you approach 150 this begins to be unmanageable...
LWA suggests that community satisfaction has two peaks, one at size ~7 for simple groups, and one at ~60 for complex groups; and that any community has to fraction, one way or another, by the time it approaches Dunbar's Number.
One of the primary principles of evolutionary psychology is that "Our modern skulls house a stone age mind" (saith Tooby and Cosmides). You can interpret all sorts of angst as the friction of a stone age mind rubbing against a modern world that isn't like the hunter-gatherer environment the brain evolved to handle.
We may not directly interact with most of the other six billion people in the world, but we still live in a world much larger than Dunbar's Number.
Or to say it with appropriate generality: taking our current brain size and mind design as the input, we live in a world much larger than Dunbar's Function for minds of our type.
Consider some of the consequences:
If you work in a large company, you probably don't know your tribal chief on any personal level, and may not even be able to get access to him. For every rule within your company, you may not know the person who decided on that rule, and have no realistic way to talk to them about the effects of that rule on you. Large amounts of the organizational structure of your life are beyond your ability to control, or even talk about with the controllers; directives that have major effects on you, may be handed down from a level you can't reach.
If you live in a large country, you probably don't know your President or Prime Minister on a personal level, and may not even be able to get a few hours' chat; you live under laws and regulations that you didn't make, and you can't talk to the people who made them.
This is a non-ancestral condition. Even children, while they may live under the dictatorial rule of their parents, can at least personally meet and talk to their tyrants. You could expect this unnatural (that is, non-EEA) condition to create some amount of anomie.
Though it's a side issue, what's even more... interesting.... is the way that our brains simply haven't updated to their diminished power in a super-Dunbarian world. We just go on debating politics, feverishly applying our valuable brain time to finding better ways to run the world, with just the same fervent intensity that would be appropriate if we were in a small tribe where we could persuade people to change things.
If people don't like being part of large organizations and countries, why do they stick around? Because of another non-ancestral condition - you can't just gather your more sensible friends, leave the band, and gather nuts and berries somewhere else. If I had to cite two non-regulatory barriers at work, it would be (a) the cost of capital equipment, and (b) the surrounding web of contacts and contracts - a web of installed relationships not easily duplicated by a new company.
I suspect that this is a major part of where the stereotype of Technology as the Machine Death-Force comes from - that along with the professional specialization and the expensive tools, you end up in social structures over which you have much less control. Some of the fear of creating a powerful AI "even if Friendly" may come from that stereotypical anomie - that you're creating a stronger Machine Death-Force to regulate your life.
But we already live in a world, right now, where people are less in control of their social destinies than they would be in a hunter-gatherer band, because it's harder to talk to the tribal chief or (if that fails) leave unpleasant restrictions and start your own country. There is an opportunity for progress here.
Another problem with our oversized world is the illusion of increased competition. There's that famous survey which showed that Harvard students would rather make $50,000 if their peers were making $25,000 than make $100,000 if their peers were receiving $200,000 - and worse, they weren't necessarily wrong about what would make them happy. With a fixed income, you're unhappier at the low end of a high-class neighborhood than the high end of a middle-class neighborhood.
But in a "neighborhood" the size of Earth - well, you're actually quite unlikely to run into either Bill Gates or Angelina Jolie on any given day. But the media relentlessly bombards you with stories about the interesting people who are much richer than you or much more attractive, as if they actually constituted a large fraction of the world. (This is a combination of biased availability, and a difficulty in discounting tiny fractions.)
Now you could say that our hedonic relativism is one of the least pleasant aspects of human nature. And I might agree with you about that. But I tend to think that deep changes of brain design and emotional architecture should be taken slowly, and so it makes sense to look at the environment too.
If you lived in a world the size of a hunter-gatherer band, then it would be easier to find something important at which to be the best - or do something that genuinely struck you as important, without becoming lost in a vast crowd of others with similar ideas.
The eudaimonic size of a community as a function of the component minds' intelligence might be given by the degree to which those minds find it natural to specialize - the number of different professions that you can excel at, without having to invent professions just to excel at. Being the best at Go is one thing, if many people know about Go and play it. Being the best at "playing tennis using a football" is easier to achieve, but it also seems a tad... artificial.
Call a specialization "natural" if it will arise without an oversupply of potential entrants. Newton could specialize in "physics", but today it would not be possible to specialize in "physics" - even if you were the only potential physicist in the world, you couldn't achieve expertise in all the physics known to modern-day humanity. You'd have to pick, say, quantum field theory, or some particular approach to QFT. But not QFT over left-handed bibble-braids with cherries on top; that's what happens when there are a thousand other workers in your field and everyone is desperate for some way to differentiate themselves.
When you look at it that way, then there must be much more than 50 natural specializations in the modern world - but still much less than six billion. By the same logic as the original Dunbar's Number, if there are so many different professional specialties that no one person has heard of them all, then you won't know who to consult about any given topic.
But if people keep getting smarter and learning more - expanding the number of relationships they can track, maintaining them more efficiently - and naturally specializing further as more knowledge is discovered and we become able to conceptualize more complex areas of study - and if the population growth rate stays under the rate of increase of Dunbar's Function - then eventually there could be a single community of sentients, and it really would be a single community.
Subscribe to:
Post Comments (Atom)
A woman was made by a computer. Can she fulfill the sexual desire?
A new technological advancement has emerged, which some are hailing as a new singularity, but not in the way that was previously predicted....
-
On Thursday, Ukraine was bombarded with missiles by Russia, with the largest oil refinery in Kyiv being struck, according to reports. The c...
-
A new technological advancement has emerged, which some are hailing as a new singularity, but not in the way that was previously predicted....
No comments:
Post a Comment