There's a well-known rule that if an article has a headline in the form of a question, the answer is NO.
This article isn't really an exception ;-)
Is a human borg-mind inevitable? The answer, I think, is kinda ...
A "borg" mind, as popularized by the classic Star Trek episode, is a group of people all controlled by a single collective will, consciousness and memory. It's obviously an extreme invented for entertainment purposes. A more common term is "hive mind", but there are many kinds of hive minds, of which the Borg Collective in Star Trek is a particular variety.
What I do think is very likely is that individual humans get sidelined via the emergence of some sort of mindplex ... a term I introduced over a decade ago to indicate a network of minds that has its own emergent consciousness, will, memory and individuality -- yet also allows the individual minds in the network to have these aspects on their own.
"Mindplex" is a very broad concept and encompasses options that are more "society of individuals"-like alongside some that are more borg-like.
Why do I think the dominance of mindplexes over 2015-style human individuals is very likely?
In short, because of: Brain-computer interfacing (BCI), the inefficiency of current means of communication, and the human love of togetherness and socializing.
What could stop mindplexes from becoming dominant? Apart from horrible global calamities, the most likely thing to stand in their way would be the very rapid emergence of advanced AGI, alongside with the capability for humans to upload and fuse with advanced AGI. If something more advanced than human-based mindplexes emerges before BCI gets to the point of enabling powerful mindplexes, then all bets regarding mindplexes are off....
Let me share a little more of what I've been thinking....
Why Many Businesses Stay Small
Reading Why Information Grows by Cesar Hidalgo, I was pleased with his summary of ideas regarding why companies tend to become less efficient when they expand beyond a certain size. Basically, following prior research of others, he attributes this to the cost of building links (links between people or companies in this case).
In the business world, building links between individuals in different companies is costly, because it requires lots of negotiation, legal overhead, etc. Linking between different individuals in the same company is generally cheaper.
Yet, when a company becomes too big, this is no longer necessarily the case. In a big company, it's often easier for a department to outsource work to an external contractor, than to deal with another department of the same company. This may occur because of complex internal politics, or simply due to the bureaucracy that seems to inevitably spring up when a company grows beyond a certain size.
When the cost of building the internal links needed to get something done exceed the cost of getting the same thing done using external links, then a company may stop growing in size and begin to grow in capability via networking with external entities instead.
Further, one thing we see happening in the tech biz world now is: the cost of linking between different companies, or companies and external individual contractors gets cheaper. This is the move toward a "gig economy", as it's been called. Cheaper links between organizations will tend to lead to smaller, leaner organizations.
One thing that I kept wondering while reading Hidalgo's book, though, is why our society is dominated by organizations that are glued together by ECONOMIC transactions.
I mean, economic interactions are important, but they are not the only kinds of links between people. There are also emotional and relational links, intellectual links, spiritual links, and so on. Yet it's organizations based on economic links that are currently dominant.
The obvious conclusion is that organizations based on economic links are currently so powerful, in large part, because economic links are so easy to form. More easily formed links tend to lead to bigger organizations, as Hidalgo points out. And bigger organizations, on the whole, have more potential to exert power....
What could change the landscape fundamentally, then, would be if other kinds of links became much easier to form.
Forming links based on, say, friendship or sexual relationship or intellectual interchange or shared goals is currently much more difficult and time-consuming than forming links based on monetary exchange. So groups founded on other sorts of exchange are going to be smaller and less able to grow rapidly than groups founded based on economic exchange. Given the current state of things.
I experience this phenomenon quite concretely in my AGI work. It's possible to pull together great contributors for a science or engineering project without paying anyone -- just by recruiting people with a common goal and vision and building a shared feeling and community among them. But in many ways this is MUCH MUCH HARDER than simply hiring and paying people.
Of course, a great community of unpaid contributors can have a self-organizing, self-motivated aspect that MOST groups of employee collaborators won't have. But it's also possible to get great collaboration and enthusiasm among a paid team -- if you hire the right people who gel with each other.... Once a non-monetary link with a project contributor is made, it can have great persistence (or it can evaporate when the person's life situation changes and they suddenly need to spend more time earning an income). But forming non-monetary links just tends to be a lot slower, whereas hiring a contractor is almost instantaneous these days, with sites like Upwork and Elance.
But what if we had, for instance, brain-computer interfacing technology?
Brain Computer Interfacing and Social Self-Organization
What if we had BCI hooked up to allow different people to directly interface with each others brains?
|Real BCI tech will presumably be a lot less picturesque |
and cool-looking than this, but hey...
What if you could drink from the firehose of somebody else's mind? -- directly suck in their thoughts or feelings? This isn't possible yet, but first steps are being taken in the lab already. Would this sort of multidimensional exchange, manifested more fully, make it easier for networks of people to establish other sorts of mutually valuable relationships beyond "merely" economic ones? I would tend to think so....
And networks of people that cohere together based on deeper forms of exchange -- intellectual, emotional and spiritual -- are likely to be much more effective than networks cohering based on economic exchange. Encoding information about needs, desires and motivations in economic terms is terribly inefficient, really.
Paying an employee to align their goals with one's own, is of meaningful yet erratic effectiveness. Spelling out one's needs and desires to a subcontractor in a requirements specification coupled with a legal contract, is always a terrible oversimplification of one's actual needs.
How much better to have a collaborator who really gets one's goals at the deep level, or a subcontracting organization that understands one's requirements at a deep and intuitive level. And these things happen sometimes. But what if they could happen systematically?
For this reason I think BCI will be the death of corporations -- they will simply pale in effectiveness compared to networks of people that self-organize based on deeper kinds of exchange than the economic. But the implications are much broader than this. BCI may also lead -- perhaps quite rapidly -- to the obsolescence of individuals as we know them.
Between Self and Borg, Mindplex
How much of modern culture is focused on exalting the joy and moral value of "coming together." Lovers who feel and act as one; parents who give their all for their children; work teams that act in harmony (e.g. agile software teams), thus achieving much more than the sum of the parts. BCI could enhance all these things -- lovers could really be in each others' minds, minimizing misunderstandings; work teams could share thoughts directly, avoiding all sorts of communication bottlenecks....
Most of the use we get out of Internet and computing tech these days, is oriented toward communication. With Facebook, SMS, video-chat and all the rest, we ensconce ourselves in interaction with others as richly and constantly as we can. If BCI were rolled out, it would immediately be applied to various forms of brain-to-brain social networking. Sufficient use of this kind of technology will cause brains to adapt physiologically to BCI-powered neurosocial networking.
So -- Will this make us a borg? Not exactly. But it will make us part of something new, a new kind of mindplex, something between present-day notions of individual and society.
Incipit Homo Mindplexicus
A true undifferentiated borg mind is unlikely to be optimal as a problem-solving system, for the same reasons that island models work in genetic algorithms (and why OpenCog's evolutionary program learning component, MOSES, works by evolving distinct "demes" of programs). Given realistic resource constraints, one often gets more innovation by letting different pools of resources evolve somewhat independently. The overall system can then choose the best (by its own explicit or implicit criteria) of what the various somewhat silo'd off subsystems have created or discovered.
So one fairly likely-looking possibility is that, after the emergence of powerful BCI: Instead of individuals looking out for their own personal good, and banding together into organizations based crudely on economic exchange -- we will have networks of tightly bound group-minds, interacting based on directly exchanging goals, values and ideas ... and periodically re-shuffling or merging within a broader network of mindplex-like emergent intelligent patterns.
One big question , though, is how this will interrelate with advances in AGI. The same tech that will let us network our minds together, will let us execute Google search queries and access calculators and general software programs from within our minds. The same tech will also let us share thoughts with any AGI software that exists at a given point in time.
A Few Plausible Scenarios
For sake of having an interesting discussion, let's assume a positive post-Singularity world where humans have options and choices (see my chapter Toward a Human-Friendly Post-Singularity World in The End of the Beginning for a more detailed discussion of this sort of world; free PDF version here).
Once AGIs are much more cognitively powerful than humans, then any human mindplexes that exist will, just like human minds, need to decide how far they want to fuse with these AGIs. Full-on fusion with AGIs will likely reduce the human component of any individual or mindplex to relative insignificance relative to the more powerful AGI component.
So various scenarios are possible :
- Advanced AGI comes before advanced BCI. Then the only people who fuse into mindplexes, rather than fusing with AGIs, are ones who value humanity but not individuality.
- Advanced BCI comes before advanced AGI. Then human mindplexes will form, and various whole and partial mindplexes will make their own decisions about fusing with AGIs
- Advanced AGI and advanced BCI come about at around the same time. Then things really get complexicated!
Yadda yadda ... interesting times ...