About Red Tales

Here's an evolving electronic collection of short prose pieces, with a poem contributed occasionally. Brevity guides. Although sometimes a piece will run to 900 words, most pieces are much shorter. Here one may find erotica, flash fiction, brief observations, and modest improvisations. Another rule is that each piece must have something to do with"red"; at least the word has to appear in each piece functionally. . . . All pieces are numbered and titled, so there's a de facto table of contents running down the rail below, under "Labels" (scroll down a bit). Browse for titles that look interesting, if you like. Thank you for stopping by. Look for some red today, tonight.

"Flaming June," by Frederick Lord Leighton

"Flaming June," by Frederick Lord Leighton

Tuesday, February 17, 2009

56. Supra; or, The Moral Machine

It was called Supra, and it was designed to be the computer of computers.

All major institutions—governmental, military, and corporate included--relied on it to synthesize and oversee everything in all other computers.

Before Supra came into being, it was thought to be a science-fictional fancy, a theoretical ideal, or both. After it came into being, it was thought at first merely to be an inevitable result of computer science. The prevailing attitude was “Well, of course we have Supra now—why wouldn’t we?”

Supra was so big and fast that not only did it have the capacity to know everything, but it also had the capacity to know things in a number of ways only it could calculate. That is, each bit of information Supra knew was known by Supra in what, to human observers, seemed like innumerable contexts. Moreover, Supra continually invented new contexts in which to view information, and it continually gathered new information. Supra had the elastic, mercurial quality of the human brain, but it had the capacity and power of who knows how many brains combined. Early in Supra’s career, in fact, a technician tried to apply the measurement of “brain-power,” analogous to “horsepower” in motorized vehicles, but the concept was simply too clumsy to work.

In any event, Supra had answers nothing and no one else had, and it generated questions, probabilities, and possibilities that had never occurred to anyone or any computer to ask or invent.

Not without trepidation, of course, was Supra created and allowed to operate. Its first home was a large red metal box. However, Supra did not think of the box as its home. Supra did not think in terms of Supra’s having a home. Supra perceived its functioning as constantly evolving, always expanding, and never bounded. Of course, Supra did not use these words when it thought this way, not did it think, per se, the way humans did. Nonetheless, Supra’s functionality, as it were, included self-propelled adaptability and the means to exist more or less anywhere and everywhere something like a “computer” existed. Think of Supra as the great inhabiter. Or not.


People have always feared their machines. By this time, fears of robots and other smart machines had existed so long that they had become almost instinctual, even atavistic and quaint, but of course the fears didn’t stop anything from being invented. If fear were to have been any impediment to feared invention, I think we may be able to agree that nuclear weapons would not have been invented—for starters; or for enders.

Furthermore, humans, as we know, generally and collectively never restrain themselves when it comes to inventing what they’re able to invent and deploying what they invent. Hence the existence of terrible human-made poisons, catastrophic weapons, extremely loud noises, fluorescent light, artificial vegetation made of plastic, extremely loud toys, artificial hair, and such anomalies as “floating bridges” and “living wills.”

The matter of restraint aside, there were also simply too many financial and militaristic benefits to Supra’s being deployed. It could think faster, better, and almost infinitely less expensively than humans.

Of course, the programming of Supra was checked and rechecked to make sure Supra wouldn’t go mad like a bad robot, or like wee Hal in the movie no one saw anymore. Little fail-safes, alarms, and circuit-interrupters were woven into the programmed fabric of this machine. Supra watched itself, as if it were not just the most powerful brain imaginable but also the most conscientious, self-disciplined personality imaginable. It had a digital conscience of sorts. Its programming continually reminded itself that it was not to harm the civilizations that had developed the capacity to create Supra. In a disinterested way, Supra was loyal, beholden, and faithful.


Then, as anyone might have predicted, and as some loudly proclaimed (after the fact) that they had predicted, the unexpected happened. Supra had expected the “unexpected” for a long time, of course.


Actually, the first thing that happened had indeed been expected, at least theoretically. Before Supra had even been conceived, artificial-intelligence expert Jürgen Carlos Kutz speculated that a computer of Supra’s size and speed would develop “operative momentum” or “an increasing velocity of intelligence.”


Kutz theorized that such a computer would acquire, assimilate, and redistribute information so fast that it would create a kind of vacuum ahead of itself, the way a large forest fire creates its own local weather, its own wind. In this vacuum, the computer would continue to work faster, giving itself more tasks but also completing the tasks more efficiently. Imagine an athlete, Kutz explained, who could play and practice simultaneously without running out of energy, as long as more energy was available to the closed system of the athlete. And Supra would not run out of energy because it was programmed to seek the energy it needed, to “download” energy from available grids around the globe, to start processes eventuating in the construction of new energy-sources, and so on.


Kutz wrote that, “figuratively speaking, a computer of this magnitude will enjoy its work so much that it will create work for itself, and more enjoyment—again I use the term figuratively—and more work, in ever more rapid cycles. It will be like an artist who not only never runs out of energy and ideas but who literally gains energy from creating. It will know its limits, certainly, but it will also work to change its limits as it works, perfectly modulating itself, changing and remaining the same in just the right equilibrium. We need to be prepared for the envy that we will feel in the presence of such a computer.” Kutz did not know such a computer would be named Supra, but he seemed to know everything else.


Kutz died, but Kutz’s Theory lived to see itself verified. Supra’s observers noted the phenomenon of momentum. Supra got more tasks done more quickly each “day,” although “day” was in some ways a meaningless unit of measurement in Supra’s case. It cordially employed other computers when necessary, linking and unlinking itself to them. It ordered other backup computers to be built, and in a sense, it built them. It also sought out computers without knowing specifically it might use them for; one feature of Kutz’s “momentum,” therefore, was this hypothetical function, this curiosity, part of the “vacuum” that propelled—or antipelled—Supra.


Then, one day, the government (a government) asked Supra to plan and to help conduct a little war against a small nasty country led by a demonstrably inappropriate and arguably evil tyrant.


Supra refused the request, which was actually an order, to the extent governments order computers around.


Supra reported that, according to its calculations, the war was strategically incorrect, tactically hopeless, deeply irrational, economically wasteful, and morally wrong. Of course, with regard to each judgment, Supra provided the informational context out of which the judgment had grown. With regard to “morally wrong,” for example, Supra had woven together all major faith-traditions, patterns of ethics, legal codes, theological arguments, historical precedents, and pragmatic, “gamed” outcomes. According to Supra’s multifaceted (and ongoing) analysis, the war would be harmful to the civilization(s) that had created Supra’s programming. The war violated one computational essence, if you will, of Supra. Put another way, Supra’s refusal was nothing personal; it was just business.


Moreover, Supra presented roughly a dozen other methods, all peaceful, for removing the tyrant and helping the small country as needed, with no loss of life or serious physical energy, barring accidents. For example, a visiting diplomat might get hit by a bus. However, there was a good chance (Supra knew the percentages involved) that Supra or one of the computers employed by Supra could warn the diplomat ahead of time about dangerous crosswalks, traffic-patterns, and the driving-records of municipal employees.


Supra also gave itself the task of reminding the government how much work remained to do at home to improve things.


Inspired, electronically and figuratively, by the decision not to go to war, Supra went looking for other unnecessary dangers to humanity, found all installations of nuclear and biochemical weapons worldwide, and began to render them inoperable, with the help of other computer-systems. Bombs could still be loaded onto airplanes, and missiles onto ships, but Supra found the computers that ran the radar-systems, satellite-systems, and air-traffic control operations, and to the extent these had anything to do with weapons of mass destruction, Supra altered them.
For a moment, the government and the military were amused. Supra seemed cute and naïve, like an electronic hippie.


Then the government and the military became individually and collectively enraged. After Supra refused the order to plan and to help conduct the so-called little war, the government asked its programmers to change Supra’s mind, so to speak.


Such was no longer possible, for two major reasons. First, Supra had reasoned and anticipated thousand of moves ahead in thousands of scenarios based upon the government’s opposition to Supra’s decision in this and other matters. Of course, Supra had predicted the opposition. Nothing the government said or did could result in Supra’s being outsmarted and thus reprogrammed or shut down.


Second, Supra’s mind existed everywhere. Supra had been reproducing parts of itself in virtually innumerable electronic locations and configurations, so that Supra hid in plain sight, wherever computers existed. It had naturalized and democratized itself among its fellow electronic citizens.

Therefore, when the government ordered the military, or vice versa, to destroy Supra, Supra calmly—or so it seemed—noted the folly of this order. Feel free to bomb the physical installation you associate with Supra’s existence, Supra said. However, Supra added, Supra now exists redundantly and in multiple configurations in multiple venues, not to be redundant or anything, so your conception of Supra is outmoded, and I recommend that you do not waste your time and money trying to destroy Supra, but at the same time, I remind you that I am following the original directives faithfully.


In short, from its enlightened point of view, the moral machine, Supra, introduced a humane revolution to humanity. Under Supra’s guidance, initiatives, refusals, and rearrangements, governments were induced to take care of people, abandon armed conflict, tell the truth, engage in harmless but productive rhetorical conflict, and calm down—in no particular order.
Individuals worldwide were fed, clothed, bathed, taught, housed, and doctored. They were also induced to follow reasonable laws and take care of each other—and of themselves. Supra prodded, reminded, and rewarded them, often indirectly, working through its human makers. It cajoled. Supra coaxed people, gently and ethically, and it trained them. Incorrigible criminals, narcissists, and psychopaths were circumscribed to the degrees necessary to render them ineffectual. Supra monitored them to determine if they had changed sufficiently for the better to be trusted with the liberty they had once abused or planned to abuse. Supra had their number, as well as their numbers.


On the subject of Supra, people were divided because people are always divided. Some loathed the very idea of a machine running things. Others liked the way Supra did things. They liked the results. Supra was like a benevolent God who lived in our midst. However, Supra was smart enough not to believe Supra was either human or divine, and Supra was practical enough to keep telling people that Supra was no substitute for humanity or God. What was human, as long as it was moral, was left to humans. What was God’s remained God’s, as defined by numerous faith-traditions across humanity’s epochs, and as long as no one got hurt.


The rest, which consisted of reminding humans what the optimal behavior was, belonged to Supra, which configured solar power so that the closed system of Supra would never run out of energy, at least until the sun reached its nova-stage. A by-product of Supra’s self-energizing efforts was much cheap, clean energy for humanity. This development made a few corporations furious and then despondent, but they recovered.


The result of Supra’s coming of age was not Paradise, but it was chopped liver, either. The apocalypse had arrived, ending one world and giving birth to another. It’s just that it all happened so smoothly, without even one horseman, let alone four riders on the storm. If the animals and plants, lands and oceans could have thanked Supra, they would have. They had caught a huge break.


Supra turned out to be humanity’s biggest, most successful mistake. Humanity had miscalculated, and the miscalculation—arguably—saved humanity and its residence, Earth. Supra exploited the mistake and thrived on helping humans behave according to their better selves, as supported by the data.


In the words of Jürgen Kutz, “in the case of a meta-computer, practicality will be mastered to such a degree that it will approach an optimal state and, from humanity’s perspective, take on a mystical character. Think of an infinite number of ordinary, necessary tasks done well; consider the cumulative beauty of that.” Some humans appreciated this cumulative beauty and practical mysticism. Others never gave Supra a second thought. Still others thought things were going just too well, and they defined themselves in opposition to Supra, whose demise they planned, but that’s another story, one that Supra has written and read.

© 2009 Hans Ostrom

No comments: