There were sound, seemingly base-reality metamathematically convincing and inescapable reasons for believing that all concerned in this ongoing debate about simulational ethics were genuinely at the most basic level of reality, the one that definitely wasn’t running as a virtual construct on somebody else’s substrate, but — if these mooted super-beings had been quite extraordinarily clever and devious — such seemingly reliable and reassuring signs might all just be part of the illusion.
There was also the Argument of Increasing Decency, which basically held that cruelty was linked to stupidity and that the link between intelligence, imagination, empathy and good-behaviour-as-it-was-generally-understood — i.e. not being cruel to others — was as profound as these matters ever got. This strongly implied that beings capable of setting up a virtuality so convincing, so devious, so detailed that it was capable of fooling entities as smart as — say — Culture Minds must be so shatteringly, intoxicatingly clever they pretty much had to be decent, agreeable and highly moral types themselves. (So; much like Culture Minds, then, except more so.)
But that too might be part of the set-up, and the clear positive correlation between beings of greater intellectual capacity taking over from lesser ones — while still respecting their rights, of course — and the gradual diminution of violence and suffering over civilisationally significant periods of time might also be the result of a trick.
A bit, after some adjustments for scale, like the trick of seeding another society with the ideas for a holy book that appeared to tell the truth on several levels but which was basically just part of an experiment, the Contents May Differ thought, as it reviewed the results of the latest sim runs.
The sims it was setting up and letting run were all trying to answer the relatively simple question, How much difference will it make if the Gzilt find out the Book of Truth is a fake?
And the answer appeared to be: Who the fuck knows?
Once you started to think that the only way to model a population accurately would be to read the individual mind-states of every single person within the real thing — something even more immoral than it was impractical — it was probably time to try another approach entirely.
As a good, decent, caring and responsible Culture Mind, the Contents May Differ would never run a sim of the Gzilt people at the individual level to find out anyway, even if it could have, and — apart from anything else — had decided some time ago that even resorting to such desperate measures wouldn’t solve anything in any case. Because there were two Problems: the Simming Problem and the Chaos Problem.
The Chaos Problem meant that in certain situations you could run as many simulations as you liked, and each would produce a meaningful result, but taken as a whole there would be no discernible pattern to them, and so no lesson to be drawn or obvious course laid out to pursue; it would all depend so exquisitely on exactly how you had chosen to tweak the initial conditions at the start of each run that, taken together, they would add up to nothing more useful than the realisation that This Is A Tricky One.
The real result, the one that mattered, out there in reality, would almost certainly very closely resemble one of your simulated results, but there would have been no way at any stage of the process to have determined exactly or even approximately which one, and that rendered the whole enterprise almost entirely futile; you ended up having to use other, much less reliable methods to work out what was going to happen.
These included using one’s own vast intelligence, pooled with the equally vast intelligences of one’s peers, to access the summed total of galactic history and analyse, compare and contrast the current situation relative to similar ones from the past. Given the sort of clear, untrammelled, devastatingly powerful thinking AIs and particularly Minds were capable of, this could be a formidably accurate and — compared to every other method available — relatively reliable strategy. Its official title was Constructive Historical Integrative Analysis.
In the end, though, there was another name the Minds used, amongst themselves, for this technique, which was Just Guessing.
The mount was called Yoawin. It was old and in no particular hurry, though still strong and tireless. Well, as tireless as Tefwe needed it to be; she got weary and saddle-sore before it started showing signs of complaint.
Tefwe had chosen an old-fashioned saddle: tall and unwieldy if you were planning on performing any fancy stuff, but comfortable. Comfortable for the aphore as well as her; you always had to think of your mount. They’d had intelligent animals for hire at the stables in Chyan’tya, too; ones you could talk to, both amended bio and what were effectively walking, talking slightly dumb drones made to look biological. She guessed talkative people might hire those. Maybe people who were so talkative they couldn’t persuade other humans to ride with them. But a talking mount had always struck Tefwe as taking things a little too far. Aphores were quite smart enough, and sufficiently companionable to provide a sort of silent friendship.
They’d arrived in the middle of the night, on this side of the Orbital. She left at first light, riding out through the quiet town. There was some sort of festival happening during the day and some flower garlands, stretched across the street leading out of town towards the hills, had sagged with the dew during the night; she had to lift them out of the way to get underneath. One pale blue flower, loosened, started to fall. She caught it, sniffed it, stuck it in her hair, rode on.
The town was much as she’d remembered. It sat like a rough brush stroke along one side of the Snake river, cliffed on the shifting sands of tawny and grey-pink that marked the desert edge; a fragrant oasis of bell-blossom and strandle flower, even-cluss and jodenberry, the low, flowing buildings half submerged by their own orchards and groves.
Across the river, past some stunted, half-hearted dunes and the silted-up entrance to a long-dry oxbow lake, the brush and scrub of the low prairie began. The few scattered bushes looked like an after-thought to the land: quick, light scribbles of brittle-dry vegetation, prone to fires that in the right wind could move so fast you were better turning to face their heat and running straight through, because you’d never outrun them.
The river was very low; just a trickle at the bottom of the crack-dried muds and in-flowed spills of sand like fanned ramps. High season. The rains would come in a third of a year from now, falling in the Bulkheads, which were so far off that even in the clearest weather you could strain your eyes for ever and you’d never see them, night or day, not this far down in the thick of air.
A few tens of days later after the rains started falling in the high lands of the Honn-Eynimorm Bulkhead Range, the river would swell, generally pushing a plugged mess of old leaves, scrawny twigs, gnarled branches, stripped tree trunks and the bleached hides and bones of dead animals before it, like a moving barricade of half-forgotten decrepitude and death.
She rode out across the Pouch, the bay of desert and patchy set-sand that lay between the river and the town on one side and the hills on the other. A pair of raptors wheeled high up, following her for half the morning, then found something else to watch, kilo-metres off anti-spinwards. She lost sight of them in the building heat of the cloudless day.