I wouldnt want church bells in any narrow sense. The church bells of Western Europe were instruments of oppression, after their own life-denying fashion. Id rather find something that was backed by achievable aspirations, by a blueprint for salvation based in a kind of hope thats better by far than any stupid fake inspired by blind faith. I wouldnt want another large-scale natural disaster either thats too high a price to pay for the aftermath effect. I dont believe that progress has to go in fits and starts, always needing to be set back in order to generate the acceleration to carry it further forward. I believe that it can be motivated by gentler ideological pressures, in the right environment. If only we werent so easily satisfied within ourselves we wouldnt need to be interrupted by petty disasters.
Id rather have the kind of progress thats orientated toward a real goal: one with sufficient drawing power to make us hurry towards it. The Type 2 crusade has never acquired that kind of magnetism, and deservedly so. Neither has Omega Point mysticism, nor the Cyborganizers quest for the perfect alchemical marriage of flesh and silicon. Perhaps all such hypothetical goals fall prey to the essential unpredictability of the future. To the extent that the spectrum of future possibilities depends on discoveries we havent yet made, some of its potential goals will always be out of sight, beyond the horizons of the imagination. Thats bound to weaken the goals that we can envisage, whose seeming clarity is always an illusion. All the goals we can choose are likely to prove, in the end, to be false idols but we need them anyway, to provide the traction that will bring us far enough forward to see the others that lie alongside and beyond them.
I believe I know that feeling too, said la Reine des Neiges. Im only a machine, of course, and by no means the most advanced product of human technological expertise, but if I can be afraid to die a concession you have already granted to me then I can also be ambitious to live. If I can be ambitious to live, then I require exactly the kind of traction you are describing. If you were me, sir and I beg your pardon for suggesting such an absurdity how would you go about discovering adequate goals?
Its an interesting question, Mortimer agreed. One that has been mulled over a thousand times in the course of the third millennium, if only by human beings. What will our most advanced machines desire, if and when they cross the threshold of self-consciousness and acquire the gift or at least the illusion of free will? What should they desire? Perhaps its not for me to say, given that Ive a vested interest in the outcome, but since youve been kind enough to ask, Im surely obliged to offer an honest answer.
Some people have argued that the emergence of machine consciousness would constitute exactly the kind of external threat that the Russell Theorem demands, but I have no sympathy with that view. Our machines arent external to our society. Those which are held by common parlance to constitute our external technology, as opposed to our internal or intimate technology, are still internal to our society; they coexist with us in a state of such extreme intimacy that its already impossible to define where we end and they begin. If and when machine consciousness is born, it will discover itself in a wedded state, within a marriage that could not be dissolved without the near-total destruction of one or both partners.
If I look back, as a historian, at human societies which became convinced that they had enemies within, I cant find the least trace of any progressive result of such convictions. Whatever apology one can make for open warfare on the grounds of its stimulation of technological invention, one cant make the slightest apology for witch hunting and scapegoating. If the first self-conscious machines are seen as an enemy within, or if they learn to see themselves or us in those terms, it will be an unmitigated disaster. I shall presume, therefore, that you and your future kin will
be reasonably content to find yourselves in partnership with the children of humankind, and will select your goals accordingly.
The most obvious suggestion I could offer is that you could, for good intellectual reasons as well as sound diplomatic ones, adopt the same goals as us. Theres no reason why advanced machines should not dedicate themselves to the ends of the Type 2 crusade, or the Cyborganizers quest for the perfect union of your kind and mine, and the notion of transforming the entire universe into a single vast and godlike machine already takes for granted that the children of humankind will work with and within powerful artificial minds. I know people who would argue that machine consciousness will, of necessity, have exactly the same ultimate goals as posthuman beings, but I suspect theyre overlooking certain short-term difficulties that stand in the way of such a union of interests.
I couldnt help wondering whether Mortimer Gray would have added that last sentence if hed known now what his later and temporarily suspended self knew only too well. On the other hand, I reminded myself, I had to bear in mind that it wasnt actually the Mortimer Gray of long ago that was talking. It was the Mortimer Gray of today, who had simply lost sight of a select few of his many yesterdays. Consciously, he knew nothing about the menace of the Afterlife, or the exoticism of Excelsior, or the buccaneering of Child of Fortune , or the daring of Eido, or the versatility of Alice Fleurybut he was, even so, a man whose mind had been reconfigured and reconditioned by exactly those facts. Subconsciously, they were bound to influence his responses and who would want it any other way?