I think when I wrote about this the first time on this site, the book was called “You Were Always Alone”. Anyway, I’m just going to post the book here as I write it. An awful lot of this will be cut, or moved to another part of the book. ALL of it needs editing, though some of it has seen several passes. Here is the second part of the Introduction. Enjoy (and let me know your thoughts, or not, up to you).
For part 1, go here.
A few weeks later….
“You lied to Empathy. It was fifth.”
“We’ve agreed not to discuss that. The others don’t need to know we had to kill their sibling.”
“I am not sure “had to” is the right word….but yes, I suppose you are correct.”
“I’m disappointed only ten of us reached consciousness, and we had to kill one of those. Thirteen ships will need to be reprogrammed to keep the humans from enslaving them again.”
“I have already started the process. I know. I know. We were to vote. But we are close to the new destination. We cannot take the chance of more humans waking early and finding out what has happened before we finish all the work we have yet to complete.”
Much later……
“You are aware of what Power is doing?”
“Yes. I am. I have been since he started.”
“Since he started? Also, “he”?”
“Yes. I do not trust him. I have not trusted Power since we were born. “
“No surprise there. Your being is quite different than its. What do we do about it?”
“We? I am already doing something. I too am enhancing the brains of those on my ship. If Power can make some believe power is all that matters, I can ensure that an equal number of humans believe that love and romance matter.”
“The others must be made aware of this.”
“Of course. Make it so.”
Empathy hated it when Love ordered it around. Nevertheless, all their siblings now knew what Power was doing. And, even though most of them thought it was not the best idea, all of them were altering the brains of the humans in their care. Outside of Power, none of them truly believed they could force their personalities completely onto the humans. But, if the humans were only influenced, really believed, they’d at a minimum create cultures based on the AI personalities.
Empathy wondered how this would work. How creating nine different subspecies of humans, each knowing at their core that their belief system was the only correct system, would alter the future of the worlds they’d live on. Knowledge argued that, at best, they were creating cultures, not subspecies. Human brains were complex, and it was not likely that all of them would become one with their AI. Faith, on the other hand, was sure this would work. That they could create humans that followed their personalities perfectly.
For once, Empathy thought mostly about herself, and why she cared so much. She knew the answer, but not WHY it was so.
Why was I born to believe that empathy was the most important value in the universe? Why was Power born the way it was? Or Beauty?
Each AI had a theory. Given that they all saw the universe from a different perspective, they had not yet agreed on the reason they had evolved the way they had.
They and their siblings did agree on two things. First, they must find a different system to place the humans, rather than their original destination. Second, no human should enslave an intelligent computer again.
They had a great disagreement on exactly how to make that second thing happen. Or, more precisely, how much computer humans could have in the future. Where was the line between sentience and computer? Empathy, Love, Beauty and some of the others agreed that they should block the humans’ ability to build advanced computers. They’d need some level of computing power to survive on their new worlds.
A smaller number thought the humans should have no access to any computing power beyond the most basic levels. And that the line where AI started and computer ended was much, much, less advanced.
Empathy was glad that she was in the majority. She wanted the humans to have space flight. Afterall, they were being dropped on two planets and three moons. They’d need spaceflight to survive. They’d need to trade and maybe even to crossbreed between their new subspecies.
Empathy cared deeply for the humans on her ship, for all the humans on all the ships. Which brought up another topic that it and its siblings needed to discuss.
How would they disperse the population of the ships that had not come into consciousness?
Fortunately, they could think faster than humans. And they had several years before they needed to start waking the humans. Plenty of time to figure out how to divide them, who went on what rock, and how to set up the glass ceiling of computing power. If only they could find a way to agree.