"Charles Stross - Accelerando" - читать интересную книгу автора (Stross Charles)

"They phoned me." With heavy irony: "It's hard for an upload to stay subsentient these days, even if it's just
a crustacean. Bezier labs have a lot to answer for."
Pamela's face is unreadable. "Bezier labs?"




Page 19
Stross/Accelerando



"They escaped." Manfred shrugs. "It's not their fault. This Bezier dude. Is he by any chance ill?"
"I —" Pamela stops. "I shouldn't be talking about work."
"You're not wearing your chaperone now," he nudges quietly.
She inclines her head. "Yes, he's ill. Some sort of brain tumor they can't hack."
Franklin nods. "That's the trouble with cancer – the ones that are left to worry about are the rare ones. No
cure."
"Well, then." Manfred chugs the remains of his glass of beer. "That explains his interest in uploading.
Judging by the crusties, he's on the right track. I wonder if he's moved on to vertebrates yet?"
"Cats," says Pamela. "He was hoping to trade their uploads to the Pentagon as a new smart bomb guidance
system in lieu of income tax payments. Something about remapping enemy targets to look like mice or birds or
something before feeding it to their sensorium. The old kitten and laser pointer trick."
Manfred stares at her, hard. "That's not very nice. Uploaded cats are a bad idea."
"Thirty-million-dollar tax bills aren't nice either, Manfred. That's lifetime nursing-home care for a hundred
blameless pensioners."
Franklin leans back, sourly amused, keeping out of the crossfire.
"The lobsters are sentient," Manfred persists. "What about those poor kittens? Don't they deserve minimal
rights? How about you? How would you like to wake up a thousand times inside a smart bomb, fooled into thinking
that some Cheyenne Mountain battle computer's target of the hour is your heart's desire? How would you like to
wake up a thousand times, only to die again? Worse: The kittens are probably not going to be allowed to run.
They're too fucking dangerous – they grow up into cats, solitary and highly efficient killing machines. With
intelligence and no socialization they'll be too dangerous to have around. They're prisoners, Pam, raised to sentience
only to discover they're under a permanent death sentence. How fair is that?"
"But they're only uploads." Pamela stares at him. "Software, right? You could reinstantiate them on another
hardware platform, like, say, your Aineko. So the argument about killing them doesn't really apply, does it?"
"So? We're going to be uploading humans in a couple of years. I think we need to take a rain check on the
utilitarian philosophy, before it bites us on the cerebral cortex. Lobsters, kittens, humans -- it's a slippery slope."
Franklin clears his throat. "I'll be needing an NDA and various due-diligence statements off you for the
crusty pilot idea," he says to Manfred. "Then I'll have to approach Jim about buying the IP."
"No can do." Manfred leans back and smiles lazily. "I'm not going to be a party to depriving them of their
civil rights. Far as I'm concerned, they're free citizens. Oh, and I patented the whole idea of using lobster-derived AI
autopilots for spacecraft this morning – it's logged all over the place, all rights assigned to the FIF. Either you give
them a contract of employment, or the whole thing's off."
"But they're just software! Software based on fucking lobsters, for God's sake! I'm not even sure they are
sentient — I mean, they're what, a ten-million-neuron network hooked up to a syntax engine and a crappy
knowledge base? What kind of basis for intelligence is that?"
Manfred's finger jabs out: "That's what they'll say about you, Bob. Do it. Do it or don't even think about