Tip:
Highlight text to annotate it
X
CORY DOCTOROW: Charlie and I wrote this book, "The Rapture
of the Nerds." We started writing it about seven years
ago as a kind of experiment.
Charlie and I had corresponded and he said, would you like to
collaborate on something?
So he sent me a chunk of the manuscript, the first 500 or
1,000 words of a story called "Jury Service." And I read
them and they were quite gonzo, I mean,
really, really gonzo.
And I thought, all right, challenge accepted.
So I wrote the most gonzo sort of follow-on 500 or 1,000
words that I could think of and sent them back to him.
And then he did the same.
And back and forth we went and ended up writing this thing
that's very silly and very fun.
And it was much loved, so much so that there was a lot of
demand for a sequel, so we wrote another story called
"Appeals Court."
And those two seemed like they were unfinished, and Tor asked
us if we would finish them and do a write-through and turn it
into one book.
So we wrote this third piece, which is called "Parole
Board." And that's the new chunk that we're going to read
from a little later today.
And the whole thing with a thoroughgoing edit became
"Rapture of the Nerds," the book but you can
pick up back there.
And I'm going to just say a few words about kind of the
thematic nature of the book, and then Charlie will say a
few words as well.
And then we'll each do a brief reading.
The title "Rapture of the Nerds" comes from another
writer, Ken McLeod--
Scottish, Trotskyist, science fiction writer of some great
merit and thoughtfulness.
And Ken was one of the first people to notice that the
Singularity bears a market similarity to the notion of
the Rapture and the "Left Behind" novels, and this idea
that there will come a time when people who possess a
specific virtue will disappear from the Earth, leaving only
the virtuous behind.
Only instead of people who have faithfully adhered to
Bronze Age doctrine going away to heaven, this would be
techies and nerds and skeptics and math people who would be
comfortable with having their brains stuck in computers and
being disassembled here on Earth and uploaded to a cloud.
So that's the notion of the book, is to try and sort of
explore an inverse of being left behind, a "Left Behind"
where the people left behind are the people who are
skeptical about uploading.
And it's something that I like to think of as the progressive
apocalypse.
So before the Enlightenment, we had this idea of
lapsarianism, the idea that things were getting worse and
worse every year, that we were kicked out of the Garden and
every year that went by, we fell further and
further from grace.
And it's kind of easy to understand how lapsarianism
would seem like an a natural, incredible idea.
After all, by the time you hit sort of the last 10% of your
natural lifespan, it seems pretty apparent that things
are getting worse.
After all, didn't everything used to taste better?
And didn't everything used to smell better?
Weren't the people of your preferred gender prettier?
Didn't you hurt less?
I mean, clearly the world is a worse place than it was.
And I think even the human imagination balks
at unbounded systems.
We think, well, if things are going to get worse and worse,
eventually they'll reach a point of, as Spinal Tap would
have put it, "none more worse." And that "none more
worse" is the Apocalypse, right?
When things can't get any more worse, you have a break.
Everything comes to a crashing halt.
Now along comes the Enlightenment and the idea
that things are just going to get better, that we will stand
on the shoulders of giants, and then people stand on our
shoulders, and then they'll stand on theirs, and so on,
all the way up to the contemporary notion of Moore's
law and the idea that things are getting better and better.
And again, we confront this unbounded system and we go,
well, there must come a time when things can get none more
better, when the balloon has inflated so much that it
bursts and we reach a break with history.
We cease to be humans as we understand it.
And that's the other theme that we explore in this book,
this idea that the Singularity is in many ways attractive not
because it's credible, but because our brain likes the
idea of a bounding condition on this
otherwise unbounded system.
So that's the book, and I'm going to let Charlie talk
about it a bit first and then do some reading.
And then I'll come back and do some reading, and then we'll
take your questions.
CHARLES STROSS: Well, so, the Singularity.
It's an interesting term.
It really got introduced in the science fiction field
around 1992 by Vernor Vinge, who you've probably heard of
and may well have read.
Science fiction writer, Hugo Nebula winner, also a
now-retired professor of computer science.
Vernor's version of a singularity was a lot more
constrained and, dare I say it, grounded than the version
that's become common currency among
transhumanists these days.
Vernor's resolution was essentially a hypothesis about
artificial intelligence, that if we can build a
human-equivalent AI, then it should, in principle, be
possible to make it think faster than a regular human
being just by throwing more computing resources at it,
assuming it's not bottlenecked--
which it has the brain as a thoroughly parallel
architecture.
There's no reason to suppose it would be.
He also speculated further that if there are
fundamentally more powerful modes of cognition than human
consciousness, if there are forms of intelligence that are
as much more powerful than our thinking as we are than, say,
a frog, then once you have really, really, really fast
human-equivalent AIs, they will make a breakthrough to a
strongly transhuman form of intelligence, if such a
breakthrough is possible at all.
And at that point, the brakes are off and we are not setting
the agenda for the universe we inhabit.
This was Vernor's original context for singularity.
And he wrote some novels about it, starting from, I think,
the late 1980s with "Marooned in Realtime" and "Subsequent."
But it wasn't long before this cross-fertilized with some
interesting, in subcultures relating to each other via the
internet-- the early transhumanists, the
Extropians.
I don't know if anyone here was on the Extropians mailing
list in the late '80s, early '90s.
I see some recognition.
And like a giant Katamari of weird ideas, it began sort of
accreting baggage as it rolled downhill.
Among the bits of baggage it acquired--
well, molecular nanotechnology and magic nanite pixie dust
probably need no further explanation.
But mind uploading, the idea that we can dissect bits of
our brains, analyze the signal processing therein, and
ultimately port whatever representation we have of a
human mind to run on different hardware.
And of course, once we are running in a computer, there's
no reason we can't make our sim very pleasant to live in.
It's AI Heaven time.
And hey, we're all going to flying up into the computers,
live in the Google Cloud for that matter.
I'm not sure that's such a good idea, but I digress.
[LAUGHTER]
CHARLES STROSS: And this sort of began accreting all sorts
of theological baggage.
It got to the point where I found myself being sort of
pinned into corners at SF conventions by people with no
context of personal space who wanted to know how long it
would be before they could abandon the meat puppet.
[LAUGHTER]
CHARLES STROSS: Now this struck me as slightly
disturbing.
And when one goes looking for parallels, it doesn't take
long to find them.
From around 950 AD onwards, Western Europe was sort of hit
by wave after wave of cults, if you like, where the
peasants would sort of abandon their farms, abandon their
clothes, trash the joint and go and form free-love
communes, marauding around Europe, because they knew
Jesus was going to come and spirit them away to heaven
within the next few weeks.
And this sort of kept up for about 50 to 100 years,
periodically being put down by [INAUDIBLE]
by the nobility, with the enthusiastic backing of the
Church, who disapproved of this kind of thing.
But it has had echoes ever since.
Christian eschatology came up with the Revelation of St.
John around the second century, and ever since then,
as Cory noted, there has been the apocalyptic trend embedded
very deeply in our culture.
If we fast forward to the 18th century and the Scottish
Enlightenment, we have a changeover--
to some extent a reaction against a very dire,
fundamentalist, Presbyterian theology of the theocracy that
ran Scotland for a couple of centuries until
the early 18th century--
resulting in a new more optimistic vision of the world
whereby things can get better and better all the time and we
can improve ourselves.
But it's still the same apocalyptic imagery and urge.
And "The Rapture of the Nerds" was, to some extent, an
attempt to grapple with the theological and aesthetic
underpinnings of Singularitarianism.
More recently, if you want to find the missing link to the
Extropians and the transhumanists, you really
need to look at the 1920s Russian movement
known as the Cosmists.
If you think sort of 1920s Russian revolutionary
Communist Extropians--
most of them would be American libertarians these days--
that's where you find them.
They got their scheme from a theologian, Nikolai Fedorov,
late 19th-century Russian Orthodox
theologian and teacher.
He taught Konstantin Tsiolkovsky, the father of
rocketry among other things.
And the Fedorov came up with an almost barking teleology
framed in terms of Christian thought.
First of all, given that we have a theological impetus to
improve ourselves towards perfection, to converge with
godliness, Fedorov concluded that a necessary step on this
was the complete unification of humanity and the
abolition of war.
How do you do this?
Well, you find them something more important to wage on--
like, say, death.
So Fedorov came up with this proposal that we should work
on human immortality, expansion into space, giving
human beings the ability to photosynthesize and move
between worlds.
We're talking 1890s here, I should add.
And this, however, was not where the buck stopped.
Because let us suppose we've inherited the entire cosmos,
we're de facto omnipotent and immortal.
It is an insult to our existence that there are still
human beings moldering in the grave, that we must therefore
then seek to resurrect every human being who had ever
lived, and moreover, every human being on every timeline
that could have given rise to the present day.
Now if you put that in your pipe and smoke it, it's some
pretty heady stuff.
And as I said, this stuff--
1920s Leninists?
1890s Russian Orthodox Christian theologians?
2010 Raymond Kurzweil?
It has what the police in the UK refer to as "form." You
know, it's come up on their radar before.
Shall I start?
CORY DOCTOROW: Yeah.
You're reading.
CHARLES STROSS: And so I'd like to read a chunk to you
today, if I can get this machine to recognize me,
explaining how Huw finds himself--
Huw, our protagonist, finds herself, I should add.
Huw started out male and ends up female and it gets
complicated thereafter.
[LAUGHTER]
CHARLES STROSS: How Huw ends up in the digital afterlife.
And I should add, Huw is a curmudgeonly Welsh Green who
runs a pottery and trusts no technology more complex than
his bicycle.
CORY DOCTOROW: Her.
CHARLES STROSS: Her, by this point.
[LAUGHTER]
CHARLES STROSS: No, the--
I thought the bike had been crushed.
Anyway.
CORY DOCTOROW: Continuity.
CHARLES STROSS: Continuity.
[LAUGHTER]
CHARLES STROSS: Huw--
CORY DOCTOROW: It's a new bike.
CHARLES STROSS: Yeah.
It's a new bike.
Huw is holding her right hand under the cold-water tap and
swearing when there's another knock at the door.
"Who is it?" she calls down the hall.
"It's the Singularity," a booming voice replies.
"What do you want?"
"Everything is different now!"
"I don't want any."
"If I could just have a moment of your time?" It takes a lot
of skill to make a stentorian voicebox emit a credible
wheedle, but the bell ringer at the door has clearly
practiced it to a fine art.
Huw turns the faucet back up and puts her fingers into the
cold stream.
There are vicious little burns, red welts that her
honest, baseline human cells will take
weeks to properly heal.
Of course, you could just ride over to the McNanite's and get
some salve that'd make them vanish before her eyes, but
Huw's endured much worse and she still got enough stubborn
stockpiled to last her a couple of eons.
There's another thud at the door.
Thud.
Thud.
Thudthudthud.
Then a transhuman tattoo of thuds in rising frequency,
individual thuds blurring into a composite buzz that gets the
bones of the old house rattling in sympathy,
shivering down little hisses of plaster dust from the
joints in the ceiling.
Huw uses her good hand to wrench the faucet off, then
wraps a tea towel around her throbbing, dripping fingers
and walks to the door, gritting her teeth with every
step as she forces herself not to run.
It feels like the house might rattle down around her ears
any second, but she won't give the infinity-botherer outside
the satisfaction.
She opens the door with the same measured calm.
Let one of these fundies know you're on edge, and he'll try
to grab the psychological advantage and work it until
you agree to his balls.
Sorry.
Pitch.
Freudian slip there.
[LAUGHTER]
"I said," Huw says, "I don't want any."
"I'm afraid I rather must insist," says the
infinity-botherer through his augmented, celestial voicebox.
The first of that voice makes Huw take an involuntary
wincing step backwards, like a blast from an air horn.
"Huw, this is mandatory, not optional.
This is mandatory, not optional.
The words send Huw whirling back through time, back to her
boyhood, and a million repetitions and variations on
this phrase from his--
"Mum?" she asks, jaw dropping as she stares up at the giant
borg on the doorstep.
It's at least three meters high, silvery and fluid, thin
as a schwa, all ashimmer with otherworldly
transcendent wossname.
It's neither beautiful nor handsome, though it's
intensely aesthetically pleasing in a way that demands
some sort of genderless superlative that no human
language has ever managed.
Huw hates it instantly--
especially since she suspects at the loa riding it might be
descended from one of his awful parents.
"Yes, dear," the Singularity booms.
"I like the regendering.
It really suits you.
Your father would send his best, by the way, if he was
still hanging around the solar system."
Huw last saw her parents at their disembodiment.
They'd already had avatars running around in the cloud
for years, dipping into meatspace every now and again
for a resynch with their slowcode
bioinstances dirtside.
When they were finally deconstituted into a fine
powder of component molecules, it'd been a technicality
really, a final flourish in their transhumanunification.
But the finality of it, zero out their bodies, had marked a
break for Huw.
Mum and Dad were now technically dead.
They were technically alive, too, but that
was beside the point.
Until Mum donned a golem and came to talk.
"Mum, I don't talk to dead people," Huw says.
"Go away." She deliberately does not slam the door, but
closes it, and turns the latch, and heads back to the
sink, deliberately ignoring the fragments of cloud wearing
her mum's memories.
She manages to go three steps before the door splinters and
tears loose of its hinges, thudding to the painstakingly
restored tile floor in the front hall with a merry
tinkling of shattered antique glass.
"Love, I know you're not best pleased to see me, but you've
been summoned, and that's that."
The spirit of adolescence descends on Huw in a red mist.
Her mum has always been able to reduce her to a screeching
teakettle of resentment.
"Get out my house, mum!
I hate you!"
Her mum's avatar grabs Huw in a vicious hug that feels like
foam rubber paddings wrapped around titanium armatures.
"Poor thing," it says.
I know it's been hard for you.
We did our best, you know, but well, we were only human.
Now, come along, sweetie."
It's Tripoli all over again, but this time the golem whose
grasp she can't escape emits a steady stream of basso
profundo validations of Huw's many gifts and talents and how
proud her parents are of all she's achieved and suchlike.
Huy tries to signal a beedlemote, but her mum's got
some kind of diplomatic semaphore that makes all the
enforcementware give it free passage.
Mum's bot stops at every traffic signal, and several
times, Huw tries to get passersby by to help her, with
lines like, "I'm being kidnapped by the bloody
Singularity!"
Unfortunately, nobody seems interested in lending a hand.
And even if they did, Mum goes about 200 kilometers per hour
between traffic lights, her gait so fast that every time
Huw opens her mouth to screen, it fills with wind and her
cheeks wibble and wobble while she tries to breathe past the
air battering at her windpipe.
Then they've arrived.
The alien consulate is midfab, its hairy fractal edges
radiating heat as nanites grab matter out of the
sky to add to it.
The actual walls are only waist high, though the spindly
plumbing, mains, and network infrastructure are already in
place and teeter skyward like a disembodied nervous system
filled with dye for an anatomical illustration.
The consul is an infinitely hot and dense dot of
eyeball-warping fuzz in the exact center of what will be
the ground floor.
Well, it's not exactly infinite, but it does seem to
bend the light around it, and it certainly radiates too much
heat to approach closely.
"Thank you for coming," it says.
"You brought your invitation, I
hope?" "*** you!
No!" Huw screams.
She's gathering breath for another outburst, but Mum
shakes her--
gently by golem standards, but hard enough to rattle the
teeth in her jaws.
"Bad idea, darling." A palpable cone of silence
descends around Huw's ears as Mum confides, "When I said it
was mandatory, I was serious.
If you don't comply, it'll delete everyone."
"Fuuuu--" Huw pauses.
"Delete?" She realizes that everything outside the cone of
silence has stopped, stuck in a bizarre meatspace cognate of
bullet time--
that's hanging on the wing in midair, leaves frozen in
midfall, that sort of thing.
"Yes, dear.
I'm not exaggerating.
It's come to pass a visit from the Next Level, and faster,
smarter thinkers than you or I are crapping themselves." Huw
is rattled.
Mum always had an accurate appreciation of her own
abilities, and as a Fields Medal winner, she wasn't
inclined to hide them under a bushel.
"But it's playing by the rules, apparentl.
There's got to be a Public Inquiry.
Which means statements by witnesses and friends of the
court and so on and so forth--
all very tiresome, I'm sure, but it seems your name came
out of the hat first.
So I'm afraid you're back on jury duty, like it or not.
If it's any consolation, I'll try to make this painless."
The birds and the bees resume their respective chirping and
buzzing as the cone of silence collapses on Huw like an icy
waterfall of fear.
"Shitbiscuits!" she screams as Mum gently wraps a band of
silvery-shimmering nanomanipulators around Huw's
head and saws off the top of her skull.
Over to Cory.
[APPLAUSE]
CORY DOCTOROW: So as you've just heard, Huw is forcibly
transcended.
And later on in the story, the section I'm going to read,
she's finally been taken off to the galactic civilization's
holding pen for expert witnesses, which is a giant,
tasteless replica of the Burj Khalifa made out of the bones
of the moons of Jupiter, which have been co-opted as a
temporary computronium outpost.
It's so tasteless that the doorman is a giant gorilla.
"I hope you enjoy the facilities here," says the
gorilla, with a wink.
"Nothing but the best for our expert witnesses--
we have hot and cold running everything."
It's a far cry from jury duty accommodation in the crappy
backpacker's hostel in dusty Tripoli.
Huw dials her time right up (sinfully extravagant) and
orders the whirlpool-equipped hot tub with champagne to
appear in the bathroom.
Then she climbs in to marinate for subjective hours (a
handful of seconds in everyone else's timeframe) and to
unkink for the first time in ages.
After all, it's not as though she's consuming
real resources here.
And she needs to relax, needs to recenter her emotions the
natural way, and do some serious plotting.
Of course, the sim is far too realistic.
A virtual champagne bath should somehow manage to keep
the champagne drinking-temperature cold
while still feeling warm to the touch.
And it shouldn't be sticky and hot and flat.
It should feel like champagne does when
it hits your tongue--
icy, bubbly, and fizzy.
And when Huw's non-bladder feels uncomfortably full and
relaxed in the hot liquid and she lets loose a surreptitious
stream, it should be magicked away, not instantly blended in
with the vintage Veuve Clicquot to make an instant
tub's worth of *** mimosa.
[LAUGHTER]
This is what comes of having too much compute-time at one's
disposal, Huw seethes.
In constraint, there is discipline, the need to choose
how much reality you're going to import and model.
Sitting on an Io's worth of computronium has freed the
Galactic Authority--
and isn't that an imaginative corker of a name?--
from having to choose.
And with her own self simulated as hot and wide as
she can be bothered with, she can feel every unpleasant
sensation, each individual sticky bubble, each droplet
clinging to her body as she hops out of the tub and into a
six-jet steam-shower for a top-to-bottom rinse, and then
grabs a towel--
every fiber slightly stiff and plasticky, as if fresh out of
the wrapper and never properly laundered
to relax the fibers--
and she dries off.
She discovers that she is hyperaware, hyperalert,
feeling every grain of not-dust in the not-air
individually as it collides with her not-skin.
Oh, enough, she wants to shout.
What is the point of all this rubbish?
This is the thing that Huw has never wanted to admit.
Her primary beef against the Singularity has never been
existential--
it's aesthetic.
The power to be a being of pure thought, the unlimited,
unconstrained world of imagination, and we build a
world of animated gifs, stupid sight gags, lame van-art
avatars, brain-dead "playful" environments, and brain-dead
flame wars augmented by animated emoticons that allow
participants to express their hackneyed ad hominems,
concern-trollery, and violations of Godwin's law
through the media of cartoon animals and
oversized animated genitals.
Whether or not sim-Huw is really Huw, whether or not
uploading is a kind of death, whether or not posthumanity is
immortal or just kidding itself, the single inviolable
fact remains.
Humans simspace is no more tasteful than the
architectural train wreck that the Galactic
Authority has erected.
The people who live in it have all the aesthetic sense of a
senile jackdaw.
Huw is prepared to accept-- for the sake
of argument, mind--
that uploading leaves your soul intact, but she is never
going to give one nanometer on the question of whether
uploading leaves your taste intact.
If the Turing test measure an AI's capacity to conduct
itself with a sense of real style, all of simspace would
be revealed for a machine-sham.
Give humanity a truly unlimited field, and it would
fill it with Happy Meal toys and holographic, sport-star,
collectible trading card game art.
There's a whole gang of dirtside refuseniks to make
this their primary objective to transcendence.
They're severe Bauhause cosplayers, so immaculately
and plainly turned out that they look more like
illustrations than humans.
Huw's never felt any affinity for them--
too cringeworthy, to like a Southern belle who comes down
with the vapors at the sight of a fish knife laying where
the dessert fork is meant to go.
It's always felt unserious to object to a major debate over
human evolution with an argument about style.
But Huw appreciates their point, and has spent his and
then her entire life complaining instead about the
ineffable and undefinable humanness that is lost when
someone departs for the cloud.
She's turned her back on her parents, refused to take their
calls from beyond the grave, she's shut herself up in her
pottery with only the barest vestige of a social life,
remade herself as someone who is both a defender of humanity
and a misanthrope.
All the while, she's insisted--
mostly to herself, because, as she now sees with glittering
clarity, no one else gave a ***--
that the source of her concerns all along has been
metaphysical.
The reality that stares her in the face now, as she reclines
on the impeccably rendered 20-million-count non-Egyptian
noncotton nonsheets, is that it's always been a perfectly
normal, absolutely subjective, totally meaningless dispute
about color schemes.
And now she's got existential angst.
The Burj Khalifa's in-room TV gets an infinity of channels,
evidently cross-wired from the cable feed
for Hilbert's hotel.
It uses some--
[LAUGHTER]
It uses some evolutionary computing system to generate
new programs on the fly, every time you press
the channel-up button.
This isn't nearly as banal as Huw imagined it might be when
she read about it on the triangular-folded cardboard
standup that materialized in her hand when she reached for
the remote.
That's because--
as the card explained--
the Burj has enough computation to model captive
versions of Huw at extremely high speed, and to tailor the
programming by sharpening its teeth against those
instances-in-a-bottle so that every press of the button
brings up eye-catching, attention-snaring material.
It's mostly soft-core ***
that involves pottery.
[LAUGHTER]
Huw would like nothing better than to relax with the
goggle-box and let her mind be lovingly swaddled in
intellectual flannel, but her mind isn't having any of it.
The more broadly parallel she runs, the more meta-cognition
she finds herself mired in, so that even as she lies abed,
propped up on a hill of pillows the size of a Celtic
burial mound, her thoughts are doing something like this.
Oh.
That's interesting.
Never thought of doing that sort of thing with glaze.
Oh, too interesting.
If you ask me, it's not natural, that kind of
interesting.
They've got to be simulating gigaHuws to come up with that
sort of realtime optimization.
There'll be hordes of Huw instances being subjected to
much-less-interesting versions of this program and winking
out of existence as soon as they get bored.
Hell, I could be one of those instances, my life dangling by
a frayed thread of attention.
Every time I press the channel-up button, I execute
thousands--
millions?
billions?--
of copies of myself.
Why don't I care more about them?
It's insane and profligate cruelty but here's me blithely
pressing the channel-up button.
Whoa, that's interesting--
she looks awfully like Bonnie, but with a bum that's a little
bit more like that girl I fancied in college.
I could die at any instant, just by losing attention and
pressing channel-up.
That's wild, I never noticed how those muscles--
the quadrati lumborum?--
spring out when someone's at the wheel, that bloke's got
QL's for days.
If I were really aesthetically opposed to this sort of thing,
I'd be vomming in my mouth with rage at the thought of
all those virtual people springing into existence and
being snuffed out.
But I'm not, am I?
Hypocrite, liar, poseur, mincing aesthete, that's me.
So long as it's interesting and stylish,
I'll forgive anything.
I've got as much existential
introspection as a Mario sprite.
Enough, already, she tells herself, and cools herself
down to a single thread, then throws that down, hunting for
the sweet spot at the junction of stupidity and calm.
Then finding it, she settles down and watches TV for one
hundred subjective years, slaughtering invisible hordes
of herself without a moment's further thought.
Satori.
So that's the reading.
[APPLAUSE]
we're on there is a power button.
Or is that controlled of a console?
So we're now free to take your questions.
We can share the mic if we can't figure
out how to get one.
How many science fiction writers does it take to turn
on a wireless mic?
All of them, apparently.
So are there any questions?
Yeah.
AUDIENCE: Would I be correct in guessing that you wrote
sections primarily by yourselves?
CORY DOCTOROW: Uh, no.
CHARLES STROSS: I don't think so.
CORY DOCTOROW: No.
They really-- there's a lot of thought of interwriting in
both of those passages.
They're really interwritten, those two, particularly.
Those are among the passages where, when I read them, I'm
like, that looks like my tick and that looks like Charlies
tick and who the *** wrote that?
[LAUGHTER]
CHARLES STROSS: Yeah.
We were swapping over about every 500 to 1,000 words, and
both of the passages we read are well over 1,000 words, so.
CORY DOCTOROW: And we did lots of re-editing of
each other's stuff.
So there are some interesting offcuts lying around in our
hard drives, too.
Other questions?
We actually could figure out who wrote those, because I at
least did version control with them.
I--
a friend of mine wrote me some Python scripts that are in
GitHub called flashbake, and every 15 minutes, they grab
all my working files and then they check them into a local
Git repo with the--
it figures out what time zone I'm in from my IP address,
where I am from my IP address, the last three songs I
listened to, the last three headlines I posted on Boing
Boing," and then and logs it.
And I figure, you know, in like 10 years, it'll be kind
of interesting to go back and look over it.
I've also been thinking that I could probably do something
more interesting these days, like take a picture every 15
minutes and--
CHARLES STROSS: Blood pressure.
CORY DOCTOROW: Blood pressure from the picture.
Skin galvanometry, skin response from the touch pad.
And probably more.
I mean, I'm sure, like, eventually the mic will be
sensitive enough to grab things like heart
rate and so on, too.
AUDIENCE: Is that Thomas Gideon?
CORY DOCTOROW: That's Thomas Gideon, yeah.
He's a good guy.
You know him?
He has a great podcast called "The Command Line." And he
works for New America Foundation now.
He's one of the internet-in-a-box guys.
CHARLES STROSS: I like living dangerously.
I just basically left my backups to Time
Machine on the laptop.
Having said that, we worked on this as just a markdown file
bounced back and forth in email, and as long as the
email folders are thoroughly backed up, I've got different
snapshots all the way through.
CORY DOCTOROW: Yeah.
I mean, this all started with the Merril Collection in
Toronto, which is the largest public science fiction
reference library in the world, started by Judy Merril,
who was kind of my mentor when I was younger.
They logged my papers for me.
Because when I started selling novels, I was moving
continents like every 18 months.
And especially back then, every novel you sold involved
three lumps of paper at least about this big.
There was the initial manuscript, and then the
marked-up manuscript, and then the typed script, and so on.
And there was no way I could manage them.
And so they just keep my papers for me, and one day,
the head librarian, Lorna Toolis, said, you know, it's
such a pity, because in the old days, we used to get
multiple distinct drafts from our writers.
And this was obviously very interesting to scholars in
subsequent years.
And now there's just kind of the rolling text file that
then gets turned into a book.
If only there were some way to keep track of the
changes to a text file.
[LAUGHTER]
CORY DOCTOROW: And I was like, you know, this is the
canonical solved problem, right?
So that's where it all started.
Are there other questions?
Yeah?
AUDIENCE: [INAUDIBLE]
CHARLES STROSS: We will talk about it at the end of this
talk, if we haven't strangled each other first.
CORY DOCTOROW: Yeah, I mean, everything is possible.
We might upload and fork new instances to work on books
together or something.
CHARLES STROSS: Yeah.
CORY DOCTOROW: It's possible.
We both have a lot of stuff on the go at the moment.
And we are-- we're both contributing to a project,
although we're not collaborating directly on it.
Neal Stephenson is working on this thing, "Hieroglyphics,"
with Arizona State.
It's science fiction stories built on real science,
credible, sort of plausible science.
I'm working on a story about Burners who create a
playa-dust printer that is so successful, they drop it on
the Playa at Fourth of Juplaya celebration at the start of
the summer, and by Labor Day weekend, it's printed out
their yurt.
And they live in it for Burning Man, and then they
chop it up into pieces and pack it out.
And they're so happy with it that they decide to build one
that does luna rigala.
And they use a private space-exploration vehicle to
drop one on the moon, and they spend a generation
reprogramming its firmware by bouncing hand signals off the
moon and building a lunar habitat that their
grandchildren can move into.
And Charlie, what story are you working on?
CHARLES STROSS: Well, I was meant to be working on the
third of a trilogy beginning with "Halting
State" and "Rule 34"--
this being the political one titled "The Lambda
Functionary." But it's going to take me about two years to
write as it's sort of set 15 years out.
And trying to do near-future SF like that is really hard
these days, thanks to people like you.
[LAUGHTER]
So I'm negotiating with my agent to hand in a different--
the fifth "Laundry" novel instead.
CORY DOCTOROW: But what are you working on for Neal?
For the "Hieroglyph" thing?
CHARLES STROSS: I haven't even confirmed I'm on that yet.
I want to be, but I'm juggling too many balls right now.
CORY DOCTOROW: Seth.
AUDIENCE: So did guys have many parts of this?
Like when you're collaborating on a shared work like this,
are there parts of it that you had to give up?
CORY DOCTOROW: Maybe not that we wish we could've kept, but
there's definitely big chunks that got off-cut.
CHARLES STROSS: Yeah, a lot of arguing over whether something
belonged in it and then rewriting.
CORY DOCTOROW: Yeah.
There's nothing that I like more and overmuch, but there's
plenty of stuff on cutting-room floor.
CHARLES STROSS: I mean, think of it as a kind of half-assed
version of pair programming if you emailing source files back
and forth to each other and working 400 miles apart rather
than the same room.
CORY DOCTOROW: Yeah.
You know, I just read an early edition of the new David Byrne
book, "How Music Works." It's an amazing book, and there's a
section in it on how we collaborated with Eno on
"Everything That Happens Will Happen Today," the album they
did a couple of years ago.
And Eno had actually written all the melodies.
And so he just-- they were sort of sitting on his hard
drive, and so he fired them off to Byrne.
And then Byrne figured out the words.
And he was like, I could go back and ask for, like,
melodic changes, but there's kind of a hassle to that, so
I'm just going to work with this as a constraint.
It's a pretty interesting passage in a
very, very good book.
I heartily recommend it.
It'll be out in about a week.
I've got a review queued up on Boing Boing that may remind
you, if you read Boing Boing.
Scott, you had a question.
AUDIENCE: [INAUDIBLE]?
CORY DOCTOROW: I mean, I think it would have been bad form to
do too much in one go.
CHARLES STROSS: Yeah, but the main thing was to keep
momentum going so we had to turn it
around within 48 hours.
And you know, 500 to 1,000 is a reasonable day's polished
output, especially when you're trying to do
something high concept.
CORY DOCTOROW: And when you're rewriting the stuff that came
before, so you're taking a pass through the piece that
came before.
CHARLES STROSS: I think the longest chunk I mailed to Cory
at one point was about 2,500 words, at a time when you were
traveling and not able to work on it for a couple of days.
CORY DOCTOROW: Yeah.
And you know, that's about how I did it, I did that novella
with Ben Rosenbaum, "True Names," and that's about how
we did that too.
Although Ben--
Ben likes to talk a lot about what you're writing.
He used to send me long, like, so here's what I'm thinking.
I don't know if you've ever met Ben, or heard him talk.
It comes through.
It's like-- (FRANTICALLY) So here's what I'm thinking.
There's this thing.
And there's another thing.
And this other thing.
And this other thing.
And I'd be, like, yeah, Ben?
Can we just write it?
I know, I know.
I know.
And he was like, I overthinking it again.
We were a little more compatible in that regard.
There was less--
less foreplay, more of the important stuff.
[LAUGHTER]
CHARLES STROSS: I didn't know you cared.
CORY DOCTOROW: Yeah.
[LAUGHTER]
AUDIENCE: [INAUDIBLE]?
CORY DOCTOROW: Would Google Docs be more
beneficial of less?
CHARLES STROSS: This probably isn't the right venue to
confess that I hate, loathe, and fear cloud computing
systems but don't actually have an
in-my-pocket fall-back.
The trouble of Google Docs for this sort of thing, first,
you'd be working on one long continual scrolling text.
And by the time you're up to about 90,000 words, Google
Docs is not terribly happy.
The other aspect is it requires you to be online with
a good internet connection.
And if you're traveling or you live somewhere where the
internet is up or down, that's not so good either.
CORY DOCTOROW: And in particular, if those moments
in which there's no network are the moments that you're
like, oh, wow, there's nothing else that I can be doing right
now, no high-priority items that I can take care of
because I've got network access, now's the
moment I can write.
Then Google Docs is no good to you at all.
I've got the sequel to "Little Brother" coming out in
February, and I wrote big chunks of that, I write that
like 2,000 words a day, and I did it while touring.
And a lot of it while touring Germany.
So the German tour is really interesting.
I was touring a young adult book, so I would go to schools
with my translator.
My translator would read for half an hour, and then I'd
read for half an hour, because, you
know, they're Europeans.
They speak better English than I do.
And while my translator was reading in German, I would
have half an hour to sit there on stage and write.
And if I'd needed network access, I would've been hosed.
I never would have finished that book.
AUDIENCE: First of all, I loved the reference to "I was
an infinitely hot and dense dot."
CORY DOCTOROW: Yeah.
Mark Leyner.
AUDIENCE: Yeah.
That was really my introduction, in a lot of
ways, to [INAUDIBLE].
But the other thing I was going to ask is I was actually
introduced to your writing by Paul Krugman's blogs.
And I was just wondering if you--
he's never mentioned meeting you, that I've read, but I was
just wondering--
CHARLES STROSS: Oh, yeah, we have met.
It was at the World Science Fiction Convention about three
years ago in Montreal, where they sort of arranged to put
us on stage.
And it's the second most terrifying thing I've ever
done on stage, is go up on stage in front of 1,000 people
and cameras with Paul Krugman and try not
to look like a fool.
CORY DOCTOROW: It's on YouTube.
[LAUGHTER]
CHARLES STROSS: Note, I say the second
most terrifying thing.
[LAUGHTER]
CHARLES STROSS: To give you something but exceeds that for
sheer blind terror was a event about 15 years ago, when I
went on stage while a guy who was into medieval longsword
reenactment demonstrated killing strokes with a
broadsword, using me as a model.
[LAUGHTER]
CHARLES STROSS: And what I knew and the audience didn't
know was that this guy had macular degeneration and was
registered blind.
He only had 2% of his visual field left.
[LAUGHTER]
CORY DOCTOROW: Wow.
CHARLES STROSS: Going up with Paul Krugman was sort of the
intellectual equivalent.
[LAUGHTER]
CORY DOCTOROW: Are you think Paul's blind?
CHARLES STROSS: No, I'm saying he's the equivalent of the
sort of blind sensei.
CORY DOCTOROW: Oh right, I see.
CHARLES STROSS: Doing something that should be
impossible.
CORY DOCTOROW: In the back.
AUDIENCE: First of all, thank you both for your blogs and
[INAUDIBLE].
But I wanted to ask you both [INAUDIBLE].
Ray Kurzweil, the debate I've heard about him is whether he
is scientifically grounded with a very strong speculative
component, or basically speculative fiction marketing
itself as scientifically grounded.
I've heard the science perspective on that, but if
two authors in speculative [INAUDIBLE].
I'm curious what sort of perspective you have
[INAUDIBLE], especially in the context of this novel.
CHARLES STROSS: Um, I just got a real strong sense of deja vu
when I read some of his books, because I'd seen the stuff 10
years earlier on the Extropians mailing list.
I should say no more.
CORY DOCTOROW: Well, I mean, I interviewed Ray for "Asimov's
Science Fiction Magazine." It's online.
And you know, the summary of kind of where I netted out
with this is that there's an underlying question that Ray
treats as kind of one on which there is broad consensus, that
in fact we don't have any broad consensus on, and that's
the locus of identity.
Not the locus of consciousness, but the locus
of identity.
So imagine that you uploaded a copy of yourself.
And you wanted to, like--
what's the command?
Tar -t, or whatever, to verify the archive?
You wanted to verify your archive, right?
So one way to do that is you could do a Turing test.
So two Chinese rooms--
you're in one, the uploaded you is in the other, and
someone gives you stimulus.
They ask you questions, and if the questions line up, you
know, the answers line up, the same identical response to
stimulus, we'd say, OK, it's the same person.
But there's a problem with that, which is that if I took
you five years ago and stuck you in a room and then took
you today, you wouldn't give the same answers.
So there's a kind of reductionist story, there's a
kind of Socratic dialogue that takes place in the Singularity
literature, in which you have someone who's a skeptic and
someone who's a guru or a believer.
And the skeptic says, if you upload me into a computer, I
will no longer be me.
And the guru leads them through this dialogue where
they say, well, what if I replaced one of your legs with
a robot leg, would you still be you?
And the person says, why, of course I'd still be me.
And what if we moved it one inch higher, right?
And you just keep going till you get to the brain stem.
And then you go, well, at what point do you cease to be you?
Or you can do it in the other direction, right?
Your pupil is the end of your optic nerve, so you can see--
just like your teeth are the end of your skeleton.
Right?
So if I took that part of your brain right there that we can
all see and I replaced that with a microchip--
as there's already CCDs that people hook up to optic nerves
and experiment in a medical context-- are you still you?
Part of your brain is now a machine.
And you just go millimeter by millimeter back, again, until
you reach the brain stem, and you say, at what point do you
cease to be you?
And that sounds plausible.
It's a nice word game to play.
But actually, if you were a concert pianist and I cut your
hands off and replace them with robot hands,
you wouldn't be you.
Right?
If I put you prior to this gross insult to your anatomy
and you afterwards in Chinese rooms and ask you questions,
you wouldn't be you.
And I've come to think that although we exist in a
continuum of identity, from moment to moment the answers,
the difference between the answers that we would give in
the Chinese room experiment are near enough to identical
as makes no never-mind, that across certain distances or
across certain singularities, across certain punctuated
events, those answers change.
In the same way that all of the stages between Latin and
French are mutually intelligible, but Latin and
French are not mutually intelligible.
Or all of the intermediate stages between speciation in
two finches are mutually compatible, but once you're
fully speciated, by definition, you're no longer
compatible.
And that because we can't agree on who I am and who you
are and where the locus of identity is-- because we've
never had to answer the question.
Because we've never had a meaningful way to say, well,
here's two of you, which one is the real one?
That because we can't do that, that there's this enormous
question that we elide when we talk about uploading.
This is an important question in the Kurzweilian sense.
Maybe not in the Vingean sense.
"Are you still you once you're in the computer?" is a
question that matters a lot to you.
Right?
It may not matter a lot to the Singularity, but it matters an
awful lot to you.
And if we can't even agree on what "you" means, then I think
we have a hard time answering that question.
And I think that little cheap Socratic dialogue is a trick
that is used to misdirect us, deliberately or otherwise,
from a pretty important existential
problem that's new.
Or at least new in the sense that it may be
non-hypothetical in the near future.
And that newness is something that we have yet to come to
grips with.
CORY DOCTOROW: On the other hand, I will add that it is an
interestingly constrained question, because it is one
that is subject to empirical experimentation.
CORY DOCTOROW: Mmhm.
Sure.
CHARLES STROSS: Although I wouldn't recommend starting
with human volunteers.
Take something like lobsters.
CORY DOCTOROW: Charlie has written
about uploaded lobsters.
Yeah?
AUDIENCE: So this thing about uploading yourself,
and is it still you?
We can have a much smaller version of the argument.
People who change their hairstyle and say, oh, this is
the new me.
Just that very statement, that by changing your hairstyle,
you become a new you, should really give us the answer of
what happens when you upload yourself.
CORY DOCTOROW: Well, you could change nothing, and someone
could fly an airplane into a building in your hometown, and
you would still-- and you would probably be a different
person, too, right?
CHARLES STROSS: Let me ask a different variant on it.
Here's a tablet.
Is this tablet part of me?
Obviously not.
OK, take into account the combination of programs,
configuration, and data loaded onto it, including chunks
which definitely came out of my mind which are part of my
extended phenotype.
Is it part of me?
This is the argument on, are tools part of identity?
We relate to them as if they're
extensions of our body.
Or are they separate?
It's sort of reverse of the amputation and
artificial leg question.
CORY DOCTOROW: Mmhm.
Sure.
I mean, if you've got 100 times more non-human cells in
your body, symbiotes, than you have human cells--
which you do--
and if you remove them, you would die-- which you would--
then who are you?
Are you a colony organism that just thinks that
it's a single entity?
Or are you a ship on which many passengers, that are
nevertheless somehow under your volitional control,
hitch a ride on?
I mean, I think that these are really important, weird
questions that we haven't quite answered.
CHARLES STROSS: Oh.
Another version.
Life-logging.
I assume everybody's more or less familiar with the idea of
life-logging.
You carry video cameras, mics, GPS, other data recorders
around with you-- blood pressure monitors--
CORY DOCTOROW: Compromised mobile phones.
CHARLES STROSS: Yep.
Internal biological monitors.
You record everything.
You tag everything.
We have reasonable speech-to-text.
We have reasonable recognition of objects in the real world.
I mean, you guys here have been part of a company that's
developed a lot of that stuff.
OK.
Imagine everything around me is logged and tagged for the
whole of my life.
But you can then ask it questions, such as, who was
having lunch with in such and such a city on Tuesday
the 2nd last year?
And get answers back.
If you then couple that with an avatar generated from
recordings of me, and some sort of model that has via a
genetic algorithm been evolved to try and give the same
responses as me, you then have a artificial
representative of me.
An avatar.
Now suppose somebody shoots me.
Is the avatar part of me?
Is it me?
One argument is no.
Another argument is well, it's a component of
your extended identity.
It's what's left over.
CORY DOCTOROW: In the same way that all the stuff in your
hard drive is, too.
Or this was a big question when AOL did
that search term dump.
Your search queries are an enormously
personal part of you.
I just--
I don't know if anyone any of you saw, if you read Boing
Boing, I wrote it an obit for a good friend of mine in June.
He was a hacker, free software guy who was my age.
Died in his sleep totally randomly.
He was a vegetarian who did yoga and rode a bicycle
everywhere but had a cerebral hemorrhage.
It was like, you just multiply all the blood vessels in your
body by the number of heartbeats you have and divide
it by the failure rate, and at a certain point, some of you
will have brain hemorrhages.
So his family, not very technical, didn't know what to
do with his computer.
It was on when he died, logged in to his [INAUDIBLE] machine.
And they were just going to turn it off and stick it in a
box until they could figure it out.
And I was like, you know, if you do that, you'll lose it.
The platters will seize up.
The computer-- you'll get burgled and it'll get stolen.
It'll get shorted out.
It'll get wet.
And this is him, right?
This is everything, all the software he's written,
everything he's done since he was about 14 years old, copied
over and over and over.
And I went with a hard drive over to
the commune he founded--
he lived in a kind of anarchist commune in Toronto--
and copied off terabyte of his data, which was mostly nested
backups of his old data, which like many of us, he hadn't
really thought this through, and didn't really plan on what
people would do with his data after he was dead.
But yes, by the time I watched our sync copy the 10 millionth
copy of Python, I was like, I probably should have de-duped
this before I started.
But in any event, I stuck it on S3 and paid for 10 years
storage so they could figure out what to do it in 10 years.
But this really was part of him.
And I think we haven't come to grips at all with
what to do with that.
And I think we're already kind of symbolically starting to
deal with what to do with backups of people, because
there's an enormous amount of personal information.
Charlie, you were saying the other day that--
what was the year?
2050?
CHARLES STROSS: Around-- yeah, some estimates.
Around 2050, whatever social networks are still running, we
can expect internet adoption to have penetrated the entire
population of Earth who are functionally literate, and
even some who aren't, around 2050, which is the flip-over
point at which more people on the internet will
be dead than alive.
Because by then, the internet will have been around for, in
one form or another, about 80 years and will have hit
serious take-off 50 years earlier.
A lot of people will have been on there and will have died.
I mean, I would like to think I'll be alive in 2050, but I'm
not optimistic, because I would be 86.
CORY DOCTOROW: If any of you want to quit your job and do a
start-up, I've got a great idea.
Not that I'm going to start--
I'm going to write about it for "The Guardian" this week
because I owe them a column.
But basically the idea is that you just deploy an army of biz
dev people in, like, sailcloth blue suits and khakis to go
around and do deals with all the major social networks--
Facebook and Google and so on--
to authenticate that someone is dead and that there is
someone who is empowered to receive their login
credentials.
And then you sell it to funeral directors.
And so you come and you're, like, oh, I'm grieving, I
don't know what to do.
And as they're trying to upsell you on a giant, tacky
aluminum coffin, they're also saying, by the way, we have a
service that we work with.
They're very good, and they will make sure that your loved
one's Facebook, Gmail, and so on, all those credentials, are
turned over to you so you can manage them.
We deal with all the CAs and so on.
It's a single point of contact.
And they'll just turn over the credentials to you on a
certified basis.
I think it would be a big business.
AUDIENCE: [INAUDIBLE].
CORY DOCTOROW: Really?
AUDIENCE: I've seen [INAUDIBLE].
CORY DOCTOROW: With--?
AUDIENCE: I don't remember where.
CORY DOCTOROW: There are ones that will set up a memorial to
export your Facebook, but there's no one, as far as I
know, who will ease--
AUDIENCE: [INAUDIBLE].
CORY DOCTOROW: Yeah, but there's no one who can do the
authentication credentials.
That's the hard part, right?
It's to like, convince Yahoo or Google or Facebook to hand
over auth credentials to someone.
And that's the part that's like logistically intensive
and relationship-intensive.
Yeah, there are tons of companies that'll do an online
memorial, where you can export the Facebook
data and get it in.
But you need to have the login credentials to do it.
So unless you know how to log into that person's account,
then you're hosed.
AUDIENCE: [INAUDIBLE].
CORY DOCTOROW: For not dying.
Yeah.
[LAUGHTER]
CORY DOCTOROW: Yeah.
Yeah, Sam?
AUDIENCE: So I've always noticed that in singularity
and sort of uploader culture, it's never mentioned who
administers the systems that run it.
[LAUGHTER]
AUDIENCE: You want to get a Singularity [INAUDIBLE] now?
[LAUGHTER]
AUDIENCE: I am [INAUDIBLE], yeah.
I think, if we're the people who are probably more
interested in uploading, but who are we gonna
trust to run us?
AUDIENCE: Google.
CHARLES STROSS: Actually, there's one novel I can think
of where the novelist in question tackled
that question head-on--
Iain M. Banks in "Surface Detail." And you probably
won't like the answer.
[LAUGHTER]
CORY DOCTOROW: You know, we kind of tackle it in this.
I mean, for one thing, we have a whole group of people who
are uploaded and kind of LARP bureaucracy.
They feel like you need a bureaucracy, and so they go to
work and they do boring work.
Even though it's all automated, they feel like
someone needs to do it.
They're called the World Gov LARP.
CHARLES STROSS: And the clothing for their LARP is
really boring.
CORY DOCTOROW: Yeah.
Yeah.
And we do actually, we do have some self-governed systems
that run Capabilities environment.
So there's a Capabilities bar where you can't get into a
fight unless the other person allows you to fight with them,
and then you have a contract that allows you to fight.
And there's things like, you could be diffed back if you
violate the contract and so on.
It's--
it's a pretty fun bar.
AUDIENCE: "Permutation City," I think, has a pretty
interesting take on it, that essentially it
doesn't require hardware.
But that also has some really good tips about what it would
be like to be [INAUDIBLE].
CORY DOCTOROW: Yeah.
CHARLES STROSS: But bear in mind, Greg Egan has sort of
recanted fairly valiantly against the approach to AI via
genetic algorithms he was pitching in that novel.
Doesn't like it anymore.
Has ethical qualms.
Over--
CORY DOCTOROW: I'm sorry.
PRESENTER: That's all we have time for.
I'm sorry.
CORY DOCTOROW: Oh, we did have someone who hadn't asked a
question at all.
PRESENTER: Uh, oh, OK.
CORY DOCTOROW: Just one quick one?
[LAUGHTER]
AUDIENCE: Uh, so the worlds you write about and the worlds
you live in are a little bit different, today, at least.
How do you stay creative?
How do you generate these ideas?
Do you have a process?
CORY DOCTOROW: There's a post office box in Schenectady.
You send a self-addressed stamped envelope and they send
you back science fiction ideas.
[LAUGHTER]
CHARLES STROSS: There's--
actually, in my case, there's a little closet in the
basement of the MIT Media Lab, and I have to report there for
reprogramming every two years.
[LAUGHTER]
CORY DOCTOROW: That's the one Aaron Swartz got busted for
sneaking into.
No.
It's the 21st century.
If you can't come up with cool science fiction ideas before
breakfast, you're not paying attention, you know?
[LAUGHTER]
[APPLAUSE]