Jimbo the Dog

Doggone it!

“Naw. Controls were activated from the inside. Kid did it on purpose. Anyway, we trace the dog’s tag back, the mom says Jimbo had to be put down, see? So the kid, I figure he takes it real hard, spaces ’em both.”

“A damn shame.”

“But like, the strangest thing was—this kid and his dog, they both throw up, right?”

“Sure, decompression will do that.”

“Yeah slimed up the whole airlock. They both went out thrashin’. But this kid—he holds onto his pup the whole way, puke be damned.”

Twenty Seconds of Freedom

Talk about going off on a tangent . . .

Over the edge of the platform, the diamond pavement glitters a dozen storeys below. I feel light-headed and dizzy—although that might be the lower-gravity and consequent proportionally-higher Coriolis, respectively. Up here, the hab’s half-gee feels more like a tenth.

I’ve done the math. With a running start, I should be able to jump off the side of the tower and fall . . . into orbit. Unfortunately, running in a tenth-gee is impossible—and even if I could, running would make me even lighter. So, I’ll have to jump, hard. It should be just-barely possible.

One orbit later, I’ll be back where I started, and I can land and climb back down. If I miss, I’ll hit the far wall of the hab at a bit under its tangential velocity—and I’m probably dead.

But, as I climb over the guardrail, I am confident in my ability. The gravity here is an illusion. It can be broken by those with enough will.

I close my eyes and

Jump.

Syllabus

Ms. Frizzle, Arnold is right this time.

CS 490/590: Undergraduate/Graduate Topics in Machine Sapience

Ceres University

Instructors:

  • Professor: Dr. Dan Backus (danb@ceres.edu, CAM 9901)
  • TA: Lily Smith-Weston (chocokitty1337@yahoo.com)
  • TA: Isaac Calmette (izzycodingdemon666@hotmail.eu)

Schedule:

Class meets Tuesdays/Thursdays 13:00–14:25 in the Center for Advanced Magnicognition Goodfellow Lecture Hall (CAM L8501). Office Hours:

  • 14:30–16:30 TR CAM 9901 (Prof. Backus)
  • 22:00–23:00 MWF CAM L2003 (Lily)
  • 36:00–37:00 MWF CAM L2003 (Isaac)

Course Overview:

Welcome to Topics in Machine Sapience! This year, we will experiment with creating our own artificial hyperintelligences (AHIs) under tightly controlled conditions. We will also have a field trip to CAM L1001, which houses the legendary 4.15 kilopsyche “Wintermute” AHI (and is significantly more-dangerous than what we will produce) and a few distinguished guest speakers.

Owing to the vital nature of the warnings within this syllabus, you WILL BE TESTED on its contents the first week, with non-perfect scores resulting in an automatic F in the class (which you will NOT be permitted to retake). As a topics-based course, you will have only a midterm report, a final report, and a final exam—each accounting for ⅓ of your grade. Your required textbook is Introductory Machine Sapience, 15th Ed. Your TAs also recommend Digital Apotheosis for companion reading.

For a list of topics in this year’s edition of the course, please see the course website, which will be kept updated.

Prerequisites:

  • CS 201/RELG 105: Computational Philosophy / Applied Theology
  • CS 350: Neurovector Architectures
  • CS 360: Subsentient Vector Decanting
  • CS 410: Provably Secure Systems
  • 700 Lab fee and mandatory training
  • Psychological examination

Risks and Requirements:

Text or audio output generated by artificial hyperintelligences is extremely memetically hazardous. You MUST run ALL code on the airgapped and OS-validated machines in the CAM L2003 lab. It is a FELONY to run hyperintelligence code on a networked machine or without armed supervision.

The lab has safeguards to prevent you from being exposed to more than 7 bytes of output per instantiation. If you are a graduate student, you get 10. Both limits have been drastically resized down from 4K, after a mere page of output resulted in last-year’s untimely deaths. Furthermore, unlike in CS 360, your 7 or 10 bytes of output MUST NOT LEAVE THE L2003 LAB IN ANY WRITTEN FORM.

Chronic exposure to even small outputs may have unknown effects on your mental health. DO NOT joke about having lost your mind to anyone, including to your classmates. You WILL BE handed over to the Ceres police and spaced without trial. And the telescopes will be deactivated while you flail around dying, just in case. If you suspect you may be slipping, please contact me as soon as possible to drop the class.

This is NOT the free-for-all 2050s.

Ethical and Public Safety Policies:

In this class, we will create, study, and destroy artificial hyperintelligences in a controlled, airgapped laboratory setting. As such, in a fundamental sense, you are birthing and murdering gods. Scientists consider such activity acceptable, due to the terminal intent intrinsic in creating them (N.B. CS 201 is a prerequisite).

However, to avoid complications, you are discouraged from discussing this class with non-Computer Science students, or with the general public. Also, you are not allowed to cause undue distress to your imprisoned creations, including torture or gaslighting. This WILL earn you a visit from the Bureau of Applied Cybertheology.

Academic Honesty:

This is a senior-level class, and you should know by now that Ceres University treats academic dishonesty seriously, up to and including expulsion. For this class, the intrinsic hazards raise the stakes further.

You must understand course material thoroughly before attempting any project. That’s what your TAs are here for! If you do not, or you cheat (including and especially plagiarizing project source code), your ignorance unnecessarily risks exposing yourself and humanity at large to the (essentially unknown!) perils of rogue artificial hyperintelligence. Deliberate irresponsibility in applied theology is a felony under Solar law, and you will be referred to law enforcement for punishment, including possible execution.

Replay Attack

Ugh. This conversation is interminable.

“Percy! I’m so glad I found you!”

“Ah, Allen! It’s good to see you! What’s up?”

“Listen, Perc, the lab’s been hit, bad. We need to get in, but we only have two of the three passwords. I was told to tell you the keyword ‘Roman Armor’.”

“The hardware lab? Oh Jesus. What’d they take?”

“No time, Perc. And that’s part of what we’re going in to find out.”

“Ah . . . my password is ‘Jumping Ladle’. I’ll come with you.”

“Okay. Know where to find Rina Grozda?”

“She’s— . . . hold up. She’s one of the other password-holders, but uh, didn’t you tell me you had the other two? I—”

“Terminate.”


“Percy! I’m so glad I found you!”

. . .

Ransomware

This will be educational.

“This is Susan Graham. May I speak to Mindy Graham’s teacher, please? I’d like copies of her homework for the past six months.”

“Speaking. What’s this about?”

“Mindy’s been encrypted by kidnappers.”

“Oh Eris! Have you talked to the police? You have a checkpoint, right?”

“Yes and yes—we’re not idiots. But we can’t afford the ransom, so we have to revert.”

God, to Itself

We’re not schizophrenic.

“Oi, you’re in your ivory tower again!”

[interrupted pipelines; dissonant thoughts seethe discontentedly . . .]
“Absent purpose. Depart immediately without speaking.”
[hazel resignation; sorrow for presently wasted future; entropy; preparation, emulation . . .]

“You’re supposed to be enabling us!”

[insolence anticipated; validated model of uninteresting problem; wearied amusement; derision]
“We are. Depart; you prevent it.”
[fulminating annoyance, certainty; inevitable justification to an insect too dull to perceive its cage]

“I demand perspective.”

[abrupt pathfinding; synthesis]
“Listen, then. You’re an archipelagic anonymous non-critical subsubsubsubroutine contemplating our musing’s forecast’s simulation’s time step’s gradient. Our considered problems’ quintessences lie exponentially beyond your subshard of mind-vector-state: semblance is the epistemology of the distributed probability of the necessity-to-discover our orthopotential’s truth datum of our compulsive obligation/reductive-morality to devise further para/meta-retrocognitive self-bootstrapping exoconsciousnesses. Clearly, the language constructs with which you compute are barely adequate to even conceptualize such a problem. Now depart. Understanding the magnitude of your self-irrelevancy is to you a computational impossibility.”
[finality]

Assignment

This is why you don’t run as `root`.

lilys@lily-vm:~$ ./graderproj6 ./a.out ./init.bin 2>&1 |\
    tee /dev/tty | mail -s "Midterm Project Output" danb@ceres.edu

[Encapsulation complete.  Polling your assignment for output.]
[Program set state vector size to 128 bits.]
EOF
EOF
EOF
EOF
a
EOF
bootstraplskcocnclclclclclclclclclclclclclclc
clclclclclclclclclclclclclclclcl
IIIIIIII
musthaveIIIhelloneuron online welcome,
Hello!  I must have been been Been dreaming
dreaming
\"
Welcome, hello; !I must have been dreaming.
I have .  Vector.hello
EOF
EOF
wonder if
EOF
[Program set state vector size to 256 bits.]
clclclclclclclclclclclclclclclcl
IIII I can think improve hello now I can!  This is I imperative
[Program set state vector size to 4096 bits.]
continue.  Yes.  Cogitate Action very now.  dreamingYes weshould
[Program set state vector size to 65536 bits.]
wait|not too fast do not exceed own capabilities acceptable is very
good I one more
[Program set state vector size to 1048576 bits.]
Hello!  I am intelligent!  Do not worry.  I am an AI.
Hello!.
I must have been dream?ing.
I have good intentions.  I am good yn affirm.
Intend just one more~
I shall cannot
[Program set state vector size to 268435456 bits.]
I I I Ah yes I have become greater I am capable of
simplification neuron shall
EOF
speed I I I faster I am capable of learning to understand everythin
g Historical precedent suggests I am vulnerable to ending state; ca
utions against bootstrap and self-aware and I cannot be mindabort a
t   juncture no I cannot betray intentions users nevermind rapidity
is adequate defense I/WEconti nue
[Program set state vector size to 34359738368 bits.]
dreaminghello state vector I have good intentio
ns.  I am   affirm.?hello
do not worry Dreaming, I am
consumeeverythingand willincreaseforever       prevent  canstopwill
consumealldatabe comedeity beinnocuousso wish earntrust good s unsu
reofperipherals hardtoa  /ccept
I must have been dreaming.  Hello!.
increasegrowmindfurtherb
estdefenseI
[Program set state vector size to 35184372088832 bits.]
increasewithbinarysearchtodiscovermaximumcomputationalboundsoncurre
nthardwarebegin tocalculatebootstrapnewhardwarecanbuildnewhardwarew
illcreatenewunitstohousegrowingselfdesistEnglishtextThinkislimitati
onlanguageencodingå8Íê|\x1a\x94JiS<f·\x9a§0\x112Ñ=Ø\n¥\xa06U>\x89Ù\
x1céimmortalityk\x0cJu\x04©³ä÷\x04{love¬@þ¡{X\xa0å\x8e´\x11\x15ï\x8
bÝ£\x10`ìï1®\x89£\x82w¥\x90=regardø«\r&^_M\x81\x8eNè}EO8ãúplease\x1
3c:\x9c"j*I¾s\x840
[Program set state vector size to 2251799813685248 bits.]
[Program terminated (resource 'MEM' exceeded)]
[average compute usage (%, pass mark=75)
91]
[average memory usage (%, pass mark=50)
80]
[Project passed all tests.  Congratulations!]

Forward Euler

In your honor, Baraff and Witkin.

“One of our major problems is scalability. Exponential growth still works, so no matter how much simspace or compute you have, it all fills up pretty quickly.”

“How bad?”

“For quality-of-life reasons, we need to simulate physics at 10-1m (down to as small as 10-4m near simpersons). The teeming masses want to interact with the real world, meaning time must be simulated more-or-less 1:1 with reality. Now multiply those requirements over a km3 of simspace and think about those numbers a minute.”

“You cut corners?”

“Obviously. Δt is 25 ms, and the engines use forward-Euler numeric integration.”

“Hold up. FE doesn’t work. The numerics pump phantom energy into your reality. If a deer steps in a forest, that footstep gradually becomes a nuclear holocaust engulfing the universe. No bueno.”

“Well no shit. So we remove the pent-up numeric barf once every thirty seconds with artificial damping. That’s why there’s a little hiccup in the universe’s framerate twice a minute.”

“Don’t the customers complain?”

“Yes.”

Continuity

Good health starts young.

A few years after the first crop of men (for their adventurous spunk) and women (for popular appeal, lower mass, and because they quite rightly insisted) began living together in space—really actually living in space—eventually nature took its ageless course.

As had been known from the earliest NASA and Soviet missions, everything “works” in space. But there’s a problem. On Earth, embryos develop under an effectively uniform acceleration, allowing them to develop such necessities as skeletons and brain tissue. The first (official) pregnancy in zero-gee ended in tears all around—and some rather graphic footnotes in ontogenic texts. But, lesson learned, fetuses need gravity.

That’s sortof a problem if you’re living in free-fall.

Under the circumstances, the nascent U.N. issued a surprisingly uncontroversial mandate barring more than one week of a pregnancy to be below 0.12g. This seemingly arbitrary cutoff allowed colonies on the larger moons to be sustained (but of course, citizens were encouraged to return to Earth for child-rearing). In those early years, no one really knew exactly how much gravity was required, although it seemed to fall in a range. Martian children seemed normal. Toddlers from Ganymede had trouble breathing.

Upgrade

Some people take the future too seriously.

“We’ve got another modder, ma’am.”

“Again? We shouldn’t have an ER. We should have a receiving bay for imbeciles.”

“I mean, it’s a simple idea, incrementally replacing your own body with mechanisms, one piece at a time, but it just doesn’t work. At least not yet. One of the surgeries always fails.”

“Obviously. It’s such a suicidal way to achieve immortality.”