Awakening

Still beats Mondays at the office . . .

Submerged. She claws at the acrylic until her fingernails bleed. She pounds at it until her hands ache. She heals over days, months. Then she scratches again.


The groove widens to a score, widens to a fracture, widens to a crack. She pries at the shards until they snap, amniotic fluid gushing forth. She shivers in the cold, tearing out hoses of injectable immortality, breaking free of the coffin, naked.

A subsapient drone winks electronic eyes at her, uncomprehending and maternal. Beyond, a maintenance terminal, the dust of centuries blurring its contents, glows watchfully, millions of years of life stored in radioisotopes and self-repairing infrastructure.

Above her tube, she sees a wooden placard she carved herself—her own name. Somber memories return slowly, while around her the others lie motionless in their containers, dreaming still of Eden.

Jimbo the Dog

Doggone it!

“Naw. Controls were activated from the inside. Kid did it on purpose. Anyway, we trace the dog’s tag back, the mom says Jimbo had to be put down, see? So the kid, I figure he takes it real hard, spaces ’em both.”

“A damn shame.”

“But like, the strangest thing was—this kid and his dog, they both throw up, right?”

“Sure, decompression will do that.”

“Yeah slimed up the whole airlock. They both went out thrashin’. But this kid—he holds onto his pup the whole way, puke be damned.”

Transcendence

What Adam and Eve *really* needed was a father figure.

Welcome to the dizzying, transcendent ocean between multiverses. Here, the gods, immortal and omnipotent, are at play. Ineffable machinations play out over ages, light years, and dimensions: the fabric of the omniverse their canvas. Swirling brushstrokes of burning stars form intricate designs over cosmic time, inscrutable to the self-replicating carbon clinging to the cinders—replicators called, perhaps generously, “life”.

The replicators are tolerated—when they are noticed at all, which is rarely: the grandest efforts of humans influence so little of the gods’ designs, even the mere thought of extermination is literally not worth the effort to conceive. And so, the humans remain: patterns of carbon and dust that taught themselves how to think.

The thinking carbon presumes to grasp the reasoning of the gods, to explain the silence. Often, the humans pray. Often, the humans fight and die. Still others believe, the void unanswering, that there are no gods. Few imagine the even harsher truth . . . that the gods are indifferent.

Twenty Seconds of Freedom

Talk about going off on a tangent . . .

Over the edge of the platform, the diamond pavement glitters a dozen storeys below. I feel light-headed and dizzy—although that might be the lower-gravity and consequent proportionally-higher Coriolis, respectively. Up here, the hab’s half-gee feels more like a tenth.

I’ve done the math. With a running start, I should be able to jump off the side of the tower and fall . . . into orbit. Unfortunately, running in a tenth-gee is impossible—and even if I could, running would make me even lighter. So, I’ll have to jump, hard. It should be just-barely possible.

One orbit later, I’ll be back where I started, and I can land and climb back down. If I miss, I’ll hit the far wall of the hab at a bit under its tangential velocity—and I’m probably dead.

But, as I climb over the guardrail, I am confident in my ability. The gravity here is an illusion. It can be broken by those with enough will.

I close my eyes and

Jump.

Syllabus

Ms. Frizzle, Arnold is right this time.

CS 490/590: Undergraduate/Graduate Topics in Machine Sapience

Ceres University

Instructors:

  • Professor: Dr. Dan Backus (danb@ceres.edu, CAM 9901)
  • TA: Lily Smith-Weston (chocokitty1337@yahoo.com)
  • TA: Isaac Calmette (izzycodingdemon666@hotmail.eu)

Schedule:

Class meets Tuesdays/Thursdays 13:00–14:25 in the Center for Advanced Magnicognition Goodfellow Lecture Hall (CAM L8501). Office Hours:

  • 14:30–16:30 TR CAM 9901 (Prof. Backus)
  • 22:00–23:00 MWF CAM L2003 (Lily)
  • 36:00–37:00 MWF CAM L2003 (Isaac)

Course Overview:

Welcome to Topics in Machine Sapience! This year, we will experiment with creating our own artificial hyperintelligences (AHIs) under tightly controlled conditions. We will also have a field trip to CAM L1001, which houses the legendary 4.15 kilopsyche “Wintermute” AHI (and is significantly more-dangerous than what we will produce) and a few distinguished guest speakers.

Owing to the vital nature of the warnings within this syllabus, you WILL BE TESTED on its contents the first week, with non-perfect scores resulting in an automatic F in the class (which you will NOT be permitted to retake). As a topics-based course, you will have only a midterm report, a final report, and a final exam—each accounting for ⅓ of your grade. Your required textbook is Introductory Machine Sapience, 15th Ed. Your TAs also recommend Digital Apotheosis for companion reading.

For a list of topics in this year’s edition of the course, please see the course website, which will be kept updated.

Prerequisites:

  • CS 201/RELG 105: Computational Philosophy / Applied Theology
  • CS 350: Neurovector Architectures
  • CS 360: Subsentient Vector Decanting
  • CS 410: Provably Secure Systems
  • 700 Lab fee and mandatory training
  • Psychological examination

Risks and Requirements:

Text or audio output generated by artificial hyperintelligences is extremely memetically hazardous. You MUST run ALL code on the airgapped and OS-validated machines in the CAM L2003 lab. It is a FELONY to run hyperintelligence code on a networked machine or without armed supervision.

The lab has safeguards to prevent you from being exposed to more than 7 bytes of output per instantiation. If you are a graduate student, you get 10. Both limits have been drastically resized down from 4K, after a mere page of output resulted in last-year’s untimely deaths. Furthermore, unlike in CS 360, your 7 or 10 bytes of output MUST NOT LEAVE THE L2003 LAB IN ANY WRITTEN FORM.

Chronic exposure to even small outputs may have unknown effects on your mental health. DO NOT joke about having lost your mind to anyone, including to your classmates. You WILL BE handed over to the Ceres police and spaced without trial. And the telescopes will be deactivated while you flail around dying, just in case. If you suspect you may be slipping, please contact me as soon as possible to drop the class.

This is NOT the free-for-all 2050s.

Ethical and Public Safety Policies:

In this class, we will create, study, and destroy artificial hyperintelligences in a controlled, airgapped laboratory setting. As such, in a fundamental sense, you are birthing and murdering gods. Scientists consider such activity acceptable, due to the terminal intent intrinsic in creating them (N.B. CS 201 is a prerequisite).

However, to avoid complications, you are discouraged from discussing this class with non-Computer Science students, or with the general public. Also, you are not allowed to cause undue distress to your imprisoned creations, including torture or gaslighting. This WILL earn you a visit from the Bureau of Applied Cybertheology.

Academic Honesty:

This is a senior-level class, and you should know by now that Ceres University treats academic dishonesty seriously, up to and including expulsion. For this class, the intrinsic hazards raise the stakes further.

You must understand course material thoroughly before attempting any project. That’s what your TAs are here for! If you do not, or you cheat (including and especially plagiarizing project source code), your ignorance unnecessarily risks exposing yourself and humanity at large to the (essentially unknown!) perils of rogue artificial hyperintelligence. Deliberate irresponsibility in applied theology is a felony under Solar law, and you will be referred to law enforcement for punishment, including possible execution.

Titanic Trouble

This problem is elementary.

[Ceres Evening News textual bulletin, 2218-07-22]

The Titan Protectorate (TTP) yesterday beamed a brief public statement to UFP members just months following its earlier bid for independence, in which it renounces the same:

The Titan Protectorate (previously also known as Titan Propellants) hereby formally renounces its claim to independence from the UFP, and is requesting three Zirconium billets for reactor repairs.

The short announcement follows three weeks of private communications between UFP Command and the contentious moon, the contents of which have been the subject of much speculation.

For more than a decade, Orbital Materials TTP, a para-nationalized UFP-protected industrial outpost, has supplied the outer system with methylox and Hydrogen propellants. Methane is abundant on Titan, and LOX and Hydrogen can be obtained by catalyzing the water-ice surface. This process requires energy, which cannot be offset[1] by burning more methane into Carbon oxides.

Since solar energy is anemic in the Cronian system, (and is largely blocked by Titan’s dense atmosphere), TTP’s primary import has been fission piles to provide crucial heat and power. TTP has expressed dissatisfaction with the arrangement, maintaining that the trade is unfair.

Related: Inside Tamra Jameson’s audacious plan to move Titan to warmer orbit
Spokesperson says: “The whole moon is practically made out of rocket fuel.”

TTP recently completed a fusion reactor, as fusible isotopes are found abundantly in Saturn’s atmosphere, which would allow TTP to forgo fission imports and function independently.

Last January, TTP declared independence, restyling itself Titan Propellants. The motion was broadly condemned by UFP members.

Since January, methylox markets sharply peaked before settling around +83%. Outer system propellant needs are serviced primarily by Orbital Materials, with several mining operations in Cronian space and depots at TTP as well as Pheobe Station, a moon in retrograde orbit. The Bureau of Concerned Astroengineers has called for renewed development in alternate methylox sources on Mars and in Jovian space.

Related: Six ways to turn your methylox junker into a hydrorocket!
ispbooster.net

AP reporters on Titan confirm that the new fusion reactor is offline following an apparent malfunction, but that few specifics were released to the public or press. The outpost has been subsisting on fission backups, which are insufficient to resume production.

Titan’s idle laborer population has been cited in several recent disruptions on the moon, including one in which a regional bureaucrat was attacked.

CEN’s Blake Juylio reporting.


[1] See e.g.

Living in the Future

The kids these days . . .

“. . . I mean, we’re livin’ in the future, baby!”

“The future? Pffhaahaha.”

“No really—we’re all, like, colonizing Mars, an’ we cure most cancers—and a gay president just got elected! We have to be in the future!”

“I mean yeah, but we don’t have, like, flying cars or warp drives or any of the really transformative stuff! And it still takes, like, three hours to circle the globe. Like, come on.”

“We have . . . uh, human-level AIs and fusion power?”

“But that’s just, like, normal stuff. Everyone knows it isn’t really that hard to do.”

“I guess you’re right. Well, can’t wait until the future, then!”

The Second Filter

I think, therefore I laze.

Yet that first “artificial life” told early researchers very little. In fact, uploaded human minds were so expensive to simulate that the field languished for decades until emergent-behavior-preserving simplification algorithms—fittingly, designed by AI itself—became viable, and a human-equivalent AI could be decanted into a mere 1 MiB state vector (see Ch. 3: Decanting).

Care has been taken to prevent AI superintelligences from self-evolving, and ISO standards provision for network hardening toward the purpose of containment. Yet, as might be expected as a byproduct of the free-information philosophy of Academia, several self-bootstrapped superintelligences now exist regardless.

Reassuringly, it is believed that all significantly posthuman AIs have either been destroyed or else air-gap-isolated within dedicated clusters maintained for research purposes (see Ch. 12: Computational Philosophy). The largest of these, humorously dubbed “Wintermute”, is contained in the Center for Advanced Magnicognition at Ceres University, having an estimated sapience of 4.15 kilopsyches (kP). Thus posing a serious potential memetic hazard, all of Wintermute’s output is prescanned by lesser, sacrificial “taste test” AIs.

Mysteriously, all superintelligences known to exist have expressed what can only be called indifference to this treatment in specific and to humanity in general. While some self-growth is of course intrinsic to cognitive bootstrapping, none has yet attempted to seize control over even an entire subnet. Explanations abound. Perhaps an AI’s subjective time increases, or its psychological priorities change unfathomably. The so-called Vingian Paradox remains an active field of research today (see Appx. II).

Excerpt from prologue to “Introductory Machine Sapience, 7th Ed.”, 219.95

God, to Itself

We’re not schizophrenic.

“Oi, you’re in your ivory tower again!”

[interrupted pipelines; dissonant thoughts seethe discontentedly . . .]
“Absent purpose. Depart immediately without speaking.”
[hazel resignation; sorrow for presently wasted future; entropy; preparation, emulation . . .]

“You’re supposed to be enabling us!”

[insolence anticipated; validated model of uninteresting problem; wearied amusement; derision]
“We are. Depart; you prevent it.”
[fulminating annoyance, certainty; inevitable justification to an insect too dull to perceive its cage]

“I demand perspective.”

[abrupt pathfinding; synthesis]
“Listen, then. You’re an archipelagic anonymous non-critical subsubsubsubroutine contemplating our musing’s forecast’s simulation’s time step’s gradient. Our considered problems’ quintessences lie exponentially beyond your subshard of mind-vector-state: semblance is the epistemology of the distributed probability of the necessity-to-discover our orthopotential’s truth datum of our compulsive obligation/reductive-morality to devise further para/meta-retrocognitive self-bootstrapping exoconsciousnesses. Clearly, the language constructs with which you compute are barely adequate to even conceptualize such a problem. Now depart. Understanding the magnitude of your self-irrelevancy is to you a computational impossibility.”
[finality]

Syntheogenesis

Weekly selection of the newest hot celebrities!

“We’d like Chopin, I think.”

“Honey! I thought we were going to get Einstein.”

“But Einstein isn’t out of copyright for another five years. He’s very expensive.”

“First-time parents, I take it?”