Ms. Frizzle, Arnold is right this time.
CS 490/590: Undergraduate/Graduate Topics in Machine Sapience
Ceres University
Instructors:
- Professor: Dr. Dan Backus (danb@ceres.edu, CAM 9901)
- TA: Lily Smith-Weston (chocokitty1337@yahoo.com)
- TA: Isaac Calmette (izzycodingdemon666@hotmail.eu)
Schedule:
Class meets Tuesdays/Thursdays 13:00–14:25 in the Center for Advanced Magnicognition Goodfellow Lecture Hall (CAM L8501). Office Hours:
- 14:30–16:30 TR CAM 9901 (Prof. Backus)
- 22:00–23:00 MWF CAM L2003 (Lily)
- 36:00–37:00 MWF CAM L2003 (Isaac)
Course Overview:
Welcome to Topics in Machine Sapience! This year, we will experiment with creating our own artificial hyperintelligences (AHIs) under tightly controlled conditions. We will also have a field trip to CAM L1001, which houses the legendary 4.15 kilopsyche “Wintermute” AHI (and is significantly more-dangerous than what we will produce) and a few distinguished guest speakers.
Owing to the vital nature of the warnings within this syllabus, you WILL BE TESTED on its contents the first week, with non-perfect scores resulting in an automatic F in the class (which you will NOT be permitted to retake). As a topics-based course, you will have only a midterm report, a final report, and a final exam—each accounting for ⅓ of your grade. Your required textbook is Introductory Machine Sapience, 15th Ed. Your TAs also recommend Digital Apotheosis for companion reading.
For a list of topics in this year’s edition of the course, please see the course website, which will be kept updated.
Prerequisites:
- CS 201/RELG 105: Computational Philosophy / Applied Theology
- CS 350: Neurovector Architectures
- CS 360: Subsentient Vector Decanting
- CS 410: Provably Secure Systems
- ᙇ700 Lab fee and mandatory training
- Psychological examination
Risks and Requirements:
Text or audio output generated by artificial hyperintelligences is extremely memetically hazardous. You MUST run ALL code on the airgapped and OS-validated machines in the CAM L2003 lab. It is a FELONY to run hyperintelligence code on a networked machine or without armed supervision.
The lab has safeguards to prevent you from being exposed to more than 7 bytes of output per instantiation. If you are a graduate student, you get 10. Both limits have been drastically resized down from 4K, after a mere page of output resulted in last-year’s untimely deaths. Furthermore, unlike in CS 360, your 7 or 10 bytes of output MUST NOT LEAVE THE L2003 LAB IN ANY WRITTEN FORM.
Chronic exposure to even small outputs may have unknown effects on your mental health. DO NOT joke about having lost your mind to anyone, including to your classmates. You WILL BE handed over to the Ceres police and spaced without trial. And the telescopes will be deactivated while you flail around dying, just in case. If you suspect you may be slipping, please contact me as soon as possible to drop the class.
This is NOT the free-for-all 2050s.
Ethical and Public Safety Policies:
In this class, we will create, study, and destroy artificial hyperintelligences in a controlled, airgapped laboratory setting. As such, in a fundamental sense, you are birthing and murdering gods. Scientists consider such activity acceptable, due to the terminal intent intrinsic in creating them (N.B. CS 201 is a prerequisite).
However, to avoid complications, you are discouraged from discussing this class with non-Computer Science students, or with the general public. Also, you are not allowed to cause undue distress to your imprisoned creations, including torture or gaslighting. This WILL earn you a visit from the Bureau of Applied Cybertheology.
Academic Honesty:
This is a senior-level class, and you should know by now that Ceres University treats academic dishonesty seriously, up to and including expulsion. For this class, the intrinsic hazards raise the stakes further.
You must understand course material thoroughly before attempting any project. That’s what your TAs are here for! If you do not, or you cheat (including and especially plagiarizing project source code), your ignorance unnecessarily risks exposing yourself and humanity at large to the (essentially unknown!) perils of rogue artificial hyperintelligence. Deliberate irresponsibility in applied theology is a felony under Solar law, and you will be referred to law enforcement for punishment, including possible execution.
You must be logged in to post a comment.