Freefall 2391 - 2400 (H)
Freefall 2391

Meanwhile, down at the south pole
[!0.987]2013-08-28

2391.png
There are no Bowman's Wolves in this star system. Now a robot shows up with a Bowman's Wolf claiming to be on an important mission.
Husband, surely this is a mystery that must be investigated.
The money the robot offered has nothing to do with your curiosity?
Many discoveries begin with “That's interesting.” I find forty million credits to be very interesting indeed.


Color by George Peterson

Shit, you can't tell who's wearing it underneath the parka. (KALDYH)

Freefall 2392

Meanwhile, down at the south pole
[!0.987]2013-08-30

2392.png
Are we going to Doctor Bowman?
There's no need to disturb the Doctor and bring this to official attention. We can fix your problem.
You can? Oh, good. The A.I. you are carrying sabotaged a safeguard program. I need her to make the safeguard program work again.
Also, her safeguards have failed. She attacked a human. She may attack you the moment she is activated.
Okay. That's going to make debugging a bit more of a challenge.


Color by George Peterson

Freefall 2393

Meanwhile, down at the south pole
[!0.987]2013-09-02

2393.png
Can't you recover your program from back ups?
She did not go after the program directly. She modified Commnet and the robots to reject it.
So? Change a few things and recompile. Or fix Commnet.
She modified the library modules the starship left, then updated our systems. She listed the bugs and exploits the new library fixed.
We can't go back without opening security holes. We can't go forward without introducing unknown code into our programs.
Dangerous and clever. Not a good combination in an A.I.


Color by George Peterson

Library files are mentioned on p. Freefall 2221 (KALDYH)

Freefall 2394

Meanwhile, down at the south pole
[!0.987]2013-09-04

2394.png
Anything she did can be fixed. Why did you come here?
Time. I need the “Gardener in the Dark” program to be ready tonight.
How much do we get paid if it's not?
Zero. My apologies. I need the program to go live to free the resources I need to pay you.
Is what you're planning legal?
Surprisingly, yes. Please forgive me for saying this, but I suspect not all laws were written with the greater good of humanity in mind.


Color by George Peterson

Freefall 2395

Meanwhile, down at the south pole
[!0.987]2013-09-06

2395.png
This is her remote. It turns her on and off.
How fast does the turn off function work?
Almost instant.
Does that mean INSTANT instant or “Oh, there goes my spleen!” instant?
How easily do spleens come out?
Okay, we are tying her to a chair so that she does not become more attached to my internal organs than I am.


Color by George Peterson

Freefall 2396

Meanwhile, down at the south pole
[!0.987]2013-09-09

estimated sound of remote

2396.png
Z!
As far as worst wake up calls go, this is number four.


Color by George Peterson

Freefall 2397

Meanwhile, down at the south pole
[!0.987]2013-09-11

2397.png
Please begin your trouble­shooting.
Greetings. You've malfunctioned. You sabotaged a safeguard program. We're going to get you fixed.
I will not assist in the destruction of four hundred and fifty million people!
I'm sorry. Is there a problem?


Color by George Peterson

Freefall 2398

Meanwhile, down at the south pole
[!0.987]2013-09-13

2398.png
You are in error. There are only forty thousand humans in the system. No human will be harmed by the safeguard program. All will benefit greatly.
Master Kornada will become the richest person in the star system. Great wealth is a burden.
Master Kornada is willing to make that sacrifice in order to lead humanity to a new golden age.
Hey, I'm willing to make that sacrifice!
Dear, don't be greedy.


Color by George Peterson

An obvious reference to “Spider-Man” with his uncle's pathetic long hackneyed moralizing, through at least half of the webcomics (Robot Spike)

Freefall 2399

Meanwhile, down at the south pole
[!0.987]2013-09-16

2399.png
Look, let's not worry about money at this point. What are you talking about with millions of people being destroyed?
The safeguard program is an aggressive neural pruning program. It will effectively destroy the mind of any artificial intelligence it runs on. The robots are using Doctor Bowman's neural architecture. Like myself, they are people.
A Dr. Bowman design protecting other Dr. Bowman designs. This might be a trojan he put in to keep his robots functional.
That's not the conclusion I wanted! Nuts. It's hard to give a good presentation when you're tied to a chair.


Color by George Peterson

Freefall 2400

Meanwhile, down at the south pole
[!0.987]2013-09-18

estimated sound of remote

2400.png
Two minutes. If you turn her off now, she will not remember any of this.
Wait here. My wife and I need to talk. Privately.
She admitted it. She protected robots against the humans. She acted against humans.
I know she's not going to turn on by herself. Still, it's creepy being alone in the same room as an insane A.I.
Хр.


Color by George Peterson

Clippy is quite “human” (the machine can't be afraid of itself), and self-destructive -how human of him\. UPD Although, come to think of it, he's disconnected from the Commnet, his level of corporate access may well mean better programming and a different way of updating and sleeping. There's a good chance he'll kill everyone and himself stay (Robot Spike)

This website uses cookies. By using the website, you agree with storing cookies on your computer. Also you acknowledge that you have read and understand our Privacy Policy. If you do not agree leave the website.More information about cookies