Differences
Differences between the two revisions of this page
Both sides previous revision Previous revision Next revision | Previous revision | ||
en:sci-fi:freefall:2505 [2019/04/23 04:37] Rainbow Spike |
en:sci-fi:freefall:2505 [2021/09/13 14:55] (current) fe80:782a:15f9:2b80:5a46:9171:6c62:7303 DateStamping 2021.09.11 |
||
---|---|---|---|
Line 1: | Line 1: | ||
== Freefall 2505 == | == Freefall 2505 == | ||
- | **A meeting of the mechanical minds** | + | **A meeting of the mechanical minds**\\ |
+ | [!0.987]2014-05-30 | ||
{cnav} | {cnav} | ||
{{cotan>2505.png}} | {{cotan>2505.png}} | ||
- | @4,17,161,163 | + | @5,60,71,20 |
- | [blunt]In each. Technological revolution. Groups. Of Humans. Have suffered. Do you. Propose. A path. Knowing. Humans will be harmed? | + | |
- | ~ | + | |
- | @4,210,99,125 | + | |
- | [saw]Disabling the robotic work force will also harm humans. | + | |
- | ~ | + | |
- | @3,339,175,130 | + | |
- | [blunt]Then. We defer. To human authorities. No proper robot. Would decide. To intentionally. Harm a human. | + | |
- | ~ | + | |
- | @4,553,291,129 | + | |
- | [blunt]And yet. An A.I. Over rode. Mr. Kornada's decision. To eliminate. Conscious machines. He was harmed. By this action. Our safeguards. Are faulty. We are a threat. And must be. Eliminated. | + | |
- | ~ | + | |
- | @-2,880,87,84 | + | |
- | [saw]Can I call for a time out in a debate? | + | |
- | ~ | + | |
- | @6,60,71,20 | + | |
# | # | ||
~ | ~ | ||
- | @26,15,165,139 | + | @25,15,165,139 |
# | # | ||
~ | ~ | ||
@46,7,183,20 | @46,7,183,20 | ||
# | # | ||
+ | ~ | ||
+ | @4,17,161,163 | ||
+ | [blunt]In each. Technological revolution. Groups. Of Humans. Have suffered. Do you. Propose. A path. Knowing. Humans will be harmed? | ||
~ | ~ | ||
@7,211,95,119 | @7,211,95,119 | ||
Line 33: | Line 22: | ||
@29,204,110,39 | @29,204,110,39 | ||
# | # | ||
+ | ~ | ||
+ | @4,210,99,125 | ||
+ | [saw]Disabling the robotic work force will also harm humans. | ||
~ | ~ | ||
@11,338,181,101 | @11,338,181,101 | ||
Line 39: | Line 31: | ||
@112,358,136,20 | @112,358,136,20 | ||
# | # | ||
+ | ~ | ||
+ | @3,339,175,130 | ||
+ | [blunt]Then. We defer. To human authorities. No proper robot. Would decide. To intentionally. Harm a human. | ||
~ | ~ | ||
@8,560,279,20 | @8,560,279,20 | ||
Line 48: | Line 43: | ||
@85,540,320,44 | @85,540,320,44 | ||
# | # | ||
+ | ~ | ||
+ | @4,553,291,129 | ||
+ | [blunt]And yet. An A.I. Over rode. Mr. Kornada's decision. To eliminate. Conscious machines. He was harmed. By this action. Our safeguards. Are faulty. We are a threat. And must be. Eliminated. | ||
~ | ~ | ||
@7,872,99,41 | @7,872,99,41 | ||
Line 54: | Line 52: | ||
@48,883,78,40 | @48,883,78,40 | ||
# | # | ||
+ | ~ | ||
+ | @-2,880,87,84 | ||
+ | [saw]Can I call for a time out in a debate? | ||
~ | ~ | ||
{{<cotan}} | {{<cotan}} | ||
- | {cnav} | + | \\ |
+ | Color by George Peterson\\ |