2013-03-28

From SMBC Wiki
(Redirected from 2929)
Jump to navigation Jump to search
2013-03-28
2013-03-28
Title text: 2013-03-28

Votey[edit]

20130328after.gif


Explanation[edit]

Run for your life.png This explanation is either missing or incomplete.

Transcript[edit]

Gday human.png This transcript was generated by a bot: The text was scraped using AWS's Textract, which may have errors. Complete transcripts describe what happens in each panel here are some good examples to get you started (1) (2).
[Describe panel here]
Suppose you're inside an out of control train. It's hurtling toward five people. You can do nothing, or you can cause it to change tracks to hit just one person. What's the right thing to do?
I would remove the part of my brain that governs empathy which is the source of ethics.
The remainder of me is an inhuman computing machine and therefore its behavior has no moral aspect, any more than a computer de termining a sum has a moral aspect.
The inhuman computing machine makes a choice, which causes some number of deaths.
If a person had made the choice either option would have been immoral. Since the computing machine chose, it was merely amoral. Since the latter is preferable, I made the most ethical choice.
Wasn't it unethical when you removed the empathy part of your brain?
No. Because it didn't alter the final outcome's level of morality.
Neuroethics is kind of disturbing isn't it?
I wouldnt know. I removed the neuroethical part of my brain.

Votey Transcript[edit]

Gday human.png This transcript was generated by a bot: The text was scraped using AWS's Textract, which may have errors. Complete transcripts describe what happens in each panel here are some good examples to get you started (1) (2).
[Describe panel here]
If there's a bad neuroethics joke, I haven't heard it!

Comment.png add a comment! ⋅ Comment.png add a topic (use sparingly)! ⋅ Icons-mini-action refresh blue.gif refresh comments!

Discussion

No comments yet!