Prometheus von Cornsilk (kingnixon) wrote,
Prometheus von Cornsilk
kingnixon

  • Music:

let's stop deciding things

WARNING after 2 paragraphs i realize that i can't manage to say what i am thinking here, so the post just kind of stops inconclusively.

i recently read diaspora by greg egan. aside from a terribly flat beginning, it is really a marvelous book. i'm not here to write you a review though, suffice to say much of it is about people in the future who have effectively turned their minds into computer programs and live in giant software worlds. this is the first thing i've read from egan, and near as i can tell he's brilliant and has more ideas than he can possibly keep track of through an entire book. so naturally the idea that most caught my mind is only mentioned once and never discussed at all. ah well.
at one point a character, yatima, is asked to make a yes/no decision she's[1] unsure about. she runs a simulation of herself that tells her she is 90% likely to say yes. so rather than waste days of her friend's time pondering the question, she just says yes straight off. (whether the simulation took its own effects into account was not mentioned).
so what just happened? rather than make a decision, yatima ran a program that told her what she would probably do, and then did that. putting aside the question of how accurate a program like that could actually be (especially at speeds that would make it any more useful than just doing your own deciding), what in hell would you DO with such a thing? say you ask it "will i marry bob and move to mexico?" and it says you are 85% likely not to. now you could say okay i probably won't anyway and tell bob to ship off alone. or you could give that 15% chance its due and weigh the merits yourself. but i know for myself at that point i would want to say yes to bob out of sheer petulance.
at this point i realized that i can't really articulate why this self-simulation idea boggles me so much, except that imagine if someone went their entire life basing their decisions on what the simulator said they would probably do (i don't think that's very far-fetched, in a hypothetical world where such programs have caught on). this wouldn't change any events of their life from what most likely would have happened anyway, so nothing is different. but they would have no idea why they'd done anything. would their life have any meaning to them?
also: http://www.blogcadre.com/blog/jason_striegel/how_i_failed_the_turing_test_2005_09_04_13_26_29

[1] sidenote: he's not really a he. as most humans now exist as only the abstract concepts of people, most of them don't bother to have genders at all (this is a big assumption on egan's part, but seems as reasonable as the opposite), and refer to themselves with ve/vir/etc pronouns. i was tempted to use these above, but then my post would have been much harder to read for no reason.
Subscribe

  • Post a new comment

    Error

    default userpic

    Your reply will be screened

    When you submit the form an invisible reCAPTCHA check will be performed.
    You must follow the Privacy Policy and Google Terms of use.
  • 4 comments