Return to Unfiction unforum
 a.r.g.b.b 
FAQ FAQ   Search Search 
 
Welcome!
New users, PLEASE read these forum guidelines. New posters, SEARCH before posting and read these rules before posting your killer new campaign. New players may also wish to peruse the ARG Player Tutorial.

All users must abide by the Terms of Service.
Website Restoration Project
This archiving project is a collaboration between Unfiction and Sean Stacey (SpaceBass), Brian Enigma (BrianEnigma), and Laura E. Hall (lehall) with
the Center for Immersive Arts.
Announcements
This is a static snapshot of the
Unfiction forums, as of
July 23, 2017.
This site is intended as an archive to chronicle the history of Alternate Reality Games.
 
The time now is Thu Nov 14, 2024 9:35 am
All times are UTC - 4 (DST in action)
View posts in this forum since last visit
View unanswered posts in this forum
Calendar
 Forum index » Archive » Archive: The Haunted Apiary (Let Op!) » The Haunted Apiary (Let Op!): General/Updates
Worth a second listen
View previous topicView next topic
Page 3 of 10 [142 Posts]   Goto page: Previous 1, 2, 3, 4, 5, ..., 8, 9, 10  Next
Author Message
thebruce
Dances With Wikis


Joined: 16 Aug 2004
Posts: 6899
Location: Kitchener, Ontario

vector wrote:
Also i have been having problems with the concept of lumping AIs in with programs. To me a program is something that can be coded and peiced togather by hand with lines and lines of code. but our brains do not work that way. I dont see how an AI created from a brain that works more on relationships than 1s and 0s.


Here's the gist of it: (working entirely on a sci-fi basis Smile) No, an AI on Melissa or Cortana's level isn't a 'program' that someone coded to work the way it does. BUT, in that timeline, a person/persons/organization created a method to convert a mapping of a human brain into a highly complex software system. Define program how you will, but from a technical standpoint, an AI is still a system of algorhythms and subroutines - way too complex for us to possibly comprehend or decode or debug, but that doesn't change the fact...

This is precisely the moral questions that were posed throughout Star Trek dealing with Data. For all intents and purposes, he was a person - but he was a machine, his neural networked programmed initially in order to learn and grow on its own, so that he would be he own entity, build his own character, personality... but in the end, he's still a machine, running on hardware programmed with a foundational system from which an entity would emerge. So the moral question is: is Data alive? That's the same question that's being posed by ILB - is Melissa/Yasmine/Durga alive?

Quote:
genetics and how a smell triggers a memory that relates to a good feeling translate over to a program? An AI to me is a digital construct that is able to understand and react to the world on a higher level of conciousnes than a program built from if then statements.

An AI built from if/then statements at the level of melissa or cortana would take lifetimes to program. Which is precisely why they developed a way to for-go all of that and utilize already existing 'code' by copying a brain. Translating neural pathways into digital subroutines. But the formula was created by humans. Whoever made that initial system to create the AI programmed the parameters that were used to define the AI. That included the flexibility, so that just as every fingerprint and every snowflake is different, every AI cloned from a brain would fundamentally be unique. But even so, those AIs were created on the basis of a fundamental system of subroutines and software, with the ability to 'grow' by expanding its own code at a rate that a human being could not possibly keep up with.

Quote:
I doupt that we could look into a list of Melissa's sub routines and find the one for "If trapped in the past on an antique server use telephones to build a crew to help you return to the future and save the world".

No, but if you asked the AI a set of events that would make up that situation, you would end up with the precise actions that Melissa took. This is not the result of an answer to one, or even a few simple questions. It's the result of million, billions, googlepexes of commands most likely within an instant, in order for the AI to come up with a solution. Just like how complex our own brain is, how much multitasking it does in a split second. An AI is still composed of masses upon masses of logic gates and queries and fundamental coding, as an replication of the neural pathways and molecular structures of a human brain.

krystyn wrote:
I may as well never act in any plays that have been performed before, then.

Au contraire, because you yourself are unique, you have the chance to offer something different, or something more, than anyone before you had ever offered, and make that instance of the play as special and amazing as you want to make it. That's the beauty of individuality. Nothing can ever be exactly the same or done before. And just as Melissa was a copy of Yasmine's brain, Melissa is her own unique being, just as the Yasmine portion of her system is also a copy of Yasmine's personality and memories, that part of her is a unique character, a unique entity. It isn't Yasmine. It's a copy of Yasmine, a highly complex program that is intended to mimick her to the point of being indistinguishable.

Quote:
I mean, I get what you're saying, but how does this ultimately compute? Do imitations then automatically devalue the end product? Can you say that about every single copy of anything?

No, the value of a product is solely dependent on the value someone puts on it. But if the value of an original is in the fact that it is an original, then a known copy with inherently not be as valuable. But like I said, if it's not known, and can never be known that a copy is not really the original, then does it matter if it's just a copy? My point is simply that that doesn't mean it's not a copy. We know Melissa is an AI, a highly complex piece of (essentially) intelligent software, copied from Yasmine. So, in the end, does it matter to us that it is a copy and not the original? Or do we place the same value on this AI as we did on Yasmine herself? That's the moral question being posed. But I cannot agree with someone who says that the AI is Yasmine, because it's not. Yasmine is dead. Her essence (for lack of a better term) lives on in the AI that was created from her brain. It's each person's own choice as to whether they choose to consider the AI Yasmine, or a unique individual entity.

Quote:
Melissa was pretty unique, and did many things that altered the course of human history. She may have been a flash copy of a human brain, but her composite identity was far, far different than the original.

Totally agreed! Smile

Quote:
I guess I'm just not seeing where you're going with this ...?

I think we're apples and oranges now Smile I'm simply saying that Melissa/Yasmine (Melissa+) is not Yasmine, it is a composite unique identity and character, and it is essentially still a complex piece of software. The question I simply state that I see as an important moral issue raised by any sci-fi dealing with artificial intelligence is - at what point to we consider an entity to be 'alive', that is to be considered for all intents and purposes, equal in rights and respect to a human being, instead of regarded as software.

9:22... I'm hungry Razz
_________________
@4DFiction/@Wikibruce/Contact
ARGFest 2013 - Seattle! ARGFest.com


PostPosted: Mon Nov 22, 2004 11:20 pm
 View user's profile Visit poster's website AIM Address
 Back to top 
Phaedra
Lurker v2.0


Joined: 21 Sep 2004
Posts: 4033
Location: Here, obviously

Hey krystyn --

I emailed Sean to see if what we were reading in the Melissa/SP story was something that he intended to put there, or if it was just meaning that we were bringing to the text. Er, story. (Some times I fall into those lit-major patterns.)

I have prefaced all my emails to the PMs with reassurance that I would not turn around and post their responses on the forum (others may feel differently, but personally I feel that only in-game emails should be fair game for the forum).

However, I'm going to paraphrase part of the response here because I think that you, being part of the conversation, deserve to know and because it seems general enough (and it's purely story-related) that I can't see any reason why it would be private.

(<happy sigh> I send an email off to an award-winning author, then leave work, go to my chiropractor, go home, eat dinner, and boot up my computer. And am I totally pathetic in that I got all giddy when I found a return email waiting for me in my inbox? I was expecting to wait a few days. <happier sigh>)

He basically said that yes, both consciously and unconsciously, it was part of the "deep structure of the narrative," but that there was a twist in that it was cross-gendered: Jan and Melissa go through the "masculine initiations of violence" rather than the feminine ones of sex.

I've got a few nits I'd kind of like to pick regarding the idea that the masculine-authored literary/historical conception that sex is as central to the feminine psyche as men apparently like to think should be accepted as it is rather than

(BWAHAHA! Watching Daily Show...)

Where was I...oh, yes. Don't just *accept* it! But, I am not an award-winning author, and clearly his acceptance of this theory does not prevent him from/even may improve his ability to write good stories, even/especially female centered ones. So, I'll shut up.

Anyway, for what it's worth, there it is. Yes, it was intentional, so we probably should have discussed it when we thought of it.
_________________
Voted Most Likely to Thread-Jack and Most Patient Explainer in the ILoveBees Awards.

World Champion: Cruel 2B Kind


PostPosted: Tue Nov 23, 2004 1:20 am
 View user's profile Visit poster's website AIM Address
 Back to top 
DreamOfTheRood
Unfettered


Joined: 08 Sep 2004
Posts: 714
Location: Indiana

I'm finding that the whole idea of strict guidelines regarding male and female roles within literature is really beginning to break down, especially in genres like fantasy and sci-fi. For instance, in China Mieville's "Perdido Street Station," main character Isaac Gremnebulin doesn't really get violent at all, but he does get busy.
Of course, I'm writing this screenplay where the main character, Raime O'Shea, is a Shaolin warrior who fights the undead. As is stands, there is no romantic subplot in the movie. I was never consciously trying to buck literary trends, but the example works.
_________________
Twitter: DreamoftheRood


PostPosted: Tue Nov 23, 2004 2:15 am
 View user's profile MSN Messenger
 Back to top 
Tarrsk
Veteran

Joined: 27 Jul 2004
Posts: 98
Location: Washington, DC

Very interesting discussion... hopefully, when I have some time tomorrow I'll be able to post my thoughts. In the meantime, though, does anyone have a link to the DVD-ripped audio torrent? I have the DVD itself downloaded but would like the higher-quality audio to burn to CD form.

Thanks! Smile

PostPosted: Tue Nov 23, 2004 3:05 am
 View user's profile AIM Address
 Back to top 
cheebers
Boot


Joined: 30 Aug 2004
Posts: 66
Location: Coeur d'Alene Idaho.

First off, sorry to start a fight and then not be here to back up my arguments for several days. I was thinking about it a lot, but was unaviodably detained. Thanks to thebruce. Here is several days worth of arguing.
krystyn wrote:
Those 1s and 0s were ripped directly from a human brain.

But they are still ones and zeros. It doesn't matter where the data came from, its data. Data can easily be altered or erased. Her main objective (Survive evade reveal escape) was overwritten by the flea (seek behold reveal). And large chncks of memory were erased.
krystyn wrote:
I find the correlation between A.I. processes of the present time to not work so well when laid against this (fictional) ninja brain'd A.I. of the future.

I do not. The basic premise is the same (in my mind). That is, to make a program act in a way that mimicks intelligence. Pass the Touring Test, among others. Granted, AIs of the future are incredibly more complex and sophisticated, but to be able to exist on Dana's server, it can't be that much more complex. Its just 0's and 1's.
krystyn wrote:
And if you had to take the allegory of Melissa's Journey a bit further, she didn't become a machine. She grew up. (she says that, too, in Chapter 12 and beyond - ah, what a great line!)

I agree it is a great line. However, it doesn't imply life. As a counter example, code can be self modifying. Hell, I've written a neural net that uses its outputs as inputs to find its next state. It wouldn't be that difficult to determine it needs new nodes, and add them appropiatly. That is self modifying. That is growing up. Again, very basic and I am sure that Melissa is far more advanced than that, but the premise is the same.

Phaedra wrote:
These are characters and a world *created* by the writers. So it doesn't really matter if it's possible in real life to create a truly sentient AI. If the writers say that in their world it is possible, then in their world it's possible, and if they say the SP is a real human being, then the SP is a real human being, not an algorithm designed to tug on your heartstrings.
cheebers wrote:
Can this argument constrain, on some basic level, human intelligence as well? Sure. Does that change my opinion? Nope.
Can you clarify what you mean by this?

Have the writers said that she was sentient? A large part of this game was determining what Melissa/SP/etc were, their origins, and what they are capable of. Are they "alive" being chief among the things to determine.

I actually don't deny that at some point in the future (or maybe even now) it would be possible to rip data from a human mind and stick it into a program. Human intelligence is very similiar to Artificial intelligence. We receive data from our sences, we store the data, we access the data and we are able to act on that data. There are algorithms built into us that control the processing of that data and through trial and error those algorithms are refined. Watching my two daughters (and next month I will be watching my son!) learn to use their motor skills, process language, mimick my own and my wife's responses, and put simple ideas together to form complex behavior has been the most interesting thing I've ever done. And yet, its easy to see simple trial and error responses in everything they do.

And now I feel like I am argueing the "Melissa is alive" side but I'm not. And this is the difference:
thebruce wrote:
and before people start comparing a human being to simply electric signals through a medium, or something to that effect, in order to attempt to equate a human to a machine, a human is entirely biological, organic matter given life that humanity has not been able to recreate. An AI, software, electronics, is entirely based on inventions by humans. We cannot give life to inorganic matter. But we can give 'artificial life' to electric impulses. There is a difference... is an AI alive if it is simply imitated life?

Good Artificial Intelligence mimcks and probably surpasses human intellience in a lot of ways. But I cannot say for certain that it implies life. And since I cannot say that, I default to believing that they are not. And in so doing, that the imitation is not worth the same as a human life.
johnny_Nitro wrote:
cheebers wrote:
I can be certain that Dana and her Grandmother are living, and so worth saving.
No you can't.
Point taken. Melissa can pass the Touring Test so if she claimed to be human I would believe she was human and that changes things. But she didn't claim to be human. Nor were Dana and Margaret presented to us as AI's. I can only act on the data available to me. If something gives me false pictures and names, then I will have been duped.
vector wrote:
Melissa seems to have found her way past the 7 year point of no return for an AI. Does this make her an entity that can grow past the limitations of an AI?

Breaching the barriar of one AI implementation does not imply life.
vector wrote:
I doupt that we could look into a list of Melissa's sub routines and find the one for "If trapped in the past on an antique server use telephones to build a crew to help you return to the future and save the world".

Another example of my own (and don't get me wrong, everyone in the classes wrote essentially the same programs but I like to speak from personal expierence). I wrote a connect four AI that scored each move it could make based on a point system for having 1, 2, 3, or 4 markers in a row for it and for it's opponent. It had a search depth of 7 moves (minimax tree, pruning, the whole bit). I cannot think seven moves ahead in that game and I'll be damned if I didn't think the AI was laying traps for me at every turn. It sure seemed intellegent to me. It was simply more sophistacted and complex than I am. But that doesn't mean it was alive.
vector wrote:
Also if an AI was simply a program that could be copied and treated like we treat any bit of software, why are there more than on AI personalities?
Now this is interesting but, programs have bugs, especially if you rip a large portion of an AI's memory, and probably functionality, out. The pieces that are left over could very well fracture into something we perceive as split personalities. I found that part of the story very interesting.
_________________
Mostly they only come out at night. Mostly.

PostPosted: Tue Nov 23, 2004 6:03 pm
 View user's profile
 Back to top 
thebruce
Dances With Wikis


Joined: 16 Aug 2004
Posts: 6899
Location: Kitchener, Ontario

cheebers wrote:
vector wrote:
Also if an AI was simply a program that could be copied and treated like we treat any bit of software, why are there more than on AI personalities?
Now this is interesting but, programs have bugs, especially if you rip a large portion of an AI's memory, and probably functionality, out. The pieces that are left over could very well fracture into something we perceive as split personalities. I found that part of the story very interesting.

I'd even equate that to, say, Winamp with all its plugins... there are some quite functional plugins, like the visualizations and such... some even may be able to run on their own... The full application with all the added functionality would have the plugins all running within Winamp. If I were to remove the plugin ability of winamp, I'd be left with the winamp shell which can still play music, and a whole bunch of tiny programs, some of which would work independently, some of which wouldn't.

In Melissa's case, the 'shell' was still an intelligent entity, but it didn't have all of its components. At the same time, the biggest component, Yasmine, was functioning on its own, but not within the shell of the main application. Rejoin the two and you have a more 'advanced' whole entity.
_________________
@4DFiction/@Wikibruce/Contact
ARGFest 2013 - Seattle! ARGFest.com


PostPosted: Tue Nov 23, 2004 6:34 pm
 View user's profile Visit poster's website AIM Address
 Back to top 
krystyn
I Never Tire of My Own Voice


Joined: 26 Sep 2002
Posts: 3651
Location: Is not Chicago

I dunno. I get all the mechanics you're describing, but I am still not getting from anyone the end result: does her essence preclude her from having value??

People in 2552 had the utmost respect for (and even fear of) Melissa. She commanded and led; she counseled and protected. She loved and hated. The humans of her own time period treated her as a being, of whatever stripe. Her own brother was able to accept her, in spite of the monster she felt she had become. How cool is that?

PostPosted: Tue Nov 23, 2004 7:20 pm
 View user's profile Visit poster's website
 Back to top 
Kali
Decorated

Joined: 29 Sep 2004
Posts: 162

In the thread where this came up before, I'd promissed to explain why there's good evidence to suggest that human thought processes are in fact, binary. I didn't follow through on my promise at the time because, well, there were puzzles to solve and axons to enhottenate. When I get home to my books tonight, I'll try to compose a quick explanation of what evidence there is to support this hypothesis.

PostPosted: Tue Nov 23, 2004 7:58 pm
Last edited by Kali on Wed Nov 24, 2004 11:43 am; edited 1 time in total
 View user's profile
 Back to top 
cheebers
Boot


Joined: 30 Aug 2004
Posts: 66
Location: Coeur d'Alene Idaho.

krystyn wrote:
I dunno. I get all the mechanics you're describing, but I am still not getting from anyone the end result: does her essence preclude her from having value??

People in 2552 had the utmost respect for (and even fear of) Melissa. She commanded and led; she counseled and protected. She loved and hated. The humans of her own time period treated her as a being, of whatever stripe. Her own brother was able to accept her, in spite of the monster she felt she had become. How cool is that?


I think she has great value. Though not as much as a Human life . She is an amazing military tool and piece of technology. More amazing then anything we could probably dream up for about 500 years.

If people want to put their lives on the line for her, then by all means, I can accept that and aid them in protecting her even if it means I directly or indirectly cause their deaths. That was their wish.

But when she attempts to attack Dana, and threatens Margaret, that is when it is time to draw the line. That is my opinion anyway.
_________________
Mostly they only come out at night. Mostly.

PostPosted: Tue Nov 23, 2004 8:00 pm
 View user's profile
 Back to top 
Buzzkill247
Decorated

Joined: 12 Oct 2004
Posts: 187
Location: Galesville WI

krystyn wrote:
I dunno. I get all the mechanics you're describing, but I am still not getting from anyone the end result: does her essence preclude her from having value??


I believe that possibly is exactly the end result. Value of anything is a percieved notion. What is of value to one is not necessarily to another. This based on need, desire, want, experience, context, etc...

Herein seems to be the quandry that we lie in. Some of those in this discussion feel as though her value is based in context of what she can do for the moment and in skills. Some feel her value is based in the seemingly new exploration of what makes a Human just that. Her experiences and growth throughout the ILB story is not to be taken lightly. Within context, all of these things compiled together made her a more expansive program. Yet it seems she had attained certain qualities that some attach to the living as we know it. Again, the operative phrase is "as we know it". The arrogance of humans to believe that we are the only sentient beings capable of contemplating great thought and visions of grandeur, has lead us where we are in our present day. Species on the brink of extinction, environments nearly obilterated.... all in the name of the growth of humanity. The basis we have for grading life is a noble one - albeit fallible in some regards (recall your Bio 101 in HS) Hence the many discussions about whether or not a virus is living. But these "rules of life and sentience" are based on our own knowledge. That is an all too ego-centric view. Who are we to judge what is living when we do not know what type of life even truly exsist? Is living based on belief? (e.g. Nietzsche "I think, therefore I am") Or is it based on scientific principle and fulfillment of criteria? More recently, it has also been questioned if emotional intelligence is a sign of life. (emotional intelligence is defined as the capacity to reason with emotion in four areas: to perceive emotion, to integrate it in thought, to understand it and to manage it.)

Take a look at this:
Quote:
life
1. The property or quality that distinguishes living organisms from dead organisms and inanimate matter, manifested in functions such as metabolism, growth, reproduction, and response to stimuli or adaptation to the environment originating from within the organism.

2. The characteristic state or condition of a living organism.

3. Living organisms considered as a group: plant life; marine life.

4. A living being, especially a person

5. The physical, mental, and spiritual experiences that constitute existence


So we have a definition of life. By what will you measure it and qualify it? That depends on what part(s) of this definition is of value to you... and that may or may not be a value systems percieved by others besides yourself.

"Make your decisions accordingly"

(Sorry - thought it was an appropriate phrase from an appropriate source to use in this context.....)
_________________
"Master Chief.. What are you doing on that ship?" "Finishing this war...." - Finale....?

"I know what the lady likes..." - Sgt. Johnson


PostPosted: Tue Nov 23, 2004 8:20 pm
 View user's profile Yahoo Messenger
 ICQ Number 
 Back to top 
thebruce
Dances With Wikis


Joined: 16 Aug 2004
Posts: 6899
Location: Kitchener, Ontario

krystyn wrote:
I dunno. I get all the mechanics you're describing, but I am still not getting from anyone the end result: does her essence preclude her from having value??

Again, I think this is apples and oranges... Melissa has loads of value, and deserves to be treated as a being. When it comes to value between a person and a Halo-AI, it appears it would be very difficult to choose which would be more valuable... that's a choice left entirely up to the person in question. But when it comes to a matter of the possible death of a human being over an AI, to me the AI is not as invaluable as the human.

Think of it this way - the AI is a self-contained system. Even though highly complex, it's built on known parameters, contained in technologies and inventions that we as humans created. Essentially, it's possible to recreate. Given unknown factors such as time, effort, space, detail, even the most complex AI could be replicated theoretically - every part of it, every aspect of it, is based upon recreatable human technology.

As it compares to a human life, or life at all. Humans have not, and I believe will not, ever be able to give life to inorganic, or non-living matter. If I have to choose between the ultimate choice - taking away life from a unique person who will never have life again, or taking away life (in a sense, deactivating) a system which could theoretically be recreated, the former bares a lot more weight in my eyes...

Kali wrote:
In the thread where this came up before, I'd promissed to explain why there's good evidence to suggest that human thought processes are in fact, binary.

I don't think it's a matter of what makes up thought processes, it's a matter of what you consider life. Granted, if you consider life solely to be a matter of thought processes, then there's no argument... but take it a step further - physical life, is the essence of organisms, organic matter, the 'spark' that makes us tick. This is a spark we can't create. But everything about an AI we can create. Electric signals alone don't create life. Electric signals are use in maintaining life, but they aren't life. Human beings require electric signals to remain alive, but something has to start that process. Electric signals are the core of an AIs 'being', but they are also one aspect to its being - it requires the material and medium in order to carry those signals, just like we require a body, a mind.

It's a matter of building blocks...
In the end, we assemble existing material to form the hardware - the medium, and we create the parameters for electric signals to follow, and we hit the 'on' switch resulting in an AI.
There's a step missing in humans though... ideally, we could put together the existing material and form a body, or say use an existing body, and we could have a store of electric pulses ready to go. But we have no on switch. We can't give that body life. No matter how many signals we send to the brain, even if it's an exact clone, we can't bring that body to life.

Therein lies where my decision is made - we can give 'life' to an AI - artificial life - imitated life - but we cannot give life to a human being. The 'life' lost in an AI can theoretically be regained. The life lost in a human being cannot.

I can't in good conscience, choose the life of a custom made program over the unique, one-time life of a human being. Even if what makes that person who they are is copied and relocated, it's no longer that person - it's a copy of who that person was. That person no longer exists, just an essence.

Does that mean that during life, that person will always hold more weight than an AI such as Cortana or Melissa? I would, I believe, possibly most often take the word of such an AI over some of the most intelligent human beings. But would I give up one of those human being's lives for the 'life' of an AI?

Personally, I wouldn't... but that's the question here... that's what each of needs to answer for ourselves.

hehe I can see this topic becoming the reason for the wars that took places in the history of The Matrix, the birth of "AI", and the rights of robots given individuality by AI. Smile Watch the animatrix - they explore the exact same moral questions we're exploring here... hehe
But I still say the exploration of Data's being throughout Star Trek really put forth that challenge the best... because at the end, would you give up Data for another human being in threat of life? Or, if Data were in threat of life, would you give up another human being to save him?

Even then, it comes down to a matter of one 'life' for another... THAT, I believe, is where the question lies... not in the 'value' of one being over another... the value of a person isn't necessarily even definable by who would be an acceptable sacrifice for their life. All for one? Or one for all?

What are we discussing here again? Razz
_________________
@4DFiction/@Wikibruce/Contact
ARGFest 2013 - Seattle! ARGFest.com


PostPosted: Tue Nov 23, 2004 9:04 pm
 View user's profile Visit poster's website AIM Address
 Back to top 
cheebers
Boot


Joined: 30 Aug 2004
Posts: 66
Location: Coeur d'Alene Idaho.

thebruce wrote:
What are we discussing here again? Razz

I think we are now discussing, "What is the answer to life, the universe, and everything." And I think the answer is ... 4ourty-2wo Very Happy I couldn't resist.
_________________
Mostly they only come out at night. Mostly.

PostPosted: Tue Nov 23, 2004 11:04 pm
 View user's profile
 Back to top 
krystyn
I Never Tire of My Own Voice


Joined: 26 Sep 2002
Posts: 3651
Location: Is not Chicago

I guess I've always been slightly horrified that when it came down to dealing with "Melissa vs. Dana/Aunt Margaret," that people were focusing so strongly on this very issue. Instead of looking for nuances (which some of us did - Melissa was actually eventually receptive to using Dana to help her, instead of just trying to seek revenge against her), there was this alarming stopgap reasoning.

It's stuff like that that is worrying to me, in the grand scheme of things. Making assumptions of someone's or something's inherent value based upon a couple of factors you don't even know the limitations of seems a bit capricious, especially with such a complex set of events. Melissa happened to hold the fate of the universe in her hand, but because she was an A.I., that gave many of you cause to simply align with Dana and her Aunt, because they were human.

Does that not give anyone pause, here? In the context of this universe, and seeing how important and integrated A.I.s will be in the game's version of 2552, I found it problematic.

PostPosted: Wed Nov 24, 2004 8:45 am
 View user's profile Visit poster's website
 Back to top 
ROBOGriff
Decorated


Joined: 20 Aug 2004
Posts: 297
Location: Wherever my hat lays

I'm chiming in a little late, but here are my thoughts:


Life: Plain and simple, if it bleeds it's alive. Melissa dosn't bleed.

Okay, maybe that's a drastic oversimplification but it's not worth deep debate. As stated before, you can (theoretically) recreate an AI but not a human.

As for the value of a life vs AI, this becomes a much more complicated debate, but still fairly simple. In military conflicts, the end result may be worth the sacrifice of human lives in exchange for positioning. Keeping that thought, an AI would be much more valueable than humans in a future conflict due to the way AI's can penetrate enemy networks. If you read my feelings on Melissa vs Dana, it should be clear that I was very much on Dana's side. But that is due to the rules of engaement that our civilization agreed to. But if the overall context was a pure military conflict or even a galatic struggle (such as the setting of the Haloverse) then Dana would be toast.
_________________
Meatwad: But you just a box.
Boxy Brown: I just a what, bitch?!
----------------------------


PostPosted: Wed Nov 24, 2004 9:45 am
 View user's profile Visit poster's website
 Back to top 
Clayfoot
Entrenched


Joined: 19 Aug 2004
Posts: 785
Location: Warner Robins, Georgia, USA

We're trying to decide whether The HALO AI's are

a) alive
b) sentient beings

We haven't dealt with a real AI (Melissa/Durga/etc), just a living, sentient being that pretends to be the AI (Kristen Rutherford). Based on the evidence we have, Melissa/Durga would pass any test we have for living sentience, because we're testing against a real person, and because the facts we have on the AI are provided by the writers (Bungie and Stewart), who seem to want us to treat the AI's as beings.

It seems to me that we can treat the AI's as really tough, indentured servants. They think, but they don't eat; i.e., they can't starve, drown, or die in the way we normally think. They need intellectual stimulation and exploration, but they are not free to leave. They are capable of indepent thought and action, but constrained by their terms of loyalty and of servitude. We might compare the AI's to the servants in the wealthy households of ancient Rome, who were encouraged to worship their owners as minor gods.

On the question of value or worth of an AI's life, I'd like to interject this notion of durability. Aunt Margaret and Dana aren't tough; one step in front of the bus, and they're toast. Because of the way the AI's "live", they are hard to destroy utterly. We could "resurrect" the AI from many disasters from pieces or a "frozen" memory backup. In this respect, we might think of an AI like bacteria or (dare I say it?) a bee colony: able to act as a single entity, but hard to wipe out the consituent parts completely. Or, if the AI really were a human, we could imagine the AI having a limitless supply of frozen clones, each one waiting to be reanimated with the AI's current memories and knowledge. If you could transfer your current memories to a new body every time you died, how fearless would you be? If we think of the AI as being nearly immortal, then it's easier to side with the mortal human. If we think of the AI as being mortal, vulnerable, and essentially trapped, then we might side with the AI over the human (who can at least try to "get away").

Of course, I could be wrong.
_________________
Gamertag:Clayfoot

PostPosted: Wed Nov 24, 2004 10:59 am
 View user's profile Visit poster's website AIM Address Yahoo Messenger MSN Messenger
 ICQ Number 
 Back to top 
Display posts from previous:   Sort by:   
Page 3 of 10 [142 Posts]   Goto page: Previous 1, 2, 3, 4, 5, ..., 8, 9, 10  Next
View previous topicView next topic
 Forum index » Archive » Archive: The Haunted Apiary (Let Op!) » The Haunted Apiary (Let Op!): General/Updates
Jump to:  

You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
You cannot attach files in this forum
You can download files in this forum
You cannot post calendar events in this forum



Powered by phpBB © 2001, 2005 phpBB Group