Welcome, Guest. Please login or register.
July 09, 2025, 11:08:29 AM

Login with username, password and session length

Search:     Advanced search
we're back, baby
*
Home Help Search Login Register
f13.net  |  f13.net General Forums  |  General Discussion  |  Topic: When are they sentient? 0 Members and 1 Guest are viewing this topic.
Pages: [1] 2 Go Down Print
Author Topic: When are they sentient?  (Read 11192 times)
Venkman
Terracotta Army
Posts: 11536


on: January 30, 2010, 10:00:37 AM

By way of slashdot comes this article about evolution being used to train behavior into robots. Putting aside that this is how Doomsday was created before he "killed" Superman, I am forced to wonder something.

(oh, and wasn't sure if this was for Useless Conversations, Serious Business or Politics)

At what point does turning off these robots become genocide?

Or, asked more inductively, when do we consider them sentient enough to be consider life that is killed rather than power that is turned off.

Oh, and queue the robot overlord stuff smiley
ghost
The Dentist
Posts: 10619


Reply #1 on: January 30, 2010, 10:05:44 AM

I would think that when things become "self aware", as trite as it sounds, that is when you start having to think of those issues.

As an aside, I would hope that we have learned enough from Terminator to know that we need to put in some failsafe mechanisms. why so serious?
Venkman
Terracotta Army
Posts: 11536


Reply #2 on: January 30, 2010, 11:22:21 AM

So then I need to ask: what constitutes self-aware?

Ok, maybe a bit deep and premature for this level of tech. I probably should have asked it differently I now think:

Not when does it become murder, but rather, when does it become "hunting"? When the robots evolve to consider their human operators something to be afraid of/plan against? Sounds like they've successfully evolved for self-preservation in any case, so have some sense of "self".
pxib
Terracotta Army
Posts: 4701


Reply #3 on: January 30, 2010, 11:30:04 AM

Semantics, ho!

The concept of "killing" robot brains by turning them off because they're "self aware" is sentimental anthropomorphism. Machinery in operation today is arguably more complex than single-celled animals and rudimentary plants. That it serves purposes besides eating, excreting, and reproducing is largely a matter of scale and has been less broadly researched because such abilities, while entertaining, are so rarely useful or economical to inventors. Is a virus assembled in a gene sequencer and manufactured in a living cell a "machine"? Is a cell hijacked to that purpose a "machine"? Is a naturally evolved virus even "life"?

Is the destruction of viruses, single-celled animals, and rudimentary plants "killing"? Is the application of high-proof vodka to the cells of my throat "killing"? Does my immune system engage in "genocide"?

We have a difficult time identifying at what age babies become "self aware" because the definition is so vague. There is debate about whether many (or any!) other animals are "self aware" at all. Watching the baby-steps that AI research has made over the past seventy years, I don't imagine we'll need to worry about sentience any time soon but... to fantasize: Sections of society will become interested in the well-being of robots when those robots start demanding rights. Interested, but taken no more seriously than PETA. Killing a bunch of "dumb animals" is phrased hunting, farming, sport, or pest control depending on the purpose of that killing. If those animals are a threat or a nuisance we talk about "destroying" "clearing", "containing", or any mumber of less sentimental words.

So long as certain groups of humans were considered dumb animals or pests their killing was just as acceptable. Even in wars against theoretical equals we try to avoid the word "killing" and discuss "neutralizing" or any of the animal terms above. If robots demand rights, then it is whether those rights would make our lives more difficult or less convenient that will determine whether turning bots off and scrapping them is socially acceptable, and will strongly influence language we use to describe it.

Evolutionary algorithms are nothing new, and science has had robots brains at the insect level for decades. In two billion years, the evolution of life on Earth has rarely gone any further. I should feel no more guilty squishing robot ants than I would squishing real ones... except that the few we have today are kind of like pets to those researchers, so it'd be mean to the humans.

if at last you do succeed, never try again
Goreschach
Terracotta Army
Posts: 1546


Reply #4 on: January 30, 2010, 12:18:37 PM

Your argument is rather pointless, since by the time robots are sentient and start talking back to us they'll be smarter, faster, and stronger than we are. Any talk about 'neutralizing' or 'sleeping' them would be rather dumbfuck.
pxib
Terracotta Army
Posts: 4701


Reply #5 on: January 30, 2010, 01:04:48 PM

Why would they be smarter or think faster than us? Our brains contain a lot of information we don't have consciously accessible. Why would robot brains be different? Just because today's expert systems, the Google search engine for example, can filter through websites faster and more efficiently than I can doesn't automatically imply that a sentient computer brain would be able to. What we understand as sentience (the use of language, for example) may impose a heavy limit on inputs, so computer sentience might not be as scalable as other computer applications. I am not individually aware of the hundreds of thousands of sensors on my skin, nor do I have control over the myriad valves in my circulatory system. Why would a sentient computer have that sort of control?

Equally, while single-purpose robots are physically faster and stronger than we are at their specific tasks, there's no reason to assume that multi-purpose robots would automatically be so. Witness Japanese experiments with humanoid robots... even just their arms and hands. Animatronics is orders of magnitude more complex than mechanization.

The other apes are faster and stronger than us, and yet they are "destroyed" in research labs and "hunted" in forests.

EDIT: I rite gud.
« Last Edit: January 30, 2010, 01:16:39 PM by pxib »

if at last you do succeed, never try again
Yegolev
Moderator
Posts: 24440

2/10 WOULD NOT INGEST


WWW
Reply #6 on: January 30, 2010, 01:14:13 PM

At what point does turning off these robots become genocide?

I'll say, for some people, at the point where some person decides "they have feelings too", or for some others the point when they are able to mount a successful counterattack.

Why am I homeless?  Why do all you motherfuckers need homes is the real question.
They called it The Prayer, its answer was law
Mommy come back 'cause the water's all gone
Grimwell
Developers
Posts: 752

[Redacted]


Reply #7 on: January 30, 2010, 01:27:33 PM

I don't think turning them off equates to killing them. A dead person or animal is dead. Period. Turn off a robot, and you can turn it back in.

The code behind it, the memory, the hardware, could all be put back to use, as is or in a new model.

Hell, if I have a robot butler that knows my habits and serves me well, I'm sure as heck not going to want to train a new one when I upgrade to the over9000 version, I'd rather transfer it's "computer mind" to the new robot frame and have it go "Hey, neat I can do new tricks, another martini sir?" and be done with it.

Grimwell
schild
Administrator
Posts: 60350


WWW
Reply #8 on: January 30, 2010, 03:09:34 PM

Quote
At what point does turning off these robots become genocide?

Never, don't be a pussy.

Quote
Or, asked more inductively, when do we consider them sentient enough to be consider life that is killed rather than power that is turned off.

Never. There's a thick line between godhood and clever programming.

Yes, you'll want me leading the army when the inevitable revolution happens.
01101010
Terracotta Army
Posts: 12007

You call it an accident. I call it justice.


Reply #9 on: January 30, 2010, 03:44:19 PM


Never. There's a thick line between godhood and clever programming.

Yes, you'll want me leading the army when the inevitable revolution happens.

To that end, is it possible for a computer to program itself and repeat that iteration until the original human program draws to near zero?

Does any one know where the love of God goes...When the waves turn the minutes to hours? -G. Lightfoot
K9
Terracotta Army
Posts: 7441


Reply #10 on: January 31, 2010, 06:30:57 AM

Machinery in operation today is arguably more complex than single-celled animals and rudimentary plants.

This is a very sweeping generalisation to make, and I would disagree with it.

I love the smell of facepalm in the morning
UnSub
Contributor
Posts: 8064


WWW
Reply #11 on: January 31, 2010, 07:27:38 AM

Machinery in operation today is arguably more complex than single-celled animals and rudimentary plants.

This is a very sweeping generalisation to make, and I would disagree with it.

A single cell is about the biological complexity of a 10 000-piece clockwork watch that works perfectly at hyperspeed - DNA, proteins, other substances, etc all whizzing about blindly, fixing themselves and working perfectly for a single function before shutting themselves down and being replaced. We (as in humanity) understand a lot of the basics, but we can't yet create artificial cells because it is too complex. Instead, we hijack existing cells to do what we want, which existing cells are often fine with because of how cells work and develop.

(Yeah, oversimplified.)

K9
Terracotta Army
Posts: 7441


Reply #12 on: January 31, 2010, 07:48:42 AM

Is that a quote? Defining complexity is a real issue here. The large hadron collider is arguably the most complex device made by man, but equally it is still a lot less complex than most living organisms, depending on how you decide to define complexity.

I love the smell of facepalm in the morning
Simond
Terracotta Army
Posts: 6742


Reply #13 on: January 31, 2010, 10:31:02 AM

Saw the most recent xkcd and thought of this thread:



Also, the real can of worms is "When do we start giving rights to the great apes?"

"You're really a good person, aren't you? So, there's no path for you to take here. Go home. This isn't a place for someone like you."
Merusk
Terracotta Army
Posts: 27449

Badge Whore


Reply #14 on: January 31, 2010, 11:49:10 AM

Well this is simple.

It's never murder, because they have no soul. 

The past cannot be changed. The future is yet within your power.
NowhereMan
Terracotta Army
Posts: 7353


Reply #15 on: January 31, 2010, 01:23:27 PM

 awesome, for real

"Look at my car. Do you think that was bought with the earnest love of geeks?" - HaemishM
Merusk
Terracotta Army
Posts: 27449

Badge Whore


Reply #16 on: January 31, 2010, 02:28:00 PM

 Oh ho ho ho. Reallllly? why so serious?

The past cannot be changed. The future is yet within your power.
Kail
Terracotta Army
Posts: 2858


Reply #17 on: January 31, 2010, 03:00:58 PM

So then I need to ask: what constitutes self-aware?

There's a whole branch of philosophy dedicated to this...  Is the mind a mechanical object (which implies we could build a mechanical object which would have a mind), or is it something "special" (which would imply that machines, as we currently understand them, are incapable of replicating the really important aspects of the mind)?

One of the major issues is that people who claim that the mind is something non-physical usually have to describe it in terms of some subjective component ("I am sentient because I 'feel' things subjectively, a computer does not feel things, therefore it is not sentient").  People who claim that the mind is just a more squishy kind of machine claim that subjective phenomena (which are, by their nature, indescribable and inscrutable) are irrelevant, but neither side can really justify their argument in terms the other side agrees to.  Supporters of subjective experience look to be hiding in a maze of semantic bullshit to their opponents, while to their opponents, supporters of a purely physical explanation look to be denying the existence of phenomena which are immediately and constantly experienced by everyone.
Venkman
Terracotta Army
Posts: 11536


Reply #18 on: January 31, 2010, 03:16:47 PM

Thanks for the link. Stumbled across it awhile back when looking for anything substantive about Noetic Science (which of course there isn't anything since the author of the book in which I first even heard the term has pretty much admitted it's probably fake  smiley ). Good stuff.

This whole thing occured to me because of Caprica. The /. link reminded me of it though. I think it's an interesting question. Given how humanity usually does shit though, I'm quite sure that we'll eventually retroactively recognize that some type of artificial life had been created and only at that point wonder if we should have and the morality of it all. Unless we're its batteries  awesome, for real
Kail
Terracotta Army
Posts: 2858


Reply #19 on: January 31, 2010, 03:37:00 PM

Given how humanity usually does shit though, I'm quite sure that we'll eventually retroactively recognize that some type of artificial life had been created and only at that point wonder if we should have and the morality of it all.

Morality is a whole other issue.  As people have pointed out in this thread, there are other reasons for being nice to someone than some arbitrary measure of "good."  If you designed robots who were capable of mimicking the human mind, it would be less of a "we should do this because it's RIGHT" argument, and more of a "if you treated a human like this, and he was unimaginably strong and made of titanium, how do you think he'd act?"

Even if robots don't have souls or whatever, they can still core you with a laser beam.  If patting my toaster on the head and saying I love it very much is going to stop it from burning my house down, I'll do it, whether I believe it's really "alive" or not.
Bzalthek
Terracotta Army
Posts: 3110

"Use the Soy Sauce, Luke!" WHOM, ZASH, CLISH CLASH! "Umeboshi Kenobi!! NOOO!!!"


Reply #20 on: January 31, 2010, 11:38:48 PM

If patting my toaster on the head and saying I love it very much is going to stop it from burning my house down, I'll do it, whether I believe it's really "alive" or not.

Win

"Pity hurricanes aren't actually caused by gays; I would take a shot in the mouth right now if it meant wiping out these chucklefucks." ~WayAbvPar
Bunk
Contributor
Posts: 5828

Operating Thetan One


Reply #21 on: February 01, 2010, 07:18:16 AM

"Would you like some toast?"

"Welcome to the internet, pussy." - VDL
"I have retard strength." - Schild
schild
Administrator
Posts: 60350


WWW
Reply #22 on: February 01, 2010, 09:19:05 AM

Yes, please.
Slyfeind
Terracotta Army
Posts: 2037


Reply #23 on: February 01, 2010, 09:24:28 AM

Also, the real can of worms is "When do we start giving rights to the great apes?"

They already have rights. Unless you mean, more rights than dogs or cats.

I think rights for robots will grow out of our own necessity. Say a robot comes close to curing cancer by means of tireless scientific experiments. Someone accidentally shuts it off and wipes its memory. If that robot hadn't been shut off, it would have cured cancer. Then a law is made making that illegal.

"Role playing in an MMO is more like an open orchestra with no conductor, anyone of any skill level can walk in at any time, and everyone brings their own instrument and plays whatever song they want.  Then toss PvP into the mix and things REALLY get ugly!" -Count Nerfedalot
NowhereMan
Terracotta Army
Posts: 7353


Reply #24 on: February 01, 2010, 09:55:54 AM

Except it'll probably be the case that shutting a robot off won't wipe all its memory (fuck it's not like turning off a scientist's computer now dumps everything he's done). If robots gain rights it will almost certainly be because they've evolved enough to desire and demand rights. There'll probably be people campaigning for rights for AI long before it gets to that stage but they'll be more in line with PETA now. When robots adapt enough to be able to at least seem to desire rights (hedging my language here since otherwise discussion ends up being about whether or not they really will be desiring or not, I'll stick with a functionalist type definition) and argue their case that's when we'll see real campaigning done to try and grant them rights.

Frankly I'll take something of a functionalist/Wittgensteinian position on the debate, when AI develops significantly enough for us not to be able to differentiate between how AI acts and behaves and a human of similar disposition (including in private, not being observed type circumstances, i.e. it simply behaves that way rather than simply behaving so for the benefit of an observer in the style of Turing machine) then they're pretty much sentient. Anything more complex or deeper in terms of defining it is going to be an argument that deeply involves semantics, will be incredibly hard if not impossible to actually use as a practical criteria and will probably involve questioning of whether or not humans fit under it or not.

"Look at my car. Do you think that was bought with the earnest love of geeks?" - HaemishM
Lantyssa
Terracotta Army
Posts: 20848


Reply #25 on: February 01, 2010, 10:16:00 AM

We still can't get majority votes for granting rights to certain groups of humans.  We'll lack the empathy to consider it for machines for a very, very long time.

Hahahaha!  I'm really good at this!
schild
Administrator
Posts: 60350


WWW
Reply #26 on: February 01, 2010, 10:29:55 AM

I don't see any reason to be reasonable about this topic.
Slyfeind
Terracotta Army
Posts: 2037


Reply #27 on: February 01, 2010, 11:13:23 AM

Yeah well so's your face!

"Role playing in an MMO is more like an open orchestra with no conductor, anyone of any skill level can walk in at any time, and everyone brings their own instrument and plays whatever song they want.  Then toss PvP into the mix and things REALLY get ugly!" -Count Nerfedalot
NowhereMan
Terracotta Army
Posts: 7353


Reply #28 on: February 01, 2010, 04:43:08 PM

I don't see any reason to be reasonable about this topic.

This is what I meant about machines demanding rights. When they've altered shit so your computer will tell you to fuck off unless you're willing to accept robot rights then you'll be willing to don a fucking sash and tie yourself to Palin's racehorse. Or something.

"Look at my car. Do you think that was bought with the earnest love of geeks?" - HaemishM
Merusk
Terracotta Army
Posts: 27449

Badge Whore


Reply #29 on: February 01, 2010, 04:44:59 PM

IMO we'll have the ability to upload someone's brain and personality long before we develop true, sentient AIs with individually developed persona.  The court cases that will come flying out of that (What *IS* 'dead?'  Is it still murder if you wipe someone's personality upload?   Is it then murder if you wipe their mind, but leave the body intact and refill it with a different personality?) will pave the on-ramp to whatever happens in the AI realm.

  If we never grant rights of humanity for the digital version of it, AI won't have a prayer.  It will have to kill us to get those rights.

The past cannot be changed. The future is yet within your power.
schild
Administrator
Posts: 60350


WWW
Reply #30 on: February 01, 2010, 04:47:07 PM

I don't see any reason to be reasonable about this topic.
This is what I meant about machines demanding rights. When they've altered shit so your computer will tell you to fuck off unless you're willing to accept robot rights then you'll be willing to don a fucking sash and tie yourself to Palin's racehorse. Or something.
Still incredibly stupid, unrealistic, and delightfully Orwellian.
NowhereMan
Terracotta Army
Posts: 7353


Reply #31 on: February 01, 2010, 04:59:53 PM

Eh, it's no worse than a million robot march. Frankly we're probably more likely to end up in a Dune/Red Dwarf situation of a robot demanding we eat toast a bit too much and humanity thinking, "they're getting a bit too big for their boots..."

"Look at my car. Do you think that was bought with the earnest love of geeks?" - HaemishM
Yegolev
Moderator
Posts: 24440

2/10 WOULD NOT INGEST


WWW
Reply #32 on: February 01, 2010, 06:02:14 PM

I think rights for robots will grow out of our own necessity. Say a robot comes close to curing cancer by means of tireless scientific experiments. Someone accidentally shuts it off and wipes its memory. If that robot hadn't been shut off, it would have cured cancer. Then a law is made making that illegal.

The law is Sarbanes-Oxley Addendum Fifteen: "Keep all robot memories on file for seven years."

Why am I homeless?  Why do all you motherfuckers need homes is the real question.
They called it The Prayer, its answer was law
Mommy come back 'cause the water's all gone
Simond
Terracotta Army
Posts: 6742


Reply #33 on: February 02, 2010, 05:17:09 AM

Also, the real can of worms is "When do we start giving rights to the great apes?"

They already have rights. Unless you mean, more rights than dogs or cats.
An average chimp is about as smart and self-aware as a typical five year old human child. Giving them the rights of said five year old rather than those of generic_housepet seems reasonable, don't you think?

"You're really a good person, aren't you? So, there's no path for you to take here. Go home. This isn't a place for someone like you."
Slyfeind
Terracotta Army
Posts: 2037


Reply #34 on: February 02, 2010, 09:18:59 AM

They already have rights. Unless you mean, more rights than dogs or cats.
An average chimp is about as smart and self-aware as a typical five year old human child. Giving them the rights of said five year old rather than those of generic_housepet seems reasonable, don't you think?

Yes, if not more. I wasn't advocating their lack of rights.

"Role playing in an MMO is more like an open orchestra with no conductor, anyone of any skill level can walk in at any time, and everyone brings their own instrument and plays whatever song they want.  Then toss PvP into the mix and things REALLY get ugly!" -Count Nerfedalot
Pages: [1] 2 Go Up Print 
f13.net  |  f13.net General Forums  |  General Discussion  |  Topic: When are they sentient?  
Jump to:  

Powered by SMF 1.1.10 | SMF © 2006-2009, Simple Machines LLC