• 120 results
  • 1
  • 2
  • 3
#101 Edited by Flawed_System (388 posts) -

How does a robot have "free will"? They always have to have a master don't they?

Going by the etymology of the word it came from the Czech "Robotnik" which means "slave", Slavic "Rabota" or "servitude", "rabu" or "slave" (In old Slavic).

Here's the link if you want to read more: http://www.etymonline.com/index.php?term=robot

#103 Posted by mordukai (7151 posts) -

@Tim_the_Corsair said:

Sentient, not human.

This.

#104 Posted by TooWalrus (13205 posts) -

No... because it's a robot. I think something can only be considered human if it's brain is organic... right?

#105 Edited by NTM (7383 posts) -

Just because it has human qualities doesn't make it human, it doesn't even matter if you care for one as much as you would a human, it's still not human.

#106 Posted by Enigma777 (6073 posts) -

No that would be a sentient robot, not a human.

#107 Posted by Rave (370 posts) -

A robot doesn't deserve the same rights as us, it will always just be mimicking desires and wants because that's what it has been programmed to do. Our brain acts alot on survival instinct meaning almost everything we do is motivated by certain factors. We are motivated by food, sleep, procreative drive, avoidance of pain and so on. A robot would need or have none of these desires unless we programmed it to artificially want these. I'll argue that a robot is nothing more then a great mimic of human behavior, or would think so differently then is in a way we would not ourselves be able to comprehend that it would still seem like an artificial life. It's a machine not a living thing it has no rights no matter how "human" it seems. Anyone who says different seems crazy to me.

#108 Posted by kindgineer (2730 posts) -

@Tim_the_Corsair said:

Sentient, not human.
#109 Posted by VisariLoyalist (2993 posts) -

Understand all of your "desires" are rooted in evolutionary logic. You would have to artificially create such impulses in a robot. It would be very difficult to program something to "think". Ultimately you would have to have code running that was unthinking much like the individual neuron in our brain that added up to self awareness. I think the jury is still out on if that's technically possible. I think it would first require us to unlock the code that makes our own emotions possible in the context of a biological computer first. We still have very rudimentary knowledge of our own desires let alone being able to design a feeling creature from scratch.

#110 Posted by stinky (1549 posts) -

@Rave: program=motivation.

our dna programs our motivations, we have no choice in not wanting food, sleep or feelings of procreation. its our programming.

beginning premise stated free will as well, which would mean an absence of programmed desires or mimicry.

the essential question being "does having free will make something non-human equal to humans."

#111 Posted by VisariLoyalist (2993 posts) -

@stinky: okay we can also suppose a rock has free will but if we know for sure it can't than what does the question even matter?

#112 Posted by Rave (370 posts) -
@stinky

@Rave: program=motivation.

our dna programs our motivations, we have no choice in not wanting food, sleep or feelings of procreation. its our programming.

beginning premise stated free will as well, which would mean an absence of programmed desires or mimicry.

the essential question being "does having free will make something non-human equal to humans."

It is not our programming it is millions of years of evolution to survive it is what allowed us to become "intelligent" we needed it to survive as a species. Again any of these limits you put into a robot are human restriction they are not naturally evolved for the betterment and survival of that species. And if we didn't place those human restrictions on robots there wants and desires and entire way of thinking would be so different from what we would view as being human we would not recognize it as such. Everything these robots did right down to the way they moved and interacted with speech would be done by humans to mimic human behavior the robots themselves would need none of it. This is why it is called artificial intelligence because it doesbt work the same way as other living creatures.
#113 Posted by RandomHero666 (3181 posts) -

My dog has all those, he also poops like us, he's not human

#114 Posted by Intro (1207 posts) -

@lavaman77 said:

what exactly would separate a robot from a human

One is made in China and the other is born. One has an actual real heart and brain, the other is computer chips.

This is quite simple, the way I'm looking at it at least.

#115 Posted by No0b0rAmA (1490 posts) -

A human isn't a concept.

#116 Posted by SSValis (1136 posts) -

No, it's a dancer

#117 Edited by PenguinDust (12523 posts) -

No, it's still a robot. A better question would have been "is it alive"? "Should it be afforded the same rights as a human" and "Is destroying it the same as murder?" Biologically, it's never going to be a human, but whether or not it can share the same protection under the law or acceptance within society is another, more complex debate. If you saw the Kara Demo from GDC then you might have formed a opinion on the subject. As a species we may have to have an earnest discussion on this topic in the decades to come...provided we don't kill ourselves off and the monkeys take over. And, for the record, talking monkeys aren't human either.

#118 Posted by MisterSamMan (364 posts) -

They are still robots; however, I would consider them equals.

#119 Posted by Rave (370 posts) -

I would love to hear people's reasoning behind treating machines as equals. I can't fathom it, no matter how life like they are it's still nothing more then artificial every trait we give them is a mockery of our own human traits.

#120 Posted by lavaman77 (567 posts) -

@Rave said:

I would love to hear people's reasoning behind treating machines as equals. I can't fathom it, no matter how life like they are it's still nothing more then artificial every trait we give them is a mockery of our own human traits.

It is. The thing is that i could never get myself to feel sorry for a machine, regardless if it feels pain or not. Even if i try i just can't. It would be like watching a can of pepsi being kicked around rather than an alive being.

I wish there were more films/tv shows exploring this subject as it's hella more interesting than "ROABATS EVRYWHER3111 APOCALYPSE!".

#121 Posted by Akeldama (4248 posts) -

@Tim_the_Corsair said:

Sentient, not human.