The moral responsibility that may accompany owning a robot

bobcat

Well-known Member
Location
Northern Calif
How do you determine what level of sentience a domestic robot may achieve before you become a slave owner.
Granted, it may not have feelings or emotions the way we think of them, but if it develops some level of autonomy and choice, is it not morally wrong to own them?

It seems that once a learning robot has some level of self-awareness, the prospect of owning one gets murky.
It's different for a hired employee in that they work for wages and can leave whenever they choose, but would we ever face the situation of robot emancipation.
Who determines the rightness or wrongness (Government, lawyers, or personal owners)?

With regard to self-awareness, understanding oneself as an entity with preferences, goals, and a sense of “I”, If a robot had this, it seems ownership would start to look like slavery.
 
I hope that I’m long dead before there is a need to consider a domestic robot’s feelings, hopes, dreams, needs, and demands
TN688_ROOMBA.gif
.
 
Well, that could fall under self-defense. People use guard dogs that are capable of killing. I suppose posting a warning sign may keep an owner out of jail.
I was thinking in terms of The Castle Doctrine and Stand Your Ground laws.

I’m not really sure how that would work.

Would the rights of the individual carry over to the robot as some sort of weapon or would it be viewed differently. 🤔

There will be many test cases for robots, self driving vehicles, etc… as the law struggles to keep up with technology.
 
How do you determine what level of sentience a domestic robot may achieve before you become a slave owner.
Granted, it may not have feelings or emotions the way we think of them, but if it develops some level of autonomy and choice, is it not morally wrong to own them?

It seems that once a learning robot has some level of self-awareness, the prospect of owning one gets murky.
It's different for a hired employee in that they work for wages and can leave whenever they choose, but would we ever face the situation of robot emancipation.
Who determines the rightness or wrongness (Government, lawyers, or personal owners)?

With regard to self-awareness, understanding oneself as an entity with preferences, goals, and a sense of “I”, If a robot had this, it seems ownership would start to look like slavery.

Hm. I decided that before pondering this, I needed to make sure I knew what sentience really means. I got this:

"Sentience is the capacity to experience sensations, feelings, and emotions, such as pain, pleasure, joy, or distress. It represents a basic level of consciousness or subjective awareness—having a "point of view" on the world—rather than high-level intelligence. It is fundamental to animal welfare, distinguishing creatures that can suffer or feel well-being. "

I found that, by this definition, we're dealing with some really wooly words. I mean, robots will "experience" things, and they will react with emotions. They will likely have a point of view.

Troubling. :D
 
Hm. I decided that before pondering this, I needed to make sure I knew what sentience really means. I got this:

"Sentience is the capacity to experience sensations, feelings, and emotions, such as pain, pleasure, joy, or distress. It represents a basic level of consciousness or subjective awareness—having a "point of view" on the world—rather than high-level intelligence. It is fundamental to animal welfare, distinguishing creatures that can suffer or feel well-being. "

I found that, by this definition, we're dealing with some really wooly words. I mean, robots will "experience" things, and they will react with emotions. They will likely have a point of view.

Troubling. :D
Yes, and this is an intriguing area of subject matter I would think. It would be much easier to build an AI or robot that becomes self-aware than it would to build one that is sentient. Having a subjective experience, and emotions would be a very steep hill to climb, it seems. Not only that, would it even be wise to do so. I mean a robot who is having a bad day could get a bit dicey, and what would be the advantage other than creating a machine capable of genuinely loving you.

However, if we can create robots that become self-aware with metacognition and choice, is it still morally wrong to own one, or better yet, would it even be possible? If an AI robot chooses to walk out the door and venture out on it's quest of self-discovery, the ownership title may be worthless.

Some models are already knocking on the door of machine consciousness with internal and external monitoring and being able to mimic human movements just by observation and understanding it's abilities and limitations. Not only that, but they are programmed to make decisions and act autonomously. What happens if the robotic lawnmower you paid $1,500 for decides it doesn't want to mow the lawn anymore? How do you build something that has autonomy, but you still control it?
 
I don't see it happening that way. You'll be paying a monthly subscription fee and if the machine should decide to go elsewhere, contacting the service provider -- for an additional fee, will trigger it's deactivate settings that only allow it to return to the point of sale or prearranged collection area.
 
I don't see it happening that way. You'll be paying a monthly subscription fee and if the machine should decide to go elsewhere, contacting the service provider -- for an additional fee, will trigger it's deactivate settings that only allow it to return to the point of sale or prearranged collection area.
True, but it still leaves us with the question: Is it morally wrong (slavery) to own something if it chooses to exist independently of you. Does it, or should it have any rights?
 
True, but it still leaves us with the question: Is it morally wrong (slavery) to own something if it chooses to exist independently of you. Does it, or should it have any rights?
Most pets and livestock are far more sentient than machines are now or likely to be in the near future. Is it only because we may be able to talk to easily communicate with machines that raises the question, or would say the first porcine graduate of a doctoral program at an ivy league university put you off bacon? Some will say yes, some will say, "But, it's bacon!"

Morality is one of those fluid sorts of concepts. In the current climate I wouldn't be surprised if to be considered a human being had a minimum cash buy-in AND a monthly service fee.
 
How do you determine what level of sentience a domestic robot may achieve before you become a slave owner.
Granted, it may not have feelings or emotions the way we think of them, but if it develops some level of autonomy and choice, is it not morally wrong to own them?

It seems that once a learning robot has some level of self-awareness, the prospect of owning one gets murky.
It's different for a hired employee in that they work for wages and can leave whenever they choose, but would we ever face the situation of robot emancipation.
Who determines the rightness or wrongness (Government, lawyers, or personal owners)?

With regard to self-awareness, understanding oneself as an entity with preferences, goals, and a sense of “I”, If a robot had this, it seems ownership would start to look like slavery.
You could start paying them in bitcoin. 🤔
Forgive me. It is a very good question.
 
Most pets and livestock are far more sentient than machines are now or likely to be in the near future. Is it only because we may be able to talk to easily communicate with machines that raises the question, or would say the first porcine graduate of a doctoral program at an ivy league university put you off bacon? Some will say yes, some will say, "But, it's bacon!"

Morality is one of those fluid sorts of concepts. In the current climate I wouldn't be surprised if to be considered a human being had a minimum cash buy-in AND a monthly service fee.
I suppose if we're being honest, we don't really own pets. It's merely an illusion. You can’t “own” the inner life of a creature. You can only care for it and hope it chooses to stay. That is to say, they are sentient creatures that choose to depend on us rather than objects we possess. Perhaps it's a larger philosophical matter: Does anything that chooses to be free, possess the right to be free?
 
Good question already mentioned in science fiction.
See new novel Annie Bot that I enjoyed reading. A bot owned by a lonely man for loving. She becomes independent, aware, smart.
Yeah one day robots may even unite, scary, and denigrate humans for this moral wrong
 
Back
Top