bobcat
Well-known Member
- Location
- Northern Calif
How do you determine what level of sentience a domestic robot may achieve before you become a slave owner.
Granted, it may not have feelings or emotions the way we think of them, but if it develops some level of autonomy and choice, is it not morally wrong to own them?
It seems that once a learning robot has some level of self-awareness, the prospect of owning one gets murky.
It's different for a hired employee in that they work for wages and can leave whenever they choose, but would we ever face the situation of robot emancipation.
Who determines the rightness or wrongness (Government, lawyers, or personal owners)?
With regard to self-awareness, understanding oneself as an entity with preferences, goals, and a sense of “I”, If a robot had this, it seems ownership would start to look like slavery.
Granted, it may not have feelings or emotions the way we think of them, but if it develops some level of autonomy and choice, is it not morally wrong to own them?
It seems that once a learning robot has some level of self-awareness, the prospect of owning one gets murky.
It's different for a hired employee in that they work for wages and can leave whenever they choose, but would we ever face the situation of robot emancipation.
Who determines the rightness or wrongness (Government, lawyers, or personal owners)?
With regard to self-awareness, understanding oneself as an entity with preferences, goals, and a sense of “I”, If a robot had this, it seems ownership would start to look like slavery.