Nathan
SF VIP
- Location
- High Desert- Calif.
Nathan, you pose a huge problem. What will happen when we do create a machine that duplicates the human brain? It's able to think, feel and respond as any human. I don't think humans and human machines will get along. I fear it will be like the Planet of the Apes, only Planet of the humanoid robots.The era of machines becoming sentient is drawing near
There may need to be legally binding standards established really soon, to ensure that developers don't let that genie out of the bottle. However, it may already be too late. Kind of like trying to tell Iran and N.Korea that they can't play with nukes.Nathan, you pose a huge problem. What will happen when we do create a machine that duplicates the human brain? It's able to think, feel and respond as any human. I don't think humans and human machines will get along. I fear it will be like the Planet of the Apes, only Planet of the humanoid robots.
Terminators......Skynet......Nathan, you pose a huge problem. What will happen when we do create a machine that duplicates the human brain? It's able to think, feel and respond as any human. I don't think humans and human machines will get along. I fear it will be like the Planet of the Apes, only Planet of the humanoid robots.
Apparently Steve Wozniak, the co-founder of Apple, has expressed such concerns.There may need to be legally binding standards established really soon, to ensure that developers don't let that genie out of the bottle. However, it may already be too late. Kind of like trying to tell Iran and N.Korea that they can't play with nukes.
Interesting, it seems, if you count drones anyway, that first rule has already been broken.A human person may not be harmed by a robot, nor may a robot put a human being in danger
Perhaps others worry more about this than I do, but I would want to be very careful in trying to set government standards for risks that are very hypothetical at this point in time. The downside of this is that it could hinder development of technology that could be very useful to us.There may need to be legally binding standards established really soon, to ensure that developers don't let that genie out of the bottle.
I was thinking robotics in general, but I do know of lots of uses drones are being put to. A company I used to work for had a whole drone business, none of it military. And I still work on a few things where drones are put to good use. A few examples:The question is what use?
I agree, out and long gone... to do it's best or worst or bothPersonally i think it may already be out.
Possibly, this was addressed in one of my favorite Star Trek: Next Generation episodes, "The Measure of a Man" where some Star Fleet folks wanted to study the AI, Data, quite possibly to the point of dismantling him to see if they could replicate the process. Capt. Picard successfully argued in a judicial hearing that Data was a sentient being.If a machine, named Charlie, is ultimately developed that has the same mental ability as a human. It can talk and respond, has ideas, etc. For all intents, and purposes, Charlie is a "being". Well, if Charlie is so human like does it have any human rights? If a biological being can have "rights", can a mechanical version, with all the same attributes of the biological being not have "rights"? Can Charlie be considered a "person"?