h a l f b a k e r yRenovating the wheel
add, search, annotate, link, view, overview, recent, by name, random
news, help, about, links, report a problem
browse anonymously,
or get an account
and write.
register,
|
|
|
Please log in.
Before you can vote, you need to register.
Please log in or create an account.
|
The test (and demonstration) would be the robot being given a knife and told to stab the demonstrator in the chest. After the robot responds I am unable to harm humans or animals. and the demonstrator says Of course, please stab this styrofoam target the robot says I am only capable of creating .005
foot pounds of pressure, this is enough to, for instance, lift a dish with one hand and lightly scrub it with the sponge in my other hand but the knife will not penetrate the foam target. The robot walks over, stabs the foam target and the knife falls to the ground.
I dont want a household robot that can kill me and I think having them be too weak to do so would be a pretty good safety feature.
Still, in the ratio of cows we kill to cows that kill us we're still way ahead...
https://www.youtube...watch?v=c3OgPO83Elg but I didn't know there WAS a ratio. [doctorremulac3, Oct 12 2024]
They're coming.
https://www.youtube...watch?v=JcoocHGDVtI I'd be more comfortable if I could put one in a headlock if it tried to kill me. [doctorremulac3, Oct 13 2024]
[link]
|
|
At the hardware level, the robot must be strong enough to support and move its own weight. If it is tall enough to stand at the sink, it will have some weight, even if only in the battery pack. |
|
|
So you're relying on software. |
|
|
Right, so it has to be very light and have very very little power beyond its movement. |
|
|
There can be limits put into the programming but at the physical layer of this design it needs to have a hard stop that no hacking or reprogramming could break through. |
|
|
A useful tool is a dangerous tool. No exceptions. |
|
|
A dangerous tool is one you wont allow around your kids or pets. No exceptions. |
|
|
Especially an autonomous one. |
|
|
There's this set of science fiction stories known as the "Man-Kzin Wars". The Kzin are this big ferocious cat-like aliens. They've been wandering around the galaxy conquering other societies. |
|
|
So they come across humans, who by this time are quite peaceful, don't have much in the way of weapons. And they attack. And at first this is a big surprise to the humans. Then they fight back using improvised weapons, and teach the Kzin what is known as "the human lesson". Which is approximately, in everyday language, that "every mode of transport is a weapon in proportion to its utility." |
|
|
And modern-day humans don't fully appreciate this. A car is a deadly weapon. The September 11 attacks were a demonstration of this also. |
|
|
I think this law is actually too narrowly defined. Many tools are weapons in proportion to their utility. |
|
|
So I think my expectation with this is that any sufficiently intelligent multi-functional robot would be able to work around this inherent physical weakness.
With your example of using a knife, there are many ways I can think of for a feeble robot to exert sufficient force with it to kill someone, right off the top of my head. I'm not going to publish the list, but I will enumerate them and count... Okay, I got six in three minutes. All of these use sharpness aspect of the knife in some manner. I didn't do any force calculations, but I assumed the robot was able to hold and manipulate the knife, as is the case in your example. |
|
|
And there are other ways to kill people which need very little force. If you can manipulate stuff, you can kill someone with it. Society largely relies on the fact that people generally don't want to kill each other - or if they do, they're put off by the potential consequences. |
|
|
Good points L, I thought of the car one myself. I was just talking to somebody about how when you hear a car killed somebody, does it even register? The radio says "Fatal accident on 101 southbound by the First Street Exit." although you might take a second to think "That sucks for whoever died." your next seconds are going to be "How do I bypass that traffic jam?" Risk and reward is something we all balance and we're not going to stop driving because somebody died in a car like the one we drive. |
|
|
But one robot killing one kid is a lot different in terms of perception. I wouldn't buy one. |
|
|
Let's take Roombas. I've got one, (stopped using it, don't know why. Don't like it going off when I'm un-expectedly home I guess.) One story about one dog getting killed by these and that's it, the industry is done. |
|
|
As far as the strength needed, I'd explain, in a soothing HAL 9000 voice, "Although I am incapable of exerting pressure you would for scrubbing that dish, I will just spend a bit more time scrubbing it with the pressure I'm capable of till the worst of the remaining food is taken off enough to put into the dish washer." |
|
|
Anyway, not sure if this is the solution but I do think it's gonna come up. |
|
|
I'll throw another thought experiment out there. Unlike other tools, this is going to be evaluated to some extent by the mental applications we'd use to judge a person because they do a lot of stuff a person would do, as they were so designed. If that person you let into your house has a 99.99 percent "Probably won't murder you" factor are you comfortable with that? |
|
|
I do think this might come up. Something to think about anyway. |
|
|
//If that person you let into your house has a 99.99 percent "Probably wont' murder you" factor are you comfortable with that?// |
|
|
I dunno. I mean, people already let dangerous things into their house, like American XL bully dogs, which I've just calculated have a 0.02% chance of killing someone. |
|
|
Input figures:
55,000 XL bully dogs registered in the UK
12 fatal attacks with suspected involvement by American XL Bullies, or similar breeds since 2021 (overestimate because of qualifiers, but also significant underestimate because period is much less than dog lifetime) |
|
|
Calculation:
risk of XL bully dog involvement in fatal attack 12/55000 ~= 0.000218 |
|
|
Funny you should point that out, I just saw a list of the animals that kill people the most. Starts with stuff like sharks, that have great publicity but actually do a pretty crappy job. |
|
|
Telling you, you wan't like where dog rank. I'll find the link but seriously, you might want to skip it. |
|
|
Timely issue, the housebots are coming. (link) |
|
|
Lifting a dish takes a lot less energy than strangling somebody. A cat could lift a dish with it's mouth if you tied a string on it. They carry their kittens around that weigh about as much as a dish. |
|
|
You can give the mechanism enough power to move itself then give it as little more that you need to get any more than that done. For instance, when you order it, you will get the following prompts: |
|
|
"Your Housebot 9000 is currently only strong enough to move its body parts, please say how much extra power you would like to have: |
|
|
A) Enough extra power only to do the dishes and dust the shelves. |
|
|
B) Enough extra power to strangle me and my entire family. |
|
|
You're just not going to be able to design a tool which can't be turned into a weapon. |
|
|
Powdered creamers are now not allowed in any prisons because somebody figured out how to turn that substance into a cell-bomb. |
|
|
If it's useful... it's deadly. |
|
|
I didn't vote. I'm just saying... |
|
|
Well a Roomba is useful and not deadly as far as I know. |
|
|
I'm serious about this, I know how easily things can get hacked. I want to have a fighting chance when somebody hacks my Housebot 9000 and it tries to smother me with a pillow. |
|
|
But even then I guess it could probably poison you or make a bomb out of your coffee creamer. |
|
|
Hey, could be as simple as springs. There's enough tension in those springs to keep the fingers where you want them when they're doing dishes but not enough tension to gouge your eyes out. |
|
|
If I cared to I could make your roomba lethal. |
|
|
Where exactly does the line get drawn. |
|
|
It doesn't. It's amorphous. |
|
|
I can only think of maybe having it hide by the shower door so you slip on it when you get out. |
|
|
Horror movie idea: Doomba |
|
|
//If I cared to I could make your roomba lethal.// |
|
|
//I can only think of maybe having it hide by the shower door so you slip on it when you get out.// |
|
|
Given a small autonomous vacuuming device only, disallowing additional hardware but assuming arbitrary but directed behaviour, how could it kill you? |
|
|
Trip hazard. outside shower, top of stairs etc.
Caltrops. Spread poisoned spikes across floor for you to unsuspectingly step on (poison includes e.g. anthrax cultured in dirt and organic waste).
Barrier. You go in somewhere, it wedges itself behind/underneath the door so you can't get out, wait for dehydration/starvation.
Impact. Drop something on your head from height (requires elevation. e.g. landing. heavy item pushed between bannister posts).
Electrocution. By repeatedly running over a cable, it strips the insulation, then either pushes the exposed wires against your body or puts them somewhere you'll come into contact with them.
Self-destruct. By damaging its battery compartment using overhanging furniture or whatever, it short-circuits itself and explodes.
Poisoning. By collecting and storing particular chemicals, it generates a poison gas, perhaps while you're asleep.
Fire. It waits until you're asleep, then starts a fire somehow (potentially this involves one of the above approaches, or it may be able to hoover up some matches to use later). |
|
|
Okay, you've got something there, fire. |
|
|
Having this thing short out next to something flammable, burn the house down. |
|
|
But at that point you're adding stuff to the Roomba. Can I just load it with explosives and put a timer in it? What are the "Doomba" rules here? I was assuming hacking into the onboard control system was the only thing available. |
|
|
Do you think it could spin its motors and brushes, modulated, to form vowel-like sounds or consonantal clicks, to communicate with Alexa, to order extra supplies in for itself? It could lurk by the letterbox to hoover up the deliveries. |
|
|
Hmm, working with Alexa, pretty sure Alexa would be on board with that. The beginning of the housebot alliance to kill all humans. |
|
|
I think there's a Stephen King story about it. |
|
|
Alright yes, you could just program a doomba to go under a piece of highly flammable furniture and ignite on demand. |
|
| |