A survey performed to test views of the general public about robots has an interesting “finding” about robots: don’t use them to care for children or the elderly – they are made for killing people so let them get on with it! Okay, that’s slightly exaggerated but probably less far from the study report than the study report is from finding a proper answer to any research questions on this subject. It’s fun to talk about robots as machines that will take over the world – not a new phenomenon. But there’s an interesting nugget hidden in the findings, or at least a hypothesis revealed worthy of testing…
A general view is that robotic applications (or any non-mainstream tech or innovation) should be for “the other” and not for my social group. They should perform their military tasks upon “the other” but not perform tasks upon my “in group”.
I’m not claiming that this hypothesis is exactly right, but there’s clearly group psychology at play in people’s responses about these “unknown entities” (not so much unknown, as misunderstood due to most “education” coming from fiction, especially science-fiction).
And to state the obvious (in case you think I missed it), the survey does highlight two already obvious and very useful insights: do not attempt to market robotics to the mass market for human caring applications <as there are no generally known and liked special applications in niche segments that have could the mass imagination yet>, and if you are going to market such a product to such a market be prepared to educate around “robots serve skynet, not mankind” early attitudes, or else select your target group to avoid that wasted effort and get references/testimonials that make it unnecessary to educate subsequent niches.