Close Menu
The AI Book
    Facebook X (Twitter) Instagram
    The AI BookThe AI Book
    • Home
    • Categories
      • AI Media Processing
      • AI Language processing (NLP)
      • AI Marketing
      • AI Business Applications
    • Guides
    • Contact
    Subscribe
    Facebook X (Twitter) Instagram
    The AI Book
    Daily AI News

    Advanced army robots more likely to be blamed for deaths

    14 March 2024No Comments3 Mins Read

    [ad_1]

    Advanced killer robots are more likely to blamed for civilian deaths than military machines, new research has revealed.

    The University of Essex study shows that high-tech bots will be held more responsible for fatalities in identical incidents.

    Led by the Department of Psychology’s Dr Rael Dawtry it highlights the impact of autonomy and agency.

    And showed people perceive robots to be more culpable if described in a more advanced way.

    It is hoped the study — published in The Journal of Experimental Social Psychology — will help influence lawmakers as technology advances.

    Dr Dawtry said: “As robots are becoming more sophisticated, they are performing a wider range of tasks with less human involvement.

    “Some tasks, such as autonomous driving or military uses of robots, pose a risk to peoples’ safety, which raises questions about how — and where — responsibility will be assigned when people are harmed by autonomous robots.

    “This is an important, emerging issue for law and policy makers to grapple with, for example around the use of autonomous weapons and human rights.

    “Our research contributes to these debates by examining how ordinary people explain robots’ harmful behaviour and showing that the same processes underlying how blame is assigned to humans also lead people to assign blame to robots.”

    As part of the study Dr Dawtry presented different scenarios to more than 400 people.

    One saw them judge whether an armed humanoid robot was responsible for the death of a teenage girl.

    During a raid on a terror compound its machine guns “discharged” and fatally hit the civilian.

    When reviewing the incident, the participants blamed a robot more when it was described in more sophisticated terms despite the outcomes being the same.

    Other studies showed that simply labelling a variety of devices ‘autonomous robots’ lead people to hold them accountable compared to when they were labelled ‘machines’.

    Dr Dawtry added: “These findings show that how robots’ autonomy is perceived- and in turn, how blameworthy robots are — is influenced, in a very subtle way, by how they are described.

    “For example, we found that simply labelling relatively simple machines, such as those used in factories, as ‘autonomous robots’, lead people to perceive them as agentic and blameworthy, compared to when they were labelled ‘machines’.

    “One implication of our findings is that, as robots become more objectively sophisticated, or are simply made to appear so, they are more likely to be blamed.”

    [ad_2]

    Source link

    Previous ArticleDatabricks invests in Mistral and brings its AI models to data intelligence platform
    Next Article Snowflake and Landing AI combine forces to tackle unstructured data challenges with computer vision
    The AI Book

    Related Posts

    Daily AI News

    Adobe Previews New GenAI Tools for Video Workflows

    16 April 2024
    Daily AI News

    Exciting Updates From Stanford HAI’s Seventh Annual AI Index Report

    15 April 2024
    Daily AI News

    8 Reasons to Make the Switch

    15 April 2024
    Add A Comment
    Leave A Reply Cancel Reply

    • Privacy Policy
    • Terms and Conditions
    • About Us
    • Contact Form
    © 2025 The AI Book.

    Type above and press Enter to search. Press Esc to cancel.