hit tracker
prime news list

all information about tech and other

Will a robot get my boss?

Will a robot get my boss?


request for help:

I am concerned that law enforcement agencies are increasingly using robots to neutralize threats, surveillance and hostage situations. Maybe I just saw it RoboCop often as well, but I am wary of machines that make decisions with survival or death, especially with the frequency with which real male officials often use their authority. Do I have a moral obligation to submit to a police robot?

“SUSPECTED.”

Dear suspect-

Hollywood has not been particularly optimistic with robots in positions of control. RoboCop it’s just an example of the broader canon of science fiction, the dire consequences of leaving critical tasks in mind on flexible machines – accustomed to literalism that can turn deadly into main directives that can explode to death but be confused by a set of ladders. The message of these films is clear: rigid automatons are not capable of the improvised solutions and moral nuances that are often required in times of crisis.

Perhaps this stereotype is fueled by Boston Dynamics, which is introducing some robots to police departments, which prompted the release of a video of models dancing to the successful Contours ’“ Do You Love Me ”in the late 1950s. Maybe you saw it? Among the robots were Atlas, an android resembling a soldier in a devastating storm, and Spot, who was the inspiration for the killer puppies in the “Metalhead” episode. Black Mirror. It seems that neither machine is designed to dispel fears about owning a robot, so what better way than to show people love for its agility? And what makes the test of agility lighter than the skill it considers so human, that we invented a movement designed to make fun of the inability to make an automaton (Robot)? When we see machines shuffle, shake, and rotate, it’s hard to see them as living, embodied creatures with the same flexibility and sensitivity as us.

It doesn’t matter if Spot’s joints can cut his finger or if police robots have already been used to use lethal force. The suspicion could be a way of answering your question, without any recourse to moral philosophy, based on pragmatic conclusions. Like most of us, if you plan to survive, yes, you should fully obey a police robot.

But I guess that’s not your practical question. And I agree that it is important to make up for the compensation that comes with handing over police duties to machines. The Boston Dynamics video, meanwhile, was released in late 2020 to “celebrate the beginning of what we hope will be a happier year.” A week later, the insurgents stormed the Capitol, and images of police showing little resistance to the crowd proliferated – photos striking on social media against tougher responses to protests against the Black Life issue last summer.

At a time when many police departments are suffering from a crisis of command due to racial violence, the most compelling argument for robotic policing is that machines do not have an inherent capacity to harm. For a robot it is a human person, regardless of skin color, gender or reason. As the White House noted in a 2016 report on algorithms and civil rights, new technologies “can help make decisions based on factors and variables related to risk empirically enforcing law enforcement, rather than human instincts and prejudices with flaws.”

Of course, if there is evidence of current police technology, things are not so simple. The predictive police algorithms used to identify high-risk people and neighborhoods are highly biased, which robotist Ayanna Howards has called “the original sin of AI”. Since these systems are based on historical data (past court cases, previous arrests), they ultimately select the same unjustly targeted communities and reinforce structural racism. Automated predictions are self-fulfilling by locking certain quadrants into an excessive policy pattern. (Officials who arrive at the crime scene are ready to discover one.) These tools, in other words, do not neutralize prejudices as much as they formalize prejudices, turning existing social inequalities into systems that unconsciously and mechanically perpetuate. them. Digital ethics professor Kevin Macnish warns that the values ​​of the authors of the algorithm are “frozen in code, effectively institutionalizing those values.”



Source link

admin

Leave a Reply

Your email address will not be published. Required fields are marked *