6.2 C
London
Monday, November 28, 2022

Peers challenge police use of artificial intelligence

Must read

NSW government backs biotech startup Inventia for new manufacturing hub in Sydney

Deep-tech biomedical research startup Inventia Life Science is building a new laboratory, R&D and manufacturing hub in Sydney with support from the NSW government. The...

Gran Turismo chief says developers are ‘looking into’ a PC port

Gran Turismo creator Kazunori Yamauchi isn't ruling out a PC port for the long-running racing sim franchise. In with an interview GTPlanet ...

How to Start a Blue Moon Estate Sales Franchise in 2023

frederique wacquier | Getty Images Blue Moon Estate Sales was founded in 2009 with a mission to set new standards in an unregulated industry....

Megan Stalter will be featured as the host of “Snacks vs. Chef”

In Snacks vs. Chef chefs compete to create some of the most popular snacks of all time. To win the $50,000 grand...
Shreya Christinahttps://londonbusinessblog.com
Shreya has been with londonbusinessblog.com for 3 years, writing copy for client websites, blog posts, EDMs and other mediums to engage readers and encourage action. By collaborating with clients, our SEO manager and the wider londonbusinessblog.com team, Shreya seeks to understand an audience before creating memorable, persuasive copy.
L

aw enforcement agencies’ use of artificial intelligence and facial recognition technology are not subject to proper oversight and risk exacerbating discrimination, peers have warned.

New technologies were being created in a “new Wild West” without the law and public awareness keeping up with developments, a parliamentary committee said.

The Lords Justice and Home Affairs Committee warned that the lack of oversight meant “users are in effect making it up as they go along”.

A board detailing facial recognition technology in use in Leicester Square, London (Kirsty O’Connor/PA) / PA Archive

The cross-party group said AI had the potential to improve people’s lives but could have “serious implications” for human rights and civil liberties in the justice system.

“Algorithms are being used to improve crime detection, aid the security categorisation of prisoners, streamline entry clearance processes at our borders and generate new insights that feed into the entire criminal justice pipeline,” the peers said.

Scrutiny was not happening to ensure new tools were “safe, necessary, proportionate and effective”.

“Instead, we uncovered a landscape, a new Wild West, in which new technologies are developing at a pace that public awareness, government and legislation have not kept up with.”

This is really pernicious. We are looking at high-volume data that is mostly about poor people, and we are turning it into prediction tools about poor people

Police forces and other agencies were buying equipment in a “worryingly opaque” market, with details of how systems work kept secret due to firms’ insistence on commercial confidentiality.

The peers also had concerns about AI being used in “predictive policing” – forecasting crime before it happened.

There was a danger it could make problems of discrimination worse by embedding in algorithms the “human bias” contained in the original data.

Professor Karen Yeung, an expert in law, ethics and informatics at the University of Birmingham, told the committee that “criminal risk assessment” tools were not focused on white-collar crimes such as insider trading, due to the lack of data, but were instead focused on the kind of crimes for which there was more information.

Without proper safeguards, advanced technologies may affect human rights, undermine the fairness of trials, worsen inequalities and weaken the rule of law. The tools available must be fit for purpose, and not be used unchecked

Prof Yeung said: “This is really pernicious. We are looking at high-volume data that is mostly about poor people, and we are turning it into prediction tools about poor people.

“We are leaving whole swathes of society untouched by those tools.”

The peers called for a mandatory register of algorithms used in criminal justice tools, a national body to set standards and certify new technology and new local ethics committees to oversee its use.

Baroness Hamwee, the Liberal Democrat chairwoman of the committee, said: “What would it be like to be convicted and imprisoned on the basis of AI which you don’t understand and which you can’t challenge?

“Without proper safeguards, advanced technologies may affect human rights, undermine the fairness of trials, worsen inequalities and weaken the rule of law. The tools available must be fit for purpose, and not be used unchecked.”

More articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest article

NSW government backs biotech startup Inventia for new manufacturing hub in Sydney

Deep-tech biomedical research startup Inventia Life Science is building a new laboratory, R&D and manufacturing hub in Sydney with support from the NSW government. The...

Gran Turismo chief says developers are ‘looking into’ a PC port

Gran Turismo creator Kazunori Yamauchi isn't ruling out a PC port for the long-running racing sim franchise. In with an interview GTPlanet ...

How to Start a Blue Moon Estate Sales Franchise in 2023

frederique wacquier | Getty Images Blue Moon Estate Sales was founded in 2009 with a mission to set new standards in an unregulated industry....

Megan Stalter will be featured as the host of “Snacks vs. Chef”

In Snacks vs. Chef chefs compete to create some of the most popular snacks of all time. To win the $50,000 grand...

According to the Road to Recovery report, more than 50% of micro-enterprises lacked mechanisms to cope with the impact of Covid

The survey among MSMEs during and after the lockdown period takes a closer look at the major problems faced by entrepreneurs. This survey...