
The debate over racial bias in tech has been renewed as a university in America claims it can “predict criminality” through facial recognition.
Researchers at Harrisburg University claim that they can “predict if someone is a criminal based solely on a picture of their face” through software “intended to help law enforcement prevent crime”.
One member from the Harrisburg research team in particular claimed that “Identifying the criminality of [a] person from their facial image will enable a significant advantage for law-enforcement agencies and other intelligence agencies to prevent crime from occurring.”
The university have said that this research would be included in a Springer Nature book, however, Springer have stated that this was “at no time” accepted, claiming that the research “went through a thorough peer preview process. The series editor’s decision to reject the final paper was made on Tuesday 16 June and was officially communicated to the authors on Monday 22 June”
Whilst the Harrisburg researchers claim their technology holds “no racial bias” through its operations, there has still been considerable backlash from this research - 1,700 academics signing an open letter demanding this research stays unpublished.

The Coalition for Critical Technology, and organisers of this open letter, have stated that “Such claims are based on unsound scientific premises, research, and methods, which numerous studies spanning our respective disciplines have debunked over the years.” and that “all publishers must refrain from publishing similar studies in the future”.
The group have raised attention to the distorted data that feeds this perception of what a criminal “looks like”, pointing to a number of studies that suggests harsher treatment for ethnic minorities throughout the criminal justice system.
Computer-science researcher at Cambridge University Krittika D’Silva commented: “It is irresponsible for anyone to think they can predict criminality based solely on a picture of a person’s face.”
“The implications of this are that crime ‘prediction’ software can do serious harm – and it is important that researchers and policymakers take these issues seriously”
D’Silva also points to the number of studies revealing machine-learning to hold various different biases “Numerous studies have shown that machine-learning algorithms, in particular face-recognition software, have racial, gendered, and age biases”
Harrisburg University have decided not to publish this paper on the facial recognition software, stating the news release that outlined the research, titled “A Deep Neural Network Model to Predict Criminality Using Image Processing” has been removed at the involved faculty’s request, and further that publication the research was going to appear in has since decided against this. The university state on their website:
“Academic freedom is a universally acknowledged principle that has contributed to many of the world’s most profound discoveries. This University supports the right and responsibility of university faculty to conduct research and engage in intellectual discourse, including those ideas that can be viewed from different ethical perspectives. All research conducted at the University does not necessarily reflect the views and goals of the University.”

Optalitix brings enhancements to underwriters to boost underwriting performance and growth
The introduction of Optalitix Quote’s new Flow technology enables flexible underwriting workflows thanks to the new Risk Vision dashboards. Learn more now.

Optalitix partners with Quantee to offer next generation underwriting and pricing
To offer insurers a ready-to-use pricing product that enables quick access to price retail customers online and via aggregators

Why are Insurers playing a guessing game?
Numerous insurers overlook advanced solutions. Make sure your company leverages valuable insights from its data to enhance user experience with Optalitix today!

How Can AI Enhance Cybersecurity in Business?
Artificial intelligence (AI) is a machine’s ability to perform human-like cognitive functions. Find out how AI is able to enhance cybersecurity in business here.

Underwriters – valued professionals acting as risk gatekeepers
What does an underwriter do on a daily basis? Discover the role underwriters play, how technology plays a key part in the underwriting process and more here.

Seasons Greetings from Optalitix
As the year draws to a close, we wanted to express our heartfelt gratitude for your continued support and collaboration.

Optalitix shortlisted for two awards
In the National Insurance Awards 2023 which showcases excellence in general insurance provision and management.

Case study: Modernising and scaling Dale Underwriting Partners' underwriting process with Optalitix Quote
Optalitix Quote allows underwriters to keep their existing models and make changes and updates when required by providing speed and flexibility.

How our catastrophe reporting portal helped Lloyd's of London collect more accurate data and save 1000's of hours
"The time that has been saved has given us much more freedom” Carey Bond, Head of Claims for the Americas at Lloyd’s
8 ways that data can power your business
Using your data better can help business managers see more, know more and do more. Take a look at beneficial data uses that businesses can adopt in this guide.

2023: A year of insurance disruption
The fluctuations in insurance news for 2023 are unexpected. Some companies are leaving following big losses, while others expect to see profits.

Hampden Risk Partners implements Optalitix Quote as their underwriting workbench
Hampden Risk Partners used Optalitix Quote to grow their capacity at Lloyd's

Optalitix nominated as a finalist in the 2023 Insurance Times awards
Innovation award for Lloyd's reporting portal

The computable game - Part 1
Football stats and insights

Why are Insurers playing a guessing game?
Numerous insurers overlook advanced solutions. Ensure your company leverages valuable insights from its data to enhance the user experience with Optalitix!