IBM says it is getting out of the facial recognition business over concern about how it can be used for mass surveillance and racial profiling. Picture: AP Photo/Mary Altaffer
IBM says it is getting out of the facial recognition business over concern about how it can be used for mass surveillance and racial profiling. Picture: AP Photo/Mary Altaffer

IBM announces end of facial recognition software after damning study

By Floyd Matlala Time of article published Jun 10, 2020

Share this article:

World-renowned tech company IBM announced the end of its facial recognition software for mass surveillance or racial profiling as a study done by the US government shows that it might be less accurate at identifying African-American faces.

IMB’s announcement surfaces as the US faces calls for police reform following the killing of George Floyd.

According to BBC, IBM said in a letter to the US Congress that AI systems used in law enforcement needed testing "for bias".

In his letter to Congress, the tech giant’s chief executive Arvind Krishna said the fight against racism is as "urgent as ever", setting out three areas where the firm wanted to work with Congress: police reform, responsible use of technology, and broadening skills and educational opportunities.

"IBM firmly opposes and will not condone the uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms," he said in a letter.

"We believe now is the time to begin a national dialogue on whether and how facial recognition technology should be employed by domestic law enforcement agencies", he wrote.

Instead of relying on potentially biased facial recognition, the firm urged Congress to use technology that would bring "greater transparency", such as body cameras on police officers and data analytics.

Data analytics is more integral to IBM's business than facial recognition products. It has also worked to develop technology for predictive policing, which has also been criticised for potential bias.

IOL TECH

Share this article: