AI Bias is a complex issue, especially in security and building management applications. Eliminating algorithmic bias by itself is not sufficient: There are also concerns that unbiased algorithms can be utilized in ways that introduce bias and compromise privacy.
Actuate is built from the ground up to avoid algorithmic and user bias, respect privacy, and is compliant by design. Here’s how we did it.
How Actuate Addresses Bias and Privacy:
Our Team's Perspective
We regularly publish articles on bias and privacy
We Don’t Have to Sacrifice Students’ Privacy for Campus Security
Colleges and universities face a distinctly modern conundrum: They want and need to keep students safe, but smart security technologies that can track and monitor students’ activities on and off campus threaten their right to privacy. Schools and technology vendors must collaborate to find solutions that increase campus security while also protecting individual privacy.
The False Choice Between Bias, Privacy and Safety in Smart Surveillance
Is there a choice between privacy, bias and safety in surveillance? As artificial intelligence continues to advance, we’ve seen an increased focus on bias and privacy. A 2019 report from the American Civil Liberties Union contains an ominous warning: Artificial intelligence-enabled video surveillance will soon compromise our civil liberties in a dangerous way. We consider this a false choice between privacy and safety in smart surveillance.
Privacy-First Tech Doesn’t Have to Suck
The world is on the cusp of a reckoning about the value and sensitivity of data. In Europe, GDPR is causing companies to think more about how they store and manage data, especially across borders. Even in the United States, long the Wild West of data privacy and regulation, high-profile voices are starting to call for regulation. How should leaders respond?