Published in AI

AI Bill of Rights shows Big Tech’s teeth

by on05 October 2022

White House powerless to stop it

Attempts by the White House to bring in a bill of rights for the age of algorithms has shown how weak elected governments are over the money and power of Big Tech.

White House Office of Science and Technology Policy (OSTP) released the Blueprint for an AI Bill of Rights, after gathering input from companies like Microsoft and Palantir as well as AI auditing startups, human rights groups, and the general public. It looks rather good on paper.

Its five principles state that people have a right to control how their data is used, to opt out of automated decision-making, to live free from ineffective or unsafe algorithms, to know when AI is making a decision about them, and to not be discriminated against by unfair algorithms.

However, unlike the better known US Bill of Rights, which comprises the first 10 amendments to the constitution, the AI version will not have the force of law—it’s a nonbinding white paper. That means that Big Tech can and probably will ignore it.

The OSTP said that its Blueprint for an AI Bill of Rights is “just the beginning and the start.”

University of Wisconsin-Madison Annette Zimmermann told Wired that while she is impressed with the AI Bill of Rights she believes the blueprint shies away from acknowledging that in some cases rectifying injustice can require not using AI at all.

“We can’t articulate a bill of rights without considering non-deployment, the most rights-protecting option,” she says. Zimmerman would also like to see enforceable legal frameworks that can hold people and companies accountable for designing or deploying harmful AI.”

The limited bite of the White House’s AI Bill of Rights stands in contrast to more toothy AI regulation currently under development in the European Union. But that might be because in the EU governments have a little more confidence about telling big corporates what to do.


Last modified on 05 October 2022
Rate this item
(2 votes)

Read more about: