US to Enforce Rights Laws With Respect to AI Harms, Agencies Say – Bloomberg Law

author
1 minute, 26 seconds Read

The Department of Labor, the FTC, and seven other federal agencies are doubling down on enforcement commitments in the face of artificial intelligence, including in civil rights and fair competition.

In a statement Thursday, the agencies pledged to “vigorously use our collective authorities to protect individuals’ rights regardless of whether legal violations occur through traditional means or advanced technologies.”

The statement was signed by officials from the Consumer Financial Protection Bureau, the Federal Trade Commission, the Equal Employment Opportunity Commission, and the departments of Justice, Housing and Urban Development, Education, Health and Human Services, Homeland Security, and Labor.

Companies increasingly rely on artificial intelligence to make decisions about individuals, from choosing job applicants to setting mortgage rates. But biases can creep into models through bad data, opacity in how the models perform, and incorrect use of AI tools, the statement said—and they can cause widespread harms when systems are deployed broadly.

“Although many of these tools offer the promise of advancement, their use also has the potential to perpetuate unlawful bias, automate unlawful discrimination, and produce other harmful outcomes,” the agencies said.

The FTC has already taken action on AI-related infractions, including banning Rite Aid’s use of facial recognition technology to catch shoplifters—which the FTC said incorrectly flagged many women and people of color. The Labor Department’s Office of Federal Contract Compliance Programs has said it will look at adverse impacts from AI selection tools. And agencies including CFPB, the EEOC, and HHS are releasing guidance on how existing laws apply to AI.

Thursday’s statement follows a similar one released last April from the FTC, EEOC, CFPB, and DOJ warning companies that the agencies would enforce existing law even when AI was involved.

This post was originally published on this site

Similar Posts