Experts tell legislators about ‘black box’ AI

Date:

Artificial intelligence has captured the attention of both the public and lawmakers alike, including members of the New Mexico Legislature. 

The Interim Legislative Science, Technology and Telecommunication Committee discussed how to build transparency into artificial intelligence programming regarding public resources and services at their meeting Monday.

The discussion began with a presentation by Santa Fe Institute Professor Cristopher Moore who began by noting that he was going to talk about algorithmic issues rather than generative AI.

“There are various automated methods where as they’re sometimes called, automated decision systems, which both public and private sectors use to make decisions, including decisions about us, the citizens,” Moore said.

In the public sector, these are used for pretrial detention, predictive policing, health care and other social systems to detect fraud and to offer suggestions to case workers and for public housing waiting lists.

In the private sector, these algorithms can be used for automated hiring and credit, lending and tenant screenings.

A term used in AI discussions is black box AI, which is an AI system that does not show its inputs and operations to a user or other interested party; thereby making it lacking in transparency.

Due to its opacity, Moore is against using black box AI technology.

“For a lot of these algorithms, they’re not written in the traditional sense,” Moore said. “It’s not that a human programmer sits down and writes the rules. Typically, they’re given some data and said, ‘Okay, you figure out a rule which would perform well on that data’ and that’s what we call the training process and there are different methods. But I think there’s also other levels of transparency that are important.”

These levels of transparency include when a governmental agency makes a decision about someone, that person has a “right to understand how that decision was made. It’s a matter of due process,” Moore said.

Other issues include how the AI/algorithm was trained. 

“Many of these systems have been trained elsewhere in the country, from data from other states, other cities, and often data that’s several years old,” Moore said.

Due to New Mexico’s unique demographics, including racial and economic demographics, the data an AI system was trained on may not be accurate for New Mexico.

The presentation introduces more questions than answers. Some of the discussion went to what other states are doing in reference to AI transparency.

One of these was Vermont, which passed a law in 2022 which states, among other things that “No state agency shall enter into any contract to purchase, lease, or use a tool unless the vendor discloses enough about the algorithm to make these independent audits possible.”

Share post:

Subscribe

Popular

More like this
Related

What is the winter fuel payment, how much is it and who gets it?

.More than 10 million pensioners are not getting the...

Aquarius Career Horoscope for 2025 predicts financial management | Astrology

Overall Outlook in 20252025 will be a...

KROENKE SPORTS CHARITIES AWARDS BALL ARENA ANNIVERSARY GRANTS TO 25 AREA NONPROFITS

DENVER (Wednesday, December 18, 2024) – Kroenke Sports Charities...