In late April, the Consumer Financial Protection Bureau, along with three other federal regulatory agencies, released a joint statement outlining a commitment to enforce their respective laws and regulations with a connection to artificial intelligence.

On Tuesday, the CFPB built on the pledge previously mentioned with the Civil Rights Division of the Department of Justice, the Federal Trade Commission, and the U.S. Equal Employment Opportunity Commission that the bureau will create rules to curb “harmful data broker practices.”

Speaking at an event convened by White House National Economic Council director Lael Brainard and White House Office of Science and Technology policy director Arati Prabhakar, CFPB director Rohit Chopra outlined the bureau’s regulatory intentions.

To ensure that modern-day data companies assembling profiles about us are meeting the requirements under the Fair Credit Reporting Act, Chopra said the CFPB will be developing rules to prevent “misuse and abuse” by data brokers.

Chopra explained that rules under consideration will define a data broker that sells certain types of consumer data as a “consumer reporting agency” to better reflect today’s market realities. He said the CFPB is also considering a proposal that would generally treat a data broker’s sale of data regarding, for example, a consumer’s payment history, income, and criminal records as a consumer report, because that type of data is typically used for credit, employment, and certain other determinations.

This rule would trigger requirements for ensuring accuracy and handling disputes of inaccurate information, as well as prohibit misuse, according to Chopra.

Chopra also mentioned a second proposal under consideration will address “confusion” around whether a “credit header data” is a consumer report. Chopra said that much of the current data broker market runs on personally identifying information taken from traditional credit reports generated by Equifax, Experian, and TransUnion.

Chopra pointed out that the key identifiers like name, date of birth, and Social Security Number are contained in consumer reports generated by the credit reporting companies.

He said the CFPB expects to propose to clarify the extent to which credit header data constitutes a consumer report, reducing the ability of credit reporting companies to impermissibly disclose sensitive contact information that can be used to identify people who don’t wish to be contacted, such as domestic violence survivors.

“Today, artificial intelligence and other predictive decision-making increasingly relies on ingesting massive amounts of data about our daily lives. This creates financial incentives for even more data surveillance. This also has big implications when it comes to critical decisions, like whether or not we will be interviewed for a job or get approved for a bank account or loan. It’s critical that there’s some accountability when it comes to misuse or abuse of our private information and activities,” Chopra said.

“The Consumer Financial Protection Bureau is pleased to be part of an all-of-government effort to tackle the risks associated with AI. After conducting an inquiry into the practices of data brokers in the surveillance industry, we have decided to launch a rulemaking to ensure that modern-day digital data brokers are not misusing or abusing our sensitive data,” he continued.

Chopra indicated the CFPB will publish an outline of proposals and alternatives under consideration for a proposed rule next month. Then the rule will be available for public comment in 2024, according to the CFPB director.

“We look forward to obtaining public input on the proposals under consideration. More importantly, as AI increases the processing of sensitive personal data, we hope this will bring much-needed accountability to the dark corners of the data broker market,” Chopra went on to say.