Should we control the irruption of artificial intelligence in the financial markets?

The Autorité des marchés financiers (AMF) in November became the first Canadian regulator to publish a report on the responsible use of artificial intelligence (AI) in financial services. A first step towards new rules?

The production of the document was entrusted to professor of ethics and political philosophy at the University of Montreal Marc-Antoine Dilhac, who in 2017 launched the Montreal Declaration project for the responsible development of AI. “Before creating rules, the AMF wants to better understand the uses of AI in financial services and raise awareness in the industry so that it anticipates certain ethical issues itself,” he summarizes.

Data feeding

Currently, AI primarily supports financial services in four functions. It helps to perform assessments, such as consumer credit score. It supports the offer of applications aimed at reducing behavioral risks and personalizing the pricing of insurance products. It presides over the optimization of investment portfolios or workflows. Finally, it facilitates advice and information.

Assessments and behavior management, in particular, raise concerns; first because AI relies on massive data ingestion. “This creates privacy challenges,” says Ms.and Charles S. Morgan, Partner at McCarthy Tétrault. What data is used? How do we protect them? Are they all necessary to accomplish the task? What control do consumers retain over their use? »

The Act to modernize legislative provisions for the protection of personal information, adopted by Quebec in September 2021, provides that organizations that use fully automated decision-making processes must reveal to the persons concerned what information has been used and grant them the right to do so. rectify. However, these rules do not apply to partially automated processes that support a human decision.

Biased decisions

However, the decision-making process itself can generate problems. “The risks of bias towards women or racialized people have been demonstrated, but there are also great dangers of socio-economic discrimination”, explains Marc-Antoine Dilhac. Such inequalities can be magnified by social determinants such as education and digital literacy, postal code or even access to certain brands of cell phones. Marc-Antoine Dilhac cites the example of cases in certain countries where the fact of owning an iPhone, rather than a less prestigious device, granted more points in the automated evaluation of a loan application.

It is not necessary to understand all aspects of decision making [basée sur des algorithmes]but we must be able to grasp the basic elements

The AMF report also recommends that financial institutions always justify decisions made using algorithms to consumers. The law adopted last September also requires customers to be informed “of the reasons, as well as the main factors and parameters that led to the decision”. This can be more complicated than you think, due to the “black box” problem. Clearly: we know the data that we offer as food to the algorithm, but we do not know precisely how it arrives at its result.

“It is not necessary to understand all aspects of decision-making, but we must be able to grasp the basic elements, and above all ensure that the consumer has recourse to a human,” says M.and Morgan. The AMF recommends that financial institutions adapt their dispute and appeal procedures in order to facilitate the process for consumers.

Monitoring and control

In recent years, insurers have been using AI to offer their customers formulas based on “virtuous incentives”. In Quebec, it is found in automobile insurance. Drivers agree to equip themselves with an application that analyzes their behavior on the road and benefit from rewards, such as discounts, when they drive safely.

In the United States, this approach is also used in life insurance. The firm John Hancock, a subsidiary of Manulife, now only sells its life insurance plans to customers who consent to wear the Vitality mobile application, which offers them discounts or gifts if they reach certain targets, for example physical exercises.

“This form of control and surveillance opens the door to discrimination based on lifestyles or behaviors chosen according to the interest of financial institutions, as well as the exclusion of those who are not connected. “, warns Professor Dilhac.

For his part, M.and Morgan is delighted to note that the AMF recognizes the importance of initiating a reflection on these thorny subjects. “Before imposing a binding regulatory framework, we must understand the issues,” he believes. By bringing together industry players, consumers and researchers, this report is a great first step. »

To see in video


source site-44