Ottawa turns to AI to better manage homelessness issues

(Ottawa) Ottawa is the latest city to turn to artificial intelligence (AI) to manage its homelessness crisis.


The city is partnering with a Carleton University researcher who is working on a system that can predict an individual’s risk of chronic homelessness.

The researcher who designed the Ottawa project, Majid Komeili, said the system will predict how many nights individuals will spend in a shelter over six months.

“It will be a tool in the toolbox, ensuring that no one falls through the cracks due to human error. The final decision maker will remain a human being,” he said in an email.

The system will use data about people experiencing homelessness, such as age, gender, Indigenous and citizenship status, as well as factors such as the number of times they have been refused shelter services.

It will also use external data such as weather information and economic indicators.

The Ottawa project follows similar initiatives in London, Ontario, and Los Angeles, California.

The information is available in the first place because homeless people are already “heavily monitored” to receive various benefits or treatments, said Renee Sieber, an associate professor at McGill University.

Unfortunately, homeless people are incredibly monitored and the data is very intrusive.

Renee Sieber, Associate Professor at McGill University

Data may include details about medical appointments, addictions, relapses and HIV status.

Sieber says the question is whether AI is really necessary. “Do you know more about chronic homelessness from AI than from a spreadsheet?”

A matter of time

It was only a matter of time before AI arrived in the field, said Tim Richter, president of the Canadian Alliance to End Homelessness.

While they aren’t widespread, such tools “can probably, to some extent, anticipate who is most likely to end up chronically homeless,” he said. “Using AI to do that could be very useful in targeting interventions to people.”

Most cities do not have enough reliable data to establish such systems, Richter said.

His organization is working with cities across the country, including London and Ottawa, to help collect better information “in real time, specific to each person,” but “in a way that protects their privacy.”

Chronic homelessness means that a person has been without shelter for more than six months or has experienced repeated episodes of homelessness during that time.

About 85 percent of people move in and out of homelessness quickly, while 15 to 20 percent “get stuck,” Richter said.

AI systems should be able to do their job of spotting at-risk individuals by looking at aggregate data at the community level and without knowing the specific identity of the individual involved, Richter said.

This is the approach taken by the Ottawa project. Identifiable information such as names and contact details are replaced by codes.

Mr. Komeili noted that the system uses data that has already been collected in previous years and is not specifically collected to be processed by AI.

Vinh Nguyen, the City of Ottawa’s manager of social policy, research and analysis, said in a statement that any sharing of data collected by the city “is subject to rigorous internal review.”

“The data we share is often aggregated and where this is not possible, all identifiable information is removed to ensure strict anonymity of users,” he said, adding that collaborations with academics must be reviewed by an ethics committee before work on the data takes place.

Nguyen said the city is currently conducting “internal testing and validation” and plans to consult with the shelter sector and clients before implementing the model, with consultations scheduled for late fall.

Beware of prejudices

Alina Turner, co-founder of HelpSeeker, a company that uses AI in products that address social issues, said AI’s “superpowers” can be useful when it comes to a comprehensive analysis of the factors and trends that fuel homelessness.

But her company made a conscious choice not to predict risks at the individual level, she said.

“You can have a lot of problems with bias,” she said, noting that the data varies across different communities and that “the racial bias in that data is also a major challenge.”

For example, due to systemic factors, Indigenous individuals are at higher risk of homelessness.

If an AI system were to automatically assign a higher score to someone once they enter a shelter and identify as Indigenous, “that approach raises a lot of ethical issues,” Ms.me Turner.

Luke Stark, an assistant professor at Western University, is working on a project that explores the use of data and AI for homelessness policy in Canada.

He says new technologies may risk obscuring the root causes of the problem and preventing politicians from thinking about solutions.

“One of our concerns is that all this focus on triage-based solutions then takes the pressure off policymakers to actually look at the structural causes of homelessness that exist in the first place,” he said.

He added that another issue human decision-makers need to think about is how predictions can miss certain segments of the homeless population.

Women are more likely to avoid shelters, for safety reasons, and turn to options such as a friend’s salon.

An AI system using shelter data will focus on “the type of people who are already using the shelter system […] and that leaves a whole bunch of people out.”


source site-61