Minister Bernard Drainville’s education dashboard raises fears.

We now have, in education, a brand new tool: a dashboard. It provides us with information, in real time, on nine categories of data: the number of students; graduation and qualification rates by secondary school cohorts; vocational training graduation rates; the proportion of students entering secondary school late; the results of the ministerial examinations of the 4e and 5e secondary school years; vacancies in school staff; air quality in schools; the structure of the school network.

The reactions were quick. As far as I am concerned, I wonder how we could have done without this data, which is essential for action. And we need even more. We want to know what is going well and what is going wrong. And be able to try to understand why. Many share this opinion. But not everyone.

Very negative, and even very hostile, reactions were observed. They were predictable. They come from people whose a priori rejection of measurement and evidence has long been known and well documented. We saw this again recently with regard to the National Institute of Excellence in Education (INEE).

The minister seems not to pay much attention to these voices. He is right.

But there were other, more nuanced, critical reactions that deserve to be heard. Because the fact is that, in an immensely complex field where facts and values, means and ends inevitably mix, and where countless ideological debates constantly take place, the idea of ​​measuring must be approached with caution.

To agree on this, let’s take a step back from all this.

Measure. Do it wisely.

To begin with, what we choose to measure and how we define it may already indicate questionable ideological choices. It is not the same thing to quantify, say, the number of students and the ability to exercise one’s citizenship well.

Then, the way of measuring itself can be debated. We will not only have to clearly define what we are measuring, but also ensure that said measurement does its job well. In technical terms, it must be valid and reliable. Welcome to the working table to methodologists and statisticians.

Finally, and to go too quickly, there is what we will do with these measurements – which is sometimes, as I said, already perceptible in the choice of the elements measured and the means of doing so. Who will access this data? What consequences will we draw from this? Who will shoot them?

I hope I have given you a glimpse that nothing here is simple.

And it is in particular on the question of a ranking of schools – which the data collected would make it possible to establish – that criticisms, in my opinion legitimate, have arisen.

This ranking could, for example, encourage actors to falsify data in order to appear better or even encourage the competitive race of parents towards more efficient schools and thus increase inequalities in the school system, which are already sadly great.

But I think that we will not cut corners and that with the accessible data, under this name or another, one or even school rankings will be offered here and there.

How to do it accurately and wisely?

A Quebec IVAL…

To do this, we must remember that the composition of the clientele that makes up the schools reflects economic, social and cultural inequalities which have an immense impact on the success of their students. An honest ranking must take this into account, which means, in simple terms, that the modest rate of progress of a school with its students coming from a very economically and culturally advantaged background does not have the same meaning, the same value, than the same rate in a school which receives very disadvantaged students.

However, France, in the indicators that it publishes on the value of its educational establishments, is setting up something which goes in this direction and which is called an index of added value of high schools (IVAL). I think we could design an indicator of this type taking into account the various clienteles and the costs of their education for the community. Surprising light could then be shed on our three-tier school, on its costs and on its merits.

We need data, valid and reliable data. Has such a practice (banning cell phones, etc.) implemented here had the expected results? We want to know.

Do we have a credible way to measure student well-being? Let’s measure it. That of teachers? Let’s measure it too. (This could be very useful, especially these days…)

And, more than ever, we must salute the creation of the INEE.

To watch on video


source site-44