Joe Rogan sits atop Spotify podcast listeners of 2021. He’s tied to Spotify in an exclusivity deal that’s reportedly worth millions of dollars. Rogan is accused of criticizing vaccination among young people and promoting a treatment for coronavirus that is not authorized by medical authorities. A large group of American medical professionals had expressed their concerns after Rogan gave the floor to Robert Malone, a doctor popular with anti-vaxxers.
Legendary artists such as Neil Young, Joni Mitchell and Gilles Vigneault have expressed their discomfort with the release of Joe Rogan podcasts on Spotify. They withdrew their works from the platform, which had the effect of initiating a movement of desertion of certain subscribers.
These mega-platforms move under popular or political pressure. To this day, the primary means left for artists to protest the availability of problematic content on Spotify is to remove their own works. But as professors Romuald Jamet and Guillaume Blum pointed out in a text published recently in the pages of the Duty, the issue is more fundamental. These online platforms are data vacuums. It is on this aspect that they must be accountable.
A few months ago, Twitter and Facebook were arrested. They were accused of arbitrarily excluding certain content, closing accounts, judging the acceptability of content at the head of the client or according to popular pressure. For the time being, when there are slippages, we have to rely on the capacity for indignation of political leaders or music icons. This says a lot about the aberrant nature of the legal framework that applies to social media.
The operation of these platforms is not compatible with several requirements considered essential in most democratic countries. First, these platforms are not configured to ensure upstream that the content conveyed complies with the laws. For example, Spotify is not equipped to decide between fact and fiction in controversial statements, such as positions relating to the merits of vaccination. As in other social networks, we respond to concerns that, from time to time, are felt by attaching warnings to certain content. But it is not reassuring, and even contrary to the basic principles of the rule of law, that it is a commercial enterprise that finds itself in a position to judge whether such and such a podcast, such or such a musical work contravenes the laws or gives in conspiratorial delirium.
These platforms use algorithms and compile massive data in order to maximize their revenue. According to different methods, platforms such as Spotify but also YouTube or, in different ways, Twitter or Facebook collect, compile and value masses of data from users. This allows a company like Spotify to feed algorithms that produce lists of musical pieces broadcast to users and above all to sell targeted advertising.
Spotify makes music and other content available on a “revenue sharing” basis between the platform and creators. In addition to legitimate questions about starving levels of compensation paid to artists, we must tackle the obscure nature of the processes by which what is displayed on each user’s terminal is determined. Otherwise, it is our freedom of attention that risks passing.
To feed the devices that determine what is likely to meet the tastes of users, as well as to sell targeted advertising, Spotify accumulates masses of data. The compilation of such data makes it possible to improve personalized music recommendations and above all to ensure the profitability of the platform, which can thus offer advertisers targeted advertising according to the predilections of individuals.
Spotify, like other platforms, aims to ensure that its users spend as much time there as possible. It does this by recommending songs that make users listen or click on the platform, meaning songs that users are likely to like as soon as they hear them. This leads to recommending works corresponding to users’ immediate preferences, generally the music they already like. This leaves a congruent part to the promotion of new works or emanating from minority cultures. In short, this model tends to neglect the music that users might like if offered to them.
Spotify and other social media platforms embody these environments that are just beginning to take hold in our daily lives. These mechanisms for disseminating ideas and creations are dependent on the data produced by the actions of users.
When a company uses so massively such amounts of information, it is essential that the logic underlying the calculations it performs can be known and above all discussed. It is for this reason that the laws must impose obligations with regard to the uses of the data. Not only to ensure the privacy of individuals, but also and above all to guarantee the transparent and fair nature of the decisions that are made by using this massive data. There needs to be more transparency as to the effects of these technical devices on the choices offered to users and the content made available to them.
With regard to the distribution of musical and audiovisual works, Bill C-11 tabled last week in Ottawa aims to put in place mechanisms that could increase transparency regarding the operating logic of the algorithms of platforms such as Spotify. It is a step in the right direction.