Image of phone.
Fighting disinformation has become a legal obligation for the companies cited in the report since 25 August.

New report shows internet platforms are not doing enough to fight misinformation

Report co-authored by DCU’s Institute for Future Media Democracy and Society

The CoP Monitor, a new report co-authored by the EDMO Ireland hub in DCU’s Institute for Future Media Democracy and Society (Fujo), has shown social media platforms are not doing enough to fight misinformation. The report is an international collaboration between nine academics who conducted a systematic analysis of all information provided by Google, Meta, Microsoft, TikTok and X (formerly Twitter), in the first self-reports submitted under the Strengthened Code of Practice on Disinformation.

The report finds that 55% of qualitative information requests across the signatories were either missing or incomplete due to irrelevant or unclear answers. That figure jumps to 64% for quantitative information requests where signatories often did not provide precise data or no data at all.

Only the ‘Political Advertising’ section of the Code scored a grade of ‘adequate’ within the report, while commitments relating to ‘Empowering Researchers’ and ‘Empowering the Fact-checking Community’ were rated particularly poorly.

Lead author, Dr Kirsty Park, a post-doctoral researcher at  Fujo and Policy Lead of the EDMO Ireland hub, said:

“The Code is really all about transparency. It’s easy for platforms to say that they take these issues seriously, but what the Code asks them to do is to show us evidence that this is true. While these reports show great progress and we can see platforms are taking action, there are still many instances where requested data is missing or answers are irrelevant, and that simply isn’t acceptable when you are talking about some of the biggest companies in the world. They have the resources and a societal responsibility to do better, as well as new legal obligations under the Digital Services Act.”

The Code, which was redeveloped in 2022, contains 44 commitments covering topics such as demonetising disinformation in online advertising, labelling political advertising, combatting deepfakes or AI manipulated content and empowering users, researchers and fact-checkers. Signatories to the Code select which commitments they agree to meet and every six months they must provide detailed information to meet Key Performance Indicators which prove that they are meeting their obligations.

X, formerly known as Twitter, withdrew from the Code this year, provoking a strong reaction from the European Commission with Internal Market Commissioner Thierry Breton warning that “You can run but you can’t hide”.  However, the CoP Monitor report shows that X was already showing signs of non-cooperation within their reporting under the Code, scoring a ‘Poor’ grade in all sections and leaving most of their report blank.

Fighting disinformation has become a legal obligation for the companies cited in the report since 25 August due to the implementation of the Digital Services Act. While participation in the Code is not mandatory, it is an ideal method for platforms to show they are complying with their responsibility to mitigate the risks of disinformation on their services and a refusal to participate or poor quality participation may be taken as evidence of non-compliance and subject to large fines.

The report also highlights the need for a strong monitoring framework to ensure that platforms are providing all the requested information as well as investigatory work to verify whether the provided information is true. Ensuring effective monitoring of the Code is one of the priorities covered in the Government’s development of the National Counter Disinformation Strategy. Ireland’s media regulator, the Broadcasting Authority of Ireland, recently replaced by Coimisiún na Meán, previously commissioned a series of three reports monitoring the Code.

You can view the full report here