Seven complaints registered with new online watchdog centre

Coimisiún na Meán to follow up reports to its contact centre with relevant digital services companies

Seven complaints made to the new online watchdog’s contact centre in its first week are now under review, an Oireachtas committee has heard.

The Coimisiún na Meán (CnaM) centre has passed the concerns on to its digital supervision team, which will raise them with technology platforms as part of the State’s new online regulation regime.

In total, some 108 people have sent emails or made telephone calls to the contact centre since it opened last week, CnaM told the Oireachtas enterprise committee on Wednesday, describing this as a “manageable” volume.

The seven cases have been escalated to its digital services complaints team, which will follow them up with the relevant companies as they concern potential breaches of the European Union’s Digital Services Act (DSA).


The other contacts from members of the public were either general queries, broadcasting complaints referred to the commission’s broadcasting unit or requests for the removal of content that the person did not like, which is not its role, the regulator said.

CnaM’s contact centre, which has been outsourced to service provider Fexco, opened on Monday, February 19th, two days after new EU rules for the regulation of online services came into effect. Its email is

John Evans, who was appointed Digital Services Commissioner last July, told the committee he was building up CnaM’s platform supervision and investigations team, which will engage directly with online platforms as part of the State’s enforcement of the DSA.

“We can’t take on the role of being the general monitor of the internet,” Dr Evans said, stressing it was the responsibility of platforms to deal with illegal and harmful content in the first instance and that platforms should be the first port of call for complainants.

But CnaM will invest in expertise in areas such as data, algorithms and digital forensics as part of its intelligence gathering, he added. Overall, across four divisions, CnaM now employs 102 people and expects to reach 160 this year.

The regulator also anticipates it will appoint “trusted flaggers” – organisations with specialist knowledge of particular types of illegal and harmful content online – later this year, after an appeal to “qualified entities” to apply for this status.

Where a trusted flagger identifies illegal content, they may submit a notice to the relevant online platform and those platforms will be obliged to give their notices priority.

Under the DSA, responsibility for supervising the 22 companies designated as either Very Large Online Platforms (VLOPs) or Very Large Online Search Engines (VLOSEs) is shared between the European Commission and the regulator of the member state in which the company has its European headquarters. Some 13 of the designated VLOPs and VLOSEs are headquartered in the State.

“Ireland has an outsize role in this respect, so it is very important that we start thinking like a big regulator,” Dr Evans told the committee.

The European Commission last week initiated its first formal proceedings under the DSA, announcing it will investigate social video app TikTok for potential breaches of the legislation linked to the protection of minors, advertising transparency, data access for researchers as well as its risk management for “addictive design” and harmful content.

CnaM, as the digital services coordinator for TikTok, will “provide assistance” to the European Commission in its investigation, it has confirmed.

In response to a question about the risks to democracy that might arise from disinformation-spreading online “deepfakes”, Dr Evans said “everybody was very aware of the potential impact that an unregulated space could have for elections” and that this highlighted the importance of the DSA now being in force.

CnaM does not act as a contact moderator, an appeal body or a judge in dispute between different parties or users, but it does have a remit to assess if online service providers are complying with their obligations under the DSA.

While misinformation and disinformation are not necessarily illegal, large social media platforms are obliged to access a range of risks that their services may pose, including risks to civic discourse, electoral processes, public health and public security.

Laura Slattery

Laura Slattery

Laura Slattery is an Irish Times journalist writing about media, advertising and other business topics