Calls for Meta to build safer products and engage in Coimisiún na Meán investigation

Coimisiún na Meán examines whether Meta’s platforms limit user choice and manipulate access to non-profiled content feeds under EU law
Calls for Meta to build safer products and engage in Coimisiún na Meán investigation

Media regulator Coimisiún na Meán has begun two investigations into Meta to examine whether the online giant has breached the EU Digital Services Act (DSA), including through the use of “dark patterns” on Facebook and Instagram.

The Irish Examiner reports that the regulator has concerns that so-called “dark patterns” may be preventing people from accessing a feed not based on profiling. Profiling is the use of automated systems to personalise content or ads based on patterns in a person’s data or behaviour.

The investigation will examine whether the Facebook and Instagram interfaces deceive or manipulate users away from choosing a recommender system feed that is not based on profiling of their personal data.

A recommender system is an information filter that suggests content, products, or services to users. Dark patterns - manipulative or deceptive interface designs - may prevent people from exercising their right to choose a recommender system feed not based on profiling.

The investigations will also examine whether users can select and modify their preferred recommender system and whether this functionality - which must be directly and easily accessible - is available through Facebook and Instagram’s interfaces.

John Evans, digital services commissioner with Coimisiún na Meán, said it recognises the concerns that many people have about recommender systems, "and the potential harm that these algorithms can potentially cause by repeatedly pushing harmful content into the feeds of users, especially children and young people".

"We want to remind users of Very Large Online Platforms, the household name companies most of us would recognise, that they have a right to choose a recommender system feed that is not based on the profiling of their personal data.

"Furthermore, Very Large Online Platforms have an obligation to ensure that users can opt for this alternative feed at any time and that it is easily accessible. Platforms also have a duty to ensure that they do not design or operate their interface in such a way as to manipulate users away from exercising their rights.

"Our message is clear: it is unacceptable for platforms to prevent people from using their rights under the law, or to try to manipulate people away from making empowered choices about whether or not recommender system feeds control what they see online."

Evans said that just over two years into the application of the Digital Services Act (DSA), it has played "a crucial role in re-balancing the rights of people and online platforms, putting greater obligations on platforms to keep people safe online, while providing people with greater rights than they had previously".

A Meta spokesperson said: "We disagree with any suggestion that we have breached the DSA. We have introduced substantial changes to our processes and systems to meet our regulatory obligations, and will engage with Coimisiún na Meán to share details of this work.”

They said it announced a non-profiling option on Facebook and Instagram back in 2023, in response to the DSA.

Noeline Blackwell, Online Safety Coordinator with the Children's Rights Alliance, has called on social media platform Meta to build safer products and systems and to engage with Coimisiún na Meán in its investigation.

Speaking on RTÉ radio’s Morning Ireland, Blackwell explained that at present, when anybody goes online, the platform they use gets information about them, which can be used to build a profile which should not be used to push new material at people unless they want it, she said.

“People should be able to switch that off easily. The concern that the regulator has, and that they're investigating, is that the company is using children's and young people's profiles in particular to push information at them.

“So they might be looking at something that they genuinely want to look at, but that the information that they get from the company after that might not be in their best interest and might not be suitable and might be harmful.”

Such information could take people “down rabbit holes” she warned.

“Look, that doesn't only happen to children and young people. Lots of people can go down rabbit holes, can find that they have an interest in one thing, and they're fed more and more information, and that they get into a stage where they could end up getting anxious about something.

“These companies are very big, profitable companies. They are an industry themselves. They want to make money.”

EU regulations require companies to operate in a safe way, she added.

“In the Children's Rights Alliance, we always say there's almost nothing in the European Union that you can do without a Certificate of Safety about it. These are the exceptions. So this is where, when the companies are not behaving in a way that is safe, then the regulator can come in.

“The regulator isn't the fastest operator in the world. It's not the quickest, but it can be the most comprehensive. And it is what, if Coimisiún na Meán investigates this, finds that these dark patterns are in fact being applied, they can hold the company accountable. And the problem in some ways is that Meta is saying, again, nothing to see here.

“Meta will insist on fair processes. There will be a preliminary finding of fact. And what would be really great would be if Meta would engage with this. And instead of saying, as they are doing, nothing to see here, if they would actually say, we can build a safer system. Because that's really what the industry needs to do, is build safe products and systems.”

More in this section