Open Review of Management, Banking and Finance

«They say things are happening at the border, but nobody knows which border» (Mark Strand)

Digital disinformation is a threat to competition in the european single market.The impact of misinformation and the danger of a declining fact-checking system on consumer choices and the EU’s regulatory response

By Michele Sances* and Marco Sepe**

ABSTRACT: This paper seeks to examine the phenomenon of fake news and disinformation and its impact on consumer choices and the European Single Market. Analysing some relevant empirical cases, it aims to investigate in some depth the economic impact of fake news and its effects on information transparency, one of the key principles of the European norms governing competition. Furthermore, the EU’s regulatory response will be examined, paying particular attention to the strategies adopted by the Digital Services Act (DSA) and the Digital Markets Act (DMA) to combat digital disinformation.

SUMMARY: 1.Fake news as a factor of economic distortion: empirical and theoretical evidence. – 2. Fake news and transparency: a crucial issue for EU legislation regarding competition. – 3. The European regulatory framework: existing instruments and declared objectives. – 4. Some criticalities of the European regulatory framework: between regulatory ambitions and structural limitations.

1. Transparent and reliable information is an element fundamental to the correct functioning of the market as it permits consumers to make informed choices and prompts companies to compete fairly.

However, increases in fake news and reductions in fact-checking practices by digital platforms, as highlighted very recently by Meta’s decision to abandon fact-checking in the United States in favour of “Community Notes” [1], have engendered significant problems for the market due to the intensification of information asymmetries capable of influencing people’s ability to choose.

The economic theory of information asymmetries formulated by George Akerlof and known as the “theory of lemons”, clearly illustrates the mechanism by means of which a lack of transparency can distort the functioning of a market, profoundly. According to Akerlof, when consumers are unable to distinguish effectively between high-quality and low-quality products (the so-called “lemons”), they inevitably end up by preferring less expensive goods, often of lower quality, thus leading to the gradual exclusion of better products from the market[2]. This phenomenon generated by the presence of information asymmetries, compromises economic efficiency and general well-being.

Digital disinformation produces similar effects, causing a situation where consumers are no longer able to distinguish between reliable and deceptive information[3].

The COVID-19 pandemic highlighted this phenomenon even further, when disinformation came to represent one of the foremost problems of the information age by re-introducing the concept of “infodemic”[4], that is, the excessive and uncontrolled diffusion of information which often went unchecked and/or turned out to be inaccurate, making it difficult for people to make head or tail of a given topic due to the difficulties encountered when seeking to separate the grain from the chaff, that is, trustworthy from unreliable sources of information.

The spread of false information today appears to be characterised by the intent to harm individuals or organisations, to gain political and economic advantage. The fake news involved assumes multiple forms such as deceptive marketing, political propaganda or knowingly altered content passed off as authentic[5].

Within this scenario, social platforms play a fundamental role: thanks to their structure based on immediate interaction and the possibility of sharing content instantly and globally, they can act as fertile ground for the proliferation of fake news. Furthermore, given the popularity some of them enjoy, the risk that consumer choices and, as a result, the right to compete fairly be altered, is very high.

It is important to add that these platforms do not always apply information verification procedures, something that increases the risk of uncontrolled dissemination of fake news, considerably[6].

A significant example of how misinformation can alter the market concretely is the Arla Foods case which affected the dairy sector in the United Kingdom recently.

Arla Foods, one of the UK’s largest dairy cooperatives, announced that in 2024 it would conduct a trial to reduce methane emissions from cattle through the introduction of an additive to the fodder fed to the animals. This initiative, born of a praiseworthy intent, that of reducing the greenhouse gas emissions caused by cattle farming, became the subject of a violent disinformation campaign on the social media that falsely claimed that the additive was part of an alleged plot to depopulate the planet. According to this defamatory campaign, the product added to the cattle feed would cause problems to human fertility, an objective that was unfoundedly linked to personages like Bill Gates. The disinformation campaign supported by the circulation of fake news on the social media resulted in a boycott of Arla Foods products, which led to significant economic losses, despite the additive having been officially approved by the relevant authorities that deemed it absolutely safe[7].

Another rather similar case concerned the United States and the Wayfair Company. In 2020, Wayfair was the target of a massive campaign of disinformation that falsely accused it of child trafficking, erroneously associating the names of its products (e.g. furniture with female names) with alleged victims of kidnapping. This false narrative, which quickly spread across the social networks, generated significant economic and reputational damage to the company, undermining consumer trust and negatively affecting its performance on the market[8].

Another example is represented by the conspiracy theories that proliferated regarding 5G during the COVID-19 pandemic. According to some theories, the spread of the virus was caused or favoured by 5G technologies. Thanks to their widespread diffusion, especially through social networks, these theories caused serious economic and social damage, including physical attacks on technicians and damage to telecommunications infrastructures in several countries, events that showed once more how disinformation can produce real and devastating effects[9].

These are just some examples of how the uncontrolled spread of fake news can cause not only direct and immediate damage to the companies involved, but also create structural distortions of the market, altering consumer behaviour, compromising their trust and generating alterations of information that undermine fair competition based on transparency and quality. It is, therefore, essential to implement rigorous regulatory strategies, accompanied by educational and preventive action, to combat the phenomenon of disinformation effectively and protect the proper functioning of the markets.

2. One of the principal objectives of EU legislation regarding competition is the protection of the transparency of information, an essential requirement when seeking to safeguard informed consumer choices and guarantee that competition on the market be based effectively on merit.

Several studies [10] have pointed out how disinformation by exploiting the discrepancies existing between the supply of and the demand for accurate information, takes particular advantage of contexts where there is a lack of systematic and structured verification of news. Within this scenario, the intentional dissemination of fake news is able to manipulate consumer preferences, altering the natural decision-making mechanisms of clients and consequently compromising free competition. This phenomenon becomes even more problematic when disinformation comes directly or indirectly from gatekeepers, that is, entities that control the dominant digital platforms on the market. Actors of this type, by exercising a privileged position in the circulation of information, can generate indirect violations of Article 102 TFEU, which expressly forbids the abuse of a dominant position.

From this perspective, European legislation on competition plays a crucial role in ensuring a fair and competitive internal market and guaranteeing, among other things, rigorous transparency of information as an essential prerequisite for the protection of the freedom of choice of customers and authentic competition based on merit. Article 102 TFEU specifically represents an effective regulatory tool aimed at preventing and repressing possible distortions of the market caused by an abuse of a dominant position, including situations generated by direct action and the failure to control and verify the contents conveyed by large digital platforms.

Disinformation, in this sense, may be configured as a systematic and intentional dissemination of false or misleading contents, capable of altering competitive dynamics significantly by structurally compromising the transparency of information, hence determining negative impacts on users’ decisions regarding purchase and consumption. The failure to adopt adequate fact-checking systems can be seen, therefore, as a clear failure on the part of the information market to adopt targeted regulatory intervention that can establish balance, efficiency and consumer protection.

Platforms of this kind, by virtue of networking effects, economies of scale and control over data flows, are in a privileged position which permits them to influence consumer behaviour, making the spread of fake news particularly insidious and pervasive.

The Covid-19 crisis brought to light, dramatically, how disinformation constitutes one of the main challenges of the information age. The concept of “infodemic”, already introduced during the SARS epidemic of 2003, emerged again and with greater force to describe the emergence of an overabundance of information, often inaccurate or false, regarding the SARS-CoV-2 virus. The consequences have been serious, contributing to dangerous behaviour that has worsened the health crisis. Within this scenario, disinformation has shown itself better able to anticipate and orientate the public’s demand for information than traditional sources of information, due to its greater reactivity, its lesser editorial inertia and its ability to respond selectively to the public’s interests, as emphasised by the study conducted by Gravino et.  al [11].

This study combined data regarding journalistic production (“General News”) and demands for information expressed via Google Trends (“Searches”), developing an improved vector auto-regressive regression (VAR) model. The results of the investigation revealed a significant correlation between increases in the demand for information and the subsequent supply of content, especially in relation to keywords such as “coronavirus”. However, this analysis also revealed that fake news tend to align better, both quantitatively and semantically, with the public’s demand for information than generalist sources. This semantic alignment suggests that fake news fills an information gap left by the traditional media by intercepting specific unsatisfied interests. From the point of view of competition, this phenomenon represents a systemic threat. Fake news manipulates consumer preferences, altering the normal functioning of the market and reducing the effectiveness of competitive dynamics based on truthful and comparable information.

Were gatekeeper platforms to spread content of this ilk, as in the case of social networks and dominant search engines, this would lead, potentially, to indirect violations of art. 102 TFEU, as distorted use of their power to diffuse information may be deemed tantamount to an abuse of a dominant position. It is also important to emphasise that the information asymmetry produced by behaviour of this kind, combined with an absence or inefficiency of verification mechanisms, impedes the entry of innovative, responsible players into the market and their growth, hence damaging dynamic competition.

It is to this context that the recent effort made by the European Union to integrate legislation regarding competition with ex-ante tools (such as the Digital Markets Act (DMA), which imposes specific behavioural obligations upon gatekeepers aimed at ensuring fairness and contestability of the market) belongs. The DMA aims, in fact, at bridging the gaps of traditional tools (such as art. 102 TFEU) that act mainly ex-post by intervening more promptly and incisively against behaviour that, although not yet punishable as abuse, risk compromising competition. In particular, the inadequate management of disinformation by dominant platforms may be considered a form of abuse deserving regulatory attention. Finally, the growing attention being paid to the quality of information as a relevant parameter for competition justifies an extension of the scope of the application of antitrust legislation, so that it may also consider miscarriages of information that, although not deriving from anti-competitive practices in the strict sense, generate exclusionary or distorting effects. From this perspective, the creation of independent indicators for monitoring disinformation, such as those based on semantic misalignment or the loss of the predictability of editorial behaviour, could provide regulatory and competition authorities with tools capable of promoting a more transparent, reliable and competitive information ecosystem.

3. In recent years, the European Union has assumed an increasingly prominent role when it comes to regulation within the ambit of digital space, by responding to the challenges posed by online disinformation, algorithmic manipulation and the concentration of informational and economic power in the hands of a limited number of large digital platforms. This role has become necessary also by virtue of the pervasiveness, the transversal nature and acceleration of the digital phenomenon, which risks leading to an “algocratic” drift, that is, a society where decision-making power is dominated by algorithms, with possible negative implications for transparency, for the market and even for democracy itself [12].

Within this context, the Digital Services Act (DSA) and the Digital Markets Act (DMA) adopted, represent a crucial step towards a new, more assertive phase of European digital governance, one oriented towards the protection of the public interest.

Viewing the two acts one at a time, we find that the Digital Services Act (DSA – EU Regulation 2022/2065), fully applicable since February 2024, introduces a sophisticated multi-level regulatory resolution aimed at holding digital service providers accountable for the information they convey, with obligations that vary according to their size and systemic impact on the internal market and on European society. In particular, Very Large Online Platforms (VLOPs) are subjected to specific and stringent obligations, given their ability to influence the public debate significantly and spread harmful content.

Among the obligations introduced by this regulation, Very Large Online Platforms are required to conduct regular in-depth assessments of the systemic risks arising from the design and management of their services, including those related to the spread of disinformation, hate speech, manipulation of public opinion and violations of fundamental rights[13]. On the basis of these periodic analyses, adequate and effective measures are required to be taken to mitigate these risks, by means of proportionate and targeted interventions that may even include changes to algorithms, systems capable of moderating content and procedures designed to prevent abuse on the platform[14].

Furthermore, the DSA imposes specific requirements regarding the transparency of algorithms conveying recommendations (tools based on artificial intelligence that propose customised content to users) by requiring VLOPs to provide users with clear and accessible information regarding the criteria used to customise and propose content[15]. Such transparency is essential if users are to be permitted to understand and, if they so wish, to change how they receive and interact with online information. A further qualifying element of the DSA is the obligation for VLOPs to undergo periodical independent audits, aimed at verifying the correct application of the norms of the regulation, the effectiveness of the measures adopted and compliance with the obligations of due diligence, such as to strengthen the credibility and effectiveness of the regulatory system[16]. Finally, VLOPs are required to implement clear and efficient procedures guaranteeing timely reporting and subsequent removal of illegal content[17], while simultaneously ensuring the protection of freedom of expression and providing mechanisms of appeal for users affected by changes, they may deem incorrect or disproportionate[18].

Now let us take a look at the Digital Markets Act (DMA), introduced by EU Regulation 2022/1925, which intervenes incisively on the competitive dynamics of the digital market with the aim of ensuring the fairness and contestability of the markets. In particular, the regulation addresses gatekeepers[19], i.e. those entities that, due to their size and strategic position, control access to core online platform services and significantly influence the commercial relationships existing between consumer and business users.

Among the obligations imposed on gatekeepers is the proscription of exclusionary and discriminatory practices, such as recurrence to so-called “self-preferencing”, which consists in favouring one’s own services or products over those offered by competitors[20] on user interfaces or search engines. This practice is considered detrimental to competition and to the free choice of consumers, since, by unduly exploiting the dominant position of the gatekeeper, it limits the visibility and the growth-possibility of competitors.

In addition, the DMA emphasises the interoperability of services, by requiring gatekeepers to allow interconnection and access by third parties to certain services and functionalities, in particular within the ambit of ​​interpersonal communications[21]. This measure aims at reducing barriers against entry and promoting a more open and competitive environment in the digital market.

A further relevant aspect concerns data portability, that is, the obligation for gatekeepers to ensure that end and business users have the possibility to utilise, access and transfer their data continuously and in real time to other competing digital services [22]. This norm seeks to combat the phenomenon of user “lock-in” by facilitating mobility between different platforms and fostering competition based on the quality and innovation of services.

By means of these measures, the DMA intends, therefore, to correct the imbalances generated by unfair and anti-competitive practices typical of some digital gatekeepers, ensuring contestability and a fairer, more transparent and dynamic market environment. The DMA may well prove to be a central tool in the fight against disinformation and the proliferation of fake news. By actually requiring gatekeepers to ensure greater interoperability and data portability, a vaster plurality of points of view and information sources is guaranteed while the risk that disinformation may grow and enlarge within closed and self-referential ecosystems, is reduced. Similarly, the prohibition of “self-preferencing” practices prevents gatekeepers from privileging sensationalist or misleading content capable of engendering high rates of engagement to the detriment of the reliability of the information.

Furthermore, by imposing greater transparency on commercial relations between business and consumer users, the Digital Market Act promotes the emergence and development of operators specialised in fact-checking and the verification of information, thus creating conditions for a healthier and more competitive digital environment, also in terms of the quality of the content provided. It follows, therefore, that although it does not intervene directly with regard to content, the DMA helps mitigate the systemic risks associated with disinformation.

The adoption of the Digital Service Act and the Digital Market Act reflects awareness on the part of the EU of the “algocratic” risk[23] by placing particular emphasis on the need for algorithmic education, effective governance and algorithmic transparency. This marks an important step forward in European regulatory strategy and represents an integrated and multidimensional approach capable of addressing the risks of digitalisation, while attempting to balance freedom, innovation and the protection of fundamental rights.

4. However, despite the ambitious, innovative scope of the new regulatory framework, numerous critical issues risk compromising its effective implementation and transformative impact.

First of all, we find the imbalance of power and information existing between public institutions and digital platforms. The latter not only possess considerable technical and legal resources but also wield strategic lobbying power at European level[24].

This fuels the risk that the power of digital platforms may turn into a de facto brand of hegemony, where platforms, by availing themselves of forms of collaboration and lobbying, manage to shape the constraints to which they are subjected, thus maintaining control over the cognitive infrastructures of the public debate.

Furthermore, as regards the Digital Service Act, while this introduces relevant obligations such as algorithmic transparency and seeks to combat deceptive design models that may induce users to undertake undesired actions, it does not structurally address the impact of generative artificial intelligence, nor does it possess robust mechanisms capable of countering the automatic production of manipulative or polarising content. The lack of an integrative regulatory measure embracing the DSA, the AI Act and other initiatives, fragments the approach to the governance of emerging technologies, rendering response to systemic and transnational threats rather weak.

In addition, we find a scarcity of empirical tools capable of monitoring the effectiveness of the measures adopted. Without solid ex-post evaluation methodologies, the regulations risk remaining confined to a rhetorical or normative ambit. Finally, as suggested by several scholars[25], perceptions of disinformation as a “threat” subject to regulation are by no means neutral, but reflections of a form of competition between alternative views of the public sphere and democracy. To address these challenges, effectively, it is necessary to reach beyond a purely normative and legal approach and promote an inclusive governance that actively involves civil society, independent journalism, the scientific community and citizens, with a view to re-establishing not only the truthfulness of information, but also the legitimacy and plurality of the European democratic debate.


[1] M. Ferrari, ‘The Change in Digital Reality: Protecting Consumers from Fake News’ (2021) Boston Hospitality Review.

[2] G.A. Akerlof, ‘The Market for “Lemons”: Quality Uncertainty and the Market Mechanism’ (1970) 84 Q J Econ 488.

[3] T.H.M. Le, ‘The Spread of Fake News: Disclosure Willingness Role’ (2024) 10(14) Heliyon e34468 https://doi.org/10.1016/j.heliyon.2024.e34468.

[4] https://www.treccani.it/vocabolario/infodemia_(Neologismi)/ .

[5] T.H.M. Le, ‘The Spread of Fake News: Disclosure Willingness Role’.

[6] M. Ferrari, ‘The Change in Digital Reality: Protecting Consumers from Fake News’.

[7] A Raval, ‘The Disinformation Storm Is Hitting Companies Harder’ Financial Times (2024) https://www.ft.com/content/0aa9725d-e423-4a6b-b842-866ad4541dc2.

[8] BBC News, ‘Coronavirus: “Infodemic” of False Information Must Be Fought’ (13 July 2020) https://www.bbc.com/news/world-53416247.

[9] BBC News, ‘Coronavirus: How False Information Is Spreading’ (30 June 2020) https://www.bbc.com/news/53191523.

[10] P. Gravino, G. Prevedello, M. Galletti and V. Loreto, ‘Assessing Disinformation through the Dynamics of Supply and Demand in the News Ecosystem’ (2021) arXiv preprint https://doi.org/10.21203/rs.3.rs-577571/v1.

[11] P Gravino, G Prevedello, M Galletti and V Loreto, ‘Assessing Disinformation through the Dynamics of Supply and Demand in the News Ecosystem’ (2021).

[12] See M. Sepe, ‘Innovazione Digitale, tra rischi di deriva algocratica e possibili rimedi’ (2023) Rivista trimestrale di diritto dell’economia, supplemento al n 4/2023.

[13] Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market for Digital Services and amending Directive 2000/31/EC (Digital Services Act), art. 34

[14] Regulation (EU) 2022/2065, art. 35.

[15] Regulation (EU) 2022/2065, art .27.

[16] Regulation (EU) 2022/2065, art .37.

[17] Regulation (EU) 2022/2065, art .16.

[18] Regulation (EU) 2022/2065, art. 20.

[19] Regulation (EU) 2022/1925 of the European Parliament and of the Council of 14 September 2022 on Contestable and Fair Markets in the Digital Sector and amending Directives (EU) 2019/1937 and (EU) 2020/1828 (Digital Markets Act), art. 3.

[20] Regulation (EU) 2022/1925, art. 6 (5).

[21] Regulation (EU) 2022/1925, art .7

[22] Regulation (EU) 2022/1925, art. 6 (9).

[23] See M. Sepe, ‘Innovazione Digitale, tra rischi di deriva algocratica e possibili rimedi’ (2023)

[24] L.B. García and A. Oleart, ‘Regulating Disinformation and Big Tech in the EU: A Research Agenda on the Institutional Strategies, Public Spheres and Analytical Challenges’ (2023) J Common Mkt Stud 1395.

[25] Luis Bouza García and Alvaro Oleart, Regulating Disinformation and Big Tech in the EU: A Research Agenda on the Institutional Strategies, Public Spheres and Analytical Challenges, Journal of Common Market Studies, 2023, pp. 1395; Marco Sepe, Innovazione digitale, tra rischi di deriva algocratica e possibili rimedi, in Rivista trimestrale di diritto dell’economia, Supplemento al n. 3/2023, pp. 235–237​; Pietro Gravino, Giulio Prevedello, Martina Galletti, Vittorio Loreto, Assessing disinformation through the dynamics of supply and demand in the news ecosystem, arXiv preprint, 2021, pp. 1–2; Minh T.H. Le, The spread of fake news: Disclosure willingness role, Heliyon, Volume 10, 2024, article e34468, pp. 1–2 (Introduction  and theoretical framework)​.

Author

* Michele Sances is PhD candidate, Department of Law and Economics at the International Telematic University UNINETTUNO.

** Marco Sepe is Full Professor of Economic Law at University Unitelma Sapienza of Roma.

Although the work is the result of a shared reflection, paragraph 1 is attributable to Prof. Marco Sepe and paragraphs 2, 3 and 4 to Dr. Michele Sances.

Information

This entry was posted on 04/06/2024 by in Senza categoria and tagged , , .