Digital transformation and the regulation of „The Big Tech“ in the EU

The digital transformation is putting pressure on modification of the European Union (EU) directives that seek to protect sensitive data of citizens and organizations themselves. This is a topical issue, as with the speed of technological development and the growth of so-called Big Tech, gaps in legislation and the need to adapt legislation to rapid developments are and will continue to emerge. The EU is therefore making legislative adjustments to respond to the technological transformation. However, the question remains as to how these directives can effectively cope with the rapid evolution of technology.

EU AND BIG TECH: ACTORS IN INTERNATIONAL RELATIONS

The first actor in international relations to be concerned with the issue is the EU. It is described that the European Union in terms of international relations can be understood in terms of two dimensions. One deals with internal affairs and the other with external (non-EU) affairs. In the case of both conceptions, defining the EU as an actor in international relations is complex. Indeed, the EU itself may exhibit certain elements of an international organization, an alliance, or even a „supranational state“. In terms of foreign and security policy, the EU actor-hood is defined as a „highly institutionalized multilateral forum for encouraging regular international cooperation on foreign policy issues among independent states“. Regarding the EU’s power as an actor, the tendency is to compare it with the US or other powerful actors in international relations.

It is also important to keep in mind the definition of power that Ian Manners defines for the EU – the so-called normative power. Normative power is also known as ideological power, the power of opinion, or the ability to shape the notion of what is ‚normal‘. In practice, this means that the EU is able to influence the global environment through its norms or values, which are peace, freedom, democracy, human rights, social solidarity, sustainable development, and anti-discrimination.

The second actor – large technology companies (non-state actors) – are called the „Big Tech“. These companies include Google, Amazon, Facebook, Apple, and others. Today, these companies are creating a new type of authority. They are entering international relations at a level previously held only by sovereign states, as they take over decision-making responsibilities for them in certain areas. States, on the other hand, have sought to resist this growing influence from Big Tech by creating „antitrust, privacy, and speech laws“. This new private authority is impacting voting and political behaviour („influencing up to twenty percent of undecided voters“) and resulting in information overload. Moreover, by promoting viral content (clickbait), it spreads misinformation. This authority raises the question of its legitimacy and accountability to be recognized in international relations. However, Big Tech does not comply with the norms of international relations, as users who could confirm the status of private authority do not know how algorithms affect them. Furthermore, it is difficult for the creators themselves to estimate the exact functioning of the algorithms. And finally, it is uncertain who bears responsibility for the algorithms. Should it be Big Tech itself?

RESPONSES TO TECHNOLOGICAL DEVELOPMENT

The rapid development and transformation of technology is forcing the adoption of new legal norms not only by the states but above all by the European Union, which, according to its character as a normative power, can set the direction for both state and non-state actors in terms of what the standard of dealing with security should look like in terms of the development of technology and Big Tech.

Historically, the first directive on digital regulation came into force on 16 January 2000. This was the Directive on Electronic Commerce. In addition to promoting transparency and ensuring that the commercial environment is one of communication and consumer information, it also includes criteria for concluding electronic contracts as well as compliance with rules on online purchases, advertising, commercial communications, and content monitoring. However, as technology evolves, this essential pillar of digital regulation directives needs to be adapted and extended.

Following uncertainties about the sufficiency of the regulations, the European Commission launched a public consultation in 2015. A key element was to collect data on the liability of platform intermediaries. Criticisms have been raised about the transparency of the platforms in terms of their search results, the methods by which reviews are created, the primary sources of information, the amount of personal information about users, the full list of which is not known to users, illegal content that spreads various forms of abuse, racism, or xenophobia, or the lack of transparency of cloud service providers in terms of the security of users‘ data.

In 2018, Facebook, Google, and Twitter modified their terms of service to conform with EU consumer protection legislation and to ensure the immediate elimination of prohibited advertisements because of an action by national enforcers of the Consumer Protection Cooperation Network, coordinated by the Commission in 2016.

Currently, there is a package in place that includes two acts that focus on regulation for a safer digital user space. The Digital Services Act (DSA) came into force in 2022. It regulates online intermediaries and platforms to prevent the spread of misinformation and other illegal content in the digital environment and protects users‘ rights as well as their safety.

A second act, the Digital Markets Act (DMA), also came into force in 2022. It addresses fair competitiveness for platforms in the EU market. Large platforms are referred to as „Gatekeepers“. This means that they serve as a gateway for consumers to the services they provide. A platform that achieves at least €7.5 billion annual turnover in the EU can be referred to as this. The number of users in the EU or the number of member states in which the gatekeeper controls at least one major platform service also plays a role. This applies to companies such as Alphabet, Amazon, Apple, ByteDance, Meta, and Microsoft. As a result of the Act, gatekeepers will not be able to promote their products in a positive way to the disadvantage of other companies, they will not be able to use users‘ personal data to target advertising without consent, and they must allow a fair playing field for users who make money from their activities on the platform. It also regulates the terms of software applications, logging in and out of services, transparency of information regarding its classification as a gatekeeper, and the provision of basic services to users. It also sets out the level of fine a gatekeeper will face, thus ensuring the EU’s compliance. Besides this, the DMA also keeps in mind smaller companies, whose competitiveness is strengthened by transparent market conditions.

However, there are also concerns that what the DMA opens room for many clarifications that would be necessary to address. It has been recognised that the Data Protection Act does not address measures to enable data portability (addressed in the Data Protection Act) and that there are issues of compliance with rules on the use of users‘ personal data, which may pose a danger to EU citizens whose data may be misused in non-democratic countries. This includes criticism of the large amount of space the DMA leaves to the free market in certain areas. Thus, experts suggest ways in which the DMA could be modified or supplemented. First, they suggest changes that would allow the rules to be better adapted to different situations and needs. Second, policymakers should clarify how transactions and data disputes should look and work to avoid problems. Third, intermediaries should be given a greater role in helping to ensure that the rules are followed and protect the interests of citizens and companies. And fourth, other issues such as the market power of smaller players, private sector enforcement, interaction with intellectual property rights, and the distinction between personal and non-personal data need to be addressed more, as this is the only way the new rules can ensure a fair and efficient digital market.

Alternatively, the situation can be summarised by the ideological role of the EU, which seeks to seize the opportunity to become an exemplary actor in the development of digitalisation and related measures, which stems from the normative nature of EU power. At the same time, it is pointed to the EU’s inability to counter the continuous and increasing influence of Big Tech in its new measures. This leads to the belief that the directives not only do not sufficiently address this problem, but that in certain circumstances there may be cases where the situation could be worsened. After all, data protection only works well when we are really talking about the collection and exchange of personal data. However, even when we follow these rules, problems can still arise, such as unfair profit. Similar problems may occur in data protection and artificial intelligence laws. A further criticism in this sense is that the rights that these laws protect, such as protection from discrimination, freedom of expression, privacy, etc., are political and civil rights. It is questionable whether these rights can solve the complex social problems that arise from new technologies and innovation. The EU has limited powers and focuses mainly on the internal market, which may be insufficient to deal with the expansion of large technology companies. This market orientation may lead to the perception of areas such as healthcare or education as market products, which is problematic. It is therefore suggested that regulation should focus more on socio-economic rights, such as health and education, rather than only on market-based rights.

THE CHALENGES AND FUTURE OF REGULATION

In response to technological transformation and the growing influence of Big Tech, the EU has made significant legislative changes to protect the digital environment and ensure fair competition. The Digital Services Act (DSA), which focuses on regulating online intermediaries and platforms to prevent misinformation and protect users‘ rights, and the Digital Markets Act (DMA) targets large digital platforms or „Gatekeepers“ to promote fair competition and prevent monopolistic practices and imposes strict rules on large tech companies to prevent them from abusing their market power. While the Acts are an example of a proactive approach, the digital environment is constantly evolving. Criticism of these and other EU directives suggests that their future effectiveness will depend on their adaptability, enforcement, and comprehensive scope. This applies to areas such as data protection (greater control by users over their data), socio-economic rights, the regulation of algorithms, and the promotion of transparency and accountability.

Given the nature of the EU as a normative power and its efforts to adapt and adjust directives to developments, it is possible that the EU becomes a global example for other actors in international relations in regulation against Big Tech. However, even though the EU’s legislative adjustments represent a major step in the fight against unregulated digital space, the ongoing evolution of Big Tech will remain a significant challenge.

 

BIBLIOGRAPHY

Source of the picture: https://techxplore.com/news/2021-11-eu-ministers-bloc-big-tech.html

Directive 2000/31. Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (Directive on electronic commerce), accessed July 14, 2024, https://eur-lex.europa.eu/legal-content/EN/ALL/?uri=CELEX%3A32000L0031

European Commission (2016) “Results of the public consultation on the regulatory environment for platforms, online intermediaries, data and cloud computing and the collaborative economy”, Shaping Europe’s Digital Future, accessed July 14, 2024, https://digital-strategy.ec.europa.eu/en/library/results-public-consultation-regulatory-environment-platforms-online-intermediaries-data-and-cloud

European Commission (2018) “Facebook, Google and Twitter accept to change their terms of services to make them customer-friendly and compliant with EU rules”, Newsroom, accessed July 15, 2024, https://ec.europa.eu/newsroom/just/items/614254/en

European Commission (2022) “Data Act & amended rules on the legal protection of databases”, accessed July 28, 2024, https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/13045-Data-Act-amended-rules-on-the-legal-protection-of-databases_en

European Commission (2022) “e-Commerce Directive”, Shaping Europe’s Digital Future, accessed July 14, 2024, https://digital-strategy.ec.europa.eu/en/policies/e-commerce-directive

European Commission (2023) “Digital Markets Act: Commission designates six gatekeepers”, Press release, accessed July 22, 2024, https://ec.europa.eu/commission/presscorner/detail/en/ip_23_4328

European Commission (n. d.) “About the Digital Markets Act”, Digital Markets Act (DMA), accessed July 22, 2024, https://digital-markets-act.ec.europa.eu/about-dma_en

European Commission (n. d.) “The Digital Services Act: Ensuring a safe and accountable online environment”, accessed July 15, 2024, https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-services-act_e

European Council and Council of the European Union (2024) “A digital future for Europe”, accessed July 15, 2024, https://www.consilium.europa.eu/en/policies/a-digital-future-for-europe/

European Council and Council of the European Union (2024) “Digital Markets Act”, accessed July 21, 2024, https://www.consilium.europa.eu/en/infographics/digital-markets-act/

European Council and Council of the European Union (2024) “Digital Markets Act”, accessed July 21, 2024, https://www.consilium.europa.eu/en/policies/digital-markets-act/

Ginsberg, R. H., Smith, M. E. (2007) “Understanding the European Union as a global political actor: Theory, practice, and impact”, State of the European Union, Volume 8, accessed July 7, 2024, https://aei.pitt.edu/7882/

Manners, I. (2002) “Normative Power Europe: A Contradiction in Terms?”, JCMS, Volume 40 – Number 2, Pages 235–258, Blackwell Publishers, accessed July 7, 2024, https://www.princeton.edu/~amoravcs/library/mannersnormativepower.pdf

Picht, P. G., Richter, H. (2022) “EU Digital Regulation 2022: Data Desiderata”, GRUR International, Volume 71 – Number 5, Pages 395–402, accessed July 28, 2024, https://doi.org/10.1093/grurint/ikac021

Sharon, T., Gellert, R. (2023). “Regulating Big Tech expansionism? Sphere transgressions and the limits of Europe’s digital regulatory strategy”, Information Communication & Society, Pages 1–18, accessed August 03, 2024, https://doi.org/10.1080/1369118x.2023.2246526

Srivastava, S. (2021) “Algorithmic Governance and the International Politics of Big Tech” Perspectives on Politics, Volume 21 – Number 3, Pages 989–1000, accessed July 14, 2024, https://doi.org/10.1017/s1537592721003145

 

Written by Bára Auerová

67