Regulation of Data Abuse in Digital Platforms: History
Please note this is an old version of this entry, which may differ significantly from the current revision.
Contributor:

Data abuse refers to the unauthorized acquisition and use of personal data by enterprises, organizations, or individuals, including but not limited to identity information, transaction records, and health records, without the explicit consent of users. With the booming platform economy, the governance of digital platforms has become a top priority for policymakers, regulators, and competition authorities around the world.

  • data abuse
  • digital platforms
  • risk of privacy leakage
  • governance mechanisms

1. Introduction

In recent years, the increasing prevalence of data scandals on digital platforms has brought data abuse and privacy security to the forefront of public attention. Unlike traditional economic models, digital platforms are characterized by the collection, processing, and mining of data through complex algorithms, which results in economic value. The more data a platform collects, the greater the economic value it can obtain, which enhances its market power [1]. The availability of data resources significantly influences the formation of competitive economic structures for platforms. Consequently, data competition [2], non-monetary transactions [3], and privacy protection have become crucial issues that affect the market dominance of platforms [4][5][6]. In this context, some platforms may choose to abuse data to achieve monopolistic profits. This phenomenon has become increasingly common and has resulted in several high-profile cases. For instance, in July 2022, the State Internet Information Office of China imposed a fine of RMB 8.026 billion on Didi Global Inc., which was found to have engaged in 16 illegal activities across eight categories. Specifically, Didi Global Inc. collected users’ face recognition information, precise location information, and affinity information through illicit means, seriously violating users’ personal information rights and interests. This action underscores the importance of data security for digital enterprises and brings public concerns about data abuse on digital platforms to the forefront.
Data abuse refers to the unauthorized acquisition and use of personal data by enterprises, organizations, or individuals, including but not limited to identity information, transaction records, and health records, without the explicit consent of users. From the perspective of competition law, data abuse can be classified into two categories: exploitative abuse and exclusive abuse [7]. Exploitative abuse occurs when digital platforms exploit their platforms to extract consumer benefits in the process of users exchanging data for the right to use digital services. In practice, consumers have limited choices when it comes to data sharing and are forced to accept the data equivalents unilaterally set by the platform. This can lead to excessive collection of personal information and even data-related harm. On the other hand, exclusionary abuse focuses on the tactics used by digital platforms to restrict competition by leveraging their data advantage, such as exclusive contracts, cross-use of data, and refusal to share data. Table 1 outlines the common practices and main hazards of exploitative and exclusive abuses.
Table 1. The common practices and main hazards of data abuse.
Table 1 illustrates how data abuse by digital platforms has negatively impacted both consumer rights protection and platform economic development. Exploitative abuses, such as the case of Trip.com, excessive collection of personal information by the Wechat Moments, and the Pegasus software spying scandal, violate consumers’ right to fair trade and personal privacy. Given the prevalence of personal information, including home addresses, vehicle information, telecommunication information, and courier information, it is evident that consumers face substantial risks of having their personal data used for commercial purposes without their consent. However, relevant companies have yet to provide adequate protection against illegal data extraction and sale. This poses a serious impediment to the growth of the platform economy, as consumers increasingly seek to limit their online footprint to mitigate data abuse concerns [8]. In addition, the existence of exclusionary abuses, exemplified by cases such as the ‘Two choose One’ case and the Google and Facebook synergistic monopolies, exacerbates data monopolies, inhibits industry innovation, squeezes the viability of SMEs, jeopardizes fair competition in the marketplace, and disrupts market order. As a result, governing the data abuse behaviors of digital platforms has become a critical issue for the development of the platform economy.
Why does data abuse happen? Scholars have undertaken extensive research into the causes of data abuse and have identified three main factors. Firstly, the profit-seeking nature of platforms. Since the platform has the characteristics of strong network externality [9], multi-attribution of users [10], winner-take-all [11], etc., the valorization of the data can enhance its market power, so the digital platform has the motive to commit data abuse in pursuit of monopoly profits. Secondly, the government values efficiency over equity. Usually, the government provides a variety of preferential policies and a relaxed regulatory environment for platforms to improve their market efficiency. For example, the Chinese government offers a preferential tax rate of 10% for some platforms [12], which leaves the door open for platforms to abuse data. Thirdly, users relinquish ownership of their data. Users enjoy services without being concerned about privacy exposure. On the one hand, the data collected by the platform improves the user services. On the other hand, this data is accessed by third parties in a way that harms users and generates the risk of user privacy leakage [13][14]. The trade-off between good service and loss of privacy is a game-theoretic choice of whether users choose to participate in the supervision of data abuse on platforms. Overall, the abuse of data on digital platforms occurs as a result of a game of multiple forces.

2. Regulation of Data Abuse in Digital Platforms

With the booming platform economy, the governance of digital platforms has become a top priority for policymakers, regulators, and competition authorities around the world. The vast literature on digital platform governance explores issues such as merger and acquisition, pricing [5][15], collusion [16], and the abuse of market dominance [17]. This study focuses on the governance of data abuse in digital platforms, referring to the harmful effects that can arise from the excessive collection or use of user data. In the platform economy, it is manifested as excessive collection of user data [18][19], refusal to share data, use of data advantages to achieve self-preferential treatment, forced free-riding, big data discrimination pricing [20], the abuse of market leverage, etc. Instead of analyzing the specific harm of each form of data abuse, the researchers employ evolutionary game theory to explore key factors that influence the decision making of governments, platforms, and users in different data use scenarios. Additionally, it examines the mechanisms that governments and users can employ to prohibit platforms from data abuse. The relevant literature includes three aspects: research on the motivation for data abuse by platforms, research on the regulation of data abuse by platforms, and research on evolutionary game model.

2.1. Research on the Motivation for Data Abuse by Platforms

The research on the motivation behind why digital platforms choose to abuse data is fragmented, as scholars often focus on single factors or subjects. There is currently no systematic research framework in place. Therefore, researchers try to combine relevant literature to identify the relevant motives that scholars have discussed. These motives can be broadly categorized into two groups: direct factors and indirect factors. One of the direct factors is data-based revenue, as data is considered an indispensable factor of production in the platform economy [21]. It directly contributes to economic growth and enhances the efficiency of social production in enterprises [1]. So, the digital platform has the appeal of data valorization. However, enterprises are inherently profit-seeking, and in order to capture more value, a few platforms that have strong market power choose to abuse data to maximize their revenue [17]. On the other hand, indirect factors, such as government policy and user behavior, also influence the data use strategies of platforms. Government policies, such as tax rates and penalties, can indirectly influence their data use strategies by affecting the platform’s profits. [20][22] argued that charging different tax rates on access revenue and data use revenue will reduce platforms’ incentives to over-collect personal data. Liu, W. et al. [20] considered that high fines reduce the profitability of the platform by reducing the revenue from a breach and thereby stopping the platform companies from carrying out data abuse. Similarly, the user’s privacy exposure [8] affects the platform’s ultimate data usage strategy by influencing the amount of data available to the platform. Bourreau, M. et al. [23] believed that the data provided to the platform by users during their consumption on the platform is used to optimize business and increase market power. Once users reduce consumption or falsify consumption data, this will reduce the revenue that the platform obtains from the data, and thus, reduce the incentive for the platform to carry out data abuse.

2.2. Research on the Regulation of Data Abuse by Platforms

There are many regulatory studies on data abuse on digital platforms, and a variety of regulatory paths been have proposed. They can be mainly divided into two categories: platform technical interventions and government policy interventions.
On the technical front, scholars have proposed a range of measures to regulate data abuse. Some examples include differential privacy protection technology [24], which effectively safeguards user privacy on digital platforms; blockchain encryption technology [25][26][27], which encrypts and protects data from malicious access and exploitation; and, machine learning algorithms-based attacker detection methods [28], which identify issues in data collection, storage, processing, and sharing. On the government front, policymakers can adopt several measures to curb data abuse. For instance, Liu, W. et al. [20] proposed to regulate tax rates, which would discourage firms from expanding and thus weaken the incentive for platforms to abuse data. However, the tax system in many countries is still highly controversial. This path may not be a feasible solution. Another proposed approach, suggested by Grewal, R. et al. [29], is to regulate the amount of fines and urged the government to exercise the administrative punishment functions, but the standard of fines is still inconclusive, as evidenced by the current penalty strategy in China, which follows a “one case one meeting” approach. Shi, T. et al. [30] believed that data privacy policies can be adjusted and that strict privacy protection policies will increase the cost of violations for data abuse of digital platforms, but discourage innovation. Obviously, there are multiple trade-offs in the process of platform economic development [31]. The above research affirms the leading role of the government in supervision, but does not take into account the conflicting interests between the government, the platform and users. There is a certain power game among the three parties, and the design of the supervision mechanism also needs to focus on the game relationship between the three. Overall, a more comprehensive and nuanced approach is needed to effectively regulate data abuse on digital platforms, which takes into account the interests of all parties involved and considers the potential unintended consequences of various regulatory paths.

2.3. Research on Evolutionary Game Theory

Evolutionary game theory is a tool utilized in the study of interactions between animals, humans, and other organisms with regards to their behavioral strategies [32]. Common forms are bipartite evolutionary game models [33][34], tripartite evolutionary game models [35][36], etc. In such games, the choices made by one organism can notably affect the actions of others, ultimately leading to varying levels of success or failure in terms of individual survival and reproductive outcomes within a population. Therefore, the evolutionary game method [33][35][36], which focuses on the evolutionary process and stable state of the population, can help to study the cooperation [37], competition [38][39] and regulation [40][41][42] in social and commercial decision making. Concerned that the problem of data abuse involves a game between the government, platforms and users, which form a limited and large group, the evolutionary game approach can be used to explore the regulatory mechanism to stop data abuse on digital platforms.

This entry is adapted from the peer-reviewed paper 10.3390/systems11040188

References

  1. Ichihashi, S. Competing data intermediaries. RAND J. Econ. 2021, 52, 515–537.
  2. Ichihashi, S.; Kim, B.C. Addictive Platforms. Manag. Sci. 2022, 69, 1127–1145.
  3. Zhu, Y.; Grover, V. Privacy in the sharing economy: Why don’t users disclose their negative experiences? Int. J. Inf. Manag. 2022, 67, 102543.
  4. Just, N. Governing online platforms: Competition policy in times of platformization. Telecommun. Policy 2018, 42, 386–394.
  5. Belleflamme, P.; Peitz, M. Managing competition on a two-sided platform. J. Econ. Manag. Strategy 2019, 28, 5–22.
  6. Teh, T.H. Platform governance. Am. Econ. J. Microecon. 2022, 14, 213–254.
  7. Graef, I. When data evolves into market power: Data concentration and data abuse under competition law. In Digital Dominance; Moore, M., Tambini, D., Eds.; Oxford University Press: Oxford, UK, 2018; pp. 71–97.
  8. Mousavi, R.; Chen, R.; Kim, D.J.; Chen, K. Effectiveness of privacy assurance mechanisms in users’ privacy protection on social networking sites from the perspective of protection motivation theory. Decis. Support Syst. 2020, 135, 113323.
  9. Rochet, J.C.; Tirole, J. Platform competition in two-sided markets. J. Eur. Econ. Assoc. 2003, 1, 990–1029.
  10. Gawer, A.; Cusumano, M.A. Industry platforms and ecosystem innovation. J. Prod. Innov. Manag. 2014, 31, 417–433.
  11. Parker, G.G.; Van Alstyne, M.W.; Choudary, S.P. Platform Revolution: How Networked Markets Are Transforming the Economy and How to Make them Work for You; WW Norton & Company: New York, NY, USA, 2016.
  12. SAT, C. Notice on Issues Related to Enterprise Income Tax Preferential Policies for Software and Integrated Circuit Industry. 2016. Available online: http://www.chinatax.gov.cn/chinatax/n810341/n810755/c2128416/content.html (accessed on 15 December 2022).
  13. Cloarec, J. The personalization–privacy paradox in the attention economy. Technol. Forecast. Soc. Chang. 2020, 161, 120299.
  14. Fainmesser, I.P.; Galeotti, A.; Momot, R. Digital privacy. Manag. Sci. 2022. Epub ahead of print.
  15. Armstrong, M. Competition in two-sided markets. RAND J. Econ. 2006, 37, 668–691.
  16. Calvano, E.; Calzolari, G.; Denicolo, V.; Pastorello, S. Artificial intelligence, algorithmic pricing, and collusion. Am. Econ. Rev. 2020, 110, 3267–3297.
  17. Gilbert, R.J. Separation: A Cure for Abuse of Platform Dominance? Inf. Econ. Policy 2021, 54, 100876.
  18. Choi, J.P.; Jeon, D.S.; Kim, B.C. Privacy and personal data collection with information externalities. J. Public Econ. 2019, 173, 113–124.
  19. van Hoboken, J.; Fathaigh, R. Smartphone platforms as privacy regulators. Comput. Law Secur. Rev. 2021, 41, 105557.
  20. Liu, W.; Long, S.; Xie, D.; Liang, Y.; Wang, J. How to govern the big data discriminatory pricing behavior in the platform service supply chain? An examination with a three-party evolutionary game model. Int. J. Prod. Econ. 2021, 231, 107910.
  21. Arrieta-Ibarra, I.; Goff, L.; Jiménez-Hernández, D.; Lanier, J.; Weyl, E.G. Should we treat data as labor? Moving beyond “free”. AEA Pap. Proc. 2018, 108, 38–42.
  22. Bloch, F.; Demange, G. Taxation and privacy protection on Internet platforms. J. Public Econ. Theory 2018, 20, 52–66.
  23. Bourreau, M.; Kraemer, J.; Hofmann, J. Prominence-for-data schemes in digital platform ecosystems: Implications for platform bias and consumer data collection. In Innovation Through Information Systems; Ahlemann, F., Schütte, R., Stieglitz, S., Eds.; Springer: Cham, Switzerland, 2021; Volume 48, pp. 512–516.
  24. Yuan, S.; Pi, D.; Zhao, X.; Xu, M. Differential privacy trajectory data protection scheme based on R-tree. Expert Syst. Appl. 2021, 182, 115215.
  25. Koppu, S.; SOMAYAJI, S.R.K.; Meenakshisundaram, I.; Wang, W.; Su, C. Fusion of Blockchain, IoT and Artificial Intelligence—A Survey. IEICE Trans. Inf. Syst. 2022, 105, 300–308.
  26. Wang, W.; Yang, Y.; Yin, Z.; Dev, K.; Zhou, X.; Li, X.; Qureshi, N.M.F.; Su, C. BSIF: Blockchain-based secure, interactive, and fair mobile crowdsensing. IEEE J. Sel. Areas Commun. 2022, 40, 3452–3469.
  27. Wang, W.; Chen, Q.; Yin, Z.; Srivastava, G.; Gadekallu, T.R.; Alsolami, F.; Su, C. Blockchain and PUF-based lightweight authentication protocol for wireless medical sensor networks. IEEE Internet Things J. 2021, 9, 8883–8891.
  28. Yang, Y.; Wei, X.; Xu, R.; Wang, W.; Peng, L.; Wang, Y. Jointly beam stealing attackers detection and localization without training: An image processing viewpoint. Front. Comput. Sci. 2023, 17, 173704.
  29. Grewal, R.; Chakravarty, A.; Saini, A. Governance mechanisms in business-to-business electronic markets. J. Mark. 2010, 74, 45–62.
  30. Shi, T.; Xiao, H.; Han, F.; Chen, L.; Shi, J. A Regulatory Game Analysis of Smart Aging Platforms Considering Privacy Protection. Int. J. Environ. Res. Public Health 2022, 19, 5778.
  31. Steppe, R. Online price discrimination and personal data: A General Data Protection Regulation perspective. Comput. Law Secur. Rev. 2017, 33, 768–785.
  32. Tanimoto, J. Fundamentals of Evolutionary Game Theory and Its Applications; Springer: Berlin/Heidelberg, Germany, 2015.
  33. da Silva Rocha, A.B.; Salomão, G.M. Environmental policy regulation and corporate compliance in evolutionary game models with well-mixed and structured populations. Eur. J. Oper. Res. 2019, 279, 486–501.
  34. Ji, S.f.; Zhao, D.; Luo, R.j. Evolutionary game analysis on local governments and manufacturers’ behavioral strategies: Impact of phasing out subsidies for new energy vehicles. Energy 2019, 189, 116064.
  35. Bao, A.R.H.; Liu, Y.; Dong, J.; Chen, Z.P.; Chen, Z.J.; Wu, C. Evolutionary Game Analysis of Co-Opetition Strategy in Energy Big Data Ecosystem under Government Intervention. Energies 2022, 15, 2066.
  36. Encarnação, S.; Santos, F.P.; Santos, F.C.; Blass, V.; Pacheco, J.M.; Portugali, J. Paths to the adoption of electric vehicles: An evolutionary game theoretical approach. Transp. Res. Part Methodol. 2018, 113, 24–33.
  37. Yang, Z.; Shi, Y.; Li, Y. Analysis of intellectual property cooperation behavior and its simulation under two types of scenarios using evolutionary game theory. Comput. Ind. Eng. 2018, 125, 739–750.
  38. Cai, G.; Kock, N. An evolutionary game theoretic perspective on e-collaboration: The collaboration effort and media relativeness. Eur. J. Oper. Res. 2009, 194, 821–833.
  39. Yu, H.; Zeng, A.Z.; Zhao, L. Analyzing the evolutionary stability of the vendor-managed inventory supply chains. Comput. Ind. Eng. 2009, 56, 274–282.
  40. Mahmoudi, R.; Rasti-Barzoki, M. Sustainable supply chains under government intervention with a real-world case study: An evolutionary game theoretic approach. Comput. Ind. Eng. 2018, 116, 130–143.
  41. Li, B.; Wang, Q.; Chen, B.; Sun, T.; Wang, Z.; Cheng, Y. Tripartite evolutionary game analysis of governance mechanism in Chinese WEEE recycling industry. Comput. Ind. Eng. 2022, 167, 108045.
  42. Mirzaee, H.; Samarghandi, H.; Willoughby, K. A three-player game theory model for carbon cap-and-trade mechanism with stochastic parameters. Comput. Ind. Eng. 2022, 169, 108285.
More
This entry is offline, you can click here to edit this entry!
Video Production Service