Dialectical view on data sharing and privacy protection

BY LIU JIE, SONG RONG | 09-19-2024
Chinese Social Sciences Today

It is important to dialectically consider the relationship between data sharing and privacy protection. Photo: TUCHONG


Privacy issues gradually become salient amid the digitalization of society and personal information. It is essential to avoid indiscriminate data collection and “algorithmic autocracy” by dialectically considering the relationship between data sharing and privacy protection.


Conflict between data sharing and privacy protection

First, extensive collection of personal data in both physical and virtual spaces has become a direct source of privacy breaches. Moreover, big data technology systems possess massive data storage capacity, raising concerns about potential data misuse. 


Second, the “secondary use” of data often occurs without user authorization, applying the data to projects unrelated to the original collection purpose. This may result in discrimination against individuals, information distortion, and disinformation during the process of data aggregation and information dissemination.


Third, as online interactions have permeated people’s everyday lives, privacy violations and threats are increasingly diverse.


Fourth, faced with the potential of large-scale collection and repeated use of personal information, individuals are either willing to experiment and trade their privacy for better service or cautious in their use of technology. High-profile privacy security incidents can lead to increased vigilance, reduced public trust in technology, and resistance to the adoption of new technology, which will negatively affect data sharing.


These conflicts essentially stem from the limitations of existing technologies, unregulated use of technology, and friction between various stakeholders. Privacy protection in the context of digitalization and intelligentization must be embedded in optional data sharing, with continuous technological improvement and adherence to reasonable social norms.


Context dependence of privacy protection

Privacy has absolute value and relative value. Absolute value lies in an individual’s instinctive need for personal isolation or intimate relationships within a small group under certain circumstances, as well as the inherent dignity and freedom that humans deserve. Relative value refers to the significance of privacy in terms of regional culture, recognition by others, or legal norms. The relative value of privacy is the adaptation of its absolute value to the social environment. A person’s ability to enjoy, safeguard, and manage privacy largely depends on the attributes of the material, informational, and social environments in which they are situated. Privacy protection therefore requires exploring the specific context of privacy.


With the deep integration of virtual and physical spaces, the division between public and private spaces presupposed by traditional privacy theories can no longer effectively explain privacy issues in a big data environment, and privacy protection entails more than strict control of private information. People now rely heavily on information and communication technology in their daily lives and seek to ensure appropriate flow of information, which necessitates a framework of contextual integrity to analyze when individuals perceive new technologies and contexts as threats to their privacy.


This contextual integrity involves two aspects: first, adherence to various social contexts and corresponding social norms, and second, maintaining consistency between the information and its original context when using or disseminating information belonging to others. Regarding privacy protection in a big data environment, individuals do not hold absolute power over their data. Relative and reasonable privacy protection should be founded on the examination of specific contexts such as the purpose of data collection, processing methods, and the practical value of its application.


Limitations of data sharing

The digitalization of society, while serving as the basis for data sharing, is itself based on a human understanding of the world, which can only be achieved within certain limits. As a result, both data generation and data sharing are inherently limited.


Firstly, the representation of data is limited. Data provides a selective or approximate description of objective reality, rather than a complete replication of the complex and diverse real world.


Secondly, data practices are limited. Currently, technologies involved in data sharing, such as artificial intelligence, still suffer from “black box” issues, limiting their range of application, capabilities, and interpretability.


Lastly, data sharing should be regulated by social institutions. Technology must truly contribute to human well-being in order to have a long-lasting vitality in society. Data sharing should therefore be developed within the framework of social institutions, which includes privacy protection regulations. It is important to respond to public demand and continuously fix technical vulnerabilities in the process of data sharing to ensure the accuracy, integrity, and availability of data, which allows for maintaining a high rate of data sharing while also protecting individuals’ privacy.


Liu Jie and Song Rong (professor) are from the School of Marxism at Central China Normal University.




Edited by WANG YOURAN