?

Research on the Legal Framework of Content Regulations for Network Platforms

2024-05-10 07:09ZhangYinandLiaoXinyue
Contemporary Social Sciences 2024年1期

Zhang Yin and Liao Xinyue

Southwest University of Political Science and Law

Abstract: In the era of the Internet,various network platforms have evolved into new hubs for information dissemination.Currently,China has established a platform-centered content regulation framework,wherein platforms proactively enforce content regulations in accordance with legal censorship obligations.Additionally,platform policies and user agreements augment their authority in content regulation.The platforms can achieve cost-effective and highly efficient content regulation by leveraging their strategic advantages enabled by their own technical capabilities and extensive coverage.The platform self-regulation model,however,still faces challenges.First,accurately evaluating content remains a formidable task; second,ensuring effective platform publicity through self-regulation poses difficulties; third,users may potentially face disadvantages due to the platform’s right of self-regulation; and fourth,digital copyright owners face challenges when defending digital copyright disputes under the safe harbor rule.Therefore,it is imperative to establish,review,and revise the legal framework for content regulation of network platforms in order to enhance the efficiency of their governance systems.The formulation of the legal framework for content regulation of network platforms may encompass the following aspects:rationalizing obligations pertaining to platform content regulations,enhancing supervision over platform self-regulation,and establishing a dual-track responsibility system for digital copyright content regulation.This will ensure a harmonious balance among public interests,users’ personal rights and interests,and commercial benefits through regulating the content on network platforms.

Keywords: content regulations,platform self-regulation,legal framework

With the advent of the Internet era,digital technology has transformed contemporary society into a “network society” in terms of social structure,a “flexible society” in terms of institutional systems,a “cyber society” in terms of cultural development,and a “self-directed society” in terms of consciousness recognition (Kim & Kim,2018,pp.6-8).Network platforms have evolved into new hubs for information dissemination.In 2021,the Communist Party of China (CPC) Central Committee and the State Council jointly issued a guideline on developing a more civilized and well-regulated cyberspace environment.The guideline emphasizes the imperative of enhancing the construction of a more civilized cyberspace environment and highlights the urgency and necessity of promoting the establishment of behavioral norms in cyberspace and strengthening governance to create a healthier online ecosystem.The regulation of network platform content is the fundamental aspect of comprehensive governance in cyberspace.The content on network platforms,being contributed by diverse users,is characterized by a substantial volume of information,rapid transmission speed,extensive influence,and persistent damage that is challenging to eliminate.These factors also amplify the potential for social risks and heighten the complexity of content regulation of network platforms.Network platforms,as privately owned entities of a commercial nature,possess distinctive technical and capital advantages in content regulation.The adoption of platformcentered cyberspace governance has become a popular choice for nations worldwide.The robust growth of the Internet economy has created vast opportunities for commercial interests in the business relationships associated with network platform content.In the current era of the Internet,a new game pattern has emerged where public power,private power,and private rights intersect.Therefore,content regulation of network platforms must be carried out under a tripartite structure consisting of the government (state),platforms,and users.Therefore,the regulation of content on network platforms necessitates the establishment of a legal framework for content regulation.It is necessary to address the challenges posed by the self-regulation model,protect user rights and public interests,and leverage the advantages of platform selfregulation.

Current Situation of Content Regulation of Network Platforms

China has established a platform-centered content regulatory framework based on theCybersecurity Law of the People’s Republic of China(“Cybersecurity Law”) andAdministrative Measures on Internet Information Services,alongside numerous regulations and normative documents.The legal obligation exerts external pressure on platforms,prompting them to proactively engage in self-regulation.The platforms’ inherent technical capabilities and extensive coverage effectively address the technical and economic challenges encountered by traditional administrative regulations,thereby establishing their dominant position in platform governance.

Power Source of Platform Self-Regulation

Chinese laws and regulations stipulate that the platform must assume substantial supervisory obligations for information and content regulation,and it shall bear corresponding administrative responsibilities in case of failure to fulfill such obligations.TheAdministrativeMeasures on Internet Information Servicesimplemented in 2000 was the first legislation that imposed substantive supervision obligations on network platforms for information and content regulation.It delineated nine categories of information and content that are prohibited from being disseminated,①See Article 15 of the Administrative Measures on Internet Information Services: “Internet information service providers shall not produce,reproduce,distribute or disseminate information that includes the following contents: (1) content that is against the basic principles determined by the Constitution; (2)content that impairs national security,divulges State secrets,subverts State sovereignty or jeopardizes national unity; (3) content that damages the reputation and interests of the State; (4) content that incites ethnic hostility and ethnic discrimination or jeopardizes unity among ethnic groups; (5) content that damages State religious policies or that advocates sects or feudal superstitions; (6) content that disseminates rumors,disturbs the social order or damages social stability; (7) content that disseminates obscenity,pornography,gambling,violence,homicide and terror,or incites crime; (8) content that insults or slanders others or that infringes their legal rights and interests; and (9) other content prohibited by laws or administrative regulations.”while also imposing record-keeping and reporting obligations on the platforms.②See Article 16 of the Administrative Measures on Internet Information Services: “If an Internet information service provider discovers that information transmitted by its website clearly falls within the contents listed in Article 15 hereof,it shall immediately discontinue the transmission of such information,keep relevant records and make a report to relevant State authorities.”TheAdministrative Measures on Internet Information Services(revised draft for opinion-seeking in 2021) has reinforced the obligation of network platforms in information and content regulation.TheCybersecurity Lawintroduced in 2016,as the first specialized law in the field of cybersecurity in China,also stipulated that network operators assume responsibility for regulating user-generated information and content while implementing appropriate measures to deal with illicit content.③See Article 55 of the Cybersecurity Law of the People’s Republic of China: “When a cybersecurity incident occurs,the cybersecurity incident response plan shall be initiated immediately,and the incident shall be investigated and assessed.Network operators shall adopt technical measures and other necessary measures to eliminate hidden security threats and prevent the spread of harm,and must promptly issue warnings which are relevant to the public.”TheCybersecurity Lawimposes more stringent regulatory obligations on network platforms in comparison to theAdministrative Measures on Internet Information Services.First,the scope of illegal content is expanded beyond the nine categories of prohibited content to include a recapitulative statement,thereby broadening the coverage of illegal content.Second,it imposes obligations on network platforms to identify and remove illegal information and content.Additionally,it eliminates the statement of “obviously illegal content” in theAdministrative Measures on Internet Information Services,thereby imposing stricter censorship obligations on platforms.Finally,it stipulates that the network operators shall bear corresponding administrative responsibilities in the event of their failure to fulfill regulatory obligations.In such cases,the penalties imposed will extend beyond previous operational sanctions.④See Article 68 of the Cybersecurity Law of the People’s Republic of China: “If a network operator violates Article 47 of this Law by failing to stop the transmission of information which is prohibited from being published or transmitted by laws or administrative regulations,failing to employ disposition measures such as deletion,or failing to retain relevant records,the relevant competent department shall order the operator to take corrective action,issue a warning,and confiscate any illegal income.If corrective action is refused or in serious circumstances,a fine of RMB100,000 to RMB500,000 shall be levied and the operator may be ordered to temporarily suspend business,take down its website,or the operator’s business permits or licenses may be revoked; the directly responsible supervisor and other directly responsible personnel shall be subject to a fine of RMB10,000 to RMB100,000.”

The platform’s capacity to exercise self-regulatory authority in content regulation is derived from the obligations imposed by laws and regulations pertaining to platform content regulations.The legal framework for regulating platform content in China can be divided into three levels: the first level pertains to the criteria for identifying illegal content; the second level involves the platform’s obligations in addressing illegal content; and the third level encompasses the disposal of potential consequences arising from the platform’s failure or refusal to fulfill its relevant processing obligations as mandated (Kong,2020,pp.133–148).Under this legal framework,platforms will proactively regulate user-generated content to fulfill their obligations.From the perspective of the theory of administrative law,self-regulation is considered one of the mechanisms through which the state leverages private entities in society to assist in accomplishing public tasks (Gao,2015,pp.73–98).Chinese laws and regulations delineate the obligations of network platforms in content regulation but do not confer upon them any specific authority or powers.Network platforms are neither explicitly authorized by laws and regulations,nor entrusted by relevant administrative entities to carry out self-regulation.According to the administrative laws in the civil law system in the Chinese mainland,such self-regulatory authority does not fall under administrative authorization or delegation; instead,platforms are required to engage in self-regulation due to the obligations imposed by laws and regulations.By doing so,platforms,as private entities,are included in the process of regulating network platform content.Therefore,within China’s Internet governance system,platforms leverage their technical capabilities and extensive coverage to establish a dominant position and enable self-regulation through the formulation of platform policies and user agreements that align with the content regulation obligations stipulated by laws and regulations.

The platform policies and user agreements are the formal embodiment of the power source underlying platform self-regulation.The platform gains the authority to carry out governance activities for its users through private law processes.The vast majority of network platforms have established internal platform policies,which require users to agree to and enter into a user agreement to register an account and access the complete range of services provided by the platform.The introduction of relevant regulations and normative documents signifies that the platform’s governance approach through the formulation of platform policies and user agreements,has gained recognition from administrative authorities.①See the Provisions on the Management of Internet Forum Community Services,the Provisions on the Management of Internet User Public Account Information Services,the Regulations on Internet-Based Live Broadcasting Businesses,and other normative documents.The platform possesses the authority to regulate user behaviors due to users’ reliance on the platform infrastructure for information dissemination.The platform gains the authority to conduct governance activities in the realm of private law by establishing platform policies and signing user agreements with users.

Methods for Content Regulation Under the Platform Self-Regulation Model

As for content regulation of network platforms,with the advancement of network technology,network platforms are playing an increasingly vital role in information dissemination,posing challenges for administrative authorities to effectively regulate the vast amount of information and content on these platforms using traditional censorship methods.Therefore,the highly efficient and cost-effective content regulation of network platforms can,to a certain extent,be achieved by leveraging the platform’s dominant position in the Internet architecture and establishing legal obligations pertaining to content regulation for platforms while capitalizing on their inherent technical advantages.

To effectively achieve content regulation of network platforms,the platforms have established two parallel governance mechanisms,namely pre-examination and postprocessing,to address illegal content.In practice,the pre-examination mechanism primarily functions to prevent the entry of illegal content into the platform through a pre-set keyword shielding system established by the platform.The implementation of automatic filtering,identification,and interception of inappropriate content through technological means has become a crucial strategy for alleviating the burden of network platforms on content regulation and enhancing their content regulation capabilities.Meanwhile,the recent development and application of automatic identification and filtering technologies also provide a compelling rationale for assigning greater responsibilities to network platforms on content regulation (Wei,2020,pp.27–33).Upon conducting a thorough review or upon receiving user complaints,the platform will evaluate instances of content violations and subsequently implement appropriate post-processing measures based on the severity of each violation.These measures primarily encompass content shielding and deletion,user blocking and expulsion,as well as restrictions on external link access.

The platforms’ content regulation efforts go beyond just removing illegal content due to their dominant position in Internet governance.With the advancement of big data technology,platforms leverage their own technological and capital advantages to deliver personalized content to specific users,promote promotional content to non-specific users,and establish a user account weight mechanism.This also signifies that the platform possesses actual control over the dissemination of compliant information and content.

The advantages of platform self-regulation on content are primarily manifested in the following aspects: First,the platform content serves as both the source of information to be regulated and the target of regulation,enabling prompt handling of illegal content upon detection,thereby overcoming the time delay associated with traditional administrative regulations.Second,the massive,instantaneous,and interactive nature of Internet information dissemination makes content regulation relying on technology and algorithms the optimal approach to regulating content on network platforms.The platform,equipped with essential technical advantages and expertise,demonstrates superior performance in regulating content on network platforms compared to the traditional administrative regulation mode in terms of both cost and efficiency.Third,owing to the borderless nature of the Internet,both the content and users of network platforms possess decentralized characteristics that transcend national boundaries.Furthermore,platform self-regulation can overcome jurisdictional restrictions imposed by national boundaries.Fourth,the platform is not only bound by external legal obligations but also internally driven to enforce self-regulation of network content.Cultivating a favorable communication environment on the platform can effectively attract and retain users,thereby promoting the platform’s economic benefits and long-term development.

Dilemma of Content Regulation of Network Platforms

The network platform,under the mode of self-regulation,has effectively established a dominant position in content regulation and emerged as the entity possessing “discourse power” in cyberspace due to its advanced technical capabilities and extensive coverage.The design of governance mechanisms for network platforms often prioritizes the maximization of private economic interests,given their nature as private entities,while also striving for highly efficient and cost-effective governance.Consequently,the self-regulation model for content regulation on these platforms entails both risks and deficiencies.

Platforms Face Challenges in Accurately Assessing the Content

The current laws and regulations have relatively vague standards for identifying illegal information,resulting in a lack of clear judgment criteria for platforms attempting content regulation.TheCybersecurity Lawand theAdministrative Measures on Internet Information Serviceshave provided a more comprehensive definition of illegal information; however,the absence of clear criteria for content judgment remains an ongoing issue.Some scholars have summarized the issues about determining harmful information on the Internet as follows: Despite the existence of corresponding provisions in current legislation,deficiencies persist within the legislative system in terms of its unity,rationality,and clarity due to the presence of diverse legislative bodies,varying normative levels,and an excessively broad design of relevant provisions.Consequently,there may arise problems such as “different verdicts in the same kind of case,” subjective interpretations,misuse of authority,and challenges in implementation (Yin,2015,pp.102–113).

The diverse content and varied expression approaches on network platforms pose a challenge for the platform to accurately judge the content during the examination process.The borderless nature of the Internet enables the unrestricted exchange of content and information on a global scale,transcending national boundaries.Although content that poses a threat to national security has been classified as illegal information and there are legal norms in place for its handling,①See Article 50 of the Cybersecurity Law of the People’s Republic of China: “The national cyberspace authorities and relevant departments shall be responsible for monitoring and managing the security of online content.Where they discover the publication or transmission of information that is prohibited by laws or administrative regulations,they shall request that network operators stop the transmission of such information and employ removal measures such as deletion,as well as retain relevant records; for information described above that comes from outside of the territory of the People’s Republic of China,they shall notify the relevant organization to adopt technical measures and other necessary measures to block the transmission of such information.”it is still challenging to accurately identify specific instances involving“invisible” cultural export and negative incitement related to social culture,national culture,and the dissemination of opinions on major social events due to the limitations of the current technological capabilities.The platform is incapable of effectively blocking such content through its pre-examination mechanism based on keyword filtering,while the personnel in the post-processing mechanism lack the capacity to make corresponding value judgments,thereby presenting challenges in identifying and assessing imperceptible illegal content.

The stringent regulatory responsibilities imposed on platforms by administrative authorities,coupled with their inclination to prioritize final outcomes when evaluating platform governance performance,have led to a lack of intrinsic motivation for platforms to make accurate value judgments on content.Chinese laws and regulations stipulate that the platforms have responsibilities to regulate content,with corresponding penalties for violations.Although stricter restrictions on users may harm user stickiness,platforms often opt for implementing more rigorous screening mechanisms for users instead of investing in larger examination costs or facing penalties.As a result,in practice,the platform usually lacks sufficient motivation to rigorously judge content.

Self-Regulation Model Can Hardly Facilitate Platforms to Fulfill Their Public Obligations

The obligations imposed on network platforms by Chinese laws and regulations necessitate platforms to assume certain functions of public management in terms of content regulation.The active engagement of platforms in activities related to network information and content regulation constitutes a crucial component of the comprehensive network governance system,wherein platforms effectively assume the role of the primary entity responsible for social governance and play a certain role in safeguarding the order of cyberspace as well as the rights and interests of users.Their public nature has become increasingly prominent as market regulators (Liu,2019,pp.42–56).These platforms,being private entities primarily driven by commercial interests,are inevitably inclined to establish various rules and regulations within their system that promote their own development and pursuit of commercial interests.However,this may inadvertently limit or even deprive other entities of their rights and resources due to the “Matthew effect” of economic growth and the profit-seeking nature inherent in any commercial private entity (Sun,2021).For instance,a foreign browser exploits its dominant market position by utilizing the summaries,news,and other content from media websites through its search engine without charge,consequently leading to a decline in user traffic for these websites.Moreover,certain search engines will prioritize paid content or the information of enterprises that have a partnership with the platform through algorithm settings,which may mislead some users and increase the risk of users making misjudgments based on inaccurate information.

With the advancement of the Internet economy,there have emerged certain inevitable transformations and challenges in network content.These are primarily characterized by the Matthew effect,substandard content,susceptibility to platform manipulation,and proliferation of the influence of Internet troll armies (Zeng & Tian,2019,pp.166–167).The emergence of network public relations companies further demonstrates the substantial commercial benefits derived from the content regulation of network platforms.Network public relations companies perceive the combination of the good network reputation maintained by some enterprises and individuals and the actual control of network platforms over content as business opportunities in the Internet era.The network public relations companies,in fact,function as “intermediary traders” who profit from the price spread by charging users in need of public relations services and paying fees to the platform in exchange for the authorization to publish or remove targeted content on it.Users with a large demand for network public relations services can also establish direct collaborations with the platform.For instance,there might be scenarios where a TV series or film dominates the trending hashtag for an entire month after its release,or instances of content related to a celebrity being banned and relevant keywords being removed.Due to the commercial nature of the network platform,network public relations behaviors pose low technical difficulty on the platform and hold significant commercial interests,which makes the platform consistently pleased to witness its development.The platforms,however,tend to overlook the risks of distortion or manipulation in transmitting information and the resulting harm to public interests caused by such network public relations behaviors.

In addition,there exists a potential risk that the platform may exploit its dominant position to infringe upon users’ right to engage in public discourse.The network platform,serving as a new hub for information dissemination in the Internet era,leverages its technical capabilities and extensive coverage to connect vast amounts of information with diverse user groups,thus establishing itself as a public domain for the public to acquire and disseminate information.Through content regulation,these platforms can exert certain control over social opinion and public life.Due to their technical advantages and privileges,these platforms employ algorithmic recommendations,trending topics,search rankings,and other technical means to grant specific or non-specific users preferential access to certain content.This approach is more covert yet exerts a greater influence compared to paid news and advertising in traditional media.

Platforms,in the social interactions they facilitate,establish online spaces that are governed by platform architecture and algorithms.In such spaces,the potential risks associated with content could have significant negative impacts on society (Price,2021,pp.238–261).The lower operating costs of network platforms enable them to attract a large user base,thereby increasing the likelihood of oligopoly compared to traditional enterprises.The Internet industry has witnessed the emergence of a new type of competition,where a few dominant enterprises hold the majority of the market share.Often,the largest players capture more than half of the market share,surpassing their closest competitors by several folds.Such examples include Facebook,Apple,Microsoft,Google,and Amazon in the United States,as well as Baidu,Alibaba,and Tencent in China (Han & Li,2020,pp.104–110).The closure of Trump’s social accounts by Twitter and Facebook in 2021 demonstrated the noteworthy influence that leading network platforms can exert on public opinion and even political affairs.Therefore,there are risks that the super platforms and technology giants may exert absolute control over speech and information flow in the digital era.This issue also presents an inevitable challenge in China’s regulation efforts on network platform content.

Platforms’ Exercise of Self-Regulation Undermines Users’ Position

The authority of self-regulation possessed by network platforms falls within the realm of private power.This authority stems from the dominance and influence of platforms derived from their technical advantages (Zhou,2015,pp.37–43),and is manifested in the unbalanced or asymmetrical legal relationships between private entities (Xu,2018,pp.105–121).In the era of the Internet,digitalization has transformed the formation mechanism of powers through“technological prowess” that indirectly impacts others’ capabilities by intervening or altering their natural and artificial living conditions (Meier & Blum,2020,pp.70–75).By establishing platform policies and signing user agreements,platforms have established a contractual relationship with users in the realm of private law.However,owing to their technological prowess,information superiority,and architectural advantages,platforms have actually gained dominance over users,resulting in an imbalanced power dynamic where users lack substantial bargaining power.During the process of formulating platform policies and user agreements,users are not allowed to participate in or negotiate the terms set forth; they can only choose whether to accept them.However,declining acceptance implies exclusion from accessing the range of services provided by the platform.The self-regulation model grants the platform direct and “private” authority to oversee and regulate its content,thereby enhancing the platform’s discretionary power in evaluating such content (Li,2019,pp.834–842).The disadvantageous position of users is manifested in the following aspects:

First,the examination mechanism of the platform,whether it involves technical or manual examination,lacks the essential attributes of openness and transparency required for procedural justice in terms of both the examination process and criteria.This limitation hampers users’ right to be informed and right to redress to a certain extent.Moreover,platforms may manipulate content examination for commercial profits,thereby compromising the rights and interests of users (Madio & Quinn,2023).

Second,although there are reporting and appealing channels available for users to raise objections or seek redress in the platform policies,they are still handled by respective platforms.In practice,the more common scenario is that the queue for manual customer service tends to be lengthy.Even if users patiently wait for a response from manual customer service,they often receive formulaic “official” replies that hinder their effective communication with the platform.Consequently,it becomes challenging for users to have their rights and interests adequately protected and redressed through internal channels of the platform.

Third,the platform policies and user agreements typically specify the content regarding the transfer of a portion of users’ rights and interests through standard terms.Although this will promote effective platform governance,it further exacerbates the imbalanced contractual power dynamics between the platform and its users.For example,theWeibo Online ServiceAgreementstipulates that users shall,by any means,refrain from causing any harm to the business reputation and other legitimate rights and interests of its operator Weimeng Company and the associated companies,as well as engaging in any activities that disrupt the normal operation of Weimeng,undermine the business model of Weibo,or otherwise jeopardize the integrity of Weibo’s ecosystem.The provision,in essence,limits users’ ability to provide constructive criticism and express their dissatisfaction with the platform.

Fourth,theWeibo Online Service Agreementalso includes a provision stating that once a user registers a Weibo account,the platform will assume that the user agrees to have various types of commercial advertisements or other commercial information placed by the operator in different ways throughout the provision of Weibo services (including,but not limited to,placing advertisements on any page of the Weibo platform website),and consents to receiving product promotions or other relevant commercial information from the operator through email,private messages,or other means.It can be seen that the platform has generated significant commercial profits by mandating users to transfer their rights and interests through standard terms.However,users are exposed to the risk of discerning whether the content they browse genuinely reflects other users’ sentiments or if it contains covert advertisements placed by the platform.

Challenges Faced by Digital Copyright Owners in Safeguarding Their Rights Under the Safe Harbor Rule

The safe harbor rule provides legal protection for the platform,exempting it from tort liability that may arise during the content regulation process.The safe harbor rule was initially introduced in theDigital Millennium Copyright Actenacted by the United States in 1998,while the EU also incorporated an exemption system for tort liability in theE-commerceDirective 2000/31/EC.The principle conditionally restricts the copyright infringement liability of providers engaged in information transmission,system caching,information hosting,and information positioning services,and was initially applied within the realm of copyright protection.In the initial stages of Internet development,due to the limited capacity of network intermediary service providers to conduct prior content examination,they were generally presumed unaware of infringement information.As a result,the “Notice-Take Down” rule was implemented to restrict the indirect infringement liability of these providers.

The application of the safe harbor rule in China is primarily reflected in the relevant provisions of theRegulation on the Protection of the Right of Communication to the Public onInformation Networks.These provisions specify under what conditions various Internet service providers can be exempted from liability and enjoy safe harbor treatment.①See Articles 20,21,22,and 23 of the Regulation on the Protection of the Right of Communication to the Public on Information Networks.The safe harbor rule also applies to network platforms,in principle.TheMeasures for the Administrative Protection of Internet Copyrightsstipulate that “Where there is no evidence to prove that an Internet information service provider knows the facts of a tort,or the Internet information service provider has taken measures to remove relevant contents after receipt of the copyright owner’s notice,the Internet information service provider shall not assume the administrative legal liabilities.”②See Article 12 of the Measures for the Administrative Protection of Internet Copyright.

With the continuous development of China’s network industry and advancements in technology,the safe harbor rule established for copyright protection is proving inadequate to address the challenges posed by digital copyright in the Internet era.Instead,it has transformed into a “safe harbor” for network platforms.In particular,it may potentially evolve into a protective shield enabling the platforms to evade liability for tort compensation.This is evident in practices such as engaging in deliberate infringement prior to receiving a notice of infringement,claiming no liability if no notice of infringement is received,and claiming no liability if the content is removed upon receiving the notice of infringement.In the practice of content regulation of network platforms in China,it is challenging to provide evidence that“an Internet information service provider clearly knows the facts of tort.” The aforementioned challenge frequently results in platforms being inactive when it comes to seeking evidence regarding whether the communicated content constitutes infringement based on the safe harbor rule.In light of this,the determination of tort liability generally relies on the proactive notification by the copyright owners to the Internet information service provider regarding the fact of tort.In practice,the approach to ensure that platforms are aware of copyright infringement mainly relies on copyright holders raising objections.However,since there are numerous large and small network platforms,it becomes excessively costly and challenging for copyright owners to individually identify and notify each instance of infringement in order to effectively protect their rights.

Feasible Paths for Establishing Legal Framework for Content Regulation of Network Platforms

The advent of the intelligent Internet has given rise to a new and complex game pattern,wherein public power,private power,and private rights intersect.Within this pattern,cooperation and confrontation can arise between any of the two.This has significantly transformed the structure and function of the relationship between the state and society,as well as power and rights in the past (Ma,2018,pp.20–38).While employing the selfregulatory capabilities of network platforms enabled by their technical and information advantages to implement content regulation,it is necessary to establish and improve the corresponding legal framework.The mode for the content regulation of network platforms should involve both the intervention of public power and the regulation of private power (Kong,2020,pp.133–148).

Rationalizing Obligations for Platform Content Regulation

The legal framework for content regulation of network platforms should refrain from imposing overly rigid requirements when it comes to obligations for platform content regulation.The excessively strict obligations imposed on platforms will significantly harm the platform business model,hinder technological innovation,and impede market competition(Lefouili & Madio,2022,pp.319–351).The public nature of network platforms should be emphasized during the process of regulating platform content,while also taking into consideration the market characteristics of these platforms as for-profit commercial entities when determining their regulatory obligations.The digital information industry boasts a unique operating law,characterized by rapid innovation in both technology and business models (Tang & Tang,2023,pp.59–72).The improper emphasis solely on the platform’s primary responsibility will result in excessive operating costs and an overly formalized approach to content regulation,thereby compromising public interests and potentially excluding the government from bearing corresponding regulatory obligations (Liu,2022,pp.79–93).Relying solely on legal liability as an external pressure has been proven insufficient in effectively curbing platforms’ manipulation of content examination for their own commercial gains.The establishment of a sound platform liability system necessitates striking a balance between commercial autonomy and content compliance (Wei,2020,pp.27–33).Therefore,the commercial behaviors resulting from platform content regulation should be integrated into the legal regulatory framework.While respecting the commercial nature of the platform,it is important to encourage reasonable commercial competition and cooperation among platforms within the market law of Internet economy development.Administrative authorities need to implement dynamic supervision in the process of platform governance,and timely identify and deal with platform behaviors that may undermine public interests or users’ rights and interests.

Moreover,administrative authorities should avoid prioritizing final outcomes when evaluating platform governance performance.The supervision of administrative authorities in the content regulation of network platforms in China is primarily ensured through interviews conducted by the Office of the Central Cyberspace Affairs Commission with platform leaders,as well as special actions of administrative law enforcement.However,both interviews and special law enforcement actions belong to the post-administrative accountability means,and administrative authorities tend to prioritize final outcomes when evaluating whether platforms have fulfilled their regulatory obligations or not.Some scholars argue that the platform’s responsibility is predicated on the presence of illegal information,rather than any inherent fault on the part of the platform.This essentially imposes stringent obligations on platforms regarding content regulation (Yao,2019,pp.31–42).In order to be exempt from liability,the platform can only adopt stricter restrictions on the content examination,for example,setting more detailed filtering terms,and embracing the examination principle of “presumed guilt for any suspected violation,” thereby directly prohibiting information that may potentially cause infringement or violate the law.The excessively stringent obligations imposed on platforms will transform them into examinants of social discourse and undermine the diversity of content (Hogan Lovells,2018,pp.1–28).It may have an impact on users’ freedom of expression,potentially impeding the platform’s role as a hub for gathering and sharing information.

The platform should be empowered to ascertain the legality of content based on legal principles in cases where laws and regulations are ambiguous.In the process of content regulation,if promptly ascertaining the legality of content proves challenging,platforms can employ principles such as the clear and present danger doctrine,case-by-case evaluation,and content classification governance to determine and address each case.The clear and present danger doctrine can be summarized as the principle that justified limitations on freedom of speech are necessary when the content may pose significant potential harm due to specific circumstances.The principle of case-by-case evaluation was initially applied to the court during the process of adjudicating relevant cases,which grants judges greater discretionary power.In the context of content regulation,the “balance of interests theory” can be extensively employed to assess the weight of various conflicting rights and their respective consequences resulting from protection or suppression.Subsequently,a judgment can be made regarding which rights should receive what degree of protection before proceeding with content regulation.The principle of content classification governance involves organizing content into political,commercial,and unprotected categories.Since legal protections differ for each category of content,network platforms must regulate the content according to its category (Xie& Song,2022,pp.67–79).

Improving Supervision over Platform Self-Regulation

The platform possesses technical and operational advantages in implementing content regulation activities through self-regulation,yet the absence of effective supervision may result in the abuse of platform governance power.Therefore,it is imperative to establish and improve the legal framework for content regulation of network platforms to bolster supervision over platform self-regulation.

Introducing Due Process of Law in Platform Self-regulation

The supervision of administrative authorities over the content regulation of network platforms should not be limited to imposing obligations in advance and holding them accountable afterward.In the process of establishing platform policies and user agreements,administrative authorities should exercise regulatory functions.The administrative authorities should guide the implementation of the self-regulatory mechanism of platforms,ensuring that they operate in accordance with legal rules.

The exercise of national public power is bound by the principle of due process of law,but the platform may avoid adhering to the requirements of due process when exercising its private power.This places users at a disadvantage and poses challenges for them to protect and redress their rights and interests.The platform,with its distinctive resource advantages and technical capabilities,has the potential to establish a monopoly over rulemaking,rule implementation,penalties,and other related rights.Such a monopoly serves as a pathway to acquiring power (Mann,2018,p.29).The advantages of platforms in technology,information,and architecture can empower them to exert control over their users.When platforms engage in self-regulation of platform content,they exercise “quasi-legislative power,” “quasi-administrative power,” and “quasi-judicial power” similar to national public power in the actual interaction with users.This is primarily manifested in three aspects: the establishment of platform policies and user agreements,the handling of illegal content,and the handling of user complaints.The platform’s quasi-legislative power is manifested through its rule-making authority exercised during the establishment of platform policies and user agreements.The exercise of the platform’s rulemaking power in practice deprives users of their rights to participate in the process or negotiate relevant terms.As bound parties,users are not granted the opportunity to engage in rulemaking or raise objections.Consequently,the platform exercises complete control over rulemaking,leveraging its advantageous position derived from its extensive coverage.The “quasi-administrative power” refers to the platform’s power to address illegal content during the process of content regulation,which is similar to administrative enforcement power.The platform reserves the right to block and delete relevant content,or even ban or expel users if it considers the content involved to be illegal.However,during this process,it is challenging for users to safeguard their rights to be informed,to address inquiries,and to redress regarding the platform’s examination criteria and handling measures.The “Quasi-judicial power” involves the users submitting their redress applications to the platform,and the decisions made by the platform regarding these applications are considered final since the platform’s internal organs are the agents responsible for accepting users’ appeals and processing redress applications.

The principle of due process of law serves as the premise and guarantee for the platform to achieve efficient governance in the course of self-regulation.Therefore,it is both necessary and reasonable to incorporate this principle into the process of platform self-regulation.Only through this approach can a harmonious and healthy relationship between the platform and users be established under the platform self-regulation model.In the exercise of the “quasilegislative power,” it is important to ensure that users have the right to participate in the process of formulating platform policies and user agreements and raise objections,as required by the principle of due process of law.In the exercise of the “quasi-administrative power,”platforms are required,under the principle of due process of law,to actively disclose their key filtering term lists,keep their examination process and criteria transparent,and take responsibility for any negligence.The exercise of “quasi-judicial power” necessitates the disclosure of the process and grounds for appeal,allowing users to re-appeal the platform’s processing procedures and outcomes,in accordance with the principle of due process.

Establishing Relevant Supporting Systems

First,establishing a comprehensive system for recording and reviewing platform policies and user agreements.With the superimposed development of algorithms,big data,and the digital economy,platform policy formulation is progressing in a progressively sophisticated,technical,and intricate direction.Therefore,it is imperative for government agencies to scrutinize platform policies in order to mitigate the risk of misinterpretation stemming from the knowledge gap between users and platforms while also lowering the threshold for public comprehension of said policies.Relevant departments of the administrative authorities should carefully examine the platform policies and user agreements,and then provide corresponding suggestions for rectification and modification.This will ensure that the platform policies accurately reflect their public nature and safeguard public interests,thereby bolstering the rationality of such policies.In addition,the Office of the Central Cyberspace Affairs Commission and other departments can take the lead in providing normative guidance for formulating platform policies,user agreements,platform service content,privacy policies,and other key regulations of the platform.This will ensure a balanced approach to addressing the interests of various stakeholders during the formulation process of such regulations.

Second,building government redress channels for users to submit reporting and appeals.When enforcing the law,administrative authorities are obligated to adhere to the fundamental requirements of due process,which encompass providing prior notification and offering opportunities for the counterpart to present their statement and defense.The counterpart has the right to raise objections regarding both substantive and procedural aspects of the law enforcement actions,as well as file an administrative lawsuit with judicial authorities to review the legality and rationality of such administrative behavior.However,in the process of content regulation of network platforms,the internal redress channels provided by the platform are still part of the platform’s governance mechanism and represent its own interests.Therefore,in the absence of alternative channels for redress except the platform’s internal redress mechanism and for resorting to legal action,users’ rights and interests may not be fully safeguarded.Even if users believe that the platform’s governance behavior is inappropriate and decide to file a civil lawsuit to the court,they may face exacerbated costs in safeguarding their rights and interests due to the lengthy litigation process and the principle that civil cases should be heard in the jurisdiction where the defendant is located.The court’s evaluation of standard terms in the judicial practice of our country varies significantly,as it can only adjudicate disputes on specific content outlined in the user agreement signed between users and platforms.Given that platform policies and user agreements predominantly belong to standard terms,there exists considerable divergence in how courts assess these provisions.Furthermore,the court’s assessment of whether a platform has fulfilled its obligation for reasonable promptness through the presentation of standard terms is also inconsistent (Hu & Li,2019,pp.53–62).Therefore,the effectiveness of judicial redress based on civil legal relationships between users and platforms is inferior to that of administrative redress.Given this,the feasible approach to offer efficient and convenient redress for users is to establish redress channels akin to administrative redress.In the process of fulfilling their supervisory responsibilities,relevant government departments should create mechanisms for administrative review,enabling users to submit reporting and appeals regarding platform behaviors and question platform policies and user agreements.Furthermore,it is essential to include the provision of redress channels for users within the scope of administrative authorities’ supervisory functions.

Third,developing a regular public reporting system.To ensure the transparency required by due process,it is necessary to refer to the relevant provisions outlined in Germany’sNetwork Enforcement Act in 2017.According to this Act,social network platforms that receive more than 100 complaints per year are obligated to produce a semiannual report and publish it on both Bundesanzeiger and their websites’ home pages,providing comprehensive details regarding its contents.The UK’sOnline Harms White Paperpublished in 2019 also mandates regulated platforms to produce and publish annual transparency reports,encompassing the dissemination of harmful content on the platform as well as the corresponding measures implemented by said platform.The report will be published by regulatory authorities who possess the authority to demand additional information from the platform,including algorithmic operations.By drawing on the experience of foreign systems,China should also establish a robust public reporting system.The reports should include regular public disclosures in the form of the platform’s semi-annual or annual report,as well as those drafted based on a specific number of user complaints.It is the responsibility of administrative regulatory authorities to determine the content of these reports and organize users to engage in discussions regarding the content of the report.Additionally,these authorities should provide detailed provisions to stipulate the specific content.

Establishing a Dual-track Responsibility Mechanism for Digital Copyright Content Regulation

The Red Flags Rule,which serves as a fundamental component of the safe harbor rule,has been employed by numerous copyright owners in litigations against the platforms that function as website operators for infringement.The safe harbor rule is a general principle,while The Red Flags Rule represents its application in specific cases.Internet service providers can only be exempt from legal liability if they were not clearly aware or should not have been aware that the information they transmit constitutes infringement,or that linked works,performances,audio,and video recordings belong to infringing works.Article 23 of theRegulation on theProtection of the Right of Communication to the Public on Information Networksembodies The Red Flags Rule.①See Article 23 of the Regulation on the Protection of the Right of Communication to the Public on Information Networks: “A network service provider which provides searching or linking service to a service recipient and which,upon receiving a written notification of the right owner,disconnects the link to an infringing work,performance,or sound or video recording in accordance with the provisions of these Regulations bears no liability for compensation;however,if it knows or has reasonable grounds to know that the linked work,performance,or sound or video recording is an infringement,it shall bear the liability for contributory infringement.”However,during the initial phase of Internet development in China,courts held divergent interpretations regarding the application of the safe harbor rule.Consequently,numerous piracy websites exploited this rule to evade accountability,while search engines and sharing platforms still perceive it as a crucial safeguard against liability.Therefore,the application of The Red Flags Rule should be emphasized in legislation or through guiding cases of the Supreme People’s Court,highlighting its priority over the safe harbor rule when determining network infringement.

The safe harbor rule is widely recognized as a crucial legal incentive for promoting the innovation of early Internet technology and business models,serving as the foundational pillar of the thriving Internet economy.The advancement of Internet technology has greatly facilitated the implementation of automatic filtering technology,thereby significantly reducing the difficulty and cost for network platforms to conduct content pre-examination.Consequently,a unique development background for the Internet has been created,where The Red Flags Rule prevails.Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC,it is stipulated that content-sharing service platforms will be held directly liable for “communicating infringing content to the public”when users publish copyrighted works that violate these rights.ThisDirectiveexcludes the application of the safe harbor rule for content-sharing service platforms in cases of usergenerated content infringement and imposes an obligation on such platforms to obtain prior authorization for copyright and related rights,thereby avoiding post-event general content examination.Additionally,it aims to reduce platform costs associated with obtaining copyright authorization by implementing collective licensing mechanisms and negotiation mechanisms for various rights.These measures will overcome certain limitations inherent in the safe harbor rule’s reliance on copyright owners issuing infringement notices.Therefore,the application of The Red Flags Rule should be emphasized in legislation or through guiding cases of the Supreme People’s Court,highlighting its priority over the safe harbor rule when determining network infringement.

The establishment of an efficient copyright content regulation responsibility system necessitates the implementation of a dual-track accountability framework wherein the responsibilities held by rights owners and platforms are clearly defined (De Chiara,Manna,& Rubí-Puig,et al.,2021).Therefore,China should also prioritize the implementation of The Red Flags Rule.The administrative authorities should take the lead in establishing a comprehensive digital copyright information database to effectively safeguard the rights of copyright owners by actively encouraging them to upload their copyrighted works onto this platform.Moreover,proactive measures such as employing advanced algorithms and technical means should be adopted by platforms to compare their content with the registered digital copyright information database.It is imperative for platforms to address any relevant content that has been registered in the digital copyright information database rather than solely relying on notifications from copyright owners.This approach delineates the respective responsibilities of platforms and copyright owners in terms of being clearly aware of copyright infringement,thereby enhancing the safeguarding of digital copyright.

Conclusion

The platform-centered content regulation of network platforms has become a prevalent approach in cyberspace governance for many countries today.However,given the emerging game pattern in the Internet era where public power,private power,and private rights intersect,it is imperative to establish and refine the relevant legal framework for content regulation of network platforms in order to address the shortcomings and risks associated with the platform self-regulation model.The involvement of public power should not be overlooked in the formulation of specific legal rules,and its intervention should not be excessively stringent.If the platform becomes a mere instrument for implementing administrative decisions,it will lose its advantages in regulating network content and hinder information exchange,content innovation,and fair competition in the Internet era.Public power should be defined as the role of a supervisor and guide when constructing the legal framework for content regulation of network platforms.Efforts should be made to strike a balance between public interests,users’ individual rights,and platform commercial interests in regulating the content on these platforms.

91香蕉高清国产线观看免费-97夜夜澡人人爽人人喊a-99久久久无码国产精品9-国产亚洲日韩欧美综合