Media Matters for Democracy conducts an initial analysis of the new social media rules and their potential impact on digital rights and economy in Pakistan

Media Matters for Democracy conducts an initial analysis of the new social media rules and their potential impact on digital rights and economy in Pakistan

(Islamabad, Pakistan)

Recently, a new set of ‘Rules’ were notified by the Ministry of Information Technology and Telecommunication (MOITT) to govern social media in Pakistan. The new rules are titled ‘Removal and Blocking of Unlawful Online Content (Procedure, Oversight, and Safeguard) Rules 2020’ (hereinafter ‘Rules’) and they outline the implementation of Section 37 of the Prevention of Electronic Crimes Act (PECA) 2016.

The Rules empower the telecom regulator, Pakistan Telecommunication Authority (PTA) to ‘block online speech critical of the government and public officers’, and delegates similar overbroad powers to the telecom regulator and the law enforcement groups.

The Rules are essentially a reincarnation of the much-debated and contested Citizen Protection (Against Online Harm) Rules 2020 introduced (CP Rules) ‘approved’ and later suspended by the federal government earlier this year. CP Rules outlined the government’s plan to localize data, and centralize control of digital information and expression to one central ‘National Coordinator’, among other regressive and dangerous provisions which could seriously impact fundamental rights and hamper the growth of Pakistan’s digital economy.

Civil society organizations including Media Matters for Democracy (MMFD), Pakistan Federal Union of Journalists (PFUJ), and the Islamabad High Court Bar Association declared the CP Rules unacceptable and demanded an immediate revocation.

Media Matters for Democracy (MMFD) has conducted an initial clause by clause analysis of the recently published social media Rules and their potential impact on the Internet, digital economy, and human rights in Pakistan.


“Removal and Blocking of Unlawful Online Content (Procedure, Oversight, and Safeguards) Rules 2020”, as stated earlier, outlines the rules and procedures for the implementation of Section 37 of PECA 2016, a section that empowers the telecom regulator to remove or block access to any online content based on vague qualifiers.

The rules further broaden the scope for content blocking / takedown and based on vague parameters, also allows for barring the critiques of the federal government.

Following is a brief clause by clause analysis of the new social media Rules:

Section 1 (XIII) – Definition of the Social Media Company (or Companies)

Social Media Company is defined in the Rules as “any Person that owns or manages Online Systems for provision of social media”.

This provision may create legal liabilities on individuals associated with technology companies rather than targeting only the companies themselves.

This definition may also be used to illogically categorise digital companies that may not qualify as social media companies.

Section 4 (01) (II) – Integrity, Security, and Defence of Pakistan

This section extends the scope of illegal content to any information that can create “hatred”, “contempt” or “disaffection” of government and public servants.

In effect, this means any information that discloses wrongdoing, any criticism, any effort to hold the government and public servants accountable for their actions can now legally be removed and blocked by the Pakistan Telecommunication Authority.

It is important to note that other sections of the Prevention of Electronic Crimes Act (PECA) 2016 – including Section 20 and 10A – that have similar provisions have been politically abused in the recent past, more specifically against the critics of the government policies and journalists.

Section 4 (01) (IV) – Decency and Morality Section 292 PPC

For decency and morality, the Rules refer to various sections of the Pakistan Penal Code including Section 292 that criminalizes sale, (etc) of obscene books and creates liabilities for anyone who sells, distributes, or benefits from distribution of “obscene books, pamphlets, paper, drawing, painting, representation or figure or any other obscene object”.

The term ‘obscene’ is not defined, creating space for subjective and arbitrary interpretations.

In the recent past, we’ve seen that a number of digital content items have been taken offline or blocked through Section 37 using vague definitions of decency and morality, including but not limited to dating applications and online games.

Similarly, for decency and morality, the Rules refer to various sections of the Pakistan Penal Code including Section 294 that criminalises obscene acts and songs, including singing, recitation, and uttering of songs, ballads, and words that may create annoyance to the public.

This section is highly likely to lead to more arbitrary blocking of creative content online. 

Section 4 (02) – Precedence over Community Guidelines

This section holds that the Rules will prevail and take precedence over any contrary Community Guidelines of service providers and social media companies.

It assumes that big technology companies, such as Google, Facebook, and Twitter, will be willing to create a separate regulatory system for Pakistan. This is highly unlikely, and may also push digital companies out of Pakistan.

This possibility was already indicated by the Asia Internet Coalition, an alliance of tech companies.

It is also important to note that global technology companies including Google and Facebook – the members of the Asia Internet Coalition (AIC) – maintain that they have not been consulted in the process of policymaking and they remain apprehensive of the Rules. AIC members have also indicated that these Rules will make them “rethink their presence in Pakistan”.

Thus, an assumption that social media companies will simply comply with the demands – such as in Section 04 (02) – is extremely uninformed and could create a situation which may put billion-dollar technology companies on a collision course with the Pakistani government, the fallout of which will be extremely damaging.

Section 6 (06) – Retention of Data regarding Access to Online Content

This section makes it mandatory for service providers and social media companies to retain information including traffic data linked to the blocked content if asked by the PTA.

There is no indication of how this data will be used, how long it would be retained, and/or why it is being collected. Given the lack of a data protection law, non-existent transparency, and history of targeting citizens who are vocal online, this clause may lead to abuse of power and targeted intimidation of citizens who use the Internet for information.

Section 7 – Obligations for Removal and Blocking

This section gives service providers, social media companies, and owners of information systems 6 to 24 hours to abide by PTA’s instructions to remove or block online content.

This timeline remains impractical for most international corporations who deal with millions of users and massive amounts of content. It is very likely that big technology companies will resist these timelines.

More importantly, there remains an absolute lack of transparency in the content-takedown policies of the government and the PTA. Every year, Facebook, Twitter, and other platforms indicate that hundreds of thousands of content items were removed at the request of the government through their transparency reports, however, there is no data whatsoever available to ascertain whether or not the demands of content-takedown from the government were justified.

Similarly, according to unconfirmed figures, nearly a million websites remain blocked in Pakistan with zero transparency. Provisions under Section 07 of the rules are highly likely to be used in a similar non-transparent manner, creating another layer of legal provisions for censorship.

Section 8 – Blocking of an Online System

The section indicates that the PTA, under Section 37 of PECA 2016, will have the authority to block entire online systems or any services provided by such service providers.

This means that PTA can not only block platforms such as Facebook, but also the Facebook company’s associated applications, such as WhatsApp and Instagram.

This section is completely ultra vires, i.e. it extends PTA’s powers beyond the scope of the parent law (the Prevention of Electronic Crimes Act 2016).

This also raises concerns about extending the scope of ‘legalised censorship‘ in digital spaces.

Section 9 (02) – Other Obligations of Service Providers and Social Media Companies

Among other obligations on the social media companies defined in the Rules, one of them is that the social media companies are required to ‘create and display’ Community Guidelines as defined by the Rules, in addition to the social media comapanies’ own Community Guidelines.

As stated above, the global social networks are highly unlikely to abide by these conditions, which ask them to create a separate regulatory system for a country with a relatively smaller digital market. This section is likely to push companies to stop operating in Pakistan, which would be detrimental to the country’s digital economy.

Section 9 (03) – Deploying a Monitoring Mechanism

This section asks service providers and social media companies to deploy appropriate mechanisms to identify online content that should be blocked under these rules.

This indicates the creation of elaborate mechanisms to monitor user-generated content, user activity, and pre-emptive blocking and censorship of content without awaiting a complaint. This is an extremely dangerous provision with serious implications for both the freedom of expression and privacy of citizens, especially given the track record and the lack of transparency in content-takedown by the telecom regulator.

Section 9 (04) – Monitoring and Removal of Content by Private Actors

This section states that service providers and social media companies shall not knowingly host, display, upload, publish, transmit, update or share any online content barred under the Rules, through various sections and qualifiers, including extremely vague terms, such as decency and morality.

This section and its sub-sections indicate that the PTA aims to privatise censorship and get private actors, including service providers and social media companies, to make proactive decisions about content-takedowns through both automated and reporting mechanisms. 

This provision is directly in conflict with the constitutionally guaranteed rights of citizens of Pakistan with regards to free expression and access to information.

Section 9 (05) (a) – Registration of Service Providers and Social Media Companies

Social media companies with more than 500,000 users have to register with PTA within nine months.

PTA has neither the authority nor the legal framework to register the global social media companies locally either under PECA or under the Pakistan Telecommunication Authority (Re-organisation) Act through which the PTA was established as a telecom regulator.

This section is ultra vires and extends PTA’s authority beyond the jurisdiction provided in the parent laws. 

Section 9 (05) (b & c) – Office and Focal Person in Pakistan

The Rules indicate that it is now mandatory for social media companies to establish a registered office in Pakistan and appoint a focal person based in Pakistan.

Said obligations extend beyond the scope of the parent law (PECA 2016), which does not impose any such restrictions upon the social media companies, or any other intermediary.

On the contrary, PECA 2016 limits the liability of intermediary stakeholders under ‘Section 38 – Limitation of liability of service providers’ – another section of the Rules that is ultra vires.

In addition, these obligations completely overlook the reality of Pakistan’s security situation. It is, thus, another section that is likely to push global corporations out of Pakistan, creating a devastating impact on our digital economy.

Section 9 (05) (d) – Establishment of Data Servers in Pakistan

The Rules state that social media companies and global service providers must establish one or more database servers in Pakistan. Again, social media companies are unlikely to comply as this obligates the companies to invest a substantial sum of money thereby increasing the local cost of doing business for global digital companies substantially.

As indicated in the statement issued by AIC, the global social media companies are likely to resist and ‘reassess their presence’ in Pakistan if need be, a situation that might cost Pakistan a massive setback to its digital economy.

In addition, given the history of abuse of power by local law enforcement, locally hosted personal data may be more vulnerable than data hosted elsewhere.

Section 9 (07) – Provision of Decrypted, Readable Data to FIA

This section demands the social media companies to provide decrypted, readable and comprehensible information’ including subscriber information, traffic, and content data among other kinds of information, as and when required.

The definition enables an interpretation that allows the local law enforcement groups to demand the encrypted content data, including conversations on messaging apps and platforms, such as WhatsApp. The Rules also indicated that a failure to provide decrypted data may cause a penalty of 500 million rupees ( 3 million US dollars approximately) on social media companies.

There are no qualifiers for the conditions in which private information may be asked. There is no judicial process to access private information and this clause remains a major threat to the right of privacy and security of personal data.

Section 9 (09) – Monitoring of Live Streams

This section demands that the ‘service providers and social media companies’ have to deploy mechanisms to ensure the prevention of uploading and live streaming of content related to hate speech, pornography, incitement of violence, and national security’.

Not only are these qualifiers are totally undefined and subjective, but also this clause privatises censorship, asking private actors to make decisions with regards to content restrictions and holding the private sector liable for user-generated content.

This clause has serious implications for freedom of expression and privacy.

Additionally, the monitoring and regulating live streams during the broadcast is a momentous task, which is unlikely to be undertaken by any private actors.

Section 9 (10) – Legal Liabilities for Social Media Companies

This section allows for the imposition of a hefty fine on the service providers and social media companies in case of a failure of the implementation of the Rules. It is important to note that under the parent law (PECA 2016), PTA is not empowered to impose any such fine.

Thus, this section is ultra vires and delegates powers beyond the jurisdiction of the law. With this clause, the Rules also enter into the domain of legislation, which is the sole prerogative of the Parliament.

Additionally, this is in contradiction to Section 38 of PECA that limits the legal liability of service providers. Therefore, the rules go directly against the provisions and protections provided to service providers in PECA.


Overall we find the Rules excessive in scope, contradictory in text, and detrimental to Pakistan’s digital growth. We also find them ultra-vires to their parent law (PECA 2016) i.e. the Rules seem to delegate powers beyond the scope of the Prevention of Electronic Crimes Act 2016, and Section 37, which is the basis of the said Rules.

The Rules seems to have, once again, put the Pakistani government and the billion-dollar digital companies on a collision course. It is important to mention here that the global social media companies are extremely necessary for Pakistan’s digital economy and are instrumental for nearly all businesses in Pakistan, especially the SME and upcoming digital startups. Any interruption in the services, or access to the global platforms, could cause an irrecoverable damage

We also believe that the Rules alone aren’t the problem. Section 37 in particular and the PECA 2016 in general – the legislation that the Rules build upon – are equally problematic, if not more, and have been used as tools to suppress political dissent and target media and journalists. Thus, the Rules alone cannot be amended or suspended without a comprehensive amendment in the PECA law based on the principles of human rights.

We call upon the federal government in specific, and all the parliamentary parties in general, to address the flawed nature of Section 37, and of the PECA 2016 itself. We urge them to take this opportunity to initiate a parliamentary discourse on the draconian nature of the Prevention of Electronic Crimes Act 2016  — Section 37 in particular — and work together for necessary amendments in the PECA 2016, including Section 37.

Media Matters for Democracy (MMFD) has published a comprehensive statement on the Rules which can be seen here.