What term is used to describe the process of restricting access to certain material on the web?

Abstract: "A summary or brief description of the content of another longer work. An abstract is often provided along with the citation to a work."

Almanac: "1. A collection, usually annual, of statistics and facts, both current and retrospective. May be broad in geographical and subject coverage, or limited to a particular country or state or to a special subject. 2. An annual containing miscellaneous matter, such as a calendar, a list of astronomical events, planting tables, astrological predictions, and anecdotes" (Definition from Yale University Library)

Annotation: "1. A note that describes, explains, or evaluates; especially such a note added to an entry in a bibliography, reading list, or catalog. 2. Process of making such notes. Annotation is the end product of making such notes." (Definition from Colorado State University Libraries)

Archives: "1. A space which houses historical or public records. 2. The historical or public records themselves, which are generally non-circulating materials such as collections of personal papers, rare books, ephemera, etc."

Article: "A brief work—generally between 1 and 35 pages in length—on a topic. Often published as part of a journal, magazine, or newspaper."

Atlas: "A book or bound collection of maps, illustrations, etc.; Volume of maps, plates, engravings, tables, etc., which may be used to accompany a text; or it may be an independent publication." (Definition from Colorado State University Libraries)

Attachment: "A separate file (e.g., text, spreadsheet, graphic, audio, video) sent with an email message."

Authentication: "A security process that typically employs usernames and passwords to validate the identity of users before allowing them access to certain information."

Author: "The person(s) or organization(s) that wrote or compiled a document. Looking for information under its author's name is one option in searching."

Back to the top

Module 2: Restricting Access and Content

Overview of censoring, blocking and filtering of content

Access to information, and increasingly access to knowledge, is a central tenet of the internet.  However, efforts to restrict access have developed in step with increased access.  Technical measures are being implemented by state and non-state actors to limit, influence, monitor and control people’s access to the internet.  These measures include censoring, blocking, filtering and monitoring content.  While these measures may not be as extreme as complete internet shutdowns, they equally hinder the full enjoyment of the right to freedom of expression.

Censorship and blockingFiltering
Typically refers to the prevention of access to specific websites, domains, IP addresses, protocols or services included on a blacklist.(1) Justifications for blocking often include the need to prevent access to illegal content, or content that is a threat to public order or is objectionable for a particular audience.(2) Generally refers to restricting or limiting access to information (or related services) that is either illegal in a particular jurisdiction, is considered a threat to public order, or is objectionable for a particular audience.

Filtering can relate to the use of technology that blocks pages by reference to certain characteristics, such as traffic patterns, protocols or keywords, or based on their perceived connection to content deemed inappropriate or unlawful.

Note: This distinction might be considered semantical, but it can also be considered a matter of scale and perspective. However, the key commonality is that they both limit access to the internet.(3)

As explained by ARTICLE 19, there are different ways in which access to content can be restricted, for example:(4)

  • URL blocking blocks a specific web page.
  • IP address blocking prevents connection to a host.
  • Entire domain names can be blocked through DNS tampering.
  • Blacklisting compiles a list of URLs to be filtered, while whitelisted URLs are not subject to blocking or filtering.
  • Keyword blocking is generally used to enable the blocking of specific categories of content.

The rise of disinformation has also contributed to an increase in blocking and filtering with states trying to mitigate the spread of false information, and in some instances legally permitting blocking and filtering in order to prohibit and punish the dissemination of false or inaccurate statements.

Applicable international human rights standards

The same general considerations relating to access, online rights and freedom of expression discussed above are applicable here, save for specific considerations relating to filtering and blocking.  In 2011, in a Joint Statement on Freedom of Expression and the Internet, a collective of Special Rapporteurs and experts stated the following in relation to filtering and blocking:

  • Mandatory blocking of entire websites, IP addresses, ports, network protocols or types of uses (such as social networking) is an extreme measure – analogous to banning a newspaper or broadcaster – can only be justified in accordance with international standards, for example, where necessary to protect children against sexual abuse.
  • Content filtering systems which are imposed by a government or commercial service provider and which are not end-user controlled are a form of prior censorship and are not justifiable as a restriction on freedom of expression. 
  • Products designed to facilitate end-user filtering should be accompanied by clear information to end-users about how they work and their potential pitfalls in terms of over-inclusive filtering.

In a 2016 Report, the UNSR on FreeEx explained that:

“States often block and filter content with the assistance of the private sector. Internet service providers may block access to specific keywords, web pages or entire websites.  On platforms that host content, the type of filtering technique depends on the nature of the platform and the content in question.  Domain name registrars may refuse to register those that match a government blacklist; social media companies may remove postings or suspend accounts; search engines may take down search results that link to illegal content.  The method of restriction required by Governments or employed by companies can raise both necessity and proportionality concerns, depending on the validity of the rationale cited for the removal and the risk of removal of legal or protected expression.

Ambiguities in State regulation coupled with onerous intermediary liability obligations could result in excessive filtering.  Even if content regulations were validly enacted and enforced, users may still experience unnecessary access restrictions.  For example, content filtering in one jurisdiction may affect the digital expression of users in other jurisdictions.  While companies may configure filters to apply only to a particular jurisdiction or region, there have been instances where they were nevertheless passed on to other networks or areas of the platform.”

Blocking and filtering in Ethiopia

Ethiopia has been regarded as a problematic state in relation to its use of blocking and filtering in the past.  Between 2012 and 2018, hundreds of websites were blocked, including the websites of LGBTIQ organisations, media outlets and CSOs like the Electronic Frontier Foundation.  In 2017, during a spate of anti-government protests, Facebook, Twitter, WhatsApp, and Dropbox were frequently blocked.

In 2018 Freedom House noted that with the change of regime, over 250 websites were unblocked.  Despite this, politically motivated blocking and filtering remains a threat in Ethiopia.  As of 2019, Freedom House confirmed that there were still no procedures for determining which websites are blocked or for appealing blocking decisions.

Blocking and filtering in Turkey

Turkey’s government has recently received sustained criticism for the “systematic actions the Turkish government has taken to restrict Turkey’s media environment, including closing media outlets, jailing media professionals, and blocking critical online content.”(5) In 2018, Freedom House found that over 3300 URLs containing news items were blocked.

In 2019, the Wikimedia Foundation, which owns and operates Wikipedia, petitioned the European Court of Human Rights (ECtHR) in relation to the blocking of Wikipedia in Turkey. Despite the outstanding petition to the ECtHR, in January 2020, following a ruling from the Turkish Constitutional Court, the Turkish government restored access to Wikipedia. The Constitutional Court ultimately found that blocking Wikipedia was unconstitutional.

Concerns of blocking and filtering in 2020 Togolese Elections

Presidential elections were held in Togo in February 2020.  There were heightened tensions in the lead up to the elections, with protests against the 53-year rule of Gnassingbe Eyadema.  In a joint letter to the government of Togo, CSOs noted their concerns that the Togolese government would restrict access to the internet during the elections.  Reports suggest that it is highly likely that social media platforms such as WhatsApp, Telegram, and Facebook messenger were blocked on the day of elections.

Blocking and filtering remain a contemporary concern. While in limited instances there may be justifiable limitations, the trend is that of generally unjustifiably blocking and filtering with limited guidance to the public and limited to no regulation or oversight over the state.(6)

Unjustifiable limitations

There may be circumstances where measures such as blocking and filtering of content are justifiable. The protection of children’s rights may be one such justification. Blocking and filtering techniques can be developed and utilised to prevent the proliferation of and exposure to damaging material and to protect children from harmful and illegal content. However, despite this important purpose, UNICEF’s 2017 Report on ‘Children’s Rights and Business in a Digital World: Freedom of Expression, Association, Access to Information and Participation’ has recognised the inherent concerns around blocking and filtering, including a lack of transparency; the unscrupulous nature of filters; the lack of evidence to show where and when they have been deployed; and the threat of legitimate content being limited.(7) The children’s rights example illustrates that even when there might appear to be a legitimate purpose, rights can be unduly limited if the elements of legality, necessity and proportionality are not thoroughly and independently tested.

As discussed above, and as with all limitations of the right to freedom of expression, restrictions are only permissible if they are provided by law, pursuant to a legitimate aim and conform to the strict tests of necessity and proportionality. In terms of “blanket” or “generic” bans, the 2011 UNHRC General Comment found that “generic bans on the operation of certain sites and systems are not compatible” with article 19 of the ICCPR. Where restrictions constitute “generic” bans, they will generally amount to an infringement of the right to freedom of expression.

In digital rights litigation, practitioners will do well to test all tenets of the limitations analysis before determining the appropriateness or otherwise of an imposed restriction. The ECtHR, in its 2012 decision of Ahmet Yıldırım v Turkey, provides guidance on the limitations analysis in relation to blocking and filtering.

Case note: Ahmet Yıldırım v Turkey

The applicant owned and ran a website on which he published his academic work and his views on various topics.  In 2009, the Denzili Criminal Court in Turkey ordered the blocking of the website as a preventative measure in the context of criminal proceedings against the site’s owner, who was accused of insulting the memory of Atatürk.  The Court subsequently ordered the blocking of all access to Google Sites, a website hosting platform, as this was the only means of blocking the offending website.  The applicant unsuccessfully tried to have the blocking order removed and applied to the ECtHR submitting that the blocking of Google Sites amounted to indirect censorship.

The ECtHR held that the impugned measure amounted to a restriction stemming from a preventive order blocking access to a website.  The ECtHR found that the impugned measure produced arbitrary effects and could not be said to have been aimed solely at blocking access to the offending website, since it consisted in the wholesale blocking of all websites hosted by Google Sites.

The ECtHR reasoned that specific legal provisions are necessary, as general provisions and clauses governing civil and criminal responsibility do not constitute a valid basis for ordering internet blocking.  Relying on General Comment 34, the Joint Declaration on Freedom of Expression and the Internet and the 2011 UNSR FreeEx Report, the ECtHR went further, stating:

“In any case, blocking access to the Internet, or parts of the Internet, for whole populations or segments of the public can never be justified, including in the interests of justice, public order or national security.  Thus, any indiscriminate blocking measure which interferes with lawful content, sites or platforms as a collateral effect of a measure aimed at illegal content or an illegal site or platform fails per se the “adequacy” test, in so far as it lacks a “rational connection”, that is, a plausible instrumental relationship between the interference and the social need pursued.  By the same token, blocking orders imposed on sites and platforms which remain valid indefinitely or for long periods are tantamount to inadmissible forms of prior restraint, in other words, to pure censorship.”

Furthermore, the ECtHR held that the judicial review procedures concerning the blocking of websites in Turkey are insufficient to meet the criteria for avoiding abuse, as Turkish domestic law does not provide for any safeguards to ensure that a blocking order in respect of a specific website is not used as a means of blocking access in general.  Accordingly, the ECtHR found there had been a violation of the right to freedom of expression.

Similar considerations relating to ligation in respect of internet shutdowns are applicable in the context of blocking and filtering.  However, there are further practical considerations that might be of use to potential litigators and activists.

Tips for measuring restrictions

The Open Observatory of Network Interference is a useful, free resource that detects censorship and traffic manipulation on the internet. Their software can help measure:

  • Blocking of websites.
  • Blocking of instant messaging apps (WhatsApp, Facebook Messenger and Telegram).
  • Blocking of censorship circumvention tools (such as Tor).
  • Presence of systems (middleboxes) in your network that might be responsible for censorship and/or surveillance.
  • Speed and performance of your network.

This tool can be a helpful way to collect data that can be used as evidence of restrictions to access.

Conclusion

Activists and litigators should remain vigilant in relation to blocking and filtering and, where necessary, apply the principles of legality, proportionality and necessity to establish when the restriction of content amounts to a rights violation.  As international pressure against internet shutdowns mounts, litigators should be cognisant that blocking and filtering may increase as an attempt to restrict the free flow of information.

More Resources on Restricting Online Content

What is the process of restricting access to certain material Many businesses use this?

Content filtering is the process of restricting access to certain material on the web. Many businesses use content filtering to limit employees' Web access.

Is a program that restricts access to certain material on the Web?

A Web filter, which is commonly referred to as "content control software", is a piece of software designed to restrict what websites a user can visit on his or her computer.

What is the term for a website that uses encryption?

Most legitimate websites use the encryption protection called “secure sockets layer” (SSL), which is a form of encrypting data that is sent to and from a website. This keeps attackers from accessing that data while it is in transit.

Which term is used to describe software used by cybercriminals?

Malware encompasses all types of malicious software, including viruses, and cybercriminals use it for many reasons, such as: Tricking a victim into providing personal data for identity theft.