Student Voices: Content Filtering — A Successful Japanese Model?

The Internet is like an oversized encyclopedia that contains enormous amounts of information on almost all kinds of topics that anyone can access and use. However, despite those positive effects, there are also contents in there that nobody wants to see or at least should not see. Some of them could be dangerous traps that can alter ones life forever in a bad way or tarnish the minds of children. These types of content should be eliminated, or at least, filtered for those who may be severely affected by them. Many content filtering software such as “SafeSquid”, “SurfWatch”, “NetNanny”, and others attempt to protect these people.

This software is in use in many areas, such as public schools, libraries, and personal computers. There are also organizations that help regulate the content filtering process for phones and other network services used by minors. Some countries, such as China, choose to filter content at the state level.

However, there are several problems with these solutions, both technically and morally. Many critics point out that government filtering viewpoints on moral and political issues is the first step of propaganda. Furthermore, they also point out that if private companies produce software that filters content, the company has the freedom to censor whatever they want. This is further evidenced by software that block contents as religious, anti-religious, or political material, based on the decision of the company owners. The software “X-Stop” was shown to block Quaker websites, the National Journal of Sexual Orientation Law, the Heritage Foundation, and The Ethical Spectacle. Another program, “CYBERsitter”, blocks sites such as the National Organization for Women. “CyberPatrol”, a software developed by The Anti-Defamation League and Mattel’s The Learning Company, has been found to block political sites that it sees to be engaging in hate speech, as well as human rights websites, such as Amnesty International’s web page about Israel, and gay rights websites. In the United States, many U.S. public schools and libraries use the same filtering software that many Christian organizations use.

Technical problems of censoring include filtering errors such as overblocking, where sites that contain no harmful information are censored as dangerous content by the software, and underblocking, which happens when new content is uploaded onto the internet and the software company fails to update the software in time.

From these examples, it is quite obvious that private companies are unreliable in terms of blocking dangerous content. The best solution to the problem is to leave the censorship responsibility in the hands of parents. However, not all parents are familiar with computers, much less, blocking sites (which can be done without any censoring software). Therefore the second best option is to have a comprehensive organization comprised of experts in different fields (to create variety) and have them judge what should be permitted and what should be censored in their country. Japan already has something similar to it, called the Content Evaluation and Monitoring Association (EMA). The organization was formed in 2008 and focuses on filtering and monitoring mobile sites for young people.

This organization, unlike some of the software that private companies produce, does not filter contents that relate to subjects like religion, politics, and homosexuality, most likely because these are not directly harmful to people (unless they are extreme speeches). These sites are more subjective and require that every individual should know and think about before they come up with an answer. This relatively new organization has already made numerous successes in decreasing the number of child victims of crimes, and hopefully, other countries, (especially the U.S. where school shootings are a problem), will follow the same pattern in the near future.