We’ve all probably seen goatse, tubgirl and a bunch of other repugnant content on the Internet since we first got online. I bet one of the first sites many of you were told to visit was Rotten.com. And you probably went there, even after you were told what you could expect to see on that site. It’s human nature. We’re all inquisitive beings and we all have different taste.
I also bet that most of you who went to that afore-mentioned site probably never went there again, and you probably also hoped that your kids would never get to see it, yet the chances are they probably already have and either got grossed out, or visit it regularly out of some morbid fascination. So would you want a filter to potentially stop the fruit of your loins from viewing such content? Or would you let them roam free on the Internet and hope that, if they see it, they’ll form their own opinion, and perhaps never go there again? The government wants to make that choice for you and your kids.
Its Internet filtering proposal involves restricting flagged content at the ISP level, using dynamic filters. A list of sites will be inaccessible by Australians, but this list of sites won’t be made public — so will the site you don’t want your kids to see be on it and vice versa?
Imagine the increase in calls to ISPs when users call up complaining about not being able to reach a certain site, and the ISP not knowing how to answer the question because they don’t know if that site is on the blacklist or not (and if they do know, they probably can’t say for fear of breaking the law!). Will an attempt by someone trying to access a blacklisted site cause their IP to be logged and cause authorities to swoop on their front door and take them in for questioning?
In fact, there will be two blacklists created by the government: one list will contain sites that are not suitable for minors, while another list will contain sites not suitable for adults. So if porn is deemed inappropriate for minors, all of it should be blocked, right? What about violent acts as reported in the news? Isn’t violence inappropriate? What about sites on severe body modification and tattooing? And if this content gets blocked for kids, doesn’t that mean it also automatically gets blocked for adults? Or will we be able to access it if we pay a tax using our credit card?
The second blacklist is perhaps more disturbing in that it essentially tells us that the government thinks we, as adults, can’t distinguish from appropriate and inappropriate content ourselves. We all like to think that we’re old enough to make decisions on what we want and don’t want to see on the Internet, and let’s face it, most of us have seen very inappropriate content on the Internet and it hasn’t harmed us in any significant way. We’ve become desensitised to a lot of things — seeing dismembered bodies as the result of war; seeing abnormal sexual acts being performed that we never want to see again; seeing planes fly into buildings and people falling from buildings... oh, right, that was actually shown live on TV.
Why is it that all of a sudden the government thinks that we need to be protected from nasties on the Internet? The Howard government spent $85 million dollars on an Internet filter for users to download and run on their own computers, and spent another $15 million advertising it. This was created in order to be an option for anyone who wanted to censor the Internet in their own home. It wasn’t law. However, only approximately 140000 people downloaded this filter in its first 12 months of availability, which perhaps indicates that the Australian public didn’t really want, or need it, in the first place. Yet now the government thinks that the entire populace needs to be protected – mostly from sexual predators. Whatever happened to teaching kids about stranger danger and even supervising them while they are online?
A mandatory ISP-level filter designed to dynamically block inappropriate content (and apparently sexual predators) will likely slow down the entire Internet experience for all of us as a consequence, and no doubt filter legitimate content in the process. There needs to be major uproar over this plan.
If the Internet censorship in this country is to be situated at the ISP level, it’s unlikely that ISPs can be made to comply unless legislative changes are made. However, the government shouldn’t be putting so much pressure on ISPs to regulate content if it feels that the content is so bad to begin with. Rather than leaving it up to the ISPs themselves to implement the blacklist of sites it doesn’t want us to see, why doesn’t it route traffic through a huge proxy, filter out all the content that’s likely to harm us, and then let it flow from the ISPs to us? That’s what countries like Saudi Arabia do.
Or how about the government just lets us decide what to view for ourselves, rather than trying to ‘nanny’ us. The fact of the matter is, most routers come with content filtering built in. Parents already have the ability to restrict content for themselves and their kids if they want to. It’s their choice. It’s true that some routers are harder to set up than others, so less tech-savvy users probably can’t set up these filters in an effective manner. A lack of technical knowledge is probably behind the low download number behind the Howard governments’ filter. So perhaps a different tack needs to be taken by the government: instead of restricting content for the entire populace, why not offer a service for less tech-savvy users in which professionals can be booked, make a house call, set up the filters and educate the parents on how to use them?
The task of mass-filtering the Internet is a magnificently tricky one and is doomed to fail if not done correctly. Will the filters be based on keywords, specific sites or entire domains? What about newsgroup content, chat rooms, e-mail and peer-to-peer traffic? The Web isn’t the only place to view inappropriate content, so these will have to be regulated, too. ISPs have already bowed to pressure from copyright holders; they shut down their binary newsgroups for fear of being caught accidentally hosting copyrighted material uploaded by anonymous users, which also filtered out a multitude of legitimate content in the process — screenshots of troubleshooting steps on technical newsgroups, for example.
If implemented, these filters will once again shift the responsibility away from the parents, whose job it is to raise their children in a loving environment and to teach them to be aware of the dangers around them. Granted, a lot of parents out there don’t have a clue how to raise their kids in such a way, but the entire nation shouldn’t have its Internet access restricted because of a responsibility-challenged minority. Let’s hope the government’s master plan is rejected in the Senate, so that we can continue to make up our own minds as to what we want to see on a free and unfettered Internet. Let’s not become China.