I have to agree with Mr. Lange.

There are two things that will probably make this strategy incredibly time-consuming and difficult to maintain.

1. Maintaining the same hosts file on every computer.
2. Maintaining the list of hosts in the hosts file.

Especially when it comes to porn, the number of sites you would have to list is just overwhelming, and on top of that, they keep changing. New ones come in and old ones change domains all the time. I think you're creating a lot of work that could easily be avoided by purchasing software that is specifically intended for this purpose. Frankly I find it hard to imagine a more time-consuming way of doing this.

Just my two cents. :)

Kind regards,
Helgi Hrafn Gunnarsson
helgi@binary.is

On Mon, Aug 9, 2010 at 10:01 AM, John Lange <john@johnlange.ca> wrote:
On Sat, 2010-08-07 at 17:13 -0500, Mike Pfaiffer wrote:

>       The teacher at the lab is absolutely sold on the idea of using the
> hosts file to block out places like porn sites, gambling sites,
> facebook, and places like the pirate bay. For classroom machines I can
> see his point. Particularly since they are going through a different
> subnet than we are.

I don't think I can agree with this strategy. It would take about 2
seconds for the user to delete their hosts file thereby defeating the
blocking. The reality is there is always a way around the filters but
but blocking at the firewall at least in theory prevents users from
disabling it.

Another problem I see with manipulating "hosts" is that you'd have to
maintain it on every machine which seems like a lot of work.

--
John Lange
http://www.johnlange.ca


_______________________________________________
Roundtable mailing list
Roundtable@muug.mb.ca
http://www.muug.mb.ca/mailman/listinfo/roundtable