At the lab the teacher advocates using the Windows equivalent of the /etc/hosts file to prevent access to certain sites from classroom computers. He and I have been having an ongoing chat about this for a few months. I've been reading up on the way the file is used to redirect requests to a different address (eg. 127.0.0.1). Is there a difference in the way Windows parses the file compared to Linux?
One reason for the above question is I was thinking it might be useful to redirect requests to advertising sites to 127.0.0.1 to speed up access on days when things seem to crawl. One article I read on Digg suggested a lot of the wait time for web pages was due to slow and misconfigured ad servers. I found one site which has example files which are updated so often. I tried one and I got almost nothing when surfing the web. Using the file as a pattern I created a smaller version which works well with the chromium browser but fails to display text in firefox.
These are the lines I've added. Yes I know there are duplicates.
127.0.0.1 media.fastclick.com media.fastclick.net 127.0.0.1 *.tribalfusion.com a.tribalfusion.com 127.0.0.1 cdn.optmd.com 127.0.0.1 ad.doubleclick.com ad.doubleclick.net *doubleclick.net googleads.g.doubleclick.net 127.0.0.1 as.casalemedia.com 127.0.0.1 ads.adsonar.com 127.0.0.1 seeker.dice.com 127.0.0.1 townhall.com 127.0.0.1 s3.amazonaws.com 127.0.0.1 pixel.quantserv.com 127.0.0.1 st.blogads.com 127.0.0.1 *.rackspacecloud.com 127.0.0.1 js.adsonar.com 127.0.0.1 ads.pointroll.com
Would the "*" in the domain name cause problems? Like I said, I used the Windows file as an example.
Later Mike
I don't know the answer to your question, but it seems easy enough to find out:
(Windows XP SP3)
C:>ping foo.hhjjhhjjhh.com Ping request could not find host foo.hhjjhhjjhh.com. Please check the name and try again. C:>echo 127.0.0.1 *.hhjjhhjjhh.com > c:\windows\system32\drivers\etc\hosts C:>ping foo.hhjjhhjjhh.com Ping request could not find host foo.hhjjhhjjhh.com. Please check the name and try again. C:>echo 127.0.0.1 foo.hhjjhhjjhh.com >> c:\windows\system32\drivers\etc\hosts C:>ping foo.hhjjhhjjhh.com Pinging foo.hhjjhhjjhh.com [127.0.0.1] with 32 bytes of data: Reply from 127.0.0.1: bytes=32 time<1ms TTL=128 ...
I'll leave it as an exercise to you to test the same on Linux.
BTW, why are you blocking Amazon S3?
Sean
On Sat, Aug 7, 2010 at 1:04 PM, Mike Pfaiffer high.res.mike@gmail.comwrote:
At the lab the teacher advocates using the Windows equivalent of the
/etc/hosts file to prevent access to certain sites from classroom computers. He and I have been having an ongoing chat about this for a few months. I've been reading up on the way the file is used to redirect requests to a different address (eg. 127.0.0.1). Is there a difference in the way Windows parses the file compared to Linux?
One reason for the above question is I was thinking it might be
useful to redirect requests to advertising sites to 127.0.0.1 to speed up access on days when things seem to crawl. One article I read on Digg suggested a lot of the wait time for web pages was due to slow and misconfigured ad servers. I found one site which has example files which are updated so often. I tried one and I got almost nothing when surfing the web. Using the file as a pattern I created a smaller version which works well with the chromium browser but fails to display text in firefox.
These are the lines I've added. Yes I know there are duplicates.
127.0.0.1 media.fastclick.com media.fastclick.net 127.0.0.1 *.tribalfusion.com a.tribalfusion.com 127.0.0.1 cdn.optmd.com 127.0.0.1 ad.doubleclick.com ad.doubleclick.net *doubleclick.net googleads.g.doubleclick.net 127.0.0.1 as.casalemedia.com 127.0.0.1 ads.adsonar.com 127.0.0.1 seeker.dice.com 127.0.0.1 townhall.com 127.0.0.1 s3.amazonaws.com 127.0.0.1 pixel.quantserv.com 127.0.0.1 st.blogads.com 127.0.0.1 *.rackspacecloud.com 127.0.0.1 js.adsonar.com 127.0.0.1 ads.pointroll.com
Would the "*" in the domain name cause problems? Like I said, I used
the Windows file as an example.
Later Mike
Roundtable mailing list Roundtable@muug.mb.ca http://www.muug.mb.ca/mailman/listinfo/roundtable
On 10-08-07 01:19 PM, Sean Walberg wrote:
I don't know the answer to your question, but it seems easy enough to find out:
(Windows XP SP3)
C:>ping foo.hhjjhhjjhh.com Ping request could not find host foo.hhjjhhjjhh.com. Please check the name and try again. C:>echo 127.0.0.1 *.hhjjhhjjhh.com> c:\windows\system32\drivers\etc\hosts C:>ping foo.hhjjhhjjhh.com Ping request could not find host foo.hhjjhhjjhh.com. Please check the name and try again. C:>echo 127.0.0.1 foo.hhjjhhjjhh.com>> c:\windows\system32\drivers\etc\hosts C:>ping foo.hhjjhhjjhh.com Pinging foo.hhjjhhjjhh.com [127.0.0.1] with 32 bytes of data: Reply from 127.0.0.1: bytes=32 time<1ms TTL=128 ...
That never occurred to me. I'll give it a try when I get back.
I'll leave it as an exercise to you to test the same on Linux.
That may resolve one problem. Thanks.
BTW, why are you blocking Amazon S3?
It was appearing as an ad server when chromium was loading an unrelated web page. Besides I don't order anything from Amazon. I thought the domain was actually amazonaws.com and unrelated to Amazon...
FYI, that example I mentioned earlier had listed over 1500 ad servers.
Sean
Later Mike
On Sat, Aug 7, 2010 at 1:04 PM, Mike Pfaifferhigh.res.mike@gmail.comwrote:
At the lab the teacher advocates using the Windows equivalent of the
/etc/hosts file to prevent access to certain sites from classroom computers. He and I have been having an ongoing chat about this for a few months. I've been reading up on the way the file is used to redirect requests to a different address (eg. 127.0.0.1). Is there a difference in the way Windows parses the file compared to Linux?
One reason for the above question is I was thinking it might be
useful to redirect requests to advertising sites to 127.0.0.1 to speed up access on days when things seem to crawl. One article I read on Digg suggested a lot of the wait time for web pages was due to slow and misconfigured ad servers. I found one site which has example files which are updated so often. I tried one and I got almost nothing when surfing the web. Using the file as a pattern I created a smaller version which works well with the chromium browser but fails to display text in firefox.
These are the lines I've added. Yes I know there are duplicates.
127.0.0.1 media.fastclick.com media.fastclick.net 127.0.0.1 *.tribalfusion.com a.tribalfusion.com 127.0.0.1 cdn.optmd.com 127.0.0.1 ad.doubleclick.com ad.doubleclick.net *doubleclick.net googleads.g.doubleclick.net 127.0.0.1 as.casalemedia.com 127.0.0.1 ads.adsonar.com 127.0.0.1 seeker.dice.com 127.0.0.1 townhall.com 127.0.0.1 s3.amazonaws.com 127.0.0.1 pixel.quantserv.com 127.0.0.1 st.blogads.com 127.0.0.1 *.rackspacecloud.com 127.0.0.1 js.adsonar.com 127.0.0.1 ads.pointroll.com
Would the "*" in the domain name cause problems? Like I said, I used
the Windows file as an example.
Later Mike
Roundtable mailing list Roundtable@muug.mb.ca http://www.muug.mb.ca/mailman/listinfo/roundtable
Roundtable mailing list Roundtable@muug.mb.ca http://www.muug.mb.ca/mailman/listinfo/roundtable
On Sat, Aug 7, 2010 at 1:51 PM, Mike Pfaiffer high.res.mike@gmail.comwrote:
BTW, why are you blocking Amazon S3?
It was appearing as an ad server when chromium was loading an
unrelated web page. Besides I don't order anything from Amazon. I thought the domain was actually amazonaws.com and unrelated to Amazon...
S3 is Amazon's "disk in the cloud" offering. Many people store static assets on S3 and use it as a cheap CDN. There might be some ads there, but there's more likely to be someone storing the images on their blog there, too. It would be like blocking everything under google.com because you don't like seeing adsense content network ads.
I probably should have mentioned this in my original response, but fscking around with the hosts file is generally a bad idea. If you want to block ads, run the adblock plugin.
Sean
On 10-08-07 03:43 PM, Sean Walberg wrote:
On Sat, Aug 7, 2010 at 1:51 PM, Mike Pfaifferhigh.res.mike@gmail.comwrote:
BTW, why are you blocking Amazon S3?
It was appearing as an ad server when chromium was loading an
unrelated web page. Besides I don't order anything from Amazon. I thought the domain was actually amazonaws.com and unrelated to Amazon...
S3 is Amazon's "disk in the cloud" offering. Many people store static assets on S3 and use it as a cheap CDN. There might be some ads there, but there's more likely to be someone storing the images on their blog there, too. It would be like blocking everything under google.com because you don't like seeing adsense content network ads.
Oh. That makes sense. I would guess since they would be loading external images from there blocking the original ad server would suffice.
I probably should have mentioned this in my original response, but fscking around with the hosts file is generally a bad idea. If you want to block ads, run the adblock plugin.
That's already included in Firefox isn't it? Then I already have that running. Unfortunately Firefox doesn't always block up popups. It will get about 80% of them but the additional 20% can be just as annoying. Then there are the banner ads too. The increase in speed by blocking just a few of their servers is noticeable.
Sean
Later Mike
Roundtable mailing list Roundtable@muug.mb.ca http://www.muug.mb.ca/mailman/listinfo/roundtable
On Sat, Aug 7, 2010 at 4:27 PM, Mike Pfaiffer high.res.mike@gmail.comwrote:
That's already included in Firefox isn't it? Then I already have
that running. Unfortunately Firefox doesn't always block up popups. It will get about 80% of them but the additional 20% can be just as annoying. Then there are the banner ads too. The increase in speed by blocking just a few of their servers is noticeable.
I don't know what kind of sites you visit, but I don't get many popups these days. Now that my day job involves Internet marketing, I'm actually interested to see how people monetize their sites and what's being advertised out there, so I don't use any ad blocking.
Sean
On 10-08-07 04:49 PM, Sean Walberg wrote:
On Sat, Aug 7, 2010 at 4:27 PM, Mike Pfaifferhigh.res.mike@gmail.comwrote:
That's already included in Firefox isn't it? Then I already have
that running. Unfortunately Firefox doesn't always block up popups. It will get about 80% of them but the additional 20% can be just as annoying. Then there are the banner ads too. The increase in speed by blocking just a few of their servers is noticeable.
I don't know what kind of sites you visit, but I don't get many popups these days. Now that my day job involves Internet marketing, I'm actually interested to see how people monetize their sites and what's being advertised out there, so I don't use any ad blocking.
It's going to sound childish for someone my age... I visit comic strip pages. For example comics.com and gocomics.com. I really admire people who can express themselves through drawing. I was commenting to one of the sysadmins how their site was slow. I suggested they were running Windows. They got indignant and said they were running Solaris. ;-) As I said, blocking the ad sites makes a noticeable result in page load times. Of course other sites like Digg seem to be having generic problems regardless of the ad sites I block.
The teacher at the lab is absolutely sold on the idea of using the hosts file to block out places like porn sites, gambling sites, facebook, and places like the pirate bay. For classroom machines I can see his point. Particularly since they are going through a different subnet than we are.
Sean
Later Mike
Roundtable mailing list Roundtable@muug.mb.ca http://www.muug.mb.ca/mailman/listinfo/roundtable
On Sat, Aug 7, 2010 at 5:13 PM, Mike Pfaiffer high.res.mike@gmail.comwrote:
The teacher at the lab is absolutely sold on the idea of using the
hosts file to block out places like porn sites, gambling sites, facebook, and places like the pirate bay. For classroom machines I can see his point. Particularly since they are going through a different subnet than we are.
Doing that at the DNS level misses out on proxies, any site with DNS wildcards, and also relies on the application to Do The Right Thing. Your friend might want to spend some time configuring Squid in transparent proxy mode. There's also the Squidguard redirector that makes blocking bad sites easier.
Sean
On 10-08-07 05:46 PM, Sean Walberg wrote:
On Sat, Aug 7, 2010 at 5:13 PM, Mike Pfaifferhigh.res.mike@gmail.comwrote:
The teacher at the lab is absolutely sold on the idea of using the
hosts file to block out places like porn sites, gambling sites, facebook, and places like the pirate bay. For classroom machines I can see his point. Particularly since they are going through a different subnet than we are.
Doing that at the DNS level misses out on proxies, any site with DNS wildcards, and also relies on the application to Do The Right Thing. Your friend might want to spend some time configuring Squid in transparent proxy mode. There's also the Squidguard redirector that makes blocking bad sites easier.
Quite correct. The teacher spends his time going through and looking for proxies when the students access the "inappropriate sites". What ever makes him happy...
The Squid proxy we have in the lab is not accessible through the classroom. The classroom has been converted to a wireless system which is administered by the buildings IT folks. They already have barracuda running but there appears to be problems. The buildings IT folks have put the lab in the DMZ and let us administer it ourselves.
Sean
Later Mike
Roundtable mailing list Roundtable@muug.mb.ca http://www.muug.mb.ca/mailman/listinfo/roundtable
On Sat, 2010-08-07 at 17:13 -0500, Mike Pfaiffer wrote:
The teacher at the lab is absolutely sold on the idea of using the hosts file to block out places like porn sites, gambling sites, facebook, and places like the pirate bay. For classroom machines I can see his point. Particularly since they are going through a different subnet than we are.
I don't think I can agree with this strategy. It would take about 2 seconds for the user to delete their hosts file thereby defeating the blocking. The reality is there is always a way around the filters but but blocking at the firewall at least in theory prevents users from disabling it.
Another problem I see with manipulating "hosts" is that you'd have to maintain it on every machine which seems like a lot of work.
On 10-08-09 10:01 AM, John Lange wrote:
On Sat, 2010-08-07 at 17:13 -0500, Mike Pfaiffer wrote:
The teacher at the lab is absolutely sold on the idea of using the hosts file to block out places like porn sites, gambling sites, facebook, and places like the pirate bay. For classroom machines I can see his point. Particularly since they are going through a different subnet than we are.
I don't think I can agree with this strategy. It would take about 2 seconds for the user to delete their hosts file thereby defeating the blocking. The reality is there is always a way around the filters but but blocking at the firewall at least in theory prevents users from disabling it.
The firewall used in the classroom is out of our control. The building IT folks are trying to keep on top of it. The actual lab itself is in a separate room and we can control that firewall.
Essentially the teacher maintains a copy of the hosts file on a USB stick. He reinstalls it every noon and just before he leaves.
Another problem I see with manipulating "hosts" is that you'd have to maintain it on every machine which seems like a lot of work.
Yup. Sure is. Especially since the building blocks sites like Youtube and Google/Yahoo images.
Later Mike
I have to agree with Mr. Lange.
There are two things that will probably make this strategy incredibly time-consuming and difficult to maintain.
1. Maintaining the same hosts file on every computer. 2. Maintaining the list of hosts in the hosts file.
Especially when it comes to porn, the number of sites you would have to list is just overwhelming, and on top of that, they keep changing. New ones come in and old ones change domains all the time. I think you're creating a lot of work that could easily be avoided by purchasing software that is specifically intended for this purpose. Frankly I find it hard to imagine a more time-consuming way of doing this.
Just my two cents. :)
Kind regards, Helgi Hrafn Gunnarsson helgi@binary.is
On Mon, Aug 9, 2010 at 10:01 AM, John Lange john@johnlange.ca wrote:
On Sat, 2010-08-07 at 17:13 -0500, Mike Pfaiffer wrote:
The teacher at the lab is absolutely sold on the idea of using the
hosts file to block out places like porn sites, gambling sites, facebook, and places like the pirate bay. For classroom machines I can see his point. Particularly since they are going through a different subnet than we are.
I don't think I can agree with this strategy. It would take about 2 seconds for the user to delete their hosts file thereby defeating the blocking. The reality is there is always a way around the filters but but blocking at the firewall at least in theory prevents users from disabling it.
Another problem I see with manipulating "hosts" is that you'd have to maintain it on every machine which seems like a lot of work.
-- John Lange http://www.johnlange.ca
Roundtable mailing list Roundtable@muug.mb.ca http://www.muug.mb.ca/mailman/listinfo/roundtable
On 10-08-07 01:19 PM, Sean Walberg wrote:
I don't know the answer to your question, but it seems easy enough to find out:
(Windows XP SP3)
C:>ping foo.hhjjhhjjhh.com Ping request could not find host foo.hhjjhhjjhh.com. Please check the name and try again. C:>echo 127.0.0.1 *.hhjjhhjjhh.com> c:\windows\system32\drivers\etc\hosts C:>ping foo.hhjjhhjjhh.com Ping request could not find host foo.hhjjhhjjhh.com. Please check the name and try again. C:>echo 127.0.0.1 foo.hhjjhhjjhh.com>> c:\windows\system32\drivers\etc\hosts C:>ping foo.hhjjhhjjhh.com Pinging foo.hhjjhhjjhh.com [127.0.0.1] with 32 bytes of data: Reply from 127.0.0.1: bytes=32 time<1ms TTL=128 ...
I'll leave it as an exercise to you to test the same on Linux.
BTW, why are you blocking Amazon S3?
Sean
Immensely interesting results... Apparently the "*" is treated as an actual character rather than as a wildcard. Judging from your results the example I was using was wrong. I took out all the entries with the "*". I'll try it out for a few weeks and see what more I can learn.
Unfortunately the 64bit version of Firefox still didn't display text. I commented out all the added sites and still didn't get any text. What solved the problem was going into the preferences and telling the program to use my fonts rather than let the site select their own. The 32bit version of Firefox was working fine before so if it ain't broke, don't fix it.
Later Mike
On Sat, Aug 7, 2010 at 1:04 PM, Mike Pfaifferhigh.res.mike@gmail.comwrote:
At the lab the teacher advocates using the Windows equivalent of the
/etc/hosts file to prevent access to certain sites from classroom computers. He and I have been having an ongoing chat about this for a few months. I've been reading up on the way the file is used to redirect requests to a different address (eg. 127.0.0.1). Is there a difference in the way Windows parses the file compared to Linux?
One reason for the above question is I was thinking it might be
useful to redirect requests to advertising sites to 127.0.0.1 to speed up access on days when things seem to crawl. One article I read on Digg suggested a lot of the wait time for web pages was due to slow and misconfigured ad servers. I found one site which has example files which are updated so often. I tried one and I got almost nothing when surfing the web. Using the file as a pattern I created a smaller version which works well with the chromium browser but fails to display text in firefox.
These are the lines I've added. Yes I know there are duplicates.
127.0.0.1 media.fastclick.com media.fastclick.net 127.0.0.1 *.tribalfusion.com a.tribalfusion.com 127.0.0.1 cdn.optmd.com 127.0.0.1 ad.doubleclick.com ad.doubleclick.net *doubleclick.net googleads.g.doubleclick.net 127.0.0.1 as.casalemedia.com 127.0.0.1 ads.adsonar.com 127.0.0.1 seeker.dice.com 127.0.0.1 townhall.com 127.0.0.1 s3.amazonaws.com 127.0.0.1 pixel.quantserv.com 127.0.0.1 st.blogads.com 127.0.0.1 *.rackspacecloud.com 127.0.0.1 js.adsonar.com 127.0.0.1 ads.pointroll.com
Would the "*" in the domain name cause problems? Like I said, I used
the Windows file as an example.
Later Mike
Roundtable mailing list Roundtable@muug.mb.ca http://www.muug.mb.ca/mailman/listinfo/roundtable
Roundtable mailing list Roundtable@muug.mb.ca http://www.muug.mb.ca/mailman/listinfo/roundtable
I tried a little experiment with Windows ME. I first opened the web page www.cbc.ca/news without any LMHOSTS file. (Yes, Windows uses caps). It took: 5 seconds to load page header, additional 5 seconds to load the bulk of the web page, and 9 seconds to load counts of forum comments to links of news stories.
I then added LMHOSTS with the links that Mike provided. However, following instructions from the example LMHOSTS.SAV file, I ensured one line per URL. So I split lines that Mike had combined, added the local IP address to all of them. To ensure timing was not do to any cache, I closed IE before each test. The test web page is one I often look at, so no difference between the first test and later tests. It took: 4 seconds to load the page hearder, additional 4 seconds to load the bulk of the page, and 8 seconds to load counts of comments.
This is a little difference, but not a lot. Within experimental error? I do have an ad blocker, so some of the work is already done.
Rob Dyck
On Sat, Aug 7, 2010 at 5:25 PM, Robert Dyck rbdyck2@shaw.ca wrote:
This is a little difference, but not a lot. Within experimental error? I do have an ad blocker, so some of the work is already done.
I see two problems here.
#1 is that closing the browser does not clear the cache. #2 is the LMHOSTS only works for NetBIOS name requests. The HOSTS file is used for DNS.
Folks, if you want to test page load speed, use Firebug, Tamper Data, Chrome Developer Tools, or any one of a dozen other tools out there. Hitting F5 with a stopwatch is not a valid measurement.
Sean
Does Windows ME have a HOSTS file? Remember this isn't Windows NT. Microsoft Windows 2000 is just Windows NT version 5.0 with a fancy name, Windows XP is Windows NT 5.1, Windows Vista is Windows NT version 6.0, and Windows 7 is Windows NT version 7.0. But Windows ME is part of the Windows 9x series.
I created a short-cut (Windows version of a symbolic link) called HOSTS, linked to LMHOSTS. I tried the same web page but found no difference in load times at all.
My initial test was with a web page that I often visit, so there is no cache difference between any of the tests.
I restarted my computer in case Windows was loading the HOSTS file into memory, tried the same test and found no difference either.
Rob Dyck
Yes, Win9x has a HOSTS file. The problem you've run into is that a shortcut isn't a symlink - instead, it's a file called "HOSTS.LNK" that the shell treats specially. Delete the shortcut and copy the file instead. -Adam
-----Original Message----- From: "Robert Dyck" rbdyck2@shaw.ca Sender: roundtable-bounces@muug.mb.ca Date: Sat, 7 Aug 2010 19:31:21 To: 'Continuation of Round Table discussion'roundtable@muug.mb.ca Reply-To: Continuation of Round Table discussion roundtable@muug.mb.ca Subject: Re: [RndTbl] Learning a little about /etc/hosts
Does Windows ME have a HOSTS file? Remember this isn't Windows NT. Microsoft Windows 2000 is just Windows NT version 5.0 with a fancy name, Windows XP is Windows NT 5.1, Windows Vista is Windows NT version 6.0, and Windows 7 is Windows NT version 7.0. But Windows ME is part of the Windows 9x series.
I created a short-cut (Windows version of a symbolic link) called HOSTS, linked to LMHOSTS. I tried the same web page but found no difference in load times at all.
My initial test was with a web page that I often visit, so there is no cache difference between any of the tests.
I restarted my computer in case Windows was loading the HOSTS file into memory, tried the same test and found no difference either.
Rob Dyck
_______________________________________________ Roundtable mailing list Roundtable@muug.mb.ca http://www.muug.mb.ca/mailman/listinfo/roundtable
--- Adam Thompson wrote:
Yes, Win9x has a HOSTS file. The problem you've run into is that a
shortcut isn't a symlink - instead, it's a file called "HOSTS.LNK" that the shell treats specially. Delete the shortcut and copy the file instead. -Adam
A shortcut is handled by the desktop (Explorer.exe) instead of the file system. Doesn't make sense, but did that anyway. Reboot Windows in case it cashes HOSTS in memory. Still have no difference in the same web site I use for these tests.
Rob Dyck
I need to take that back; there are changes. After copying LMHOSTS to hosts (yes, it's lower case), I found the ad banner in the page header was replaced with a black rectangle and a white icon with the red "X". And the video window pop-up showed the news article without the ad that normally runs in front of it; but only did so once. Thereafter it would not show the news article video. This worked for a couple minutes, then after refeshing the window it displayed a black area where the ad would have been, and played news video, but showed 2 ads before every news article. But after closing windows and re-trying a couple times, it returned to playing one video ad before each news video clip.
All that was yesterday. I restarted my computer today and tried again. Again I found when playing video of a news story it stuttered a couple times, but did play the news article without any ad, but that was only once. Thereafter it would not play any news story. It appears the CBC News website blocks video if you block their video ads. I commented out the entries for doubleclick (leaving googleads.g.doubleclick.net in place) and found it would play video, and did precede each video news story with a video ad. But some times it would play two video ads, other times it would not play the news video. I had to restart my computer before getting reliable response. Now it plays one video ad before each news video, as it was before.
Rob Dyck
On 10-08-07 05:25 PM, Robert Dyck wrote:
I tried a little experiment with Windows ME. I first opened the web page www.cbc.ca/news without any LMHOSTS file. (Yes, Windows uses caps). It took: 5 seconds to load page header, additional 5 seconds to load the bulk of the web page, and 9 seconds to load counts of forum comments to links of news stories.
I then added LMHOSTS with the links that Mike provided. However, following instructions from the example LMHOSTS.SAV file, I ensured one line per URL. So I split lines that Mike had combined, added the local IP address to all of them. To ensure timing was not do to any cache, I closed IE before each test. The test web page is one I often look at, so no difference between the first test and later tests. It took: 4 seconds to load the page hearder, additional 4 seconds to load the bulk of the page, and 8 seconds to load counts of comments.
This is a little difference, but not a lot. Within experimental error? I do have an ad blocker, so some of the work is already done.
Some days the gocomics.com site I mentioned can take close to a minute and a half to load. By blocking some ad sites I observed the time reduced more than the 20% you observed. It depends on which ad sites are being accessed and the number. One day before I started blocking ad sites, I notice they had half a dozen separate pages under the main page. The usual number of pop-unders as I believe they are called is around two to three.
Sean does raise an interesting point. Blocking the sites by domain is inefficient. It would be better to block them by type. It would reduce the number of accesses to a minimum. If that makes sense...
Rob Dyck
Later Mike
Roundtable mailing list Roundtable@muug.mb.ca http://www.muug.mb.ca/mailman/listinfo/roundtable