On 10-08-07 01:19 PM, Sean Walberg wrote:
I don't know the answer to your question, but it seems easy enough to find out:
(Windows XP SP3)
C:>ping foo.hhjjhhjjhh.com Ping request could not find host foo.hhjjhhjjhh.com. Please check the name and try again. C:>echo 127.0.0.1 *.hhjjhhjjhh.com> c:\windows\system32\drivers\etc\hosts C:>ping foo.hhjjhhjjhh.com Ping request could not find host foo.hhjjhhjjhh.com. Please check the name and try again. C:>echo 127.0.0.1 foo.hhjjhhjjhh.com>> c:\windows\system32\drivers\etc\hosts C:>ping foo.hhjjhhjjhh.com Pinging foo.hhjjhhjjhh.com [127.0.0.1] with 32 bytes of data: Reply from 127.0.0.1: bytes=32 time<1ms TTL=128 ...
That never occurred to me. I'll give it a try when I get back.
I'll leave it as an exercise to you to test the same on Linux.
That may resolve one problem. Thanks.
BTW, why are you blocking Amazon S3?
It was appearing as an ad server when chromium was loading an unrelated web page. Besides I don't order anything from Amazon. I thought the domain was actually amazonaws.com and unrelated to Amazon...
FYI, that example I mentioned earlier had listed over 1500 ad servers.
Sean
Later Mike
On Sat, Aug 7, 2010 at 1:04 PM, Mike Pfaifferhigh.res.mike@gmail.comwrote:
At the lab the teacher advocates using the Windows equivalent of the
/etc/hosts file to prevent access to certain sites from classroom computers. He and I have been having an ongoing chat about this for a few months. I've been reading up on the way the file is used to redirect requests to a different address (eg. 127.0.0.1). Is there a difference in the way Windows parses the file compared to Linux?
One reason for the above question is I was thinking it might be
useful to redirect requests to advertising sites to 127.0.0.1 to speed up access on days when things seem to crawl. One article I read on Digg suggested a lot of the wait time for web pages was due to slow and misconfigured ad servers. I found one site which has example files which are updated so often. I tried one and I got almost nothing when surfing the web. Using the file as a pattern I created a smaller version which works well with the chromium browser but fails to display text in firefox.
These are the lines I've added. Yes I know there are duplicates.
127.0.0.1 media.fastclick.com media.fastclick.net 127.0.0.1 *.tribalfusion.com a.tribalfusion.com 127.0.0.1 cdn.optmd.com 127.0.0.1 ad.doubleclick.com ad.doubleclick.net *doubleclick.net googleads.g.doubleclick.net 127.0.0.1 as.casalemedia.com 127.0.0.1 ads.adsonar.com 127.0.0.1 seeker.dice.com 127.0.0.1 townhall.com 127.0.0.1 s3.amazonaws.com 127.0.0.1 pixel.quantserv.com 127.0.0.1 st.blogads.com 127.0.0.1 *.rackspacecloud.com 127.0.0.1 js.adsonar.com 127.0.0.1 ads.pointroll.com
Would the "*" in the domain name cause problems? Like I said, I used
the Windows file as an example.
Later Mike
Roundtable mailing list Roundtable@muug.mb.ca http://www.muug.mb.ca/mailman/listinfo/roundtable
Roundtable mailing list Roundtable@muug.mb.ca http://www.muug.mb.ca/mailman/listinfo/roundtable