Bad Websites? Don't Go There

One of the best ways of protecting your computer from websites which serve malicious content is not to go to those websites. If their content includes malicious code, why would you think that any of their content is desirable? Don't go there, or if you do, go armed with knowledge.

If you must surf to dodgy web sites, know which web sites are known to be malicious. The power of the Internet includes online, real time advice from the good guys.

Besides online malicious web site analysis, the classic protection strategy was plain old avoidance. Various security experts provide lists of websites that you should avoid, and they distribute the lists on the web. These lists are pretty big, and change frequently - generally each month. And, to prevent you from having to examine a list, by hand, each time you consider following a given link, you put these lists into the Hosts file on your computer, and let the computer do the work for you.

You can get a Hosts file from several trusted sources.


The Hosts file is a simple text file, stored in a recognised location on your computer. The operating system finds it from registry entry [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Tcpip\Parameters\DataBasePath]. Generally, this entry points to "%SystemRoot%\System32\drivers\etc", though malicious software, if installed on your computer, may change this entry.

If you use a Hosts file from only one of the above sources, you'll simply copy the file into the folder, as discussed above. If you do as I do, and use combined sources (since each source has different criteria what undesirable content is out there), you'll not want to edit and merge the file by hand. So there are several tools for doing this.

All of the above are free, and reliable. But, if you're skeptical about whether to trust any of the sources listed above, that's good. Do some research.

With exception to the issue below, using a Hosts file, as part of a layered security strategy, is simple yet effective. Use of the Hosts file is built in to every network operating system that uses Internet Protocol. Installing the Hosts file simply consists of merging entries into the existing file (as described above), or copying a file into the folder, if there is none in use right now.

Now, using a Hosts file is not without cost. A Hosts file entry identifies one individual subdomain, in any given domain. If "hackersrus.net" has separate addresses for "servera", "serverb", and "serverc", you'll need

127.0.0.1 servera.hackersrus.net
127.0.0.1 serverb.hackersrus.net
127.0.0.1 serverc.hackersrus.net

and this can make the Hosts file pretty large. With the HPGuru Hosts file, the file is well over 1M in size.

If you're running the DNS Client service, which provides a centrally managed DNS / Hosts lookup, the Hosts file is cached automatically. When the system starts up, and anytime you update the Hosts file, the DNS Client service will recache the file. This is a very CPU intensive process - on my computer (the last time I used it), the service would take 10 - 15 minutes to cache the file; during that time, the computer was pretty useless.

The solution, in that case, is to Stop and Disable the DNS Client service.

This should be a relevant issue only on small LANs that don't have a dedicated DNS server. If your domain includes a DNS server for local name resolution, you need to setup both the clients and the server very carefully. In that case, you'll want to centralise your website blocking, not have separate files on each client. If you don't have a dedicated DNS server, there are free DNS server utilities, that will provide local caching of DNS information, without having to precache the Hosts file.

Note one of the downsides of Hosts file based protection is latency. For you to surf safely, you have to be using the most up to date Hosts file. How often do you intend to update yours? If your Internet activity consists mainly of browsing, a browser add-on that references an online database makes much more sense.

>> Top

1 comments:

Unknown said...

Interesting. I had two 2000 Professional installations go south on me because boot time and other things (like modifying the hosts file and sometimes just because). CPU 100% for MINUTES not seconds. Disabling DNS might have prevented the reinstalls. Oh well. Haven't ran into that for a couple of years now.