How to Block Website Copier

Embed Size (px)

Citation preview

Open Source offline browser

For HTTrack users:Advice & what not to do when you are using HTTrack

For webmasters having problems with bandwidth abuse / other abuses related to HTTrack:Abuse FAQ for webmasters

Advice & what not to doPlease follow these common sense rules to avoid any network abuse

Do not overload the websites!
Downloading a site can overload it, if you have a fast pipe, or if you capture too many simultaneous cgi (dynamically generated pages). Do not download too large websites: use filters

Do not use too many simultaneous connections

Use bandwidth limits

Use connection limits

Use size limits

Use time limits

Only disable robots.txt rules with great care

Try not to download during working hours

Check your mirror transfer rate/size

For large mirrors, first ask the webmaster of the site

Ensure that you can copy the website Are the pages copyrighted?

Can you copy them only for private purpose?

Do not make online mirrors unless you are authorized to do so

Do not overload your network Is your (corporate, private..) network connected through dialup ISP?

Is your network bandwidth limited (and expensive)?

Are you slowing down the traffic?

Do not steal private information Do not grab emails

Do not grab private information

Abuse FAQ for webmastersHow to limit network abuse
HTTrack Website Copier FAQ (updated - DRAFT)
Q: How to block offline browsers, like HTTrack?

A: This is a complex question, let's study it

First, there are several different reasons for that
Why do you want to block offline browsers? :Because a large part of your bandwidth is used by some users, who are slowing down the rest

Because of copyright questions (you do not want people to copy parts of your website)

Because of privacy (you do not want email grabbers to steal all your user's emails)

Bandwidth abuse:

Many Webmasters are concerned about bandwidth abuse, even if this problem is caused by a minority of people. Offline browsers tools, like HTTrack, can be used in a WRONG way, and therefore are sometimes considered as a potential danger.
But before thinking that all offline browsers are BAD, consider this: students, teachers, IT consultants, websurfers and many people who like your website, may want to copy parts of it, for their work, their studies, to teach or demonstrate to people during class school or shows. They might do that because they are connected through expensive modem connection, or because they would like to consult pages while travelling, or archive sites that may be removed one day, make some data mining, comiling information ("if only I could find this website I saw one day..").
There are many good reasons to mirror websites, and this helps many good people.
As a webmaster, you might be interested to use such tools, too: test broken links, move a website to another location, control which external links are put on your website for legal/content control, test the webserver response and performances, index it..

Anyway, bandwidth abuse can be a problem. If your site is regularly "clobbered" by evil downloaders, you have
various solutions. You have radical solutions, and intermediate solutions. I strongly recomment not to use
radical solutions, because of the previous remarks (good people often mirror websites).

In general, for all solutions,
the good thing: it will limit the bandwidth abuse
the bad thing: depending on the solution, it will be either a small constraint, or a fatal nuisance (you'll get 0 visitors)
or, to be extreme: if you unplug the wire, there will be no bandwidth abuseInform people, explain why ("please do not clobber the bandwidth")
Good: Will work with good people. Many good people just don't KNOW that they can slow down a network.
Bad: Will **only** work with good people
How to do: Obvious - place a note, a warning, an article, a draw, a poeme or whatever you want

Use "robots.txt" file
Good: Easy to setup
Bad: Easy to override
How to do: Create a robots.txt file on top dir, with proper parameters
Example:
User-agent: *

Disallow: /bigfolder

Ban registered offline-browsers User-agents
Good: Easy to setup
Bad: Radical, and easy to override
How to do: Filter the "User-agent" HTTP header field

Limit the bandwidth per IP (or by folders)
Good: Efficient
Bad: Multiple users behind proxies will be slow down, not really easy to setup
How to do: Depends on webserver. Might be done with low-level IP rules (QoS)

Priorize small files, against large files
Good: Efficient if large files are the cause of abuse
Bad: Not always efficient
How to do: Depends on the webserver

Ban abuser IPs
Good: Immediate solution
Bad: Annoying to do, useless for dynamic IPs, and not very user friendly
How to do: Either ban IP's on the firewall, or on the webserver (see ACLs)

Limit abusers IPs
Good: Intermediate and immediate solution
Bad: Annoying to do, useless for dynamic IPs, and annoying to maintain..
How to do: Use routine QoS (fair queuing), or webserver options

Use technical tricks (like javascript) to hide URLs
Good: Efficient
Bad: The most efficient tricks will also cause your website to he heavy, and not user-friendly (and therefore less attractive, even for surfing users). Remember: clients or visitors might want to consult offline your website. Advanced users will also be still able to note the URLs and catch them. Will not work on non-javascript browsers. It will not work if the user clicks 50 times and put downloads in background with a standard browser
How to do: Most offline browsers (I would say all, but let's say most) are unable to "understand" javascript/java properly. Reason: very tricky to handle!
Example:
You can replace:
Foo
by:



Foo


You can also use java-based applets. I would say that it is the "best of the horrors". A big, fat, slow, bogus java applet. Avoid!

Use technical tricks to lag offline browsers
Good: Efficient
Bad: Can be avoided by advanced users, annoying to maintain, AND potentially worst that the illness (cgi's are often taking some CPU usage). . It will not work if the user clicks 50 times and put downloads in background with a standard browser
How to do: Create fake empty links that point to cgi's, with long delays
Example: Use things like (example in php:)

Use technical tricks to temporarily ban IPs
Good: Efficient
Bad: Radical (your site will only be available online for all users), not easy to setup
How to to: Create fake links with "killing" targets
Example: Use things like (again an example in php:)

function add_temp_firewall_rule($addr) {// The chain chhttp is flushed in a cron job to avoid ipchains overflowsystem("/usr/bin/sudo -u root /sbin/ipchains -I 1 chhttp -p tcp -s ".$addr." --dport 80 -j REJECT");syslog("user rejected due to too many copy attemps : ".$addr);}

Copyright issues

You do not want people to "steal" your website, or even copy parts of it. First, stealing a website does not
require to have an offline browser. Second, direct (and credited) copy is sometimes better than disguised
plagiarism. Besides, several previous remarks are also interesting here: the more protected your website will be,
the potentially less attractive it will also be. There is no perfect solution, too. A webmaster asked me one day
to give him a solution to prevent any website copy. Not only for offline browsers, but also against "save as",
cut and paste, print.. and print screen. I replied that is was not possible, especially for the print screen - and
that another potential threat was the evil photographer. Maybe with a "this document will self-destruct in 5 seconds.."
or by shooting users after consulting the document.
More seriously, once a document is being placed on a website, there will always be the risks of copy (or plagiarism)

To limit the risk, previous a- and h- solutions, in "bandwidth abuse" section, can be used

Privacy

Might be related to section 2.
But the greatest risk is maybe email grabbers. A solution can be to use javascript to hide emails.
Good: Efficient
Bad: Non-javascript browsers will not have the "clickable" link
How to do: Use javascript to build mailto: links
Example:




smith at mycompany dot com

Another one is to create images of emails
Good: Efficient, does not require javascript
Bad: There is still the problem of the link (mailto:), images are bigger than text, and it can cause problems for blind people (a good solution is use an ALT attribute with the email written like "smith at mycompany dot com")
How to do: Not so obvious of you do not want to create images by yourself
Example: (php, Unix)


The script to find the id:

#!/bin/sh

# small script for email.php3
echo "Enter login:"
read login
echo "The URL is:"
printf "http://yourhost/email.php3?id="
printf $login|md5sum|cut -c1-8
echo

You can also create temporary email aliases, each week, for all users
Good: Efficient, and you can give your real email in your reply-to address
Bad: Temporary emails
How to do: Not so hard todo
Example: (script & php, Unix)
#!/bin/sh
#
# Anonymous random aliases for all users
# changed each week, to avoid spam problems
# on websites
# (to put into /etc/cron.weekly/)

# Each alias is regenerated each week, and valid for 2 weeks

# prefix for all users
# must not be the prefix of another alias!
USER_PREFIX="user-"

# valid for 2 weeks
ALIAS_VALID=2

# random string
SECRET="my secret string `hostname -f`"

# build
grep -vE "^$USER_PREFIX" /etc/aliases > /etc/aliases.new
for i in `cut -f1 -d':' /etc/passwd`; do
if test `id -u $i` -ge 500; then
off=0
while test "$off" -lt $ALIAS_VALID; do
THISWEEK="`date +'%Y'` $[`date +'%U'`-$off]"
SECRET="`echo \"$SECRET $i $THISWEEK\" | md5sum | cut -c1-4`"
FIRST=`echo $i | cut -c1-3`
NAME="$USER_PREFIX$FIRST$SECRET"
echo "$NAME : $i" >> /etc/aliases.new
#
off=$[$off+1]
done
fi
done

# move file
mv -f /etc/aliases /etc/aliases.old
mv -f /etc/aliases.new /etc/aliases

# update aliases
newaliases

And then, put the email address in your pages through: