How to protect your website from hackers

Fight back against web hackers
The XSS Me plug-in will give your website a thorough workout, looking for potential cross-site scripting vulnerabilities

Previously, we saw how hackers spend a lot of time surveying websites they want to attack, building up a detailed picture of their targets using information found in DNS records, as well as on the web and from the site itself.

This information helps hackers learn the hardware and software structure of the site, its capabilities, back-end systems and, ultimately, its vulnerabilities. It can be eye-opening to discover the detail a hacker can see about your website and its systems.

fight against hackers

GIVEN UP BY GOOGLE: 'Robots.txt' files are remarkably easy to find using a Google query

To prevent information about private files falling into the wrong hands, if there's no good reason for a file or directory being on the server, it shouldn't be there in the first place.

Remove it from the server and from the 'robots.txt' file. Never have anything on your server that you're not happy to leave open to public scrutiny.

Leave false clues

However, 'robots.txt' will also give hackers pause for thought if you use it to apparently expose a few fake directories and tip them off about security systems that don't exist.

Adding an entry for an intrusion detection system, such as 'snort_data' for example, will tell a false story about your site's security capabilities. Other directory names will send hackers on a wild goose chase looking for software that isn't installed.

If your website requires users to log into accounts, ensure that they confirm their registrations by replying to an email sent to a nominated email account.

The most effective way of preventing a brute force attack against these accounts is to enforce a policy of 'three strikes and you're out' when logging in. If a user enters an incorrect password three times, they must request a new password (or a reminder of their current one), which will be sent to the email account they used to confirm their membership.

If a three strikes policy is too draconian for your tastes, or you feel that it may lead to denial of service attacks against individual users by others deliberately trying to access their accounts using three bad passwords, then it's a good idea to slow things down by not sending the user immediately back to the login page.

After a certain number of failed attempts, you could sample the time and not allow another login attempt until a certain number of minutes have passed. This will make a brute force attack very slow, if not practically impossible to mount.