There are lots of things that you can do to extend the life of your website from the very beginning; allowing it to run fast and reliably while also being safe from malicious attacks. However, there are three forgotten areas that you should be concerned about when developing a new site; security, stability, and performance.
Let’s discuss some aspects of each area and how they can increase the safety, reliability, and performance of your website.
When most people think about web security, they think about someone hacking into the site and stealing customer data. However, this is not always the case. Sometimes hackers want to use your web server for other types of illegal or unethical practices such as setting up an email server to forward spam, using it to host illegal files or even illegal Bitcoin mining–just to name a few.
There’s nothing worse than having your website infecting your customer’s computers. Not only will Google mark your website as malicious but other filtering and antivirus services will blacklist your website and block their users from visiting it. From being blacklisted as a spammer to having your hosting provider completely shut you down – there’s no good outcome.
The cost of clean up can vary depending on how complicated your website is, the type of infection, and the quality of your backups.
If you are storing customer information, you may need to contact your insurance company and potentially report the breach. It’s a mess no matter how you look at it.
Below are some of the methods you can employ to reduce the risk of your web server being hacked as well as some overall best practices to prevent your server from being misused.
1. Prevent SQL Injection Attacks
If you use a data store that takes advantage of SQL and you use SQL directly in your code, then you could open yourself up to the possibility that a hacker will send malicious code that can cripple your site and/or corrupt your data. The best way to prevent this is to use structured parameters in your Transact SQL code. If you are using Microsoft SQL Server, you can also choose not to use open SQL in your code at all. Instead, you can use stored procedures that use formatted parameters. This will prevent random statements from being executed, and it will also be much faster since your SQL will be precompiled on the server.
2. Avoid Detailed Error Messages
If an error occurs, resist the temptation to use them as debugging tools. Handle the errors gracefully by giving the user a vague error statement and provide them navigation back to the homepage or the page they were on previously. Giving away too much information can give hackers what they need to exploit your site.
3. Prevent Cross-Site Scripting Attacks
4. Use Client and Server-Side Validation
5. Use HTTPS
Encrypting the traffic between the user’s browser and the server using SSL is always a good idea when the potential of transmitting sensitive data exists. This will prevent hackers from grabbing and deciphering the data as it is transmitted.
6. Use Two-factor Authentication to Log In
Using two-factor authentication to log into the management area of your website. Two-factor authentication essentially not only a username and password but potentially a continuously changing token/PIN or some sort of additional validation (i.e. a prompt on your cell phone) to verify it is you. Even if someone has your username and password, they can’t get in without the extra piece of information.
7. Keep Your Software Up to Date
In this day and age, you should be using a content management system (CMS). If you have an admin area you log into to manage content, then you are using a CMS. The CMS provider regularly provides updates to their core system, and various vendors provide updates to their plugins. Some updates add functionality, but many of the updates in between are primarily to fix security holes. If you don’t keep your system up to date, you are leaving yourself open to known vulnerabilities.
8. File Change Detection
You can run scripts on your server that notifies you of any changed files. There are some files that shouldn’t change often or at all unless you install an update. If you see that file change, you should be on high alert to find out what changed and who changed it. This is essentially a canary in a coal mine – it’s an early detection system.
9. Limit the Number of Login Attempts
Most systems these days can block an IP address if it has failed multiple authentication requests within a given period. Hackers have scripts that try different combinations to get in. If your website allows someone to continue trying, they may eventually get in. If you limit their ability to try new combinations, you may be able to keep them out. An example ruleset may look like five failed authentication attempts within a three minute period makes the user wait 15 minutes before allowing them to try again. You could even block their IP completely after a certain number of attempts.
10. Think in Layers
Consider someone picking a lock only to be met with another door with another lock. You can protect your website directly, but you should also protect your web server. You can use hardware or software firewalls, DDOS prevention systems, IP filtering, standard port changes, and malware scans to add an extra layer of protection.
Stability is a hard thing to define. There are lots of things that you should be aware of during development to make your site perform reliably and be more stable, such as cleaning up user sessions, guarding against memory leaks and managing garbage collection. There are also things that you can monitor for stability after the site has been deployed, like:
1. Clean Code
There is no replacement for clean code. Not only will it be more efficient, but it will be easier to track down bugs as well as easier for a new developer to understand. Code with no architecture or “spaghetti code” as we call doesn’t define code in a way that is separate and understandable. Instead, it is all mixed together and potentially duplicated in different areas of the site. There’s not much you can do with a site like this.
2. Load Testing
You should be utilizing cloud-based load testing tools if your website is expected to function under heavy load or heavy load spikes. You can create load simulations to see how your website performs under different scenarios. Make sure your testing environment matches your production environment.
3. Customize Memory Limits
If you have your own server make sure that your site’s memory limit is set to match your sites requirements as well as the resources of your server. You don’t want to make the website run on too little memory, but you also don’t want to allow one connection to use up all of your memory.
4. Cross Browser Testing
Stability is in the eye of the beholder. Make sure you test on the most popular version of Internet Explorer, Edge, Firefox, Chrome, and Safari. There are automated cloud tools to help you but adding manually testing never hurts.
5. Your Web Server
Are you using a dedicated server or a shared server? With a shared server, you are sharing the server’s resources with other websites. Although there should be limits on how many resources one website can use, we have seen servers at bulk hosting providers that may have hundreds of websites on one web server.
Not only do you want to make sure that your site is reliable and stable, but you also want it to be fast and easy to use. Below are a few of the things that you should monitor to make sure your site performs at its peak potential.
1. Full Page Loading Times
Measure the time it takes to fully load different pages. Especially measure the ones that contain linked content or things such as embedded content, large images or pages that query a database to pull in content. There are many tools out there to measure page speed. There are various factors to review such as first-byte time, DOM load, the overall file size of the website, compression, image optimization, caching, etc.
Try to test your site’s performance from different locations to make sure it isn’t slowing down in specific areas. This may have to do with the number of switches, networks, and servers someone goes through to get to your site. One solution is to use a Content Delivery Network (CDN). A CDN essentially caches copies of your website and places them on POP locations around the world, which then reduces the number of switches and servers your user has to go through to view your content. The network is set to come back to your main website and look for updated content.
3. Dedicated Resources
The cost of dedicated cloud servers has been going down. For the extra amount paid, you are essentially asking your provider to dedicate a certain amount of resources for your web server regardless of whether you are using it or not at that particular time. You are giving your website some breathing room instead of having it compete for resources.
4. Network Latency
Make sure to choose a reputable hosting provider. You can have a beast of a web server, but if their network has high latency or packet loss, your server won’t be the bottleneck.
When a visitor types in your website address or clicks a link on Google, their web browser has to do a DNS lookup. It’s essentially asking what IP address to go to in order to request the website files. Think of it as looking up a phone number. You want to make sure that lookup is as fast as possible. Make sure your DNS servers respond quickly.
In simple terms, caching is storing website data for future use. There are many places along the chain you can utilize caching and various types of caching systems. From server side caching to browser caching, you are essentially telling the server or browser to store pieces of information it will need to access often or information that will not change often. It’s one less lookup or transmission, and they add up.
7. Image optimization
Not all images are created equal. If you are taking a photo that you will print in a brochure and also use on your website, you actually have different requirements. For the brochure, you need high pixel density (DPI), but your screen needs fewer pixels. Additionally, there are file formats that work best for different images. You can choose between vector images or raster images. You have format options such as .jpg, .gif, .svg, and .png. You have compression options such as lossless compression or lossy compression. In short, you have a lot of options and what you use should be determined by the image itself and the display requirements.
Have you ever received where the box was much larger than the contents? Minification is the same thing; it’s the process of taking out unused characters without changing how it functions. You are making it smaller so that it transmits faster. CSS Aggregation is a bit different, it’s like order five things and having them all come in the same box vs. five different boxes. It just reduces the number of files a browser has to download in order to render your website.
9. Query Optimization
This one is a bit more difficult because it requires experience and finesse. When building a website that relies on a database to function, you can pull that data from the database in many ways. Additionally, you may be pulling from multiple tables in one database to display the content.
For example, in an eCommerce website, you may store the user information in one table and order information in another table. When a user goes to their profile page to see past orders, you would pull data from the user table first and then use information in that query to pull data from another table. Sometimes, you are pulling data from many database tables. Query optimization is essentially finding the most efficient route to get the information you need. If the query is not designed well, your user may have to wait several seconds for the server to pull up all the information and while that is happening, your server is using up more resources than it should which means it can serve fewer people at once.
Paying special attention to these three areas will help to ensure that your website is always safe, reliable and running at its peak. Designing, developing and deploying a website is only the beginning. If you compromise sensitive user data, your site is always down, or your site is consistently slow then users won’t want to return to your site, and you’ve done all of that hard work for nothing.
Managing and improving your website is an ongoing process. It is a living entity, and it needs to be given every opportunity to flourish. Contact us today if you want to extend the life of your website by ensuring that it is secure, stable, and performs.