Choosing a Server Size

When a Freedom Unlimited website requires extra server power, we can upgrade it for you. Upgrades give the server more RAM, disk storage and bandwidth. These can be adjusted per server to provide the optimal level for each website. Regardless of what size server your site is running on, there are things you can do to the site to make it run optimally, which are covered in the "Optimize Your Site" section below.

Determining Server Size

It's very difficult to determine ahead of time how large a server a website will need, because no two websites are the same. Here are the relevant factors:



The first factor in determining how powerful a server your website needs is traffic, which is the number of people who are visiting your site. Keep in mind, however, that there are different kinds of traffic, and while looking at your website’s statistics can give you a good idea of how much traffic you’re getting, it doesn’t show the whole picture.

For one thing, there are bots, spiders and web crawlers. These are used by search engines and other programs to index your site, and also sometimes by attackers and malicious users. In any case, these will usually not show up in your site statistics. To the server, however, they still count as visits.

Also, if you send out a large email campaign that contains graphics hosted on your site, then each time somebody views the email, your server has to load up those graphics. This is also true if there’s a link to download a large file on your server within that email, or somewhere on a third-party website. In each of these cases, your server will be serving up these files, which uses up resources, even if people aren’t actually visiting your site.

Traffic Peaks

Another important factor in determining how powerful a server you need is how high the peaks in your traffic are. Let’s imagine that two identical sites each get 1,000 visits per day. However, one of these sites receives a steady stream of traffic throughout the day, while the other one receives a huge spike every day at 5pm. The second site will require a more powerful server. To keep from crashing or being unresponsive, a server needs to be powerful enough to handle peaks in usage. Average usage over a day, or even over an hour, is not relevant if occasional peaks are able to take the server down. You need enough headroom to account for these peaks.

Site Design

The way your site is built can also have a huge effect on the load placed on your server. To give you an example, let’s imagine there are two sites: Site A and Site B. They’re getting the exact same amount of traffic. However, all of the pages on Site A consist of nothing but text and a few small, web-optimized images, while Site B’s homepage has a photo rotator, an embedded video, several JavaScript files, a product lister, 1 MB worth of images and a form letting visitors sign up for a newsletter. Each time a visitor arrives on Site A’s homepage, the server just needs to serve up text and a few images, while on Site B, all of the graphics, videos and scripts have to load every time.

Making the Determination

The good news is that with the AccriCloud, you can always change your server level once you have a better idea of what size you need. Changes to your server require no downtime, and don't require DNS changes. As a company's website is a crucial part of its business, we always suggest going bigger at first, and then cutting the server power back if you can.

Optimize Your Site

But there are also things you can do to make your site run better. Here is a list of some of the top reasons why sites perform slowly, and what you can do about it.

1. Large images/videos

Suppose one of your clients regularly uploads photo albums to their site.  They want the photos to be as high-quality as possible so that visitors can see their work, but you know that large images do wonders to slow a site down.  Or perhaps a client wants large, high-resolution images on a rotating slider on the home page, but the size of the rotator is only half the length and width of the actual images.

Having your server send out huge images takes an unnecessary toll on the resources available, and is especially wasteful if users do not view all the images on the page or the images are scaled down from their native size to fit on an element of the page.  Similarly, even small videos can easily make up the bulk of the data that a server sends out to users.  Offloading these resources to other sites can save an enormous amount of server resources at a relatively small cost.

What you can do

The three most important factors when optimizing your images for the web are choosing the right file format, choosing a good compression ratio, and ensuring images are not scaled downwards. Properly optimizing with respect to these three parameters will drastically reduce the load on your server, enabling you to support more concurrent users.

There are many file types available for use, but the most commonly used are jpg, png and gif, in that order [1]. Each of these images has optimal use cases which will deliver the best quality per byte of the file's size.  A tutorial on the differences between these formats can be found here.  If you are unsure about which file type or the amount of compression to use, you can always save an image with multiple different types and amounts of compression and compare the images on quality & file size.

Additionally, you should ensure that your image's dimensions are not being scaled down by your website.  Stack Overflow has more detail on why this is optimal.  Good programs to perform these editing tasks include GIMP and Photoshop, but there are online resources if you prefer to not use GIMP or Photoshop.

Finally, if necessary, you should consider offloading work to other third parties if possible.  There are many places that will host your videos for cheap or free - YouTube and Wistia are two great examples of sites to which you can offload heavy videos which would otherwise eat up server resources.  For smaller resources which don't make sense to upload to a third party, such as small pdf forms, it is best to ensure that you do not link to other sites hosting the resource - instead have a copy of the resource on each site that needs it - this will both help to minimize http requests and to ensure that the site which hosts the file doesn't take the brunt of the traffic from several other sites. 

2. Javascript/CSS issues

There are many different issues which can be caused by CSS and Javascript files.  For instance, you may be linking to many external files, you may have ordered them improperly, not be deferring javascript, or perhaps even the files that you're using are not minified.  All of these issues can slow down a site and negatively impact the performance for the end user.  Fortunately there are many resources online which makes it easier for you to fix these problems.

What you can do

Combine external javascript and css files and order them properly.  Once modern browsers encounter javascript files they wait until the javascript file is done being processed before moving on.  When you ensure that the css files are referenced before the javascript files, you ensure that the javascript does not block the parallel download of additional resources.  Combining & minifying the css and javascript files can save even more server resources.  This tool can help you combine & minify external javascript files while giving you additional information about any potential errors in your code, while this tool does the same for css files.  Finally, google has a good tutorial on the hows and whys of javascript deferment.

3. Many HTTP requests

Each time you request a resource from a different URL, it contributes to the time it takes to load a page.  These contributions are most significant when you have a page with a small overall size.  Therefore, as you optimize regarding the size of your site, this category of optimization becomes more important.  If you are linking to external resources from many different locations, have links to redirects, or have bad links, you will contribute to the overall load time that your end user experiences.

What you can do

To minimize HTTP requests, you should ensure that you serve combined css and javascript files from one location if possible rather than grabbing external files from a hodgepodge of different URLs.  Additionally, ensure that you have no links which simply redirect the user to another location, and that any links to pages with a 404 or other 4xx error codes, as both of these add HTTP requests at no additional functionality.  You can use this tool to crawl your site looking for broken links.

4. Large amounts of traffic

A server's resources are finite, meaning that there is a cap on the amount of requests for the web page that the server can accommodate at any given time.  Having more concurrent users requesting data from your site than it can handle will slow down or potentially even crash the site.  Disregarding malicious denial of service attacks and sudden enormous spikes in popularity from larger sites linking to yours, there are several types of traffic which you can analyze and prepare for.  

General user access comes in two basic forms - a slow trickle and a huge flood.  Most web sites experience the former - they get so many visitors a day that mostly come spread out.  Perhaps there is a small spike around 6 pm when many people arrive home from work and log on to browse the internet, but for the most part traffic is constant throughout the day.  If a site is getting most of its visitors in a short period of time, the server will have to work harder to serve the same number of people than another site which gets traffic more evenly throughout the day.  Therefore it is important to realize that depending on your site's traffic patterns, you may need more powerful hardware than other similar sites, and it is especially important to ensure that your server is powerful enough to handle the peak load and not just the rest of the day's traffic.

Robots, or web crawlers, are automated pieces of software that scour the web harvesting data.  It is possible that a site can experience a performance impact from well-intentioned robots, especially during peak load times.  Fortunately there is a simple way to use a text file called robots.txt to disallow bots from browsing your web site.

What you can do

Robots.txt is a file that lives on the root of your web directory which gives robots detailed information about exactly which directories that you want or do not want them to be able to browse.  To access the robots.txt file in Accrisoft Freedom, go to Silver > Tools > Search Optimization > Robots & Spiders.  Details on how to structure your robots.txt file can be found here.

Once all of the other recommendations on this page are followed, the only way to ensure that your site can run during peak load times is to ensure that your server has enough resources to handle the peak times.

Back to top
Ready to Get Started? Contact Us for Pricing