Best Bang for Buck: Improve Hosting Speed?

Spacemail by SpaceshipSpacemail by Spaceship
Watch

Name Trader

formerly @stubTop Member
Impact
11,467
OK. What is the best bang for you buck to improve the speed of your vps/dedicated hosting?

Change CPU
Add more Cores
Add more RAM
Change to ECC RAM
Change HDD to SDD
Upgrade Port to 1GB
Anything else?
 
0
•••
The views expressed on this page by users and staff are their own, not those of NamePros.
AfternicAfternic
would not it better to write quality code once than run one or more apache modules on every page request to cure a crappy template during runtime... what a nice advice, congrats
Well thank you, but I believe you haven't read what mod_pagespeed does. It turns your "quality code" (which you may not know is a horrendous pile of garbage created by a paid template company), into what Google considers quality code.

It does this once for every possible device (iOS, Android, IE, etc.). Having SSD's accompanied with varnish does the trick quite well. It will build the entire site and cache it into RAM with SSD's as a backup source to read from if there is too much traffic.

Once the main process builds all the files (compressed, static, etc.), which merely takes milliseconds, it is good until there is a page update. Having primarily minisites, there's rarely an issue there.

It will also throw you into the Page Speed of 99/99 when configured correctly, which is a tiny SERP bonus.
 
Last edited:
0
•••
Well thank you, but I believe you haven't read what mod_pagespeed does. It turns your "quality code" (which you may not know is a horrendous pile of garbage created by a paid template company), into what Google considers quality.

im sorry but you probably misunderstood what i tried to say, which is my fault i admit..

i dont know how google can examine your php/whateverserversidecodeyouuse, what i meant to say was as simple as when your code is a quality one you dont need to install/configure/run/log/test dozens of apache/wp/js modules which only purpose is to correct your crap code... (code is not html sent to browser, its a program written in your server side programming language: php, java, js, python etc)

varnish/caching is as Disney would say a totally different world having nothing to do with putting js in the head vs bottom of the page, if you need to use one more apache module to accomplish that simple task that just says a lot about quality of your code
 
0
•••
im sorry but you probably misunderstood what i tried to say, which is my fault i admit..

i dont know how google can examine your php/whateverserversidecodeyouuse, what i meant to say was as simple as when your code is a quality one you dont need to install/configure/run/log/test dozens of apache/wp/js modules which only purpose is to correct your crap code... (code is not html sent to browser, its a program written in your server side programming language: php, java, js, python etc)

varnish/caching is as Disney would say a totally different world having nothing to do with putting js in the head vs bottom of the page, if you need to use one more apache module to accomplish that simple task that just says a lot about quality of your code
I think this is a good learning opportunity for everyone to go over what mod_pagespeed accomplishes.

No, it does not fix your server side scripting such as your programming languages of PHP and Python. This can take forever to render the final output that an end user sees. Once the output is rendered, it is cached as a static file server side, never to be processed as multiple threads again as it once was.

However, it does minify your static JS files to be compliant with Page Speed and places them accordingly within the rendered code for both mobile and desktop systems.

The final output are static images in various formats for optimal speed (JPEG, PNG, WEBP, etc.), CSS, JavaScript and html files which are served instead of going through the 30 second loop you caused by having 20 active WordPress plugins and a poor theme. With that being mentioned, it will only happen once, unless the page is changed.

Varnish and caching have nothing to do with putting Javascript in the head or in noscript tags, unless it's deemed necessary to be there for functionality of above the fold content. Otherwise, all CSS and Javascript is automatically put into the footer to load on execution. This can be accomplished by a human using multiple tools such as an online minifier, etc., but will take much longer to implement it correctly to pass in order to get good cookie points with Google.

I could toss up an example page in 1 second and within 3 refreshes, the page will surpass what a human can do in 8 hours if you'd like (as it doesn't hog all resources to do it at once).
 
0
•••
@David Walker care to tell which site you are using to check page speeds? What is 95/95 or 99/99? Thanks
 
0
•••
gtmetrix.com/ and webpagetest.org/ will serve you well.
 
0
•••
0
•••
if you need to use one more apache module to accomplish that simple task that just says a lot about quality of your code

Most people don't write code. Most people buy solutions.

If someone complains their commute from the suburbs to the city are too slow. You're advice is to Stage-3 their commuter car... where others say take a park and ride train.
 
0
•••
Something left out here....
If you have images (like affiliate banners, etc) served from elsewhere, that could be a big reason pages slow down on delivery.
Plus ad networks like adsense, chitika, etc. can also slow down delivery of pages.
Just something to think about.
 
1
•••
I analyze my sites straight from Google. This gives me a better idea than using other tools such as Pingdom, etc. as it gives me constructive feedback of what Google wants for brownie points in their algorithm.

XX/XX refers to Mobile/Desktop - I think I've only seen one website with 100/100. That's nearly impossible as you need to get below a 200 millisecond latency and DNS alone can take more than that, with Anycast and a CDN.

Mobile can be lower because it can be picky with how links are closer together and it may give the end user a bad experience as they're trying to click "About Us", but it keeps going to the "Contact Us" page (right next to it in the navigation).

There are several other factors than just this that are analyzed on both platforms. I believe that even though they don't let you select a server or device to test from, they are doing it from specific locations and devices as well. I've came to that conclusion because you must wait 30 seconds between tests, I see the Googlebot using different user agents accessing my site and the scores will bounce around (+/-), even though nothing has changed.

A typical minisite will have a 96/98 score after everything Google suggests is corrected, and that's because it's out of my control. Google wants me to leverage browser caching. So, I did. However, running it again, they want me to leverage browser caching on AdSense and Analytics. In other words, they dock you points for not doing that (which you could, but would be against the TOS). -_-

Either way, I see my position in the search engines higher following their constructive feedback and fixing everything (in my case, doing it once and automatically using an Apache module mod_pagespeed).

From my first test in the mid 60's/70's to 98/99, I saw a 1 to 3 page jump once crawled again for certain keywords. I remain in the same spots.

It leads me to believe that Google is putting Page Rank high on the pedestal as it does give the user the best possible experience. Everything they see loads first and quickly, while what they won't see for another second is still rendering.
 
2
•••
0
•••
I have a vps run multiple scripts per day for sorting through pre-release and dropping domains lists. When I switched from hdd to sdd, the processing times dropped from 2 - 3 hours per script to about 15 minutes each. Not sure how much bearing that has on your particular situation ( sorry, didn't read the entire thread ), but thought it was worth mentioning that that can make a significant difference.
 
0
•••
mod_pagespeed is pretty powerful. Its output is pretty much illegible--it's not quality code; it's just smaller, so it transmits faster. It makes various other optimizations as well based on Google's performance observations over the years. It's designed for Apache because sites on Apache tend to be the least optimized. Developers who put time and expertise into optimization steer clear of Apache in the first place. Its multithreading techniques are primitive and outdated.

If you're looking to increase your speed, it really depends what sort of slowdown you're observing. Typically growing websites/servers first start to notice problems with MySQL, especially when they're using large, heavyweight frameworks like WordPress. The key to optimizing MySQL in these cases is usually more RAM, but you can't just throw memory at it and expect it to work better: you have to tune your database and engine settings accordingly. In most cases it's important that you use InnoDB as your engine for each table, not MyISAM. It's also important that you allocate as much RAM as possible to various InnoDB caches. Fine-tuning InnoDB/MySQL is a complex process that is best explained by people who special in that area; there are some nice blog posts about the topic floating around the internet. Note that under certain circumstances, mod_pagespeed and easy-to-use CDN services like CloudFlare can help take some of the load off your MySQL server.

Edit: I should note that tuning MySQL sounds scary at first, but it's pretty easy to do as long as you do your reading and follow the directions of the experts. Switching from MyISAM to InnoDB is also easy if you have something like phpMyAdmin.
 
Last edited:
1
•••
Some great insight from David & Paul. Thanks for starting the thread Stub.

I am responding here to bookmark this :)
 
0
•••
Dynadot — .com TransferDynadot — .com Transfer

We're social

Spaceship
Domain Recover
CatchDoms
DomainEasy — Zero Commission
  • The sidebar remains visible by scrolling at a speed relative to the page’s height.
Back