Search engines do more and more often also rate the true speed of a page for the search results.
Are you prepared for that? Fast web pages are a must today if you want to keep up with your competitors. Information can be found everywhere and whoever "serves" this information fastest will win.
Pagespeed only measures the main domain (https://yourdomain.de)
(not https://yourdomain.de/analyse.html or https://yourdomaindomain.de/whatelse/).
Subfolders or the submission of certain files are being filtered out.
Google ™ Tests have shown that pages with a loading time of > 4 Sek. are left instantly 90% more often. The speed tests are conducted by a dedicated server in Germany with a 120 GigaBit connection.
The results are displayed in colour. Green values are basically good, red values signify high loading/latency times and might indicate a high data volume and/or a slow reaction time. In addition, a too high data volume in certain areas is indicated.
You get fast Premium Hosting by SEO NW, Cool Webhosting ,with which you, too, could improve the results.
A lot of the information given by pagespeed.de can be used for one's own website. If they are acted upon, the website will load faster and visitors will have a better surfing experience. The Google Pagespeed Score should also increase then, however, here at Pagepeed, more data are displayed and weaknesses of one's own website are shown. Beyond that, we recommend a premium hoster. These are a bit more expensive to purchase, but possibly also improve the results on Google and Pagespeed.
We develop continuously. A lot of data are displayed, especially in the domain archive. Should you want for a new feature, feel free to contact us. As we are currently working on the script, it can take some time until pagespeed.de is adapted to your wishes.
A new feature is also to be able to ban the pagespeed.de crawler. We respect if you do not wish for your website to be tested. Therefore, there is an instruction manual how you can prohibit the pagespeed.de crawler to perform tests of your own website. We do regret this but it is your right. You only need to add two lines to the robots.txt and the pagespeed.de crawler will not visit your website. The crawler will adhere to robots.txt but nevertheless, we recommend to perform the Pagespeed test and take the data into account. Should your website be slow, you will find valuable advice in the archive to increase the page speed.
With User-Agent: pagespeedbot you prohibit pagespeed.de from reading out your URL. Please be careful with robots.txt and test the entire thing to ensure that you do not ban all crawlers/bots from your page, also by the way when using a wild card.
Disallow : /
With this code, our bot will not visit your page. However, with this code ALL bots are prohibited from your page (if they adhere to it...), also the Google Bot. Be careful with it!
We also want to make the web faster. Take to heart the advice you can find on pagespeed.de and take care of a nice experience for the visitors of your website. Fast websites will give you and the visitors the necessary light bulb moment. Be one step ahead of your competitors!
A Ping is the most simple and quickest way to check if a server is available. The speed which is measured here (in ms) however only points at the network connection and the value has no relation to the performance. In case a Ping cannot be sent through, either the server is not available or the hoster is blocking the Ping protocol.
A potential blockage of a Ping can also result from an entry in the firewall.
In Germany, 'server to server' Ping times should be under 20 ms, for abroad values around 30 ms count as ok.
Latency time: The period between the start of a 'trigger' and the entry of a visible/measurable reaction to it (see Wikipedia). By means of latency more can be concluded about the performance of a server. Here the reaction of the web server is examined.
If it reaches bad values here, the reason can be an overloaded web server but also potentially a SQL data base which makes the server wait.
The latency should be a maximum of 2-3 times the Ping time to indicate a quick reaction of the server.
Generally the front page of a domain is measured and the front page should always receive special attention. News magazines like Focus.de pack their front pages to the top and make the visitor load lots of data. Smaller pages should be more diligent because long loading times increase the risk that the visitor leaves very quickly. Therefore in our test we also pay attention to the data volume and the speed.
A fast connection of course delivers more data in the test period, which will be set in relation. You can accelerate with a CDN
In today's time no we page can exist without images and graphics any more. But in this field you should not overdo it, either, because lots of images, which might even be fairly large, hinder a website from loading fast. Every image means a HTTP request and each request requires bandwidth.
Many test pages recommend to please put graphics into so-called Sprites. That offers the advantage that only one graphic has to be loaded. The big disadvantage of it if that then in the CSS the coordinates of every single partial graphic must be named. With that the average webmaster/blogger usually cannot cope. Therefore sprites are not very suitable for everyday use. ( For example: Bilder.Rocks ) is a bigger Picture Website.
One option for relief is offered by Google™, where one can for example also load actual AJAX libraries from external sources.
HTML is practically the basic language of all web pages. Without HTML a browser would not know how to display the page. While earlier formats were defined within the HTML code, now this is realised through definitions in so-called style sheets (CSS). Style sheets, too, can be combined into one big file and load in the header instead of many small ones.
So the HTML code of a front page should range in the region aroung 50 - 100 KB, everything else is either overloaded or using so-called spaghetti codes. That means that style information is in the HTML code and not in the style sheet where it belongs. Formatting in HTML code is not saved in the cache! Optimized Websites have an advantage by Search Engines ( see https://versicherungen.rocks ).
Obviously online shops live from the wide range of articles and a choice of various providers. Whoever sells not just their own products in a shop is dependent on external data. Generally such data should be actualised in times of low traffic (for example at night). Only if data are loaded from a local data base or local XML files a quick response of the server can be ensured. If there is permanently live scanning for new data that can be extremely hindersome for a good performance.
Blogs live from their layouts and the plug-ins they use. If you use extensive layouts or have installed many plug-ins you have to live with a big rise in latency times, expecially if external data are loaded like for example Twitter streams (the API of Twitter is very slow). In addition, the outsourcing of image galleries to external services can lead to obstructions. Live searches like for example by Amazon ™ for certain keywords can also delay the loading of the page to a certain degree.
Whoever inserts advertising to their pages, automatically builds in some brakes into the site. If you cannot or do not want to abstain from advertisements, you should pay attention to how fast a provider delivers its advertisement. A good choice is for example always Google ™ Adsense. Its server promptly delivers its banner and hardly delays the page in building up. You should not set more than three banners per page (Google ™ does not permit more anyways).
Statistics should - if possible - be compiled internally (local plug-ins, Piwik ) and not be outsorced to external providers. Especially counter providers from the USA weaken the performance of a web site a lot. Here we also want to note another Google ™ Service: Analytics. Their tracking servers are spread world wide and hardly impede the loading of the site. (Pay attention to data protection!).