How to reach a world peace speaker
Instructions: How to get a 100/100 Google Pagespeed
What is Google Pagespeed?
Google Pagespeed is an online tool that can measure the performance of your website using a wide variety of parameters. A distinction is now made between mobile and desktop speed. Pagespeed also tests the site for "usability". In addition to the better user experience, a good page speed also has a positive effect on the ranking of your page in Google search, so that it is twice as worthwhile to improve the performance.
It is important to say right from the start that optimizing the pagescore to 100/100 is just a proof-of-concept or experiment. It is good and important to optimize for a high page score, but since the rules to get to 100/100 are so strict and sometimes arbitrary, the goal cannot and should not be to achieve the full number of points. Some pages will simply not be able to be optimized up to the full number of points without having to severely restrict the functionality. Nevertheless, you can learn a lot about page speed optimization from this experiment.
In my experience, an average page that is not explicitly optimized for speed but does not contain any serious errors will achieve a pagescore between 60 and 75. In general, I would consider a score in the mid-90s to be the goal for most projects.
At first I only optimized the home page of my blog to 100/100. My blog runs on the Kirby CMS - some optimizations therefore also relate directly to this system. It is therefore possible that not all pages will achieve a pagescore of 100 with the techniques described.
I have divided most of the points in the same way as Google names the sources of error in the page analysis, but some techniques will overlap in terms of content.
The starting position and the existing sources of error are of course different for each page, so that you may still encounter problems that I cannot go into here because I did not have them before the optimization.
If you generally value the performance of your site, you should of course, as a basic requirement, not host your site with the cheapest shared host. For advanced users, I recommend Uberspace popup: yes. If Google notices during the analysis that the server is responding slowly, this is noted with "Reduce the server's response time".
In order to be able to better assess the effect of the image size on the pagescore, I tested three photos that were much too large and scaled them down using CSS. With just three photos, my Pagescore dropped directly to 33/100. This shows how extremely important image optimization is when it comes to pagespeed. Basically, the following applies: Store all images only in the size in which they are actually used, i.e. do not scale too large images smaller using CSS. In Kirby, I use the built-in thumbnail function for almost all images. In addition, one of the many image optimization tools can be used to reduce the size of all images a little without loss. It gets a bit more complicated when it comes to retina images. Here you have to make sure that you use the images in the correct sizes. But that's a whole topic in itself that I'll go into again separately.
Use browser caching
Browser caching in and of itself is very easy to set up if the server supports it. On an Apache server, the caching rules are defined in the file. This defines how long the downloaded data should be saved by the browser. This doesn't help the first time you visit the site, but it does help with subsequent visits. For static files that do not change frequently, the time can therefore be increased to a month, for example. An example from mine:
With this rule, all local caching problems were eliminated for me. If you use static files in a different format, you can add them to the rule.
In the meantime, minimizing CSS files has become widely accepted. The CSS files are changed by preprocessors or corresponding tools in such a way that all comments, line breaks and spaces are removed in order to save storage space. Of course, you continue to work with a non-reduced, documented version for development and a minimized live version. In addition, individual CSS files should be combined into one file in order to reduce the number of HTTP requests. This point can be ticked off quite easily.
Eliminate render-blocking "above the fold"
It gets more complicated when processing the CSS files. Google suggests that you split your CSS file into a “critical” part, which styles everything that is visible in the “above the fold” area, and a “rest”. The "critical" CSS is then inserted inline in a tag in the, so that no request is triggered, but the area can be rendered directly. The rest of the CSS is then integrated via an external file, which, however, is not linked in, but before the closing. According to the HTML5 specification, this is also allowed, even if it is almost never used.
As you can imagine, extracting the critical CSS area is a tedious and inflexible job. This may be possible for some projects, but not for many. I decided on the following intermediate route on my side: Since my entire CSS file is only 13 KB minimized, I decided to embed it completely inline. I do that in Kirby via the plug-in, which provides a "critical" function. A minimized version is then automatically integrated inline from a stored SCSS file. I have also integrated all other CSS resources as described in the (currently this is only the CSS file for my lightbox). For larger and more complex projects, I would definitely accept the minimal loss of speed and incorporate the CSS resources in the classic way.
The compression does the following: Data that are sent from the server to the browser are automatically compressed by the server software and unpacked again by the browser (in real time). This can save a few kilobytes of transferred data in total. As long as the compression is supported by the server and is configurable, there are no disadvantages. The compression for static files on Apache servers can also be defined via:
The compression of dynamic PHP scripts is usually a bit more complicated. For the setup on Uberspace, I've already written an article about it here. There you will also find a link to an online tool that you can use to test whether the compression works.
Avoid external resources or: The last points are the most difficult
Google Analytics (analytics.js)
The unpleasant thing here is, of course, that Google seldom updates the code of the file a few times a year. Logically, you don't get these updates in this way. The only thing that helps here is a PHP script that runs automatically and regularly via a cron job and compares the self-hosted file with that of Google and then updates it if necessary. Unfortunately, this method is unavoidable in order to achieve 100 points, but I would also forego this adjustment in real operation.
There is a similar procedure for Facebook or Google+ share buttons, which only loads the external resources after you have clicked on one of the share buttons.
With these adjustments I have achieved a pagespeed of 100/100 for mobile & desktop. In addition, 100 points for the user experience on mobile devices, which does not have anything to do with performance, but only with good responsive design.
- What is an Unqualified 1035 Exchange
- Sims 3 face highlighter how to
- Wigilijne przepisy ewy wachowicz obiad
- Who Own Coopers of Stortford
- How to babysit twin toddlers drowned
- How to sew English saddle stirrup covers
- Isabell Shipard How did she die
- Maraschino cherries Costco wholesale
- Rajinikanth vs cid jokes whatsapp status love
- Who is Princess Jeong Myeong
- Elektrasol tabs Costco wholesale
- What does depletion mean on a cucumber
- Swad Rashid Chowdhury Anwar-Tri-City
- How to convert degrees to centesimal system
- What is Finfet technology
- Who won Miss Humanity International 2012
- What does an application for annulment mean?
- How to measure 1 mg powder
- If you believe in texts rhema ministries
- Who is Marlene Malahoo Forte
- Micro capital, how to become an intern
- Whole Health Chicago Reviews
- How to get to 212 cm
- How to rehydrate cigarettes