Can improving performance for SPA clients negatively affect SEO? Split test says…

The Context:

In order to tackle the performance and infrastructure issues caused by the increasing number of Single Page Applications, the Google engineers announced a small revolution in SEO at the GoogleIO last year.

Webmasters are encouraged to use dynamic rendering – serve different content based on user agent detection (Search engines can get a full HTML rendering and client can get a hybrid HTML/JS or a full JS).

https://developers.google.com/search/docs/guides/dynamic-rendering

At PriceRunner where are running React Js, we have been really excited by the performance improvements this policy change opens for. Logical improvements allowed by dynamic rendering would be:

  • a. For search engines (SE): serving a 100% server rendered version / No JS.
  • b. For users: serving an above the fold server rendered + JS

We are in the habit of testing Google recommendations before scaling them, so this time, we tested alternative b.:

The test:

Can improving performance for clients (by server rendering solely above the fold) negatively affect SEO?

Test Variant (Y): Clients only get server render above the above the fold + JS

Control (X1): Clients get our standard hybrid rendering*

SE bots: Bots get the same version for both control and variant: our standard hybrid rendering*

*Our standard hybrid rendering means: complete server rendering of the page + JS

Set up of the experiment:

Test variant for URLs representing ~ 50% of visitsControl Group for rest of URLs representing ~ 50% of visitsSearch engine get the standard hybrid rendering

Scope : DK, category TV(2), we split tested product pages based on their last product ID

And create 2 groups of equivalent traffic with similar distribution of traffic between pages within them.

Test Group URls: The clients gets only above the fold server rendering // Search engines gets our normal hybrid rendering
Control Group URls: The clients gets our normal hybrid rendering (100% server rendering of the page) // Search engines get the same normal hybrid rendering

Output the experiment:

Data available as attached

Analysis of the experiment:

We used Google Causal Impact to estimates the causal effect of our intervention on our test time series (test variant y).

The model assumes that the outcome of the time series can be explained in terms of a set of control time series not affected by the intervention (our control group – covariate x1)

A month within the test, the predicted time series (Predicted) outperforms our test series (y) by 6,1% with a standard deviation of 2,5% and a posterior probability of causal impact of 99,1%.

Conclusion:

So in short, we can assume we would be losing 6% organic traffic by scaling up our test to our entire site for at least a month. We will keep running the test to see if the cost dampens in the second month. We will also run a control experiment to check we get the same negative trend.

Server render above the fold for clients only should not be a bad idea for mobile users. We might be able to compensate the organic traffic loss by providing major performance improvement to the users. Maybe the subject of a next blog post.

Finally, we are going to test:
– the other alternative (a.) and serve only 100% HTML for bots (by removing the initial state)
– the combination of a. and b. , that might surprise us.

almost-there

Almost there… last leg of the journey: Go there, say your name and that a letter is waiting for you here. POK.

AWR cloud script debuged – SEO

Advanced web ranking AWR cloud script debuged: I have debugged The [get ranking] API call and it works fine on Wamp server 2.4

Observation: DONT’F forget to ALLOW url fopen in you php.ini

 
open('tempfile.zip');
            if ($res === TRUE) {
                $zip->extractTo($path_to_extractd_json);
                $zip->close();
            }
            else
            {
                echo "Could not extract json files from the zip archive";
                continue;
            }

            $dir_handle = opendir($path_to_extractd_json);

           // var_dump(readdir($dir_handle));

            //Now we can read each extracted json file and output the results
            while (false !== ($entry = readdir($dir_handle)))
            {
                //var_dump($entry); 
                $entry = $path_to_extractd_json.$entry;
                if (!is_file($entry)){ 
                continue;
                }

                $rankings = json_decode(file_get_contents($entry), true); //Make sure you use associative arrays since the json file contains nested json objects
                //var_dump($rankings); exit;

                //echo "Search Engine: ".$rankings["searchengine"];
                //echo "Search Depth: ".$rankings["depth"];
                //echo "Location: ".$rankings["location"];
                echo "Keyword: ".$rankings["keyword"];
                $rank_data_array = $rankings["rankdata"];

                foreach ($rank_data_array as $rank_data)
                {
                        if (stripos($rank_data["url"], "pricerunner")){
                    echo $rank_data["position"].". ".$rank_data["url"]." result on page ".$rank_data["page"];
                    echo "
"; } } } } } else { echo "No results for date=".$the_date; } } ?>

More info on AWRCLOUD API can be found on their site. The Update action allow to tag your keywords and is very handy but I haven’ tested the script yet. Good coding!

http://www.advancedwebranking.com/online/developer-api.html