Retour haut de page

Google Analytics Goes Down – What Does This Mean For Us?

Many of you may have experienced outages with certain Google tools yesterday morning. Google Analytics, Tag Manager and Optimise are among the platforms which experienced issues, causing disruption for marketers. These outages were a result of a major problem with Google’s Cloud Platform which was reportedly down for 1 hour and 39 minutes. Does this issue reflect a failure in the Google Analytics infrastructure? And, what does this mean for us?

Although it was only down for about an hour and a half, it still had a big impact. Issues like this can cause deadlines to be missed, daily analysis to be interrupted and overall productivity to dip.

What Caused It?

It is still unclear as to what caused the disruption as Google have stated that they are still carrying out internal investigations. The Google tools that were down include Analytics, Tag Manager and Optimise, which could highlight a flaw in the system.

The main disruption appears to just be a few tools being down. Let’s hope that this issue doesn’t cause any data outages in reports, as we saw with the Search Console bug this April. Time will tell if their tools being down has caused any other disruptions.

What Can We Take From This?

As marketers, you’ll understand how frustrating it is when an outage like this occurs. We need to be able to access these tools in order to carry out daily tasks. Working in a fast-past industry means we need reliable tools. Overall, the disruptions didn’t last too long, so shouldn’t impact us too much – unless any data is lost. We’ll keep an eye out for any further updates.

Did you experience any issues with Google Analytics tools yesterday? Tweet us with your thoughts.

Stop Using Robots.txt Noindex By September

Let’s all get prepared ahead of time, ready for the 1st of September, when Google will officially stop supporting noindex in the robots.txt directory.
Over the past 25 years, the unofficial standard of using robots.txt files, to make crawling an easier process, has been widely used on sites across the internet. Despite never being officially introduced as a web standard, Googlebot tends to follow robots.txt to decipher whether to crawl and index a site’s pages or images, to avoid following links and whether or not to show cached versions.

It’s important to note, robots.txt files can only be viewed as a guide and don’t completely block spiders from following requests. However, Google has announced that they plan to completely stop supporting the use of the noindex in the robots.txt file. So, it’s time to adapt a new way of instructing robots to not index any pages in which you want to avoid being crawled and indexed.

Why is Google stopping support for noindex in robots.txt?

As previously mentioned, the robots.txt noindex isn’t considered an official directive. Despite being unofficially supported by Google for the past quarter of a decade, noindex in robots.txt is often used incorrectly and has failed to work in 8% of cases. Google deciding to standardise the protocol is another step to further optimising the algorithm. Their aim with this standardisation is to prepare for potential open source releases in the future, which won’t support robots.txt directories. Google has been advising for years that users should avoid using robots.txt files so this change, although a major one, doesn’t come as a big surprise to us.

What Other Ways Can I Control The Crawling Process?

In order to get prepared for the day that Googlebot will stop following noindex instructions, as requested in the robots.txt directory, we must adapt to different processes in order to try and control crawling as much as we possibly can. Google has provided a few alternative suggestions on their official blog. However, the two we recommend you use for noindexing are:
• Robots meta tags with ‘noindex’
• Disallow in robots.txt

Robots meta tags with ‘noindex’

The first option we’re going to explore is using noindex in robots meta tags. As a brief summary, a robots meta tag is a bit of code that should be located in the header of a web page. This is the preferred option as it holds similar value, if not more, to that of robots.txt noindex and is highly effective for stopping URLs from being indexed. Using noindex in robots meta tags will still allow Googlebot to crawl your site but it will prevent URLs from being stored in Google’s index.

Disallow in robots.txt

The other method to noindexing is to use disallow in robots.txt. This form of robots.txt informs the robot to avoid visiting and crawling the site, which in turn means that it won’t be indexed.

A disallow to exclude all robots from crawling the whole site should look like this:
example

A disallow to exclude one particular robot from crawling the whole site should look like this:
example

A disallow for certain disallowed pages to not be crawled by all robots should look like this:
example

To exclude just one folder from being crawled by all robots, a disallow should look like this:
example

Important things to bear in mind

There are some important things to keep in mind when using robots.txt to request for pages not to be indexed:
• Robots have the ability to ignore your instructions in robots.txt. Malware robots, spammers and email address harvesters are more likely to ignore robots.txt, so it’s important to think about what you’re requesting to be noindexed and if it’s something which shouldn’t be viewed by all robots.
• Robots.txt files are not private, which means anyone can see what parts of your site you don’t want robots to crawl. So, just remember this because you should NOT be using disallow in robots.txt as a way to hide certain information.

And over to you

We’ve given you an overview of our two recommendations for alternative noindexing methods. It’s now up to you to implement a new method ahead of the 1st of September so that you’re prepared for Google to stop supporting noindex robots.txt. If you have any questions, make sure to get in touch with us.

Sign up for our newsletter at the bottom of this page and follow us on Facebook and Twitter for the latest updates.

Google Search Console Data Outage

As digital marketing professionals, we heavily rely on Google Search Console for extracting important information and to help us better understand how websites are performing in Google. However, the recent data outage in Google Search Console, which resulted in a major loss of data, means we should be more concerned about the reliability of Google’s reporting channels.
It was the 5th of April when Google reported an indexing bug, which hit 4% of Google’s indexed pages. Four weeks on and the bug has finally been resolved. However, despite the issue being fixed, many users have noticed bad/missing data in their Search Console reports. Not only does this major indexing issue negatively impact on our analysis, but it also presents an important question about the reliability of Google.

Ineffective April Reports

The data outage means that reports for April cannot be deemed as accurate and is extremely problematic for Google Search Console users. Without correct data for the majority of April, users are unable to fully distinguish whether any of their website pages were affected by the indexing bug or if any other major changed occurred.
Following on from this, users are unable to use their inaccurate April reports from Google Console to improve optimisation for their website. A major data loss like this will set marketing professionals far back. This data is significant for understanding the performance of websites in Google’s results search and is a major part of planning the optimisation process.

Can we really rely on Google?

An important question we should ask is, was the bug acting randomly or systematically? If the bug was systematically targeting certain sites this could raise the possibility that Google could be testing a new algorithm. The fact that the bug took a long time to resolve also questions the reliability of Google’s data channels. How can we fully trust a medium that is unable to resolve a bugging issue more efficiently?
Though many marketing professionals rely on Google’s tools as a main source of data, the recent issues with bugs should lead users to question the reliability of Google’s software. The de-indexing bug highlights the importance of using a variety of channels to ensure that not only do you have enough data to work with and optimise, but also that should Google encounter another bug, you have the traffic to minimise the impact of these issues in the future.

Sign up for our newsletter at the bottom of this page and follow us on Facebook and Twitter for the latest updates.

Technology for Marketing 2018

Bell Digital & Wizaly will be exhibiting at Technology for Marketing this September, the UK’s only event dedicated to martech, with 100 marketing specialists in a programme of over 250 expert speakers across 11 theatres.

At our stand, we'll be showcasing our Algorithmic Attribution platform and our Digital Marketing Services, offering to the industry solutions to maximise their ROI.

Moreover, our Head of Data Analytics, Ben Johnston will be speaking about how to Improve On and Offline Marketing Performance with Algorithmic Attribution and talk you through the solutions that they implemented for Cambria Automobiles.

What is TFM?

Technology for Marketing is the UK's only event dedicated to martech. One spot for marketers to nurture their next big idea and to gather actionable inspiration from marketing experts who’ll share their glimpse of the future. A great opportunity for networking and to get access to content and new features not previously seen before.

 

When?

26th and 27th September 2018.
Wednesday, 26th September 2018: 9:30 - 17:00.
Thursday, 27th September 2018: 9:30 - 16:30.

Speaking Session

Understanding On And Offline Marketing Performance With Algorithmic Attribution.

Join us and learn how one of the UK’s leading car dealership groups are able to truly understand the effect that their online marketing has on the sales happening on each and every one of their forecourts. Ben Johnston from Bell Digital & Wizaly will talk you through the solutions that they implemented for Cambria Automobiles, allowing them to understand the driving force behind the cars that they sell, where their investments are best spent and where they can save money.

 

When?

Thursday, 27th September 2018.
13:50 - 14:15.
Marketing Automation, Email & Multichannel Theatre

Where?

Olympia London
Hammersmith Road, Hammersmith, W14 8UX.
View map.
Stand: T676.

Don’t miss out on our upcoming events. Sign up for our newsletter at the bottom of this page and don’t forget to follow us on Facebook and Twitter to stay in the loop.

The good news!

FinalistWe are proud to announce that Bell Digital has been named a finalist in this year's Premier Partner Awards, presented by Google Partners.

The Premier Partner Awards honour innovation in digital marketing across Search, Mobile, Video, Display, Shopping and Growing Businesses Online.

The award submission included a profile describing our recent work for Tiffany Rose and was entered into the Growing Businesses Online category.

Tiffany Rose

Tiffany Rose began in 2003 with a simple aim: to offer pregnant women the chance to wear exciting, elegant, beautiful and well-made designs for special occasions. Every single garment is designed and made in Britain. In April 2018, Tiffany Rose was named as a winner of The Queen’s Awards for International Trade - the UK’s highest accolade of business success - as a result of their continued growth in international markets, aided by the partnership with ESV Digital. This follows their original 2013 Queen’s Awards for Enterprise.

“We’ve been working with Bell Digital since 2014 on our paid search. On the PPC side, they are hands down the best agency I have worked within my 20 years of working online (and in that time I have worked with a few). They are keeping us one step ahead of the changing PPC environment, the team is very responsive to our changing demands and of particular value to me is their ability to run global campaigns on a local language basis.”

Christian Robinson, Director - TIFFANY ROSE

Google Partners

Bell Digital is part of a select group of digital specialists that Google celebrates as Premier Partners. To qualify as a Premier Partner, digital marketing agencies and professionals must pass a series of exams and prove their expertise in using and applying Google's advertising products.

Premier Partner Awards 2018 winners will be announced on October 15-17 in Dublin.

Sign up for our newsletter at the bottom of this page and follow us on Facebook and Twitter for the latest updates.

Following the success of our first event, Google Analytics Best Practices, we hosted a beginner’s guide to the top Excel features used by our PPC Managers on the 27th of July. The event was led by our Head of Account Management for Paid Search, Alastair Poole.

The ESV Digital Team welcomed digital marketing professionals to a morning event and breakfast, followed by a networking session. During the event, we covered the likes of Pivot Tables, VLOOKUP’s, Index Match, Lens, SUMIFs and many other relevant tools to help PPC Managers take their Excel skills to the next level.

 

Don’t miss out on our upcoming events. Sign up for our newsletter at the bottom of this page and don’t forget to follow us on Facebook and Twitter to stay in the loop.

This July, Google AdWords (and other ad services Google owns) is changing its name. It will be Google Ads going forward. Here’s the rundown on what’s happening and why.

What’s Happening

The new AdWords/Google Ads UI was built from the ground up - and has been far from universally welcomed - and so the brand is launching at the same time as the new, formerly beta, UI takes over completely from the old experience.Google AdsIn addition to the Google Ads rebrand Doubleclick and Google Analytics will simply be named “Google Marketing Platform,” within which will be a single, unified platform what was “Doubleclick for Publishers” and “Doubleclick Ad Exchange” now to be called “Google Ad Manager.” All of these new brands are cleverly structured so you MUST say “Google” when you refer to their products rather than some form of shorthand (e.g. “AdWords”).Google Marketing PlatformAcquisitions

Google has taken some time to absorb their Doubleclick purchases both branding and technically, but they’ve also been heavily investing in Google Analytics and a new User Interface (UI) for AdWords. This is the culmination of all of this work.

Practical Changes

To be clear, this is far from a purely rebranding exercise and the new tools will be significantly different from, say, a year ago too. However, the biggest change - and the one that has led to miles of blog and community discussion threads - is the AdWords/Google Ads UI.

We won’t dwell on the intricacies of this but there is a growing belief that the dropping of “Words” from the name for the PPC platform suggests a continued and ongoing de-emphasis on keywords for advertiser control.

Areas to note beyond this are that there will be new campaign formats coming (seemingly aimed primarily at small advertisers) and in-market advertising is now available on search (again, giving you the option of stretching more away from search keywords and towards search audiences).

Conclusion

So, we’re looking at different brands, different UIs and we’re going to embrace a different way of managing PPC during 2018. Now, as much as any time in this industry, it will really pay to stay very attentive to all these changes over the next 6-12 months in order to keep ahead of the pack.

Google Analytics best practices event On the 29th of June, we hosted our first in-house event, Google Analytics Best Practices, led by our Head of Data Analytics, Ben Johnston. We welcomed our clients, partners and digital marketing professionals for a morning event, breakfast and networking session. During the event, we covered Google Analytics basics, view filtering, goal tracking, event tracking, the difference between goal tracking & event tracking, common issues and measurement planning. If you have missed this event, check out the presentation below.

Google Analytics Best Practices from ESV Digital

Don’t miss out on our next events, sign up for our newsletter at the bottom of this page and follow us on Facebook and Twitter for latest updates.

We are thrilled to announce that our clients Tiffany Rose and TOWER London were recognised in 2018 for their excellent work.

Tiffany Rose has been awarded for the second time The Queen's Award for International Trade 2018. The Queen’s Awards for Enterprise are awarded to businesses for outstanding achievement in four categories International Trade, Innovation, Sustainable Development and also for Promoting Opportunity, to recognise businesses who promote social mobility.

“We want to thank you, all of our lovely customers and incredible partners that we work with, as none of this would be possible without your incredible support.”
Tiffany Rose

We are also proud to announce that our client TOWER London has been awarded Independent Footwear Retailer of the Year for a third time at Drapers Footwear Awards 2018, the most highly-respected industry awards aimed at recognising the top performing businesses. As the only footwear focused fashion retail event the Drapers Footwear Awards recognise and celebrate the very best in the industry.

“The whole team wanted to say a massive thank you for your continued support and partnership”. TOWER London

Congratulations Tiffany Rose and TOWER London for the achievements.

Sign up for our newsletter at the bottom of this page and follow us on Facebook and Twitter for the latest updates.