Has Google just removed the business purpose of Google+ overnight?

Google+ and Google Local disconnected

As the New Year approaches Google has decided to roll out significant updates to its struggling social network Google+. It’s no secret that Google+ whilst attracting billions of sign-ups has a tiny active user-base, just 9%. Of these hopefuls I would love to see a breakdown of communities of people Vs companies who just pump information out on the network.

When Google announced in November that it was rolling out a new design for Google+, it was really signalling a fundamental change for the network. The focus is now on communities and ‘interest’ collections; technically useful features but only if active users on the network increase and spam limitations are put in place.

It warranted a sort of ‘meh’ response, but then Google sneaked in something big. When Google+ launched it was seen as the central hub of Google online, connecting services such as YouTube and Local; in the same way it disconnected ‘mandatory’ YouTube integration, it’s now completely disconnected Google Local.

This means Google Local business information such as reviews, categories, directions, star ratings, photo uploads, interior photos, maps, hours, and app integrations are no longer seen by people on Google+ business profiles. Not only this, but thousands of unverifiable business accounts (considered spam) were removed without notice.

Has Google just removed the business purpose of Google+ overnight?

The integration between Google Local and Google+ kept businesses tied to the network, knowing that their business profile information would make branded search listings and their activity on Google+ previewed above paid-for AdWord links in search. In some ways, it was the Google authorship (another archaic feature) of business.

It’s worth noting that this is a staged roll out by Google, available when choosing to view the new design of Google+. It will likely go the same way as Google Maps, the old version will be provided as an option but once all fixes have been made, the new version will remain permanent.

What does this mean for businesses?

  • It may no longer be worth maintaining a Google+ page, as business critical ‘local page’ information has been moved across to ‘Google My Business’. None of the information in Google+ will appear high in search and the SEO benefits are now questionable;
  • Google+ is going through a radical redesign; it’s deleted and disconnected a number of features over the last year. As a result, Google+ is an unpredictable network with a low number of active users – expect no engagement and little traffic directed to websites;
  • Word on the cyber street is that businesses should stop investing time into Google+, and focus purely on Google Local. The only exception to this rule may be visual businesses who are able to valuable contribute to community and interest discussions.

If you do run Google+ business pages then you’re likely to either encounter that your page has been removed or its turned into a basic profile without the Google Local information. Either way, think if Google+ is now worth it for business. Unless a business friendly update happens, then the message from Google is that ‘our social network is just for people’.


The importance of Google Knowledge Graph for online reputation

Google Android garden

Even if you haven’t heard of Google Knowledge Graph, you’ve probably seen it. Google updated its search algorithm in May 2012 to present a box on the right of search to show people, places, and things. So when you search for well-known or popular subjects, you’ll be delivered top-line information immediately.

Google Knowledge Graph, Bank of England

Just where does Google source its facts and figures for this box? This is a critical question if you’re tasked with managing the reputation of a person or an organisation because it’s the first thing people see. What’s more, Google Knowledge Graph appears above Google Business information (also known as the Google Maps listing).

A new paper published by the Oxford Internet Institute (OII) entitled “Semantic Cities: Coded Geopolitics and the Rise of the Semantic Web” reveals all; Google Knowledge Graph sources from Wikipedia as part of the Google funded Wikidata project.

This isn’t surprising, Wikipedia has steadily grown a reputation over the years for improved reliability and ever-increasing scale. It’s not just an online encyclopaedia; it is an online community of 25 million users who have created 5 million articles in the UK. The crowd-sourced element of Wikipedia keeps data fresh on the most important subjects.

The danger revealed by the OII is that we can take this ‘linked data’ online information economy of the internet for granted, potentially not questioning the facts and figures presented. If Google Knowledge Graph sources information from Wikipedia, then what happens when that information is wrong? Furthermore, what if that information is politically sensitive?

The focus of OII paper was on the political status of Jerusalem,

“The fact box is titled ‘Jerusalem’ followed by the statement ‘Capital of Israel’. The fact box contains a paragraph about the city, followed by a list of facts about the area size and population of Jerusalem that is cited to UNData.

The political status of Jerusalem has been widely debated in the media and by multiple stakeholder groups. This is because the city is claimed as capital by both the State of Israel and the State of Palestine. The city’s borders and governance have changed significantly over the years, most recently after the 1967 (Six Day) war between Israel and the neighbouring states of Egypt, Jordan and Syria when Israel annexed East Jerusalem from Jordan. Despite vehement disagreements by governments in the region, however, there is no widespread international recognition for Jerusalem (as composed of both East and West parts) as the capital of either Israel or Palestine.”

You can read a synopsis of the topic on the OII’s blog.

jerusalem political status

In terms of managing online reputation, the paper by the OII is important for three reasons:

  • It highlights the step between user-generated content to the unchangeable (fixed) content hosted by Google (You can flag inaccuracies but the process is not transparent, immediate or potentially successful)
  • It understands that information isn’t just structured data, some information has an emotive underpinning that requires Google’s ‘deep learning’ to appreciate not all information presented in reputable crowd-sourced sites are factual. There may still be ongoing debates.
  • It reminds us that behind every smart algorithm is essentially a need for a bin of knowledge (The knowledge graph was only possible because of Wikipedia’s extensive database)

Whilst the OII research was based on a location, familiar territory for me due to my experience in managing the reputation of tourism boards, it applies to people and corporations too. Google is attempting to reach a place where it doesn’t necessarily need to drive people to separate websites, links in search. Instead it’s much more convenient to host all the content directly in Google Search; saving valuable user clicks and generating even more juicy page views for advertising.

It’s inescapable that a monotonous challenge of managing online reputation today is attempting to find ways to tell Google that their information is wrong or damaging. Perhaps the development of Google’s Deep Learning software will improve Google Knowledge Graph next year? I hope so.

Bigger than Panda and Penguin: Google’s mobile-geddon is coming

The Search Engine Optimisation (SEO) weather report by MozCast has been particularly summery of late. Today we have temperatures of 17C but if you work in digital marketing, you will know a storm is coming. On the 21st April Google will be releasing its new search algorithm update that many are calling “mobile-geddon”. It will quickly identify your website as being mobile optimised or not. It’s black or white, one or zero; your website works on mobiles or it doesn’t. No pressure.

The new mobile algorithm update is going to be so big that Google has been incredibly noisy. They’ve blogged about it, alerted webmasters via email and it has been covered in every digital marketing publication.

mobile-friendly Google

Despite this, it has been reported that organisations such as the Daily Mail, Nintendo, Dyson and American Apparel are just not ready. Unless their websites are optimised for mobiles, then they will lose website visitors, and inevitably customers. Google’s update will rightly be labelled by some (mainly disgruntled lazy webmasters) as an example of the search giant acting as the great internet dictator; this time I’m on Google’s side.

According to the Office for National Statistics last year 38 million adults (76%) in Great Britain accessed the Internet every day, 21 million more than in 2006, when directly comparable records began. Of all those people, accessing the Internet using a mobile more than doubled between 2010 and 2014, from 24% to 58%. A huge rise showing that mobile-geddon isn’t just inevitable, it’s necessary. If Google were to rank websites without taking into account mobile-friendliness, then it would be its users who would suffer from poorly designed websites appearing high in search results.

The new mobile algorithm update will come after a relatively long period of peace for webmasters. The last two major Google algorithm updates happened in September and October last year. Panda 4.1 and Penguin 3.0 affected a combined 3-5% of search queries. I predict that Google’s mobile update will affect about 30% of search queries.

So if you’re a webmaster or responsible for the digital marketing of your organisation, Google has given you a clear choice. Become mobile optimised or irrelevant. It’s about to get stormy in the world of SEO.

Google Campus shows the way in breaking down news silos

The smell of tech start-up innovation is intoxicating at the Google Campus in London. The ideas being generated there are helping drive Britain’s economy. Which is why the Financial Times is now focusing so much editorial attention on tech companies. The classic tech lined along the walls, the Googleboxes (once phone boxes), TVs presenting social feeds all prove one thing – geek is chic.


I was at the Campus thanks to The Media Society, which arranged an event entitled ‘Media and Tech: What’s the story?’. Chaired by BBC Technology Correspondent, Rory Cellan-Jones, he was joined by a panel of speakers including the European Technology Correspondent for the Financial Times, Murad Ahmed; the Technology Editor for Mirror Online, Olivia Solon; the Head of UK & Ireland Communications at Google UK, Tom Price and the Vice-President & Global Communications at SwiftKey, Ruth Barnett.


With a complimentary beer in hand, here’s what I learnt about technology and the media.

A day in the life of tech reporting
Rory began by asking each member of the panel what a typical day involved for them. Inevitably, Facebook’s role following the results of the inquiry into Lee Rigby’s murder featured prominently (Interestingly, The Mirror’s line was that Facebook was not to blame). The FT kept Google’s PR team busy over the “right to be forgotten”, no doubt complemented by tax and privacy questions. SwiftKey are in campaign planning mode and the BBC technology desk had a day of writing, TV presenting and attempting to persuade Facebook to do an interview. On both sides of the media fence, clearly their days are hectic.

The intimate relationship between hacks and flacks
We all suspected it, but now its been confirmed; top tech journalists can receive up to 500 emails a day from PRs! The majority are, according to the speakers “untargeted buzz-world filled c**p”. The Mirror estimated that of this daily deluge, only about ten each day represented anything of news value.

Given the above, its not surprising that both sides of the panel agreed that more could be done to improve the hack/flack relationship, largely by making more of an effort to better understand each other’s roles. PRs believe most journalists underestimate the stress of their role, often assuming that they are lying!

However, the main cause for concern around the top table was when hack/flack relationships get too close. At this point, journalistic integrity can be lost through inducements such as free trips and products.

For instance, it was made clear that some game companies get journalists to sign a contract to withhold poorly reviewed games until two weeks after launch. Shocking.

Facebook’s PR mess
Facebook had been at the top of the public’s mind this week since they were revealed to be the tech company at the center of Lee Rigby’s murder. It is (and was) a typical case of poorly planned PR. On the day of the inquiry’s findings, Facebook’s identity was originally hidden in the report. Its identity was revealed at approximately 4pm. At this point the story had legs, privacy concerns at the center of the inquiry could be led by Facebook’s supposed failure to act. And at that point Facebook’s PR team should have contacted specialist correspondents to share their side of the story. Uncomfortably Facebook is now being held to push the government’s Draft Communications Data Bill (AKA. Snoopers’ Charter). The more libertarian geeks in the room found this uncomfortable.

Tech is hard news, not just Christmas gift guides
The two biggest sources of stories for the FT at the moment are banking and technology, as the newspaper judges that these will be two of the key drivers for Britain’s growth. However, the typical newspaper reader still struggles to see technology beyond Christmas gift guides. Technology reporting has yet to have its seminal story about the rise of computers or growth of mobiles. The focus tends to still surround political correspondents, but increasingly some mainstream stories rely on having a good understanding of technology.

Google’s view is that our news structure is dated and still arranged around silos that may result in readers missing critical details about modern technology. Their premises shows that the future lies in a different direction.

Also published on the Keene Communications blog.

Bringing Order to the Web

If you’re the sort of person who finds Search Engine Optimisation (SEO) exciting, then read Page and Brin’s original 1998 paper about Google PageRank. The Google founders provide a stonker of a read.

Originally the PageRank algorithm was the one-stop equation to detect web page influence and human interest. Today Google has over 200 algorithms, all aimed to discover relevant and reputable web pages. Together these provide order to the web, because even in 1998 commercial manipulation of web results was recognised.

From pg.12 of the original paper on ‘Manipulation of Commercial Interests’

“This kind of commercial manipulation is causing search engines a great deal of trouble, and making features that would be great to have very difficult to implement.”

To find out the latest about Google SEO developments, every SEO agency across the world maintains a fixed gaze on Head of Google Webspam’s blog, Matt Cutts. Watch the video below to get a good feeling about how SEO works and how WordPress users should respond.



Google: With great power, comes great responsibility

The world I see each day is probably very different to yours. It’s hidden, mostly. Only accessible through smartphones, tablets, laptops – anything with a screen, but it is more than that. It is the unknown. Entire cities bound together by hundreds of data connections, scattered across a dozen different servers, full of anonymous personas. This was my childhood and it’s not a scary as it sounds – it was the epitomy of liberty.

The internet was:

1)    Anonymous communication

2)    No accountability for content uploaded

3)    Everything indexed for quick search

These could widely be accepted as the values of the internet in the 1990s. It’s only with the rise of personalisation and commercial social networks in the noughties that these values began to be challenged. Challenged by big corporations whose business models couldn’t keep up with the evolution of online communications. The Stop Online Piracy Act (SOPA) was one of the biggest threats to liberty and freedom of expression we have; supported by a plethora of entertainment companies. Today we must save net neutrality.

Content is being published freely online every second, mostly devoid of any ethical obligation of copyright and certainly not uploaded with knowledge of the consequences. Just look at One Second on the Internet.

Over the last week mainstream media has been focusing on a critical ruling by the European Court of Justice (ECJ) in favour of a Spanish citizen who argued that Google should delete links relating to his house being auctioned because he failed to pay taxes. After Spanish courts upheld the complaint, it was referred to the ECJ as Google refused to remove the content, arguing it was not responsible for deleting information published legally elsewhere.

Google Search

In this case Google presented itself as not just a content host, but as a controller. Acting as a publisher to keep and delete information, the ECJ concluded that Google is “… obliged to remove links to web pages” in certain circumstances.

We are now in a situation where freedom of speech campaigners are head-to-head with privacy supporters. As an article in last Friday’s Evening Standard mentioned (paraphrased), “The ruling by the ECJ will allow a host of reputation management agencies to bury the online details of their shady clients.” Personal, sensitive or damaging information relating to an individual is an incredibly grey area, rife to be exploited by those who want to remove information about them that is instead the breadcrumbs from a shady past.

The internet today is very different to the one I grew up with. It’s become more serious, where information we upload is linked to an overall personal profile. All the content that we have uploaded over the years, or has been published about us, is cleverly delivered and recommended. The tech companies have grown astronomically and this ruling by the ECJ reminds us of this: With great power, comes great responsibility.

Thankfully though, the internet is much bigger than even Google.

Evidence! There is life on Google+

Google+ is notorious for being one of the social networks which is the most difficult to measure. Google just doesn’t provide a measurement dashboard in the same way that Facebook does. Cynics would say this because Google don’t want us to know how small their network is in comparison to other social networks. Personally I think Google is simply developing their platform gradually and haven’t yet worked out a measurement dashboard yet. Although there is a small feature which will allow you to get basic Google Analytic stats for your page if it’s connected to a website.

Yesterday evening Google introduced a “total views” stat to Google+ pages and accounts. From what I can see I do not think these are unique views and it’s only a top-line number. It isn’t broken down individually for each post yet (which is frustrating) and counts from October 2012. What is important to note: it looks like Google+ have ditched their +1 number stat in favour of the total views metric.

This would make a lot of sense. Early last year many digital agencies reported that the Google+ pages they managed on behalf of clients had seen a dramatic reduction in +1s. Only last week did we at Keene detect another drop of +1s across the network. To remove this stat altogether is a step in the right direction for Google+ and I’m pleasantly thrilled to find out that I have 32,000+ views of my profile!

Michael White Google+ profile

Let’s hope that Google continues to enhance the measurement capabilities of Google+ and this new feature shows the social media world how big the network has become.

Bloggers: If you accept or write guest posts then you may want to rethink. Google has spoken.

Back in 2006/7 I was proud to be the owner of a blog that received between 4000 – 8000 unique visitors a month. It was a real buzz and although other blogs could boast much higher traffic than this, it didn’t matter. I was blogging for the love it and publishing articles was allowing me to build relationships with other authors online.

Back then Search Engine Optimisation (SEO) practitioners would focus on techniques such as:

Participating in blog carnivals
When blog owners who all wrote on the same subject would take turns to host a series of 20 links or more, featuring all the biggest blog posts from the past month. A particular carnival I frequently featured in and hosted was called ‘Carnival of the godless’; secular thinking and philosophy.

Forum discussions
I used to receive a ton of traffic from various forums; ranging from technology, philosophy to religion. Building authority through frequent insightful posting was a sure way to generate clicks to a website.

Guest posts
Thanks partly to the previous two activities it was possible to build relationships with other subject matter experts online. The climax of these relationships usually resulted in guest posting opportunities – a way to spread online traffic and gain more coverage.

Then Google came along with their Panda, Penguin and Hummingbird updates and the face of SEO changed. Between 2011 – 2013 many clients who relied on SEO agencies in the past have come to regret their hiring decision. The ‘black hat’ tactics that were used by agencies to provide clients quick traffic gains were penalised by Google. Businesses literally died because they dropped from Google search results. These changes also negatively impacted the blogosphere.

Google saw:

  • Blog carnivals as link farms, purely used to increase backlinks to raise PageRank.
  • Forums as a way people could generate backlinks and high Click Through Rates (CTR), without the need for authority.
  • Guest posting a euphemism for spam article posting, a way marketing agencies could pay their way to Google utopia.

Some of these changes caused so much stir that even PR needed defending with Tom Foremski’s forbidding ‘Did Google just kill PR agencies?’ (to which I graciously responded…). Following on from these changes there has now been another development.

Matt Cutts, a web engineer at Google, has long been publishing a very popular blog among SEO gurus because he makes public a lot of insider information. In his latest post he has pretty much advised everyone to stop using blog guest posting as a way to generate traffic. Why? Because according to him in this article guest posting is a mostly spammy activity.

Could guest posting have received the final nail in its coffin? If so then this could very well spell the end of that wonderful thing that makes bloggers tick – sharing our blog space with other talented authors. Cutts’ reasons are clear in his article, guest posting has become an activity that:

  1. Usually means a company giving a blogger money for publishing an article. In all the cases I’ve seen, such articles are usually poorly written and show poor subject matter expertise. I know exactly how this feels because in the past (on a previous blog) I’ve accepted money for such articles.
  2. Puts some bloggers at risk of having their space exploited by companies who will only drag their site into being delisted.

I’m still chewing over this news because the implications are far reaching. A few of my own questions at this stage are:

  1. Does this now mean that guest posting violates Google’s quality guidelines?
  2. How does Google define what a guest post is?
  3. Is Cutts hinting at an upcoming new Google algorithm update / re-run of Penguin?
  4. How will this change impact the way companies choose to work with bloggers?
  5. Can this change be reversed engineered to negatively impact competitors?

To end, here are a handful of the comments being left on Cutts’ post:

Johan Matt Cutts comment


Kev Matt Cutts comment


Marko Matt Cutts comment