Warning: Declaration of Thesis_Comment::start_lvl(&$output, $depth, $args) should be compatible with Walker::start_lvl(&$output, $depth = 0, $args = Array) in /home/glynhopkins/1MD66W10/htdocs/ninjasuite/wp-content/themes/thesis_151/thesis_151/lib/functions/comments.php on line 211

Warning: Declaration of Thesis_Comment::end_lvl(&$output, $depth, $args) should be compatible with Walker::end_lvl(&$output, $depth = 0, $args = Array) in /home/glynhopkins/1MD66W10/htdocs/ninjasuite/wp-content/themes/thesis_151/thesis_151/lib/functions/comments.php on line 227

Warning: Declaration of Thesis_Comment::start_el(&$output, $comment, $depth, $args) should be compatible with Walker::start_el(&$output, $object, $depth = 0, $args = Array, $current_object_id = 0) in /home/glynhopkins/1MD66W10/htdocs/ninjasuite/wp-content/themes/thesis_151/thesis_151/lib/functions/comments.php on line 244

Warning: Declaration of Thesis_Comment::end_el(&$output, $comment, $depth, $args) should be compatible with Walker::end_el(&$output, $object, $depth = 0, $args = Array) in /home/glynhopkins/1MD66W10/htdocs/ninjasuite/wp-content/themes/thesis_151/thesis_151/lib/functions/comments.php on line 293
NinjaSuite.com – online marketing since 1999 http://www.ninjasuite.com Mon, 25 Sep 2017 14:57:12 +0000 en-US hourly 1 https://wordpress.org/?v=4.9.3 A simple free URL checker for testing links are still up http://www.ninjasuite.com/posts/a-simple-free-url-checker-for-testing-links-are-still-up/ http://www.ninjasuite.com/posts/a-simple-free-url-checker-for-testing-links-are-still-up/#respond Mon, 25 Sep 2017 08:38:01 +0000 http://www.ninjasuite.com/?p=406

This is a free URL/Link checker that can be run on a PC without any problems.

Here’s the story….

I wanted an easy way to keep a list of links that I own for Clients and to check that these links were still live. This program will capture the HTML of a page, and look for the URL you put in the box.

How to use:

  • Add links into urls.txt (one each line)
  • Open URL_Checker.exe
  • Enter the url you want to check for.
  • Punch choose file and select urls.txt
  • Press Play
  • Program has nice little message windows to tell you how it’s doing, and when it’s finished.
  • Results will be outputted to /results/results_randomnumber.txt

Other things

Not just links! You can use this too to look for anything on a page, not just URL. You could ID site running specific pieces of code, for example Google Analytics or specific advertising codes.

Multiple Lists! You can create lots of different URL.txt files (example URL_Client_name.txt) and just run those each time.

 

Download Link
(15MB – .Net framework needed PC only, most antivirus will show a warning but then say it’s okay so just sit tight and let it work through it!).
MD5 Checksum Value of .zip: eb149ccc3037d4f1c0231aa26fb8e181

 

 

Let’s take a moment to revisit SEO of the 1990s

If you are looking for the best reciprocal link checker download then this link checker tool, which incorporate a free link checker just happens to be the best reciprocal link checker software on the market. You may have heard of this free link checker tool being talked about, but I personally find that, above all things that might have been stated about this free web link checker, it’s the ease of use that make this link checker tool free. Actually that’s not exactly true, this broken link checker is free because I am sure that somewhere on the web there are other people making, not just link checkers, but other types of software that are free without any desire for payment!

 

 

 

 

]]>
http://www.ninjasuite.com/posts/a-simple-free-url-checker-for-testing-links-are-still-up/feed/ 0
Ad blockers and change: where advertising fears to tread but must! http://www.ninjasuite.com/posts/ad-blockers-and-change-where-advertising-fears-to-tread-but-must/ http://www.ninjasuite.com/posts/ad-blockers-and-change-where-advertising-fears-to-tread-but-must/#respond Tue, 20 Oct 2015 14:17:07 +0000 http://www.ninjasuite.com/?p=390

advertising

Have you ever clicked a link from Facebook and been served a page with so much advertising content on it that by the time it’s loaded you’ve actually forgotten the reason why you are on the page? Or perhaps you have tried to read an online edition from one of the UK’s most popular broadsheets? In fact it does not really matter where you go, the typical user-publisher experience can be summed up in the following words: “for fuck’s sake please click on this advert” (FFSPCOTA). The Digital Savvy managers of many leading publishing organizations are leveraging things like Domain Authority and other shortfalls in Goog’s ranking algo to ensure that the thinnest, most tenuous pieces of online content are being produced to simply capture website visitors in an attempt to try and get people to FFSPCOTA. As if you needed further proof, check out the cookie directive from the EU. The whole industry up in arms about how this would be a bad thing, how it was going to be complicated etc. etc. In a nutshell the policy could be summed up like this “Explain the shit you (the publisher/advertiser) collects, and do so in a language that makes sense, if someone doesn’t like it, let them opt-out”. Does that sound so bad to you? To me it sounds pretty reasonable. However, it’s a complete nightmare for those companies that are selling ad-space and guess who has the biggest voice online? The companies selling ad-space (in some cases they actually control the information too!).

Google, allegedly one of the world’s most respectful organization for things like copyright control, data scraping and generally requiring the delimiting of your entire business for free traffic in return for webmasters having to pay for their content to be served back to them (or thereabouts) in some kind of product, has also been worrying itself into a frenzy and we’ve seen new ad-sense mobile only ad-formats (which will not count towards your ad-block quota per page: read more crap in the faces of people), plus a brand new product (customer match) where companies can upload their email marketing list (with the biggest online advertising company on the planet) to then serve yet more ads to your most trusted customers. This product of course overlooks the fact that if you as a company own an email address and have properly profiled your newsletter subscribers, and listen to what they want, then the last thing they (your customers) want you to do is put more of your crap in their faces. Instead they are quite happy to receive your newsletter which they signed up for in the first place. But don’t take my word for it, trust your instinct too or hop over to Google church.

Facebook and other tech companies are also struggling to find a footing in Europe after being told that it’s not quite okay (English understatement) to ship EU citizen data outside of the EU and to then grant access to US authorities to take a look. That’s doubly painful because I believe that it was precisely the data flow from the EU that allowed the US evesdrop on it’s citizens once upon a time. While in the pet world they say “a dog is for life not just for Christmas” in Facebook’s own closed network email addresses are so yesterday, who needs an email address when a FB customer ID is for life.

Over in the UK a recent fining of Pharmacy2U by the Information Commissioner’s Office (ICO), regarding the sale of 20,000 customers details returned a fine of £130,000 (with a discount of 20% for a quick payment) pretty much brings all this together and where I want to take this post.

The ICO office that looks after the interests of UK citizens for their privacy determined that it would be appropriate to place a value of £6.50 (£5.20 if paid quickly) for each information record per customer. That figure pretty much sums up just how little is understood about the commercial value that behaviourally gathered information can have for companies (in this broken digital marketplace). It’s only when reading rare posts (or listening to even rarer podcasts) about the games you and I download on our phone that you start to understand just how much is happening behind the scenes, and then being piped up the network to some all-knowing sentient network that simply attempts to fob off this data as valuable to someone that is prepared to pay for it (go and see how effective that data is and what happens when you use an Adwords ‘affinity list’ developed by Google from tracking people across its network of products from YouTube to Gmail. Watch your AdWords conversion rates fall off the earth, and your budgets exhausted as your attempts at Keynesian economics fail).

The thing is this, people don’t mind watching adverts; look at TV, and actually in some cases they can be really fun to watch. Think of SuperBowl ads as a case in point. The problem is that online, unlike TV, it’s not just face value experience; instead your exposure to an ad turns into a data-set which is then flogged to the highest bidder under the auspices that this person is bang-on target.

The reality of online surfing is that most of the time we human beings are completely irrational in how or what we look for in a day online, and in some cases we just have kids – I have often wondered how the commercial value of my profile has been infected by all the programmes my two watch on YouTube or on download via GooglePlay and how somewhere in the world, my profile is being aggregated into a list and resold as “highly targeted”. I’d like to tell the advertiser that purchased it that it isn’t!

It’s been said often in online advertising that we are in a race to the bottom, and that it can only get worse. On worse, I think we’re pretty much there already. Online experiences are becoming so utterly infected with advertising that, as history shows, at a certain point human’s just sit up and say enough is enough. Hello ad-blockers.

What Next

I don’t know how to get to the next point in advertising in a step by step approach, but I do know that it considers people’s personal information to be precious and to be valuable and for there to be a partnership in terms of what you give up. Whether you give a company your information and they then cut you a share of what they make off your profile, or they give you some other form of kickback, then that could be a reality, but as the market is showing people place no value on current advertising practice.

On the other side of the fence, advertisers can’t pretend that their product is right for absolutely everyone; the advertising networks need to somehow ensure that there is compatibility between the product and the target segment that is being marketed too, otherwise we’re back to square one!

Appendix 1 [ how many trackers the Guardian is using]
Appendix 2 [how many trackers the Daily Telegraph is using]
Appendix 3 [Signs of desperation?: translation: grow your business with Adwords Italia, start with a free €75 coupon]

 

Author Information
Glyn S. H. has been online marketing since 1999 and has developed campaigns for leading luxury brands that have included Nestlè and Interflora . He works primarily in for the Travel and Tourism sector, helping hotels beat-down OTA paychecks. He has a web-marketing company, a Masters in Professional Communication, speaks fluent Italian, and is married with two kids. He also has a good sense of humour – essential for survival in web-marketing. He is not employed by Google. To contact via email: glyn@ (this domain).

]]>
http://www.ninjasuite.com/posts/ad-blockers-and-change-where-advertising-fears-to-tread-but-must/feed/ 0
How hotels can Isolate Traffic from a specific domain sharing a Google Analytics Code (eg booking engines, social networks) http://www.ninjasuite.com/posts/how-to-isolate-traffic-from-a-specific-domain-sharing-a-google-analytics-code-eg-booking-engines-social-networks/ http://www.ninjasuite.com/posts/how-to-isolate-traffic-from-a-specific-domain-sharing-a-google-analytics-code-eg-booking-engines-social-networks/#respond Thu, 05 Feb 2015 09:39:27 +0000 http://www.ninjasuite.com/?p=335

fruit

This post is a bit work in progress because you never really know for sure if what you are doing in Google Analytics is right because well….it’s Google Analytics!!

Hotels are at the forefront of a tracking nightmare that Google Analytics actually makes a good job of fixing – tracking across more than one domain. Many hotels prefer to pass the processing of bookings to a third-party that in turn do this and provide additional features (for example remarketing, meta-search programmes) and service which they take a commission for. It also removes hotels from the nasty and complicated equation of security. However a problem in this setup is that essentially a visitor arrives at one domain (hotelwebsite.com) and then when they book they get sent to (reservationsservice.com). All hotels have to do is ensure that

  •  e-commerce tracking is enabled from within their Google analytics account
  • the Google analytics code is the same across the two website domains
  • the booking engine provider has done all the technical stuff so that when a person arrives at the page where the order has been confirmed, that a piece of Javascript is fired and all the relevant information is passed back to Google Analytics via the tracking code. They also need to make sure that the channel attribution is not being lost along the way.

But what happens when you want to look at just the traffic that’s happening on your booking engine? Or you want to isolate traffic from a social network into which you’ve also loaded your Google tracking code? That’s what I’m going to write about here.

Step 1:

Go to the Admin section of Google Analytics and select your mail account and the property which is using your Google tracking code. For smaller hotels it is likely that there will just be 1 property (which is your main hotel website). You should then create a new VIEW, which you might call “booking engine”. 

Step 2:

Google Analytics will not show the full domain in it’s reports even when it is run across lots of domains which is a Google thing, so you need to tell it to do things differently.

In the View column (with the view you are creating) click on Filters and then add a new filter and call it “Show Full Domain”. Then select the Advanced Filter and enter the details as shown below:

 

filter

All of that is going to tell Google Anlytics to yank out the full domain path when showing this view. That means that it will pull out all the reservationservice.com URLS and show them fully in the Google Analytics report.

So now we have setup the filter we need go back up the Google Analytics hierarchy to the Property. Underneath account (still in Admin section) you can see “All filters”. Here you want to go and create a new filter. Give the filter a name such as “Booking Engine Traffic” and then select “Custom” and select to include the Filter Field “Hostname” and the “Filter Pattern” should match the root of the domain you are tracking (reservationservice.com). Leave all the other options unchecked if this is appropriate and then you will see a box below where you will see a list of available views. You should see your view “Booking Engine” and you need to click “ADD>>” to move it across to the selected views box. Then press Save.

Now from this day moving forward (views are not able to be applied retroactively) you will have your main Google Analytics view, but also the booking engine traffic isolated and able to analyze on it’s own.

(extra)

Google Analytics showing both domain with trailing slash and without trailing slash (how to fix it)

Create a new filter (as shown below and apply this to the view you want in Google Analytics).

google

Here’s the regex for copy and paste: ^/(.*?)/+$     and then    /$A1

Get Insights like these delivered to your inbox as soon as they are published!!

* indicates required
 





Author Information
Glyn S. H. has been online marketing since 1999 and has developed campaigns for leading luxury brands that have included Nestlè and Interflora . He works primarily in for the Travel and Tourism sector, helping hotels beat-down OTA paychecks. He has a web-marketing company, a Masters in Professional Communication, speaks fluent Italian, and is married with two kids. He also has a good sense of humour – essential for survival in web-marketing. He is not employed by Google. To contact via email: glyn@ (this domain).

 

]]>
http://www.ninjasuite.com/posts/how-to-isolate-traffic-from-a-specific-domain-sharing-a-google-analytics-code-eg-booking-engines-social-networks/feed/ 0
A year in review: 2014 in the travel and tourism marketing sector http://www.ninjasuite.com/posts/a-year-in-review-2014-in-the-travel-and-tourism-marketing-sector/ http://www.ninjasuite.com/posts/a-year-in-review-2014-in-the-travel-and-tourism-marketing-sector/#respond Thu, 01 Jan 2015 08:24:39 +0000 http://www.ninjasuite.com/?p=324

Tower bridge with firework, celebration of the New Year in London, UK

2014 has been a big year for travel and tourism online, but it will pale in significance compared to what’s on the cards for 2015. As we recover from a few days with family and friends and big celebrations, it seems only appropriate to log a few of the developments that have made this industry so exciting to be a part of.

Up until this year – and irrespective of niche – web-promotion was pretty easy. Assuming that you were not link-bombing the open-source opportunities (still very much alive in the darker corners of the promo-web), the damage caused by site wide penalties, manual, partial or whatever else Google decided to term and then re-term them, was isolated to those groups that didn’t see the writing on the wall – organic traffic, or traffic without a fee was going the way of the dinosaurs. Whether it was front end design on the travel results pages pushing independent hotels below the fold, or the opening up further of the Google Hotel finder product (with the same impact of pushing down independent hotels that historically featured in the map packs) hotels took a battering in terms of being able to compete in the face of people via traditional optimization in 2014.  If you were a hotel that managed to stay top of organic, that would translate to a bottom of the (browser) page listing which would bring in a few crumbs of traffic.

Whereas historically OTAs and other large travel providers were able to compete only with each other, in terms of their ability to be advertising and in the faces of people everywhere, 2014 saw a number of interesting technological solutions come on the market which could be leveraged to help hotels compete in the same space.

One of the most interesting I came across was Marketizator, a Romanian company in startup, which sought to provide webmasters with the ability to survey its visitors, and serve contextual overlays (think Ajax popup like Groupon, but with the ability to tailor the popup to be annoying!) based on criteria you could define. So:  If a person arrived from Facebook, show them this popup, if a person arrived from Google; show them this popup…but also…. If a person arrived from Facebook, and then looked at more than two pages, and then looked at the gallery, and then scrolled down 80% of the page then show them a popup with a breakfast included rate. I started plugging this technology into both e-commerce and travel and tourism websites with good levels of success. For e-commerce it gave a way to design lifetime incentive schemes that were triggered based on on-site activity. There was also an option to pass customer data via variables back to the platform to re-use this information for personalized marketing. However, its most useful function was the ability to create surveys and capture leads (emails) at the end of the process – “complete our survey, give us your email address and get this!” – again the same rule types were available for when to show the survey (as for the banners) – show to people that have been on the site for more than 2 minutes, people that arrive from Facebook, people that arrived from a specific campaign with a specific utm code. Imagine being able to segment your audiences intent through a survey based on search, social media, or direct links.

While the money in online travel ensures that there is no end to the volume of research supporting the view that OTAs and other travel providers were where people were checking out, our tests brought back results that confirmed with the right approach it was possible to address and acquire new guests just by developing matching strategies. In some cases those strategies were capturing up to 40% of all the daily visitors to the website.

But it’s the economic arguments that gain the traction so, if £25 of every £100 booking is going away as commission, how can we offer £25 worth of extra value or options to the client to get them to book direct, and for the hotel to then own the guest from the start?

Another example, if it costs up to £3 to gain a new visitor that has a profile matching our guests via Adwords, campaigns, but it only costs 25p to remarket to a person that has already visited the website then throw the remarketing code on the website, update the privacy policy, and set a cookie length of two years for the time being because if the hotel isn’t ready to remarket now, in the meantime they can start gathering prospects. Again the theme that came through this year was one of tapping the holes that technology led-solutions provided to hotels as a way to claw back some of their online revenue from OTAs that have been doing this for years.

Fraud has also been a closing statement to 2014 in Digital Advertising. With some really clear examples of how the digital ad-space has been populated with techniques to inflate the digital marketplace.  There have been some excellent reports [pdf] that are talking up what many digital practitioners with any conscience have been shouting about online in forums for years, that at least with a car salesman you’ve got a car to look at in the flesh, but with digital advertising it’s all about trust! As anyone that has paid attention to the biggest marketplace of online advertising, Google, their online traffic fraud information document leaves you with the hope that everything is being done that should be. However quite why there is such a volume of 10 second sessions on paid advertising campaigns which are delivering users to pages that are completely in-line with their desire, with a unique offer is troubling for advertisers looking to understand customer behavior and maximize their ROI. I am sure that advertisers would be prepared to pay more for Adwords campaign modifiers where you could apply Marketizator type filtering – If a visitor arrives from Google and stays on my website for more than 2 minutes I will pay Google £10. No-one wants to pay for a 10 second visit so why charge for them; it just shows that the system doesn’t work because only one side wins and advertising should be a partnership process.  While it might not be a concrete rule, it appears that keeping out of CPM marketplaces with you brand will expose you to less risk of traffic fraud.

One of the biggest problems for hotels of all sizes is simply staying on top of the technologies that are available and knowing how to leverage them in a strategic way. Marketizator can help you get a handle on what your audiences want when they visit you, and you can split this data up based on the channel that is bringing them to your website.  Zooming out to the global audience, attention spans are on the decrease so throw out the idea that people will go through all the different navigational elements on your website to find information about you, start serving up one-page answers that close the visitor, and answer all their questions, leaving them to click your book now button.

Closing people can be a problem, but in 2014 I started using a piece of technology that fills a hole in the marketplace (crap in-house/supplier web-design capacity to deliver landing pages that converted and could be easily tweaked independent of long supplier development times) and offers up an easy and scientific way for anyone that wants to test out pages and communications messaging (aka multivariate testing).

In some cases I worked on projects that required setting landing pages for membership clubs, and using this technology it was possible to test different layouts, sales messages, as well as the images that drove signups. How this works is just why I love working as a technology driven professional: You create a landing page via templates, or independently via a WYSIWYG interface and this is then published to your own subdomain (easy A record additions on your domain name) on your hotel. Once you have your master page, you can then clone it and tweak an element. Now here’s the clever part: You tell the system to split the traffic across the two pages, say 50/50. The system will then serve 50% of the visitors the Master Page while the other 50% will get to see the cloned page. If these pages are capturing information, such as an email address, then this will also be recorded over time as the pages are tested. Your control panel will report back stats on the pages that are performing the best. You can then promote winning variants to the master page, dupe them and do all kinds of tests to get the magic formula right. One of the landing pages we started developing converted at 7% and it now coverts at 27%.

These are technologies that are becoming available to all hotels and are being backed up with really fabulous support as well. The tutorials are spoon feeding this tech back to hotels that are starting to use it.

The industry also went through a number of changes in 2014.

TripAdvisor, has just been fined heavily in Italy for not doing enough with regards to its policing of fake reviews via their online platform, but aside from that the giant also went through a few iterations of itself.  I’ve yet to see an economic case for the cost effectiveness of their business listings service and this year I did really get to grips with measuring this. All we were able to find out was that was that for the most part revenues were coming in as an Assisted Booking revenues in Google Analytics (this can be tracked by simply tagging your TA website link with the relevant utm codes). That’s to say that TA was essentially participatory in the conversion but not a clean converter, such as you would be able to see more cleanly with Google Adwords or a Google organic conversion. So if TA can’t be used as a revenue driver let’s look at the total traffic is sent (during a trial) and put a CPC cost on that traffic to see what we would be paying for it over 12 months (total clicks to hotel website during 1 month trial x 12 divided by annual business listing charge). But even here you may find that you’re paying a premium rate CPC costs for essentially a click through to your hotel (would they have come anyway?). Your last hope is that of brand-awareness, that’s the largely bullshit metric that web-agencies pull out trying to convince you that to build a brand you just need to be in the faces of everyone all the time. A quick look at this year’s Christmas advert by Sainsburys, shows that there is a bit more to brand building then pasting your brand logo all over the web, much to the annoyance of its users.

TA also established more firmly the check-rates functionality (that’s the box that gets shown on a hotel TA page with the “real-time” rate checker into which independent hotels could propel themselves in an auction based economy – note: it’s CPC max bid that wins out here, and also note that this means that people on the website may book one of the rates shown but the cheapest rate is not immediately on the radar because it is lower down in the box (at the time of writing). Clearly this is not to the benefit of the people and “reviewers” that are quite happy to provide their reviews to TA for free…but as so often happens online the difference between what an online brand does in theory and practice is best not thought about as it will most likely lead to depression) and this was probably the first product TA had released that actually made a few people sit up and take note. In fact, it immediately became a requirement to have a TA business listing in order to be able to participate in the program. Then, perhaps because the take-up still wasn’t great, that requirement for a business listing went out the window. The problem with TA check rates is that it is essentially bid-management CPC which requires resources to manage the bids to make sure that your hotel is getting a good placement.

Chances are high that the OTAs are already using one of the industrial sized PPC bid management tools. I’ve not yet found a convincing technology platform that is offering bid management strategy for Trip Advisor and other meta-search networks (there is one but it seems only available for hotel chains with more than 150 properties) and so hotels that want to do check rates need to find a partner that can plugin to Trip Advisor and manage your campaigns. The problem is that because there is very little money in PPC bid management the chances are that hotels will be getting a poor deal because agencies will most likely just set a blanket CPC across the meta-search engines and leave that to run during the month. The outcome will be some poor performing CPC and ROI for meta-search. Solutions are starting to appear in the marketplace which should make the meta-search channels one to invest more time in, in 2015.  

Another TA product, launched in 2013, but cemented in 2014 is the Review Express platform. This is where hoteliers provide TA with guest’s emails and TA will then send a nicely formatted newsletter styled request for review. As described by the guidance notes provided by TA the hotel should tell the guest that their email address will be provided to TA for the purposes of soliciting their feedback at the point of collection. Now I’m not a gambling man but I reckon that if there was an audit of the hotel industry today using the Review Express service that a healthy share of those hoteliers are NOT confirming with the requirements of, not only Trip Advisor’s guidance notes but of EU privacy regulations. As with many things being used by hotels “it’s just easier” is a common mantra for why something gets adopted by hotels. This isn’t a criticism of hotels but it should be made clearer the obligations they need to follow. If a hotel is providing a customer’s email address to a third party without their consent a guest could quite conceivably voice their concern about it. Hotels can however send emails themselves to guests that have stayed at their hotel, and there are cleverer ways of spreading the people that are voicing their happiness at staying at the hotel through more channels than just one, and so mitigating the risk of one review site suddenly going under and all the time your staff have spent on it, going up in smoke.

Something hotels could be doing is simply measuring the effectiveness of products such as the Review Express platform , before making it a standard. Do a straight channel analysis on it. Count the number of emails that were sent via the review express platform and then divide it by the number of reviews that were published to TA. Of course this is not an exact figure because there will be people that will go and write a review independent of a nudge sent by a Review Express email. However, you might be surprised at just how poorly your Review Express emails are actually converting at (getting a review) for your hotel. These are some of the challenges I’ve been working at and providing solutions for in 2014.

Booking.com also branched out their online portfolio extending their offering away from the straightforward hotel booking marketplace to enter into the Vilas niche with villas.com and no doubt we can expect to see a lower segment for the guest house and bed and breakfast market over the next months, as they divvy up their online inventory and spread it across a number of different niches. Looking from the outside and what I saw on just about every channel I participate in, they continued their “we will be everywhere” media buying and online advertising in search and social media, and mobile!

All of these shenanigans by the larger OTAs is, in my view, because they are scared of what is coming around the corner from Google. A report by Evercore (pdf) framed an online travel economy with the search/advertising company in mind, which impacted on their evaluation of the OTAs prospects in the future. Google is currently running round hotels with 3rd party photographers that visit hotels and take street view high definition pictures of the property (for a fee). Consider that they have Search, Android devices and Apps and while people might not think that it’s such a big thing, make no mistake that this is going to be a game-changer when their final product gets launched.  

So where does that leave the hotels? Will they ever be able to compete with the marketing savvy of OTAs? Probably never, they are employing the cleverest people for a lot more money that most hotels pay their GMs.  OTAs are about discounts, and best deals. In a study of the attributes to which hotels and OTAs distinguished themselves in the online marketplace, the only attribute that the hotels had over the OTAs was that of quality. It’s quite acceptable for an OTA to put themselves in your face, run scarcity tactics on brand-owned pages to stimulate the user into making a booking with them, but if that happened on a hotel site it wouldn’t feel right. It would feel cheap, and the user would most likely ask why a hotel was doing this, ‘were they desparate?’.

A hotel is your Porsche dealership, the OTA is a bit more like Arthur Daley from Minder, or a second hand car salesman trying to get you to buy anything the moment you come through the door. In terms of customer services, as long as a person does not deviate too far from the sales funnel of an OTA their experience will be most likely be efficient, but there are a number of websites where people voice vitriol about things that go wrong, and this where the hotels have the opportunity to sell the thing that they always come out top on in research, the caring quality customer service process.

The skill of the average web-user is increasing and that means that they are able to find the hotel website irrespective of the advertising walls that are put up in front of them at the top of the results pages.

Therein lies the biggest opportunity for hotels in 2015: On-site conversion and capturing visitors for future marketing. Take a look at any of the leading travel provider’s websites and when making a booking for your brand you are pressurized into doing so. What is typically a casual reading of a catalogue for a hotel website is starkly different from the “10 people are looking at this hotel” “1 more room at this rate” and “last booked 2 minutes ago”. It’s a sales tactic that can feel a bit like the paranoia I go through when booking using ryanair and I find my excitement of booking a plane ticket being sucked slowly away from me as I complete an ever harrowing series of forms warnings. Yet, it is these lessons that hotels need to take on board and incorporate in the way they sell their online inventory otherwise they will never take direct bookings. Factor in REVpar and the question hoteliers should be asking themselves is “why the hell should someone book with us?” What are we offering that no-one else is? And that’s where creativity comes in and helps develop strategies that fit into that gap. People will come to a hotel website, but how do you tell them you are better than the OTA right now during that visit?

It would be remiss of me if I did not say the following two words: Responsive & Performance.

Go and look at the number of people that are accessing your non-responsive website with their mobile phone. If your website is not fully responsive, then you are losing business. Technology and devices have morphed so much over the past 12 months that the chances your analytics are capturing what’s really happening on your website is pretty unlikely. A responsive site should double your on-site engagement (the number of pages people look at) and increase the session time of the average visitor. However the really important point for hotels is that the messaging on a responsive site is consistent across devices. That means not more mobile sites maintenance and no more updating two versions. The message is the same everywhere. Booking engine providers need to be at the top of their game to make sure their websites are responsive and able to continue the same quality experience as people are passed from the hotel to a third party to process the payment.  Performance is the final point in this post and it’s important. Making sure a website loads really fast is a key ranking mechanism, I’ve seen it on websites that I’ve built myself and with sites I’ve been projected too.  Make websites super-fast and check they run across all the main devices and browsers with equal speed. Performance impacts search engine rankings in a very positive way. Tools like gtmetrix or Google’s own page speed tests can be useful starting points, but if you are still loading slowly, then change the server completely to something like AWS.

So in summary my travel world of 2014 has been about setting down benchmarks, and increasing the understanding of the audience my clients want to tap into. I think that 2015 will be a good year for Hotels and it’s always good to remember that hotels were around before OTAs and they will be around long afterwards, so don’t panic if you are hemorrhaging commissions to them – everyone else is! – instead, start to look at how to capture people looking for the best rate, analyze your channels and switch off those channels that are not delivering revenue, whether it’s TA or any other 3rd party. Measure every channel, set your utm tracking urls then sit back (not too far) and then take decisions based on data.

Have a good 2015.

Get Insights like these delivered to your inbox as soon as they are published!!

* indicates required





Author Information
Glyn S. H. has been online marketing since 1999 and has developed campaigns for leading luxury brands that have included Nestlè and Interflora . He works primarily in for the Travel and Tourism sector, helping hotels beat-down OTA paychecks. He has a web-marketing company, a Masters in Professional Communication, speaks fluent Italian, and is married with two kids. He also has a good sense of humour – essential for survival in web-marketing. He is not employed by Google. To contact via email: glyn@ (this domain).

 

]]>
http://www.ninjasuite.com/posts/a-year-in-review-2014-in-the-travel-and-tourism-marketing-sector/feed/ 0
Channel Attribution: are your conversions happening on the tracks or out in the desert? http://www.ninjasuite.com/posts/channel-attribution-are-your-conversions-happening-on-the-tracks-or-out-in-the-desert/ http://www.ninjasuite.com/posts/channel-attribution-are-your-conversions-happening-on-the-tracks-or-out-in-the-desert/#respond Wed, 18 Jun 2014 14:35:32 +0000 http://www.ninjasuite.com/?p=313

Train Wreck in Saskatchewan

If your booking engine is captured off-site by a third-party then you’re going to want to read this post.

What’s more frustrating than Google Analytics? Google Analytics when its not even splitting out the traffic source revenues accurately because of a cross-domain tracking bug!

A clue that things are not going well is when you see a website domain being attributed as the unique referrer of revenue when you know it can’t be true.

I’m going to assume that you have read the following guidelines about the changes that need to be made to your Google Analytics code and instead focus on how you can test to see whether things are working as they should.

Testing that Google Conversions are being properly attributed!

In order to make this test work you will need to have a cookie-viewer installed in your browser. I personally use this one.

1. Go to a search engine, such as Google, and find your website by typing in a keyword search.

2. Click the link to the website.

3. If you’re using the plugin I listed above, right-click mouse and then select “page info” and then visit the “cookies” tab, and then examine the _utmz tab. I have highlighted below three areas that you need to be looking at. You can see that the “utmcmd” is organic and the keyword is “not provided”.

notoprovided

4. The next step is to (still in the same session) visit the booking engine and to then re-examine the cookie again. If you look in this example below you can see that the utmcsr value has changed to direct. That’s because the cookie was not successfully passed to the third party booking engine. If this had gone well the session cookie would still say utmscr=google|utmccn=(organic). The result of this is that the revenue in Google analytics will be attributed to the DIRECT channel, and the referral for revenue will be the website and not the search engine.

tracking_change

 

 

That’s how to test it!

 

Get Insights like these delivered to your inbox as soon as they are published!!

* indicates required





]]>
http://www.ninjasuite.com/posts/channel-attribution-are-your-conversions-happening-on-the-tracks-or-out-in-the-desert/feed/ 0
Remarketing: How to be less annoying and a clever spender! http://www.ninjasuite.com/posts/remarketing-how-to-be-less-annoying-and-a-clever-spender/ Fri, 11 Apr 2014 13:49:04 +0000 http://www.ninjasuite.com/?p=302

Money fly

Google Remarketing (Retargeting or to those outside SEO “that scary shit that follows me around and I don’t understand why”) and now Facebook remarketing (through Facebook Exchange) are two strategies that any business online should be adopting. The skill is finding a suitably engaging commercial hook to make people want to return to your website for whatever reason. I’ve been building out a number of strategies for my clients and wanted to share a really simple optimization strategy that makes complete sense, but weirdly doesn’t seem to be easily available in the form of a user guide online.

I’ll assume that you have setup a remarketing list and have some people in that list already. And that you have created a series of banners that you wish to use as part of your remarketing campaign. I’m also going to assume that you are going to be delivering people from this remarketing campaign to a unique page on your website, that could only be found by either a blackhatter (including me & Google!) or someone that arrived at your landing page via your campaign. This is fundamental, and while there are ways to do the same thing with a homepage, it’s best to isolate conditions as much as possible when you are spending money!

How to make sure people that click on your banner do not get shown the banner again!

It seems pretty obvious but let’s say I get served a banner and I click on it, and see the offer, chances are that I’m not going to be interested in clicking again in the future, so it’s pretty pointless showing the banner again.

What I wanted was the following setting:

I have a list of 100K, I want everyone to click on the banner, but once I do I don’t want that to happen again.

This is actually quite cool, because it’s the kind of thing I’ve built using robots and lists. Essentially your 100K list automatically ignores positive clicks. But how do you do that in your own re-marketing campaign?

You use negative Audiences!!

How to set it up?

Go into Audiences on Adwords and create a new marketing list which is based on the list that you are using for your primary campaign. In my case I am just using a main list, but if you are segmenting your lists then you could simply apply it to that segment.

Give the remarketing list a name such as “people that have clicked on X campaign”. You then need to answer declare “Who do add to your list”. You should select “Visitors of a page” and then select the format URL contains (but you’ll know what you need to match). In the field you should enter the Unique URL of the landing page (that’s why it’s important to make it unique!). You should then tick “include past visitors that match these rules” and make the list “open”. You can set your list duration based on how you are running your campaign. Then save off the list.

Next head over to your re-marketing campaign and get your main campaign up. You will see some red buttons labeled “+Exclusions”. Go into Campaign Audience exclusions and set the AdGroup settings to exclude the list that you created before and hit save.

Now what’s going to happen is the following:

People in you main remarketing list will be shown the banner. Those that click on the banner and hit your landing page will be added to another remarketing list (“people that have clicked on X campaign”). Members of this list will be constantly removed from the main list so you’ve now got a self-optimizing remarketing list. Set these lists up a couple of days before going live and you’ll not have any initial burps in the system as the new list is being populated.

Make sure that you implement impression caps when setting up your campaign. The principle I follow is, whatever Google is suggesting you to do question it (expand the advanced option), and if in any doubt Google a question about the option in another browser window and understand what it means. Impression caps will stop your banner being showed 100 times a day to the same person.

è voilà!

ps. I set myself the goal of writing and publishing this post in no more than half an hour (postedit: failed it took me 40!) -What I have learnt is it is best to write the post first before passing time looking at photos on a photo-stock site!

Get Insights like these delivered to your inbox as soon as they are published!!

* indicates required





]]>
The end of backlinks, what now? http://www.ninjasuite.com/posts/the-end-of-backlinks-what-now/ http://www.ninjasuite.com/posts/the-end-of-backlinks-what-now/#respond Thu, 06 Feb 2014 21:39:38 +0000 http://www.ninjasuite.com/?p=293

End of the road

It used to be that the more external references or citations (links) your website had, the more popular it would be seen by the search engines, and rewarded by way of a high position in the results pages. The quality and relevance metric didn’t really figure, as long as there was link volume, there was a good chance of success (and if your links did have quality and relevance then you would need less links overall. Although in the early days Google wasn’t that clever!). Then SEOs were told that there were certain links that if a website had gained would be considered bad and could have a negative effect on the website. That was great, the day many webmasters woke up to find out that because someone else had linked to their website it was now their problem!

Now we are the juncture where pretty much any link that points to your website could be the cause of penalties that stop it from ranking. For paid advertising open declarations from people like our heavenly father, where specific networks, that offer paid links services, are labeled wicked rank manipulators, and therefore liable for prosecution by Google, in terms of being given a penalty and downgrading the position of the website for keywords in the search engine results pages, are frankly  open admissions that:

Links are still an important factor when seeking to augment a website’s rank in Google.

What’s more by highlighting the websites or networks that are being targeted, Google openly declares the places on the web where companies can purchase links for networks with the goal of creating penalties for their competitors. It’s no surprise to find that there are a number of negative seo services (link) where you can buy these types of services, and there are software tools like Xrumer that will expedite the task for you.

LINKS DON’T COUNT ANYMORE?? Yeah right!

If you’re reading forums or listening to experts that tell you that link building doesn’t work anymore, they are leading you astray. Link building works very well indeed, but – unlike the days of old, relevance and context is EVERYTHING!!

So that’s why I decided to write about a tool I find very useful for what can be quite a mundane task, but done well reaps rewards.

Get Insights like these delivered to your inbox as soon as they are published!!

* indicates required





]]>
http://www.ninjasuite.com/posts/the-end-of-backlinks-what-now/feed/ 0
Guest Blogging dead? Matt Cutts calls it… http://www.ninjasuite.com/posts/guest-blogging-dead-matt-cutts-calls-it/ http://www.ninjasuite.com/posts/guest-blogging-dead-matt-cutts-calls-it/#respond Thu, 23 Jan 2014 09:38:28 +0000 http://www.ninjasuite.com/?p=281

695

With the latest news that guest blogging days are numbered (according to  our heavenly Father), I thought it would be only right to do a quick post to acknowledge, with thanks, the latest alert from the Google spam team. Good luck with detection.

Consider that once upon a time it was much easier for Google to detect spam because Google was all about links (they still are, but don’t let them catch you). Of consequence, open source platforms that allowed users to post links – WordPress Laconica, Pligg, to cite but a few – were very quickly hoovered up by the spam crews and then easily addressed by Google – here’s a small sample of some of the old-hat footprints that were used before 2012.

Pligg
inurl:”upcoming” intitle:”pligg”
inurl:”register” intitle:”pligg”
inurl:”cloud.php” intitle:”pligg”
inurl:”live_comments” intitle:”pligg”
inurl:”faq-en.php” intext:”pligg”
inanchor:”Pligg beta 9 Home”
inanchor:”About Pligg”
inurl:”/pligg” inurl:/register.php
inurl:register.php intext:”upcoming” intext:”published” intext:”submit”
inurl:/register intext:”upcoming” intext:”published” intext:”submit” intext:”Tag Cloud” -inurl:.php
inurl:/register intext:”upcoming” intext:”published” intext:”submit” -inurl:.php
inurl:/register intext:”upcoming” intext:”published” intext:”submit” -inurl:.php intitle:”register”
inurl:/register intext:”Powered by Pligg” -inurl:.php
inurl:/register.php intext:”Powered by Pligg”
“Powered by Pligg”
intitle:”Pligg beta”
“What Is Pligg?”
intitle:”Pligg Beta 9″
“http://www.pligg.com”
inurl:register.php intext:”upcoming” intext:”published” intext:”submit”
inurl:/register intext:”upcoming” intext:”published” intext:”submit” intext:”Tag Cloud” -inurl:.php
inurl:/register intext:”upcoming” intext:”published” intext:”submit” -inurl:.php
inurl:/register intext:”upcoming” intext:”published” intext:”submit” -inurl:.php intitle:”register”
inurl:/register intext:”Powered by Pligg” -inurl:.php
inurl:/register.php intext:”Powered by Pligg”

Wiki
inurl:wiki/index.php
inuinurl:wiki/index.php
inurl:/wiki/User:
inurl:/TikiWiki/

Guestbooks
.html?page=comments
/?agbook=addentry
/?show=guestbook&do=add
/?t=add
/GuestBook/addentry.php

Based on their content and various ‘detectables’ within these open-source platforms page structure, they could easily be found and spammed by SEOs.

Fast forward to now and just about every open source platform has either been spammed to death, or has been sufficiently modified by webdev’s, to negate link-dropping by the many tools (now almost obsolete) on the market. But then along comes guest blogging to complicate things for Google’s team.

Guest blogging is much harder to detect and address because:

  • The websites offering this kind of opportunity to content producers are not on a unified platform.
  • They websites may not be on the same server/network (like the link-farms).
  • There may be no evidence historically of the website having ever offered a guest blog to someone in the past (darn it, no detection there even!)
  • There may be no textual identifier within the page that hosts the guest post that declares it a guest post.
  • And last but not least: these guest post websites might actually be making Google money (person arrives at a crap website, and then punches Adsense to get the hell out of there – worst case, or, person arrives at a great website and then clicks Adsense because it is relevant- good case)

Sure there are really basic footprints which will pull back results where link-builders might be able to gather links (see below), but what our heavenly father is basically now saying is that you shouldn’t be doing guest blogging for this exclusive reason.  Instead, if you are guest blogging because you want to expose your expertise to readers then that’s okay..this is the same line that the team at Google churn out when they talk about promoting good quality websites, while asking you to delimit your data so it can be served in the knowledge graph with a greyed out source.

  • inurl:/write-for-us
  • inurl:/write-a-guest-post

For those still guest blogging the considerations for doing so should not have changed that much because if your are writing or providing content for a website, you should not be wasting your money on any site that does not add value to your brand. Have I done guest blogging for links? Yes of course, because it works (as explained above). Will I get hit with a penalty, probably or probably not – I’ll fix it if I do.

It’s worth baring in mind that the eventual Google Algo update on this issue (which will be yet another way to shuffle listings across revenue channels) will not take into consideration the quality metrics you are hoping for. Just look at Penguin or Panda to see how many webmasters that were cleaner than clean got hammered!

In all of the link-building tactics the mistake is to use these methods exclusively, they should be complimentary to an ongoing marketing strategy, not the marketing strategy!

SEO is scattered with tactics that work for a bit and then stop, and then there is something new. It’s what makes it so interesting, and it’s what keeps Matt in a job 🙂

On a side note – yesterday, I just published my first post on my software reviews website and I’m going to cover some of the ways I approach all this guff that comes out of Google HQ from a technology standpoint (you can download a free web-scraper) with the aim of also convincing you to purchase some software tools that I think rock.

 

Get Insights like these delivered to your inbox as soon as they are published!!

* indicates required





Author Information
Glyn S. H. has been online marketing since 1999 and has developed campaigns for leading luxury brands that have included Nestlè and Interflora . He works primarily in for the Travel and Tourism sector, helping hotels beat-down OTA paychecks. He has a web-marketing company, a Masters in Professional Communication, speaks fluent Italian, and is married with two kids. He also has a good sense of humour – essential for survival in web-marketing. He is not employed by Google. To contact via email: glyn@ (this domain).

 

]]>
http://www.ninjasuite.com/posts/guest-blogging-dead-matt-cutts-calls-it/feed/ 0
Predications for search and social in 2014 http://www.ninjasuite.com/posts/predications-for-search-and-social-in-2014/ http://www.ninjasuite.com/posts/predications-for-search-and-social-in-2014/#respond Fri, 03 Jan 2014 12:17:24 +0000 http://www.ninjasuite.com/?p=257

 Praying at sunset

What a complicated year 2013 was for web-marketing people. 2012 saw pain delivered in the form of Penguin, exact match domain penalties and lots of little tweaks leaving many ‘web professionals’ wondering what just happened to their job. 2013 was the year of the slow dagger in organic search positioning by way of results pages overhauls and simply not telling webmasters how their website was being found. The old guard will know that in 2014 link-building is not dead, but that it’s impossible to rank in an area of the results page that will deliver any traffic without paying. It’s denial if you say to say to a client that they are ranking in position #2 and it’s great when in fact that traffic is below the fold and getting zero clicks!

SEO’s have been quicksand for a few years now and with each New Year the job moves closer and closer to traditional marketing, while the escalating cost of CPC traffic makes offline public relations campaigns economically viable (imagine saying that 5 years ago in an SEO forum!).

I’ve no doubt the year will bring surprises – it’s what makes this corner of Communications so interesting and challenging.

I thought I’d have some fun with some crystal ball gazing to see what could be in store for us, the 0.1% of specialists that know that the first 3 adverts in Google are paid and that the CSS colour used to display those PPC ads renders clear on most laptops running Windows 7 (and that you consider that unethical!).

 What can we expect?

  1. Exclusive PPC for Google listings on Brand and E-commerce terms
    Google plays with the results display daily. The aim is to clutter as much as possible the top of the page with advertising (at the time of writing, even the cookie message pushes down organics serps!), or with Google+ related pages, or Google Shopping, or other PPC & Google owned products. If you’re in the travel sector, doing old-hat SEO this year will be your most expensive through OTAs (Booking, Expedia etc). Expect this trend to continue. Position number 1 for organic search positions are no longer in view (above the fold) for most searches.

     

  2. Requirement to have Google+ listing for business and Webmaster Tools
    They’ve been carrot and sticking the industry for years, whether through Google webmaster tools, authorship or suggesting fast load time increase your position.I think that this year we might see a requirement signed up to a Google property just to be listed in their search engine. We might expect Google to cite vague ideas of security and authenticity to the SEO community like that NSA fun they had with us in 2013. What about if Google+ was the only page that they showed in the organic results (because it was trusted and verified by a Google process!).

     

  3. The demise in the economic viability of producing informational websites.
    In 2013 we saw a bigger roll-out of the Google carousel for related searches. If you are in the information niche delivering content this is probably going to be a really bad year. At the time of writing Google’s current carrot is to ask webmasters to add tags to make in-depth articles more visible in their search index. At the same time informational searches in Google are currently being rehashed from many content providers (like yours) with the aim of providing a one-stop shop for answering peoples search queries (probably this is fruit of their purchase of Wavii algo) . Think of search results looking like a Windows 8 display and the user never having to leave the Google search (and Google not paying you a cent to pinch your words and pictures). Google’s already creaming off a lot of information searches by auto-generating Google+ pages full of 3rd party reviews and placing them at the top of the search. It is similar treatment to brands and artists that are having auto-generated pages on Youtube.
  4. The Rise of Social Advertising
    Assuming that Facebook doesn’t get closed for privacy infringement, the huge cost of CPC in Google Adwords combined with the lousy conversion rate and ever increasing costs will make social media advertising more appealing. However, unlike the early days of Facebook advertising where people were duped into believing that increasing fans was actually going to benefit their business (it didn’t because then Facebook turned around and started charging them to reach the friends they’d paid to make in the past!) landing page optimization (learned from PPC campaigns with a social twist) will seek to bring people closer to the company. Expect a range of social media tools specifically designed to optimize this creative space.

     

  5. The importance of context and the rise in costs of delivering SEO.
    If we say that everyone with an online business that was working was investing in SEO and that the market was made up almost exclusively of agencies that were automating their link-building until those fateful days in 2012, it stands to reason that a large percentage of those online businesses got slapped with a penalty. The irony is that keywords in links will always work in Google as it is part of their ranking DNA, but their guidelines have been tailored to kick everyone from their listings should there be a non-advertiser getting free traffic.

    – QUICK TEST: Take a competitive niche, take the top keywords and run them through Google with 100 results. Then strip out the duplicate domains and do some comparison on the vertical markets. I’ve seen some niches where there were only 38 websites….across the whole range of keywords!

    The impact of the 2012 update was that pretty much across the board websites have disappeared and never come back. That means that you need laser focus on your link-building activities to make sure that the link is super relevant. To quantify with an example: I helped with the launch of a site in December and have popped it onto the second page of Google with just 8 links. That’s because those links are gold, contextual and highly relevant. Take your copy of Xrumer or Scrapebox and stop mining the web for forums or blogs to spam. Instead perform advanced link-analysis and target only those places where there is relevance. Use the tools for data analysis but recognize that you will need to develop relationships to survive in SEO in 2014.

     

  6. Stop building guff stuff for SEO purposes.
    “We’ll build a Facebook, Twitter, Pinterest and Blog page because it will help with our SEO” – this is the output of advice from many agencies. It’s easy to do and looks impressive to the client and the managers they report to.  By all means register your brands to keep the names secure, but you should not be building anything that is not developed around your company strategy because the chances are that long-term, these things will not help with SEO and your energies will be rewarded with questions once the initial budgets have been exhausted (they will also be pain to keep alive!). Define the function of your social networks, start from the position of “if we had to pay each time we posted or shared this content would we do it?”. Stack the things you own – whether it’s a video or a piece of written content – on a web-property that you own. Make sure that the only way a person can get the whole picture (whether reading an article or watching a video – do a short version for sharing) is to visit your website. While they are at your website try and close them with some kind of action – whether this is a sale, coupon, newsletter or whatever, the goal is to reconcile the ROI on your advertising spend. Did I say measure everything with something deeper than Bitly…something that arrives at your conversion point on your website.

That’s what 2-weeks away from SEO & SEM eating Christmas food and drinking does to someone with a passion for digital communications.

Let’s hope that none of it (except for point 6) comes true, but if it does, do make sure you have a plan to deal with the future.

Get Insights like these delivered to your inbox as soon as they are published!!

* indicates required





Author Information
Glyn S. H. has been online marketing since 1999 and has developed campaigns for leading luxury brands that have included Nestlè and Interflora . He works primarily in for the Travel and Tourism sector, helping hotels beat-down OTA paychecks. He has a web-marketing company, a Masters in Professional Communication, speaks fluent Italian, and is married with two kids. He also has a good sense of humour – essential for survival in web-marketing. He is not employed by Google. To contact via email: glyn@ (this domain).

 

 

]]>
http://www.ninjasuite.com/posts/predications-for-search-and-social-in-2014/feed/ 0
Partial match penalty recovery – the promised land!! http://www.ninjasuite.com/posts/partial-match-penalty-recovery-the-promised-land/ http://www.ninjasuite.com/posts/partial-match-penalty-recovery-the-promised-land/#respond Fri, 25 Oct 2013 13:25:57 +0000 http://www.ninjasuite.com/?p=237

Hope And Aspirations

With one client I have been following their re-inclusion in the Google search index after they received a partial match manual penalty. I feel that this is one of the vaguest and most difficult penalties to try and recover from because there is so little clue provided as to where the problem might lie. It has taken 6 months and upwards of thirty reconsideration requests to see the manual penalty revoked; with the impact on the website rankings having been almost immediate.

History and Approach

The website in question had a long history of organic search positions that during the course of its lifetime had moved through a number of different search engine optimization (SEO) suppliers, implementing the industry standards of the day [read: lots of link filth in the profile]. After establishing that traffic drop-offs coincided with a key refresh date of Penguin and discrepancies between the number of URLs submitted in the sitemap, and those indexed by Google, it was time to get to work.

According to industry the solution was to upload a disavow file – declaring “I think these links are crap, so please [Google] don’t give them any consideration attempting to work out whether my website is authoritative or not in its niche”.

However my client can’t survive on a “wait and see” and so, after consulting with them, we agreed to submit a reconsideration request to see what Google had to say.  It’s worth mentioning that during the course of this whole process Google made a change in Google Webmaster Tools to start providing webmasters with additional information about any penalties that had been applied to the website. Thus, during the link audit I learned that a partial match penalty had been applied to the website, for the quality of links pointing to it.

Tools you won’t need!

There are lots of tools to help with back-link analysis. Some of the old timers on the market include Majestic SEO, which is used a lot by industry for their back-links index, and by mutating its core functionality you can get to have some real fun with it. Frankly though you might do better to buy yourself a virtual server, hire a coder and then start building your own database of the web. This also extends to services such as Link-Detox – a service that literally popped up in parallel with the disavow file.  As we’re talking about back-links tool, my own personal preference is SEO Spyglass (Free Trial) simply because it’s a flat fee for the software and then a small fee for updates. I hope to get onto sharing some tool tech in a future post, but at the moment there is a lot of news in SEO so I’ll keep writing about that.

Having been through this entire process I think that the only tool you need to audit your links are those that Google themselves have made available to you via the Webmaster Tools account. Matt Cutts recently said that they’d be giving webmasters more links to analyze, so don’t waste your money on expensive monthly recurring fee tools unless you are a web-agency that can offset the high monthly premiums against other clients and the additional services of these providers. Now…back to Auditing

Your Google Kwality [sic] reviewer is unlikely to have access to the tools I’ve mentioned or, for that matter, the know-how on how to use them.  They will most likely be given review guidelines based on the data that both the webmaster and the reviewer has available in their webmaster tools account – otherwise it would not be fair.

Stage 1: Who’s sending the crap?

First stage was to look at the worst offenders. In WMT Google currently shows those domains that are linking the most heavily to your website. I noticed that there were three domains that were bringing in some 20,000 links to the website. When I went and looked at those pages I found that what had happened was what I call

“The Adsense Poo”

A webmaster had scraped Google search results using medium/low volume long-tail keywords capturing all the first pages of results. These websites, due to the nature of the search query, were showing snippets text that was pretty random (remember random = unique in the eyes of spider). They had then created pages which were simply an Adsense block at the top of the page, followed by a list of links below it*. There were literally thousands of them. I wrote emails to the webmasters concerned that were naturally unwilling to explain to me the function of these pages, but did, to their credit remove the majority of them. Now, had Google told me the websites that were contributing to the partial match penalty; the client would have been able to issue a claim against the webmaster for loss of sales. But sadly, this information is still not available in WMT so the client could only be grateful that the webmaster removed the links.
*It’s worth mentioning that at pubcon in Vegas this year, Matt Cutts hinted at the fact that Adheavy pages – perhaps like these -would be something that Google was going to investigate.

Each time actions such as the one above were taken, a further reconsideration request was sent via WMT updating on the situation.

The first 4 or 5 reconsideration requests all were returned with the same generic answer – “You still got problems there son”

Stage 2: Disavow

I started to analyze domains and create a disavow file. I never used anything other than the disavow fomats:

domain: domain.com

The first disavow file had a few hundred domains to remove, and this went through 5 iterations over 6 months. After each submission a new reconsideration request was made. The replies changed from standard “problems there son” to specific citations of problem URLS.  These links went back up to five years. Nothing was safe. As this site had a long history of web exposure, it was perfectly possible that it was going to be impossible to capture all the stray URLs.

You get to a point where you have to take a step back and say, however brutal it might be: What options do I have?

1) completely bin the web-domain and start a new one from scratch

2) get on your knees and pray and wait for a clear day.

If you have the luxury to be able to do a re-brand now is the time to do so because a large segment of the industry, have penalties in place and a few of these don’t even understand why they are no longer on the first page!

And this is not because their web-suppliers were overly naughty, but because the link-policy of Google went retroactive on the industry and caught suppliers with their pants down.

Stage 3: Hadoken Disavow

In the end our final proposal was to disavow literally everything that Google had on its’ books for the web-domain. I saw 98% of the links of the client had amassed over nearly a decade wiped off the slate. Google said to this: “You are now good enough to come back to the fold”.

What’s funny is this: if a website has its links wiped out they will lose their rankings on the keywords that they were doing well for in the past. However, if you have a large website of more 100K pages it’s still worth it – the tradeoff between the long-tail you will start to receive again and the big delivering keywords that your web-marketing company will now need to work at again to get back on the radar (and realistically, most of the primary keywords are cluttered up with Adwords so you’ll get a dribble of traffic even if you are in position #1 for it organically) will be worth it in the end.

There is also an added bonus, while performing this intensive audit you may, like I did for this client, find a whole heap of problems that when fixed will make the site even better for the end user, and search engines at the same time.

Step 4: Results

  • Manual penalty has been removed
  • Indexing of the website pages has increased.

 

 

Get Insights like these delivered to your inbox as soon as they are published!!

* indicates required





Author Information
Glyn S. H. has been online marketing since 1999 and has developed campaigns for leading luxury brands that have included Nestlè and Interflora . He works primarily in for the Travel and Tourism sector, helping hotels beat-down OTA paychecks. He has a web-marketing company, a Masters in Professional Communication, speaks fluent Italian, and is married with two kids. He also has a good sense of humour – essential for survival in web-marketing. He is not employed by Google. To contact via email: glyn@ (this domain).

 

]]>
http://www.ninjasuite.com/posts/partial-match-penalty-recovery-the-promised-land/feed/ 0