Category Archives: Featured

Small Business SEO Advice

I recently contributed to a Business Essentials piece on the Guardian about advice on SEO for small businesses by Mark Smith, read the full article here.

Further to the comments by my industry colleagues, here is my more in-depth view with questions posed by Mark.

How would you characterise the changes to SEO, past, present and future?

SEO has changed a lot since I first entered the search marketing world in 2007, the past used to be more about a simple split of ensuring the site was optimised well from the on-page point of view and simply building links to the relevant pages to have a good impact, over time more factors have influenced SEO, algorithm changes have meant that we have had to adapt to what Google and other search engines see as the signals for improving rankings. Social media is playing a big part from around 2011 to present, not just for simple conversation but another tool on the marketing belt to help promote content as we entered into the content marketing world there are many factors to look at from now and in the future such as Content/Query Relationship, Off-Site (Backlinks), On-Site (Content Quality), Local Search, Secure / Non-Secure Sites, Mobile optimised / responsive sites.

Are business getting it wrong/right?

The majority of businesses that are working in-house or with agencies are being guided and are getting it right most of the time, I have experienced a number of employees attending training and taking advice from industry specialists, those that are getting it wrong seem to be ill-advised on the old school methods of SEO or simply haven’t evolved.

If they are getting it wrong, why is that?

Older SEO techniques are still being practised, examples of large-scale directory submissions, bulk article writing and paid for links are still very present and still being practised in the industry, this can lead to site problems with Google also a lack of current SEO knowledge can hinder, I mentioned not evolving and this is difficult if you have a small team that has other responsibilities other than working on the SEO of the site.

What are your personal experiences in relation to whether small businesses are struggling to keep apace of change?

Having worked within an agency for the past 7 years, small businesses can struggle to keep abreast of the ever-changing world of SEO as it’s hard enough to keep pace with algorithm changes, if the agency being used isn’t keeping up with what the industry is up to this will inevitably cause problems later down the line.

Is a lot of money/opportunity being missed?

Smaller businesses have a lot of opportunity to grow within specialist niches, the issue of if competing in a larger market, insurance is a big one as an example where high street brokers want to have an online presence, comparison engines dominate and brands with large budgets are competing also make it difficult. Looking at local based terms can help to drive business online where there is the opportunity to make more money online.

What can small business do to rectify this situation? Do they need a strategy? to employ their own staff to concentrate on content? Do they need to employ an agency? How do they discern between a good agency and a bad? Or are there any other measures they should be taking to ensure they don’t miss out on opportunities?

I believe there is a place for small businesses to compete in their own market, creating a strategy to grow with is important, things to consider like what is the business goal? How can they get there? also review the competition and look for the gaps in the market to compete. By having staff read and learn up to date information, attend training courses to improve knowledge will go a long way to help. Employing an agency will help depending on budgets, this can also help distinguish the good from the bad, what are you getting for your money is a big factor, small budgets can be used effectively other than just reporting or low-level link building.


Has Digg Dugg a Big Hole, banned by Google? No just a mistake by Google

Another post on the banning/de-indexing of Digg by Google, datadial popped up on an SEO group with the 1st instance of this news and many posts have followed, read my related posts for more opinions.

Update 1800 20th March 2013

Matt Cutts reveals Google inadvertently de indexed Digg whilst tackling a spammer.

So, let’s look at the scenario, Digg is no longer is indexed in Google:



Their robots.txt file didn’t exist earlier in the day, but no seems to have recovered so was this issue, not having a robots.txt or showing a 404 error for the file, Google may not index a site with an error of a robots file.

They have millions of links, 963,508,932 to be exact:, is it a link penalty, most unlikely:


Related Posts:


Unnatural Link Warnings – Ignore them if you want?!?

Ok did you just get the dreaded Google unnatural links warning message in Webmaster Tools?

Matt Cutts Drawing

Oh no is my site to be penalised? Usually this would be the way, but no, even more confusion and panic after the latest spam messages received as Google’s head of web spam Matt Cutts explained on G+:

If you received a message yesterday about unnatural links to your site, don’t panic. In the past, these messages were sent when we took action on a site as a whole. Yesterday, we took another step towards more transparency and began sending messages when we distrust some individual links to a site. While it’s possible for this to indicate potential spammy activity by the site, it can also have innocent reasons. For example, we may take this kind of targeted action to distrust hacked links pointing to an innocent site. The innocent site will get the message as we move towards more transparency, but it’s not necessarily something that you automatically need to worry about.

If we’ve taken more severe action on your site, you’ll likely notice a drop in search traffic, which you can see in the “Search queries” feature Webmaster Tools for example. As always, if you believe you have been affected by a manual spam action and your site no longer violates the Webmaster Guidelines, go ahead and file a reconsideration request. It’ll take some time for us to process the request, but you will receive a followup message confirming when we’ve processed it.

This to me speaks of a load of cobblers, so what do we do next? Ignore the message?

Trawl through the thousands of back links to your site to find the rogue link?

If you’ve had the message what have you done about it?

There has now been an update from Mr Cutts, presumably from all the feedback negative sentiment:

Update: Thanks to everyone who gave feedback on this change. An engineer worked over the weekend based on the suggestions here, and starting on Sunday we made two changes so you can tell the “individual links aren’t trusted” messages from the “our opinion of your entire site is affected” messages.

First off, we changed the messages themselves that we’ll send out to make it clear that for a specific incident “we are taking very targeted action on the unnatural links instead of your site as a whole.” So anyone that gets a message going forward can tell what type of action has occurred.

The second change is that these messages won’t show the yellow caution sign in our webmaster console at like our other webspam notifications. This reflects the fact that these actions are much more targeted and don’t always require action by the site owner.

Thanks again for the feedback, and we’ll continue to work on ways to provide more useful and actionable information for site owners.

Hat tip for the Matt Cutts graphic by


ionSearch 2012 Videos

Leeds based Search Marketing Agency Blueclaw Media organised the ionSearch conference at The Carriageworks, Leeds on the 18th April 2012.

The event saw some of the leading SEO professionals speak on a range of topics. A good blog post from a colleague covers a roundup of ionSearch 2012.

The 2012 videos to the event have been released and here are my chosen few:

Black/Grey/White Hat SEO: Where Do You Draw the Line

Unibet & G+

Dave Snyder – Content Marketing in the Post-Panda World

Visit the official ionSearch site to view all the videos from the day.


Net Squared May Meetup: SEO 101 for Non Profits

On Tuesday May 22, 2012 I spoke at Manchester Net Squared‘s monthly meetup to talk all things SEO.  The evening over at the Manchester Digital Laboratory hosted by Steven Flower featured myself and Danny from PushOn talking about Pay-Per-Click & Google Grants for Non-Profits.

Here is the run down of the talk and links to a few tools i mentioned.


  • Setting the scene
  • What is SEO
  • Value of SEO
  • How search engines rank content
    • On-site ranking factors
    • Off-site ranking factors
  • Leveraging your non-profit status

UK Search Volume in the UK Jan 2012

Let’s set the Scene

I’m Paul Delaney, SEO Account Director at MEC Manchester, large scale clients by day and working with a number of SME companies as well as local celebrity Terry Christian in my spare time.

To kick things off lets take a look at the search volume in the UK for January and it’s a staggering 18.33bn or 6 thousand per second! Which basically means there is plenty of traffic to be had.

What is Search Engine Optimisation?

SEO isn’t a bunch of wizards and warlocks creating a mystic black art but understanding how search engines index and rank content can using this understanding to rank websites and other digital assets.

What is the Value of SEO

Why is having an SEO campaign important? Over 70% of clicks come from the organic listings and a first page place will attract 70-90% of all clicks for a search phrase.

Being number 1 has a 36.4% click through rate (Optify data) which shows the objective is to rank as high as possible to gain more traffic, the UK search space is dominated by Google with 90%+ of the share.

A case study in the financial sector shows traffic increased from under 1,000 visits per month in position 15-16 to over 5,000 in achieving a top 5 position.

As well as ranking websites, SEO can also influence other digital assets such as images, maps, product listings, news and video content.

How do search engines rank content?

Simply put, search engines crawl the web to understand what web-pages are about in order to serve relevant pages to people searching the web. As SEO is built on an art rather than science it’s important to not look at ranking factors in isolation as no one know all of them and the search engines use different algorithms to rank websites.

The first part for an SEO campaign is to get your house in order, essentially your website, so ranking factors for a page/site need to be looked at.

On-site Ranking Factors

The following factors are what search engines look at when determining what the web page is about. The HTML structure of the page/site can inform the engines the topic. using some of the following will help (order of importance):

  • Web-site topic
  • Title tag
  • URL
  • Internal link structure
  • Internal link anchor text
  • H1 heading
  • Words used in first paragraph

The key thing is to make sure the key meaning of the page is understood by search engines so keyword stuffing is definitely not the way to go! Also make each page as unique as possible and duplicated content causes issues.

Keyword Research

Selecting the right keywords to target for a site is important and a few free tools (we all love freebies!) from Google such as the Insights for Search and Google Adwords Keyword Tool can help understand what users are searching for. You will often here the phrases short-tail and long-tail keywords so to understand is key.

Short tail keywords are your super generic terms that have large volumes of searches but are the most competitive, long tail searches have more words in the phrase and are generally lower in volume but can have a higher conversion rate as the term is more specific.

Learning to optimise text is more of a skill than a natural ability, think about synonyms and plurals as well as related keywords.

Off-site Ranking Factors

A website forms part of an ecosystem on a global scale and the search engines use this to understand what the sites are about and how important they are. So what are the factors search engines consider when looking at a web page from links pointing to a domain?

  • Words used in the anchor text of external links
  • Quality of links pointing to the web-page
  • Quality of links pointing to the web-site
  • Number of links pointing to the web-page
  • Number of links point to the web-site
  • Diversity of link sources
  • Words used on the linking page
  • Relevance of link sources

And the big question is, why is this important? The better the links the better your rankings!

There are 2 ways to get links to your site:

Link-baiting – where you can develop assets to draw links: Press releases, Infographics, Tools/Calculators, App development to name a few.

Link-building – simply just going out and getting links: Directory submissions, Guest blog posts, Advertorials, Article syndication, Widget placement, Partnerships, Press release syndication.

It’s not all that simple as just getting links from other websites, be mindful of relevance and authority to be of the best quality links for your domain.

Leverage your non-profit status

3 key takeaway from my talk to leverage your non-profit status:

  1. Ask for links – Have relevant content, always ask for links to your site from your content.
  2. Optimise your assets – Remember to think about keywords when making changes to the site.
  3. Use your data – Analyse your analytics to understand your visitors and as Steven mentioned hook up with The Analysis Exchange for a free web analytics consultation for non-profits.

Ppppppickup a Google Penguin!

20120428-081607.jpgAs a kid i would visit my grandparents and every week be given a pack of Penguin biscuits so when I heard about Google’s latest webspam algorithm update being called ‘Penguin’ I thought ‘penguin’ seriously?

The SEO world can be a quirky lot at the best of times so I thought this was a bit of a laugh, but no its real. Essentially the new penguin update announced by Google is another step to reward high quality sites, this update will reduce we spam and promote high quality sites. So what tactics will Google Be identifying for this latest update?

Blackhat seo techniques that I thought were a thing of the past such as keyword stuffing, link schemes, irregular link patterns and really bad spun content. As always the Panda will be lurking to check over a sites content so the 2 updates are succinct with each other. Google want white hat seo’s to focus on and I quote:

be free to focus on creating amazing, compelling web sites

This goes without saying and seo’s that work in this way should see no impact on results as around 3.1% of search queries will be affected according to Google. This update unlike Panda is an all languages update so will be deployed globally. So what happens if you have been walloped by the Penguin? Danny Sullivan has written a post over on Search Engine Land entitled Google Penguin update tips and advice read it here.

As this isn’t the first and most definitly wont be the last on this topic, feel free to leave a comment with your thoughts or any penalties that have been spotted.


Google Search Quality Update – March 2012 50 Updates

Another month and another Google update from their Inside Search quality changes for March.

Google have announced 50 new search updates, I will look at a few specifics in detail:

As I eluded to in last months Google February 40 changes post, there was an interesting update was mentioned regarding link evaluation, now there is a little bit more detail and will be interesting to see what impact this has on anchor text in links:

Tweaks to handling of anchor text. [launch codename “PC”] This month we turned off a classifier related to anchor text (the visible text appearing in links). Our experimental data suggested that other methods of anchor processing had greater success, so turning off this component made our scoring cleaner and more robust.

Also an improvement on how anchor text is interpreted for a query to a website:

Better interpretation and use of anchor text. We’ve improved systems we use to interpret and use anchor text, and determine how relevant a given anchor might be for a given query and website.

Panda update again and seems now that Google will keep this a fresh database for updates to improve the content algorithm:

High-quality sites algorithm data update and freshness improvements. [launch codename “mm”, project codename “Panda”] Like many of the changes we make, aspects of our high-quality sites algorithm depend on processing that’s done offline and pushed on a periodic cycle. In the past month, we’ve pushed updated data for “Panda,” as we mentioned in a recent tweet. We’ve also made improvements to keep our database fresher overall.

The full list has a few others but these are my main points from an SEO point of view.

Google Search Quality Update – February 40 Updates

Google updated their Inside Search quality changes for February this week and announced 40 new updates, i will digest the list and feedback on specific SEO related changes.

Firstly, an interesting update was mentioned regrinding link evaluation, as ever Google are quite vague on the update as to not unlock Pandora’s box with secrets of the algorithm, but interesting none the less as to what has been ‘switched off’ in link signals:

Link evaluation. We often use characteristics of links to help us figure out the topic of a linked page. We have changed the way in which we evaluate links; in particular, we are turning off a method of link analysis that we used for several years. We often rearchitect or turn off parts of our scoring in order to keep our system maintainable, clean and understandable.

Another, the Panda update, this update is not a ranking signals change but more of a data refresh similar to previous Panda updates – see The Google Panda Update, One Year Later [infographic]:

Panda update. This launch refreshes data in the Panda system, making it more accurate and more sensitive to recent changes on the web.

An update to the site: search query, this could be interesting in seeing how Google ranks a sites pages:

“Site:” query update [launch codename “Semicolon”, project codename “Dice”] This change improves the ranking for queries using the “site:” operator by increasing the diversity of results.

Here are a few others from the Google update post:

Tweak to categorizer for expanded sitelinks. [launch codename “Snippy”, project codename “Megasitelinks”] This improvement adjusts a signal we use to try and identify duplicate snippets. We were applying a categorizer that wasn’t performing well for our expanded sitelinks, so we’ve stopped applying the categorizer in those cases. The result is more relevant sitelinks.

Less duplication in expanded sitelinks. [launch codename “thanksgiving”, project codename “Megasitelinks”] We’ve adjusted signals to reduce duplication in the snippets forexpanded sitelinks. Now we generate relevant snippets based more on the page content and less on the query.


Just Host – Just Don’t! – Just Host Review

Just Host, Just Don’t! My Just Host Review

UPDATE: 8th December 2012

Well it’s another year on and on the day of expiration of my account that had so many issues last year, Just Host are at it again!

Now correct me if i’m wrong does this screenshot from account management screen say due to expire on 8th December 2012?

 Well, i then went into my cPanel to backup and move away my files and low and behold, my account is suspended, NOT AGAIN!

Raised a ticket but as per last time it’s a weekend so probably no answer on this until they delete my files.

Thanks again Just Host, crappy service again!



Ok i don’t usually complain publicly about bad experiences with companies but for Just Host this will be a massive exception!

Roll back 12 months, created an account with Just Host to host a number of sites for both myself and clients of my web design business and as of the 9th December 2011 no issues at all.

My domain was up for renewal on the 8th and the auto PayPal subscription didn’t work so i logged into my Just Host account and paid the subscription, site still working fine.

9th December my site goes down, now no problem at first as i understand the domain may have slipped through the renewal net but having received a payment notification email through all good.


Just Host’s “automatic domain holding page” gets switched on leaving my site down. So i see that they have 24/7 technical support which i raise a ticket to and subsequently call Just Host to raise the problem.

Just Host - Problem 1

Just Host apparant 24/7 tech support

Now i’m speaking to Just Host on the phone and whilst the technician was polite he then advised i call billing as they will be able to switch off the Just Host holding page affiliate cash generating page at my expense.

I re-dial and Just Host billing is closed, ok support are in the USA so time difference a factor here, i then call back 3 hours later and to my shock and horror, billing are closed on a weekend!

I then speak to support at Just Host and ask what they will do about it and the technical support say they can’t do anything about it until billing open up on Monday!

I then re-raise my Just Host support ticket to make sure they have it in their inbox to action

Now Monday comes, site down for over 72 hours, Google have now removed me from the SERP’s for my brand terms for my site and i receive an email:

Just Host Support email

Just Host Support email

Just Host apologise and basically they will refund me my hosting costs. Fine, but i now have to backup all the sites on my account and move them to my new preferred host – Vidahost.

I am now going to formally complain at the poor experience with Just Host and the time it has taken for my site to be re-instated and loss of potential bsuiness.

Updates to follow.

Just Host? Just Don’t!

1 Star2 Stars3 Stars4 Stars5 Stars (2 votes, average: 5.00 out of 5)