Google have introduced a new tag management service that allows website owners to streamline the process of managing analytics, advertising, and conversion tags on their site.
Anyone working in online marketing will be familiar with the headaches involved in managing the snippets of code that need to be included in sites to provide the necessary metrics for tracking site performance. Often these tags have to be tweaked or added and removed fairly frequently, and coordination between marketers and webmasters is often not as seamless as it might be.
With Google’s new tag manager, web developers will be able to add one code snippet to a page, and then allow marketers to manage the rest from a dashboard. Google Tag Manager has the potential to significantly increase the responsiveness and flexibility of tracking a site’s analytics. The new service includes a number of features to streamline the process of adding and monitoring tags, including easy testing to ensure that tags added to a page are functioning as they should, version control so that users can roll back changes should they need to, and multi-account and user permission provisions, so that marketing agencies can manage the analytics and conversion tracking snippets on their client’s sites. Continue reading
PageRank-hi-res-2 (Photo credit: Wikipedia)
There’s been a fair bit of chatter in recent months about the value of nofollow links for SEO. It’s been claimed that when assessing a site’s backlink profile, Google takes into account nofollow links as a signal of naturalness. Whether that’s true or not, it’s useful for any SEO to understand the nature of nofollow links, and especially whether nofollow backlinks are really worth pursuing.
On occasion, sites would rather that Google did not follow particular links or allow them to affect the target’s PageRank. There are various reasons, but one of the main reasons is to avoid spam. For example, a common SEO tactic for gaining backlinks was to put thousands of links into the comment sections of blogs and on forums. Many blogs now choose to have all links in comments marked “nofollow” as a way of discouraging such spam. Because nofollow links are ignored for PageRank purposes, spammers now have no incentive to put their links on these pages. Sites like Wikipedia have all their outgoing links marked nofollow for exactly this reason, it discourages people from using Wikipedia’s authority to get PageRank, as Matt Cutts discusses in the video below.
Google didn’t release their usual algorithm update news last month, so this week we have a plethora of juicy updates to look at. As with the trend in recent months, Google have been concentrating on tidying up the SERP page, improving their detection of high-quality content (Panda), and enabling better information to be made available on the search page rather than having to click through to the results.
Results Page Changes
One of the bugbears that’s been bothering the SEO community in recent weeks has been the way that many of the top results for searches have been from the same domains. In the worst cases, a search can result in almost all the results being from the same site. Google have made three improvements to their site clustering algorithms to hopefully improve the diversity of search results.
Attribution: Thomas Shahan
A couple of weeks ago we looked at negative SEO and what you can do to protect your site. One of the techniques we mentioned was hacking. Competitors, upon gaining access to a site, may alter the content or add malware to pages in the hope that Google will delist or penalize a site. Hackers may also simply attempt to use a site to spread their malware without any particular intentions regarding SEO.
We gave a number of suggestions for dealing such an intrusion, but often, after having received a warning from Google, it can be difficult to determine exactly what the Googlebot crawler is seeing. Hackers are adept at making a site appear perfectly normal to those who go directly to a page, while serving malware or undesirable keywords and hidden links to search engine crawlers and those who arrive at a site from a search engine. What you see when you visit a site is not necessarily what Googlebot is seeing.
As we all know, back-links are a critical factor of search engine ranking algorithms. Getting more incoming links from better sources is a sure-fire method for improving a site’s position in SERPs. With the coming of Google’s Penguin algorithm update, techniques that seek to build links in ways that look unnatural are being penalized. Google wants its rankings to reflect the needs and interests of its users, and those sites that are employing shady link-building techniques are thought -reasonably- to be less likely to meet that aim. With that in mind, here are three ways that webmasters and site owners can generate backlinks that aren’t going to ring alarm bells in Mountain View.
Matt Cutts‘ recent confirmation that Google don’t use bounce rate as a signal for search engine ranking will be of cold comfort to website owners who are confounded by their visitors’ refusal to stick around.
A site’s bounce rate is the percentage of visitors that leave the site from the page on which they arrived without interacting or following navigation. Sites with a high bounce rate are falling at the first hurdle, and it can often be difficult to determine exactly which factors are repelling users. Today we’re going to have a look at the five most likely reasons that your site is failing to engage people.
Top Ten Ways To Improve Site Speed
As we saw in Part One of our series on website optimization, a slow site can seriously impact user experience. Anything over 3 seconds is going to be an irritation to users, and irritated users lead to lost sales, reduced conversions, and higher bounce rates. As an example, figures released by Walmart show that increased load times are significantly correlated with lowered conversion rates, with a precipitous drop as load times increase between one and three seconds.
Slow load times can also have an effect on SERP results. Google use site speed as one of their signals for determining ranking. It has a much smaller effect than relevant content, but any decent SEO will tell you that, all else being equal, a quicker loading site can give you a bump compared to competitors.
With that in mind, we’re going to take a look at 10 best practices that website owners can implement on their sites to make sure that their users have the best possible experience.
Are Slow Load-Times Killing Your Conversion Rate – Part One: 6 Best Infographics
We’ve all felt that sense of irritation when we click on a link and have to wait while a slow page loads its dozens of unoptimized images and embedded widgets and analytics scripts, fetching components spread across servers in disparate locations and incurring significant extra round-trip lag. Today, we’re going to have a look a 6 of the best infographics on the subject, and next week we’ll show you the top 10 causes of slow web pages and what you can do to fix them.
1) Instant America by Mashable
Americans expect prompt service, we don’t like to wait for waiters and sales staff, and we find waiting for websites especially galling, as this infographic from Mashable neatly demonstrates.
As a clever addition, Google’s Inside Search blog announced last week that the search engine would now function like a graphing calculator when entering functions like you would on your old TI.