It’s no secret that online marketing has been suffering in recent years. The use of ad blockers is on the rise, fueled primarily by poor advertising practices. That’s bad news for webmasters who draw most of their revenue from advertising partnerships – and even worse news for marketers.
“The majority of feedback from readers is that they block because of the nuisance of ads,” Destructoid founder Niero Gonzalez told Forbes in 2013, after finding out that over half of his visitors used ad-blocking software. “People are fed up with ads that expand and blow up in their face. If I wasn’t in the publishing industry, I would definitely use it.” Read more
I’ve said it before, and it bears mentioning again: search engine optimization has changed. Back in its early days, when the Internet was still in its infancy, SEO was all about manipulating search algorithms; it was all about nailing down the most successful ranking tactics. Content was often a secondary concern.
In other words, the early days of SEO were about the engine rather than the user.
That’s no longer the case. With every update to its algorithms, Google’s making its search engine a little bit smarter and a little bit more capable of determining what a user will find interesting. On top of that, social media comprises a huge chunk of most web traffic – if someone finds a page interesting, they’re probably going to share it with their friends over Reddit, Facebook, or Twitter. Read more
Since it first exploded onto the scene back in 2003, WordPress has established itself as one of the best content management systems in the world. It’s not terribly difficult to see why, either. It’s got one of the most user-friendly interfaces around, and is equipped with excellent features and functionality for both free and paying users. Because of this immense popularity, there exists a staggering amount of plugins with which writers can optimize their blogs.
One thing I (and many others) love about the platform is that much of the SEO is already built into the blog design. Using a tool such as All-In-One SEO should truthfully be enough for even the biggest SEO newbie to properly optimize their blog posts. However, that doesn’t mean proper optimization is a cakewalk. If you aren’t taking the necessary steps to write SEO-friendly posts to begin with, you’re not tapping into the full potential of the platform. Read more
Web hosting can range in cost from free to pennies a month to hundreds or even thousands of dollars. If you’re contemplating setting up a website, you may wonder exactly what you get for your money. The services that web hosting companies offer vary, but all of them share a basic set of costs that enable them to get your site up and running on the net. We’re going to have a look at those costs and think about how free web hosting and very cheap web hosting companies pay for them.
For a company to offer web hosting, they need servers. Servers are usually high-power computers that are capable of processing lots of data and delivering it to web clients. Your average home computer can do much the same thing at a less efficient level, but hosting companies need specialist hardware that has lots of RAM, disk space, and processing power.
A little over a year after its introduction, the Penguin algorithm was given a major update. As many have discovered, Penguin has had sweeping effects on the way Google deals with sites it considers to be trying to game the system with over-optimization.
Since Penguin first hit the servers, there have been two significant updates, both of which were largely tweaks or minor data refreshes. Penguin 4 brings a major update to the core algorithm and is expected to delve far deeper into sites in search of spammy tactics.
In case you’re confused by the versioning numbers. This is the 3rd revision of the Penguin web spam program, so we’re on Penguin 4. However, it’s the first major revision of the algorithm that underlies the Penguin program, so that’s being called Penguin 2.0. Read more
Google have introduced a new tag management service that allows website owners to streamline the process of managing analytics, advertising, and conversion tags on their site.
Anyone working in online marketing will be familiar with the headaches involved in managing the snippets of code that need to be included in sites to provide the necessary metrics for tracking site performance. Often these tags have to be tweaked or added and removed fairly frequently, and coordination between marketers and webmasters is often not as seamless as it might be.
With Google’s new tag manager, web developers will be able to add one code snippet to a page, and then allow marketers to manage the rest from a dashboard. Google Tag Manager has the potential to significantly increase the responsiveness and flexibility of tracking a site’s analytics. The new service includes a number of features to streamline the process of adding and monitoring tags, including easy testing to ensure that tags added to a page are functioning as they should, version control so that users can roll back changes should they need to, and multi-account and user permission provisions, so that marketing agencies can manage the analytics and conversion tracking snippets on their client’s sites. Read more
PageRank-hi-res-2 (Photo credit: Wikipedia)
There’s been a fair bit of chatter in recent months about the value of nofollow links for SEO. It’s been claimed that when assessing a site’s backlink profile, Google takes into account nofollow links as a signal of naturalness. Whether that’s true or not, it’s useful for any SEO to understand the nature of nofollow links, and especially whether nofollow backlinks are really worth pursuing.
On occasion, sites would rather that Google did not follow particular links or allow them to affect the target’s PageRank. There are various reasons, but one of the main reasons is to avoid spam. For example, a common SEO tactic for gaining backlinks was to put thousands of links into the comment sections of blogs and on forums. Many blogs now choose to have all links in comments marked “nofollow” as a way of discouraging such spam. Because nofollow links are ignored for PageRank purposes, spammers now have no incentive to put their links on these pages. Sites like Wikipedia have all their outgoing links marked nofollow for exactly this reason, it discourages people from using Wikipedia’s authority to get PageRank, as Matt Cutts discusses in the video below.
Google didn’t release their usual algorithm update news last month, so this week we have a plethora of juicy updates to look at. As with the trend in recent months, Google have been concentrating on tidying up the SERP page, improving their detection of high-quality content (Panda), and enabling better information to be made available on the search page rather than having to click through to the results.
Results Page Changes
One of the bugbears that’s been bothering the SEO community in recent weeks has been the way that many of the top results for searches have been from the same domains. In the worst cases, a search can result in almost all the results being from the same site. Google have made three improvements to their site clustering algorithms to hopefully improve the diversity of search results.