More than 100 links on a page is OK? I really don’t understand Google

Posted on Updated on

I sometimes seriously feel about Google the way men say they feel about women. It seems like their dance around link building never ceases. Either you don’t need them and great content in your site is enough, or you do need them, but don’t use exact anchors, or maybe you’re better off with social referrals (mainly G+ ones of course). It’s not easy to keep up.

The latest announcement is regarding the maximum number of links allowed on a page. Now, the common convention about this in the industry is – Don’t go over 100. Some even choose to be stricter and never go over 50 or so. Aggregators or directories, with their entire website concept revolving  around placing multiple links on each page, had to come up with different solutions like JavaScript onclick events instead of a simple <a href> tag, in order to avoid seeming as spam sites.

Apparently all this was for nothing. According to Matt Cutts Google has stopped considering this factor since 2008. You do have to be careful and make sure that each link on your page is there for a reasonable reason, but it most likely won’t get you penalized to have over 100 links on a page.

I must say that even after watching this video I wouldn’t be in haste to stack up my pages with multiple links. I’ve seen too many directories, especially since April 2012, losing their SERP ranks and Page Rank. It might be for different reasons but then again, you never really know with Google. Moreover, Matt does confirm that the more outgoing links from a page, the less page rank credit each of these links gets. So what should that say about the linking page? Is it losing its Page Rank as well?

By the way, if you’ve read my previous post about Page rank, you should know that I don’t really regard it as that important anymore, when it comes to a site’s authority. However, I do believe that link flow is important and Page Rank might reflect that flow at a certain degree.

Check out the Matt Cutts video here and leave a word with your about it:

Is this a good link?

Posted on Updated on

a-good-link-seoI’ve had several conversations with people from the SEO world lately which left me amazed by how many of them still view link building just as they used to look at it before Penguin and Panda.

If you’re not familiar with the list of criteria which were strictly uphold by building links, here’s a little refreshment with the main ones:

  • Relevant content to the content in your site. If the Title tag and Meta description of the page contain relevant KWs, even better.
  • Page rank – in home page and inner pages.
  • Maximum outgoing links from the site (the exact number was never completely decided on).
  • No outgoing links to ‘bad sites’ or what you would call – ‘a bad neighborhood’.
  • No ‘nofollow’ in header and of course in the actual <href> tag.
  • Link location – could be in content, sidebar or footer. This includes placing the link in existing content.
  • Site Directories, Article Directories and Bookmarking sites – the more the better.
  • Anchor text – exact, medium or long tail.
  • Number of obtained links to your site – the more the better!

Well, all that have changed. I’m not saying it completely disappeared, however you can’t expect to keep things as they are and expect results. In fact, not only you’re not likely to get ranked like this but you’re also very likely to get a penalty by Google.

There have always been guidelines by Google indicating which links they are considering as natural and with good value and which are simply spam. But now these parameters are checked with a magnifying glass by both automatic bots and human checkers. It’s not by chance that the number of site owners receiving warnings through their GWT account and getting penalties, has increased significantly in the past 7 months. You don’t need to check statistics to know this, you simply look on the number of forum threads, blog posts and posted articles revolving around this topic to understand that it has become a real issue for many mainstream sites. I myself, have experienced such a penalty and am still trying to recover from it.

So what changed?

  • First of all, Page rank is no longer a relevant factor. You can read all about it in my previous post about PR. You should now mainly rely on the metrics by MajesticSEO – Trust and Citation Flow.
  • Second, you don’t need a really big amount of links in order to get ranked. What you need is what I call – the right mixture for you. It’s no longer a game of numbers but a game of strategic thinking while putting the user’s behavior in and toward your site, in the center of it all. We’ll talk more about this in my next article – Custom made link profiles.
  • Third, with none-content links, the link location has to be either in a well established site directory, on in a real resources page which a site owner is placing in his site to really help his users. Of course, you still need content links, however it has to be relevant Fresh content.
    This means you cannot simply add your link to an existing article that was published a month ago or more. You must make sure it will be placed in a new article with the link being highly relevant to the content of that article. For instance – you can’t write an article about video games and place in that article a link to your home page with the anchor ‘video games‘.
    It will make much more sense to link to an inner page in your site which is dealing with a specific piece of information given in that article, which people are likely to click on in order to get more relevant information. Something like – ‘the latest technology used in video games‘.
  • Sidebar and footer links are fine, as long as they are kept to a minimum by the linking site and they are all highly relevant for its users. In order to understand if it’s relevant or not, picture yourself as a user of that site and try to imagine if you would click on most of the links which are located in its footer or sidebar.
  • Directories, web or articles, and bookmarking sites are not the best approach. Especially the low quality ones. You can still use them, but keep the number of such links to minimum and stay within the boundaries of your site’s category and niche.
    Instead of bookmarking sites, try to get yourself into feed readers tools. The way to do that is by creating good content and making sure you have a functional RSS feed. Of course there are ways to promote that feed of yours, but we won’t get into that right now.
  • Social media – social networks should be your main target when doing SEO, and I’m not talking about those old-school methods of spamming groups and walls with a link to your site. You have to use viral marketing methods and create a real buzz.
    If you’ve looked in Google Analytics in the past few months you’ve seen that they are now providing you with an option the track social conversations about you. This should serve as a perfect indication for you as to how much Google value social networks right now.
  • Traffic through the link – the most valuable link is one that is actually getting clicked on by users which stay in your site, browse in it and use your services. “Well of course“, you say, “conversion is very important“. I say it’s not only for conversion that this is important but also for SEO.
    As far as Google are concerned, if a user used a link and liked the page he reached, that means that link is natural and relevant and therefore a valuable link for your site.
  • Anchor text distribution has also become a big factor in this game of link building. It’s not  enough anymore to use multiple short, medium and long terms, you have to expand your use of linking text to general phrases like: ‘this site’, ‘for more information about this’, ‘a really good source I found‘, I think you’re getting the hang of it already.
    You should also use your domain as an anchor, including a regular, none hyper link format. It’s not complicated if you think about it. Simply imaging how your site would be linked to if people really naturally linked to it.
    If you’re still not sure, use link-analyzing tools to research a high quality site which is likely to have a real natural link profile and find out which anchors it is using.

From all this it’s plain to see that link building nowadays means you also need to have a good understanding in SMO, viral marketing, content marketing, BI and Public relations. It’s a mixture which guaranties you’ll not only get a stable natural link profile, but you’ll also get real users which are more likely to convert. I’ll get more into the right mixture you should have in a link profile and how to get it, in my next post.

Where did Page Rank go to?

Posted on Updated on

page-rank
Image by Stuart Miles on FreeDigitalPhotos.net

A few months ago I’ve published an article in SearchEngineJournal about how I believe that Page rank is no longer a real factor in a site’s authority. When I published the article this topic was still very contradicting. There were a lot of well established SEO experts which still believed that page rank, with all of it’s disadvantages, is still a relevant factor and that Google still gives it a high consideration when they’re determining a site’s authority.

However, as the months past by and as no new page ranks updates occurred , more and more people had joined what is now an almost common opinion – Page rank is out of date.

Recently Google had announced that there are not going to be any page rank updates this year. Now, although 2013 is close to its ending, you need to take in consideration here that the last update was on February 5th, 2013. If you’ve been in the SEO world for as long as I have, you’ll know that that’s a really big gap between updates. Moreover, as I indicated in my article on SEJ, my personal test cases showed that PR cannot be used as a factor for measuring a site’s value or anticipating its rank.

This was replaced with what I believe is the next generation of page rank – the Trust and Citation flows by MajesticSEO. It seems that with these metrics they’ve managed to exactly capture Google’s perception of how a natural, well established link profile should look like.

It’s not easy for us veterans in SEO to completely change our perceptions, especially those which lasted for many years now. However, there’s no escaping it, it’s time to look on SEO with different eyes and adjust ourselves to the new reality that Google has yet again forced us into.

Alternatives for Google analytics – Part 2

Posted on Updated on

As promised, this is the second part of alternative programs for Google analytics. if you didn’t go through part one of this series, I suggest you first go to the Alternative to Google analytics part 1 page and only than continue reading this post.

In my last post on this subject I promised that I’ll try to find the the most reasonable  priced ones out of the alternative software that is available out there. I think I managed to do that much and I hope this series will help you choose the right website analytic program for you.

Onestat


It’s amazing how much you can learn about a product from it’s name. Onestat is as dull and pale as it’s chosen name. I didn’t even get to the stage of testing their free trial version – Their interface is mostly built out of tables, minimum if any graphics and even if we take these aspects out of the equation you will still be getting insufficient data. I mean, let’s face it, if you’re going to offer an alternative for Google analytics, and one that actually costs money, you at least need to offer the same if not better. In short – Not one of my favorites. I do want to say one positive thing about them – they offer a live demo, which is a lot more then what some other paid website analytic programs offer.

Metasun


They don’t offer a live demo and their services page doesn’t look very promising. From their screen-shots they do look like they have a nice interface, (which is more than I can say about their logo), but since their features and services doesn’t look like much I didn’t bother testing them. You are welcome to try and add your comment about them if you happen to have the free time for checking them out.

Getclicky


Getcickly are the first ones that actually have some potential in presenting a website analytic data properly. Their user interface is clean and welcoming, they have an edible dashboard, they present all the websites that are in the account on the dashboard with sufficient data about their daily progress. and they offer some cool extra features like live analytic data or twitter spy. I did how ever bump into two problems which could not be solved by their support team:

1. If you have special charecters in your key phrases and you want to issue out an excel report, you get gibrish instead of the word itself. I managed to solve this by exporting the file as xml and than opening it with excel 2007. excel 2007 knows how to convert xml documents to an orgenized table. If you don’t have excel 2007, you can try one of the programs that there are out there for converting xml to excel format. their cost ranges between $30-$20.

2. The second and actually bigger problem I found is that there’s no way to filter information in a way that will provide you a better overview on the visitors stat. For example, in Google analytics there is an option of see in one table all the key phrases and combine it with the landing pages that were reached through these words, or which country they were searched from or browser etc. I thing you get the idea. Now, it’s not that getclicky don’t provide you with this information, but it cannot be combined in one table. No solution for this one and I have to say that I am stalling with buying this platform just because of that.

With all this, getclicky is still one of the best options I could find out there, and I spent all in total about 6-7 hours of searching and testing this issue. I also got quick responses from their sales and support contact people, and you know that is a great advantage. I suggest you try their free version and decide for your self.

I regret that I didn’t have a complete and perfect solutions for you, but I can safely say that I probably saved you some valuable time of testing and searching. I’d appreciate it if you could add your recommendations as a comment or email it to me directly. I promise that I’ll test and write about any Google analytics alternative that I am introduce to. I don’t promise to be kind though. You’ll get only an objective opinion here.

So, for now, this ends this Website analytic programs, or as I like to call it – The Google analytics alternatives series. Have a great week and stay tune for my next post. It’s going to be about developing a Facebook application, something that I am working on right now. So far I can tell you that much, you won’t believe how easy it is once you get the hang of it.

Alternatives for Google analytics – Part 1

Posted on Updated on

Google analytics is probably the best free tool there is today for tracking the performance of your site when it comes to visitors, keywords, conversions and many more important parameters. With this said, some of you may have been asking them selves if exposing their sites to Google analytics is the best strategic decision. For example, you might have several network sites which you use for link building and you would prefer not to place all of these different sites in the same account of GA. You can open different account for each of the sites, but that’s time consuming, not to mention frustrating when you need to collect the data on a daily basis.

I’ve faced this dilemma my self recently and it made me start researching about possible alternatives that will provide with all the data provided by Google analytics, preferably with no, or minimum  fee involved.  These are some of the tools I’ve bumped into and tried. Reading this might save you a lot of time, so I suggest you reading this list very carefully before heading out to your own search.

Free Google analytics alternatives

Piwik

Piwik is an open source analytic tool in PHP format. Now, this doesn’t mean that you necessarily have to know how to write in php. The basic features of the tool are enough for getting a lot of the information you’re getting from Google analytics. The downsize with this tool is that there is no main account dashboard from which you can see and manage multiple sites. Each of your sites will be on a different page with it’s own access details. The access is not very hard, (you just add /piwik to the end of your url), but still, this doesn’t solve the problem of saving time in collecting data from multiple GA account. Most of the other downsized like interface and filtering options can be solved by a gifted php programmer, but than again you have to be one or pay for one.

Statcounter


All I can say is OMG!! and not in a good way. This is one of the most difficult analytic tools that I have encountered. The installation of this tool is time consuming and frustrating and I am not sure it’s worth it. The user interface is boring, to say the least, and the data in the free version is save only 500 user back and up to 250,000 page views a month, and this includes all of your sites on that account. This might be ok for accounts with a couple of small or medium sites, but accounts with many big sites will probably wont be satisfied from this. other than that there are almost no filtering options for crossing data together in the same table.

HiStats


These guys I tried for the longest period of time. They have many advantages – nice user interface, data filtered properly, can have multiple sites and users in one account and the data save for free account is at reasonable numbers. SO what made me give up on them too at the end? Well, let’s start by pointing out their main disadvantage – They don’t communicate with their clients. I have tried to contact them several times and never got any response. Another disadvantage is the fact that they don’t present any data regarding each of the sites in the account on the dashboard. If you want to know what going on with each site you need to go into the actual page of the site to find out. But with all this I think the biggest disadvantage would be the fact that they have many bugs. The monthly presentation of data has stopped working after about 2 weeks of using this analytic tool and it hasn’t returned to function since than, (it’s been more than a month now). So in short, stay away from this one.

Not free site analytic tools

Woofra

Woofra have a nice user interface and I couldn’t really find any disadvantages other that they are not free and not cheep either. If you have one or two sites you can probably afford them, but if you have, like me, more than a hundred sites you need to follow, the costs can be high. They do offer a 3 months free trial with no reports

Goingup

I’ve tested goingup for only about a week. I stopped using it because I noticed that they mess with the site’s source. I happened to do some cloaking checking just about in the same time of trying goingup and I noticed that their code is constantly changing the string length of the page. Now this does not necessarily means that something bad is going to happen, but I don’t take chances with things like cloaking.

This is it for the first part. The second part will contain mostly not free site analytic tools, (since I can’t find any good free ones so far) but I will try to present the cheapest or most reasonable priced ones I’ll find.

Cloaking definition

Posted on Updated on

Cloaking is a black hat SEO method that makes a content site to appear different for users and search bots.

Unintentional Cloaking

Posted on Updated on

Any one who’s knows anything about SEO will tell you that getting your site accused, or even suspected in cloaking is probably the worst thing you can do to your site when it comes to appearing in Google.

If Google consider a site as a cloaker it will most likely disappear from all search results, including a search query with the site’s full.

*If you don’t know what cloaking is visit our SEO dictionary for cloaking definition.

The problem is there are many reason for a site to be consider as performing cloaking which you are not even aware of.

These are some that I’ve bumped into so far. If you know any others you are welcome to add it in the comments below.

Random tag cloud:

Now I had experience with wordpress’s and joomla’s  random tag cloud widget, so I can only refer to them in this post. The random tags are making sure that different tags will appear in the same page for every server request for that page. The problem is that cloaking is checked by the string length in the page. If the string length is different with every server request than this might be suspected as presenting a different content for users and for search bots.

Random content plugin:

Also checked only in Joomla and WordPress, but I assume that is will result with the same outcome for other random content system. It has the same effect as the random tag cloud has on the page, (see above).

I have tried to get Google’s response for this but hadn’t have that much luck with this so far.

Sire Analytic code:

I have search for several alternatives for Google analytics and during my testing of some of these alternatives have bumped into some that causes a different string length in each server request for a page.

What to do If  I am suspected in cloaking:

Well, if you used the tools below and found that you get a different String length for bots and users I suggest you do the following:

  1. Download some kind of a compare file software. There are platy our there that are free. I use ‘Beyond compare’.
  2. Copy the page you’re checking source code
  3. Refresh the page and copy it’s source code again.
  4. Run the compare and see where the difference is.
  5. Fix the problem by taking out what ever widget or plugin that causes the difference.
  6. Test it again.

With all this said, I would like to emphasize that I’ve run these cloaking tools on sites like – amazon.com and got the same suspicious result, so I doubt there’s a real risk in this. But, on the other hand, Google might be using the string length on a web page as a first indication and than it comes back with some more testings. If your site is not totally Kosher, you might get a little slapping. So, in other words, if you don’t want special attention from Google, try to avoid things that will make Google check you with a microscope.

These are some free tools for checking your site for cloaking:

http://web-tool.org/cloak-check/cloak-check.asp

http://best-seo-tools.net/cloaking/

And this is the question I’ve opened in Google Web Master tools forum:

http://www.google.com/support/forum/p/Webmasters/thread?tid=317de42a132d5dcb&hl=en&fid=317de42a132d5dcb00047f29c6f5bcd4

Open Site Explorer

Posted on Updated on

SEOmoz.org has released an new free tool for checking up on a domain link profile.

The tool released it on the 22nd of January and are promoting it very furiously.

They have parameters like – Page authority which – according to them,  

Predicts this page’s ranking potential in the search engines based on an algorithmic combination of all link metrics. Page Authority scores are on a 100-point, logarithmic scale.

Domain authority, which is supposed to predict the domain ranking chances by relying on the incoming links to the site, or in their words:

Predicts this domain’s ranking potential in the search engines based on an algorithmic combination of all link metrics. Domain Authority scores are on a 100-point, logarithmic scale.

This tool also filters results by follow, no follow, 301 links, internal and external links, to domain only or to all pages and subdomains too.

They also filter out information about each incoming link and their authority rank. This way you can learn not only how many links are pointing to a site but also about their quality rank.

Most of you know that you can have some of this information using Yahoo.com search results by placing your domain in the search engine query in this form: link:www.domainname.com

But sometimes this gets stuck and you can’t filter out the follow and nofollow links.

In the mean time I am very excited about this new tool and hope not to get disappointed of it.

Direct access to the Open site explorer: http://www.opensiteexplorer.org

RDFa WordPress Plugin Example

Posted on Updated on

I’ve uploaded the RDFa WordPress plugin on 2 of my sites now and am waiting to see the affect.

One of them has gone up 6 pages one day after I’ve activated the plugin, but I can’t say for sure yet that it has anything to do with it. This is why I’ve decided to try in on another site that is fairly new and has not yet got a lot of link building work on it, as apposed to the first site I’ve tested the RDFa plugin on.

So far I’ve noticed that the RDFa plugin affects only post pages. For example, (See marked sections below:

<span property="dc:date" content="2010-01-28 02:50:55"
resource="http://www.winner-takes-all.com/sophon-sek-arrested-for-murder/" />
<span rel="http://www.winner-takes-all.com/sophon-sek-arrested-for-murder/" 
property="dc:title" resource="http://www.winner-takes-all.com/sophon-sek-arrested-for-murder/">
Sophon Sek – Shock in the Poker Industry</span>">

The Plugin also adds this meta tag to the <head> section:

<link rel="meta" href="./wp-content/plugins/wp-rdfa/foaf.php"
type="application/rdf+xml" title="FOAF"/>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:dc="http://purl.org/dc/elements/1.1/">
<link rel="EditURI" type="application/rsd+xml" title="RSD"
href="http://www.winner-takes-all.com/xmlrpc.php?rsd" />

This can give you a good indication of how to implement RDFa manually.
Isn’t open source a great thing?

As you can see the RDFa os simply about adding identifiers to some areas in the content such as dates, main topics etc.
Although this blog is not about RDFa but about SEO in general, I am now in the process of learning about RDFa and testing it and so I started a series of articles about RDFa to summarize what I’ve learned so far.

RDFa – second step

Posted on Updated on

If you’re new to RDFa and trying to figure what it is and how to use it for your advantage in your websites, you can find most of your answers in this blog. This article is the second, in a series of, (not really sure at the moment how many there are going to be). If y0u’ve reached this page first I suggest you read the first part of this series called – RDFa – the beginning.

So after we’ve figured out that is RDF, what is it’s connection to SEO and how to use it in html pages we can proceed with the actual way of implementing this wonderful and exiting tool.

The easy way is to use to RDFa Word Press pluging.

For those of you who are willing to put some more effort and learn a little more about RDFa uses in SEO keep on reading.

First thing I should mention is I haven’t yet found a way to place RDFa in wordpress pages. The system doesn’t save this format for some reason. I’ve written about it to WP and am waiting for a response. i’ll update this blog once I get it.

In the mean time, let’s just manage with what we have.

When using RDFa you need to start with the <div> tag and these attributes in it:

<div xmlns:v="http://rdf.data-vocabulary.org/#" typeof="v:the type you need for this content">

The Google webmaster tool example for this is for a reviews page and they use this tag:

<div xmlns:v="http://rdf.data-vocabulary.org/#" typeof="v:Review">

After that there are different attribute for different uses:
For stating that this is the item being reviewed you place this tag:

<span property="v:itemreviewed">Item reviewed - could be anything from hotels to cameras</span> 

For stating that this is the actual review you place this tag:

<span property="v:description">Your review for this item</span>

If you want to add a date for this review writing you do it like this:

<span property="v:dtreviewed">month, day, year</span>

And so on. As you can see you can place the RDFa attribute in any tag that can contain style attributes. Also the context is pretty intuitive and easy to understand.
Once you see it in use you can maybe understand how it help search engines’ bots to categories your content more accurately.

W3.org has issued out a large list of attributes that can be used as RDFa pointers. you can find the list here: http://www.w3.org/MarkUp/2009/rdfa-for-html-authors

As you can see this is not a very complicated thing to learn. I simplified it as much as possible and I am sure there are many more things to write about this RDFa, which I might write in the near future, but in the mean time this is it.
The only thing I still owe you is WordPress’s answer about using these attribute in WordPress blogs.
I know that there is a little web talking about writing a plugin that will add a RDFa button to the contentbox menu, but I haven’t bumped into anything like that yet.

In the mean time, these are some extra RDFa resources:
SitePoint article about Google and RDFa
RDFa add-on for firefox