Alexa 140Are you one of those bloggers that always has a nagging thought in the back of her mind that goes something like this: “I really need to do something about optimizing my site so search engines can find it”?  And are you also one that finds, when they do look into search engine optimization (SEO), that the whole process seems bewildering, laborious, and, well, dubious as well?

I’ve had a site since 2003 – you can find it here – that I launched when optimization was a far different animal.  Google’s algorithm was a lot simpler (get enough valid links in, and you’d start to rise in the rankings), and there were many billions fewer Web pages to compete with. Between that and the fact that the site focused on a narrow and fairly obscure topic, I was able to build traffic up to over a million page views a month within a year or so, and haven’t really had to give a thought to optimization ever since.

Now I’ve got this author site, and it’s a completely different story. Leaving for a later date whether you should even care, how in the world can just another genre writer expect to be found by a search engine today?

Well, there’s some good news and some bad news on that front. The bad news is that she can’t, at least not by exploiting SEO. If your general site development and other marketing efforts are successful, though, then your discoverability will rise as well.

Why is that? Mostly because Google’s algorithm has changed in ways that make it much more likely now to lead a user to the good, as compared to the simply optimized, stuff. That’s great news on several fronts. First, it means that most of those laborious, dubious, artificial, traditional SEO tasks, like liberally salting your site with keywords, are now a lot less likely to do you much good, so you’ve now got a great excuse to keep procrastinating. Second, it means that the playing field has become more level, with two sites of equal significance being more equally discoverable, regardless of the fact that one site owner is spending a lot of time and money on SEO and the other isn’t.

I’ve been hearing about the change to Google’s algorithm for some time, but never had a way to tell how much or how little it might actually mean to an author. Today, though, I accidentally created a graphic that really brought the impact of Google search changes into focus.

What I was up to was checking out the traffic of one of the sites (AuthorsDen.com) where I set up an author page some time back, and where I had also signed up to pay a $99 annual fee for greater visibility. While the site itself has a pretty out of date look and feel, it’s got wonderful author tools, almost instantaneous support from nice and helpful staff, and upgrade plans (including otherwise free ads) that make a lot of sense – just about everything that you’d want to find – except for awesome traffic. So the question was whether or not the site has enough traffic to justify my renewing. I can tell from the available tools how many page views and clicks my book and author pages received, but which way was the site’s traffic trending?

AuthorsDenTrafficIITo find out, I went to Alexa.com, where for free you can plug the address of any Web site in the world (including your own) into the window you’ll see in the upper right hand corner of the landing page. A moment later, you’ll find out a range of interesting data, including how many unique visitors the site in question receives. The numbers for AuthorsDen (as listed and also as graphed in the chart at left) looked better than expected: it’s global rank today is 134, 751, and its US rank is 73, 582, with 77% of the traffic coming from the US, Canada, UK and India. While that may not sound like a rank to die for, consider how many hundreds of millions of web sites now exist.

GoodReads TrafficStill, in order to take advantage of AuthorsDen, you do have to spend some time there minding the store. So how does this site compare to the 900 pound gorilla of book sites – GoodReads?  The answer was only a few clicks away at Alexa, and again you can see the results at left (the full results are here). That’s quite a trendline, isn’t it? GoodRead’s global rank today stands at 248, and its US rank at 138. That’s extremely impressive, especially when you compare it to a site like that of The New York Times, which boasts a 119 global, and 138 US rankings today. Obviously, all other things being equal, time, effort and dollars spent at GoodReads have the potential to reap a much higher return.

GoodReads.graph.png 237 Interestingly enough, though, I noticed some other data on these Alexa pages that were extremely interesting, which brings us back to Google and the effectiveness of SEO. Specifically, Alexa also shows you data on how many search results bring visitors to the sites in question. And when you compare the charts at left for GoodReads (top) and AuthorsDen (bottom), you see a radically different picture. Moreover,  if you look again at the first two charts above, you can see that GoodReads rank (measured by total unique visitors) jumped when the Google algorithm changed and continued to rises, while AuthorsDen’s plummeted, and continued to fall.

AuthorsDen Search 233Clearly, whatever GoodReads is doing with its site is being valued much more highly under the new Google algorithm than AuthorsDen’s efforts. To further confuse matters, if you check out Smashwords’ numbers on Alexa, it’s hard to note any impact from the algorithm change at all. And for that matter, take another glance at the 2012 numbers for GoodReads and AuthorsDen on the charts we just looked at. Both charts show a bit jump around late February. You can guess the cause, but note that last time, both sites benefited from the algorithm changes.

So what do you do with that?What I take away from this is that if the success of three different book sites to maximize search visibility varies so widely, yours trulyisn’t likely to be able to do any better. So why spend my limited time trying to play SEO tricks, as compared to adding content and expanding direct connections with people who are into books?

For my part, the answer is clear: I’ll stick to putting out quality content and establishing relationships. And when it comes to spending my time externally using the host of author sites that continue to spring up, I’ll continue to use Alexa as one source of data to help decide which three or four sites are most worth focusing my time, effort, and sometimes small amounts of cash on.  How about you?

Update 4/19/14: I just typed in the search “John McPhee,” and the third entry – out of 9,260,000 – was this recent entry I posted at this blog:  http://updegrove.wordpress.com/2014/04/12/review-blue-bloods-by-ian-frazier-the-new-yorker/

Now that is truly weird. The only possible explanation for that post ranking that high is that Google’s new algorithm must be giving it high relevance because of the thousands of pages I’ve written here: http://www.consortiuminfo.org/ and the number of links back to them from other sites.

Have you discovered The Alexandria Project?

%d bloggers like this: