Top SEO Experts Chart



Expert (Google pages)
(alphabetical list)
Nickname
Website/
Blog
(SEJ 2005 Rank)
Location
Favorite
Forum
Style
1
2- Aaron Wall (276K)
seobook (370K)
seobook
www.seobook.com (2)
www.threadwatch.org
State College, PA SEW Forums white hat
2
Ammon Johns (11K)
Black_Knight
www.e-marketing-news.co.uk
Brighton, England
cre8asite gray hat
3
Andy Beal (259K)
-
www.marketingpilgrim.com
Raleigh, NC - white hat
4
Andy Hagans (52K)
Andy Hagans
www.andyhagans.com
www.linkbuildingblog.com
Cincinnati, OH - white hat
5
Barry Schwartz (387K)
rustybrick (188K)
rustybrick
www.rustybrick.com
Suffern, NY seroundtable white hat
6
Barry Welford (17K)
bwelford
www.cre8asiteforums.com
Montreal, QC, Can.
cre8asite white hat
7
Ben Pheiffer (30)
Phoenix
www.ranksmart.com
Austin, TX SEOChat white hat
8
Bernard nolastname
Bernard
www.measuring-up.com
Friendswood, TX SEOChat white hat
9
Brad Callen (86K)
-
www,seoelite.com
Indianapolis, IN - white hat
10
Brad Fallon (79K)
-
www.bradfallon.com
Atlanta, GA - white hat
11
Brett Tabke (89K)
Brett_Tabke
www.webmasterworld.com
Austin, TX wmw.com white hat


Bruce Clay (162K)
-
www.bruceclay.com
Simi Valley, CA orgamic listings white hat
12
Chris Boggs (33K)
Chris Boggs
www.avenuea-razorfish.com
Baltimore, MD SEW Forums white hat
13
Danny Sullivan (699K)
dannysullivan
searchengineland.com
daggle.com
Salisbury, England
SEW Forums white hat
14
David Naylor (178K)
DaveN
www.davidnaylor.co.uk
UK
SEW Forums gray hat
15
Donna Fontenot (646)
dazzlindonna (40K)
dazzlindonna
www.seo-scoop.com
Franklinton, LA SEO Refugee white hat
16
8- Erik Dafforn (78K)
-
intrapromote.com
seo speedwagon (8)
Stow, OH - white hat
17
5- Greg Boser (55K)
webguerrilla
www.webguerrilla.com (5)
Newhall, CA SEO Rock gray hat
18
Jakob Nielsen (1.4M)
-
www.useit.com
Fremont, CA - white hat
19
Jennifer Slegg (36K)
jensense (155K)
jensense
www.jensense.com
Victoria, BC, Can.
wmw.com white hat
20
Jill Whalen (165K)
Jill Whalen
www.highrankings.com
Ashland, MA SEW Forums white hat
21
6- Jim Boykin (141K)
webuildpages
www.webuildpages.com
www.jimboykin.blog (6)
Troy, NY (dp).forums white hat
22
Lee Odden (112K)
toprank
TopRank Results
TopRank.blog
Mound, MN SEW Forums
seroundtable
white hat
23
1- Matt Cutts (1.8M)
matt
www.google.com
www.matcutts.com/Blog (1)
Mountain View, CA - white hat
24
Michael Gray (854K)
Graywolf
www.wolf-howl.com
New York, NY - white hat
25
Mike Grehan (153K)
-
www.search-engine-book.co.uk
www.mikegrehan.com
UK
- white hat
26
Morgan Carey (23K)
seo_guy
www.seo-guy.com
BC, Canada
SEOChat white hat
27
Nick Wilson (199K)
nick
performancing.com
performancing.com/blog
Denmark
- white hat
28
3- Rand Fishkin (100K)
randfish
www.seomoz.org
www.seomoz..org/blog (3)
Seattle, WA seroundtable
SEW Forums
white hat
29
Roger Montti (2K)
martinibuster (806K)
martinibuster
www.martinibuster.com
San Francisco, CA WmW.com white hat
30
Shari Thurow (137K)
-
www.searchenginesbook.com
Carpentersville, IL - white hat
31
Shawn Hogan (62K)
Shawn
www.digitalpoint.com
San Diego, CA seroundtable white hat
32
Todd Friesen (20K)
oilman
(711K)
oilman
www.oilman.ca
Victoria, BC, Can.
SEO Rock gray hat
33
4- Todd Malicoat (58K)
stuntdubl (370K)
stuntdubl
www.stuntdubl.com (4)
Troy, NY wmw.com white hat
34
??? ???
Earl Grey
www.syndk8.net
Ireland-?
- black hat
35
7- ??? ???
quadzilla (247K)
quadzilla
seoblackhat.com (7)
East Teknopolis-?
seoblackhat.com black hat
20 170 90 200 150 100 50
36-51

Google XML Data Feed Optimization Tips

XML Data Feed Optimization Tips

· Add products to the appropriate category

· Always include searchable keywords in product description.

· Make use of custom/optional fields for better visibility on CSEs(utilize it for additional product attributes)

· Add relevant keywords to the product title and description.

· Make sure that you have provided active & correct link of product landing page.

· Monitor competitors prices frequently & strategically decide the sale price. Conduct Price Analytics

· Provide attractive offers like Free Shipping, Discount sale, Clearance sale, coupon & rebates etc...

· Understand Data feed requirements of each CSE while creating, optimizing & submitting the product data feed to them.

· Drop down non-Performing products

· Remove out of stock products

· Conduct FDSI (Features, Description, Specifications, Images) comparison analysis with yours and other merchants in the shopping engine

· Give correct product image

· Conduct A/B testing and track which ad works best for you. Click here for A/B testing basics

· Figure out ROI from each comparison engine and turn off if any specific engine is not performing well.

· Review and submit shopping feed frequently.

· If there is high a CTR (Click through rate) from CSE with no returns/sales there might the following possible problems related to your feeds and/or landing page.



Wrong Placement of Product in the shopping engine will also yield tons of irrelevant clicks...

· A Shopper might come to the retailer site with an intention to learn about the product.

· A slight difference in the title and description will discourage users to buy products.

· Links do not take users to the correct landing page.

· Unreliable Merchant - (Look and Feel is not good, Not a secure website, Bad information structuring, invisible or wrong placement of shopping cart, etc)

· Product specification mismatch

· Price mismatch

· Shipping information mismatch

· Long checkout process

Site Usage: General but Important Terms of Analytics

Let's go to know about some of important terms generally we need to see in web analytics. These terms describe about whole site position in brief summary.

Visits: When a user request from www server then it is called Visit. Therefore we can say Visits represent the total number of times people have visited your web site. There is an important point to note that When a user is not active over 30 minutes on the server, another request is recognized as an another visit.

Page Views: The total number of distinct pages seen by users with in a time frame after visiting on site. For example, if a Web site contains 100 pages and an estimated 1000 visitors per month, the site receives approximately 100000 monthly page views (100X1000=100000). Page Views is also known as page impressions.

% Pages/Visit: The average of pages seen per visit. It tells that how many pages seen by a user or per visit.

% Bounce Rate: The percentage of users or visitors who arrive at a web site entry page, then leave without visiting or seen other pages of website. Means to say user leaves the site without going into deep page of site.

Avg. Time on Site: Counted estimated time of presence of visitor on site. Approx for 30 minutes only. After idle 30 minutes it is counted new visit.

% New Visits: It shows the percentage of actual new visits of visitors on website.

Quick security checklist for webmasters

In recent months, there's been a noticeable increase in the number of compromised websites around the web. One explanation is that people are resorting to hacking sites in order to distribute malware or attempt to spam search results. Regardless of the reason, it's a great time for all of us to review helpful webmaster security tips.

Obligatory disclaimer: While we've collected tips and pointers below, and we encourage webmasters to "please try the following at home," this is by no means an exhaustive list for your website's security. We hope it's useful, but we recommend that you conduct more thorough research as well.

* Check your server configuration.

Apache has some security configuration tips on their site and Microsoft has some tech center resources for IIS on theirs. Some of these tips include information on directory permissions, server side includes, authentication and encryption.

* Stay up-to-date with the latest software updates and patches.

A common pitfall for many webmasters is to install a forum or blog on their website and then forget about it. Much like taking your car in for a tune-up, it's important to make sure you have all the latest updates for any software program you have installed. Need some tips? Blogger Mark Blair has a few good ones, including making a list of all the software and plug-ins used for your website and keeping track of the version numbers and updates. He also suggests taking advantage of any feeds their websites may provide.

* Regularly keep an eye on your log files.

Making this a habit has many great benefits, one of which is added security. You might be surprised with what you find.

* Check your site for common vulnerabilities.

Avoid having directories with open permissions. This is almost like leaving the front door to your home wide open, with a door mat that reads "Come on in and help yourself!" Also check for any XSS (cross-site scripting) and SQL injection vulnerabilities. Finally, choose good passwords. The Gmail support center has some good guidelines to follow, which can be helpful for choosing passwords in general.

* Be wary of third-party content providers.

If you're considering installing an application provided by a third party, such as a widget, counter, ad network, or webstat service, be sure to exercise due diligence. While there are lots of great third-party content on the web, it's also possible for providers to use these applications to push exploits, such as dangerous scripts, towards your visitors. Make sure the application is created by a reputable source. Do they have a legitimate website with support and contact information? Have other webmasters used the service?

* Try a Google site: search to see what's indexed.

This may seem a bit obvious, but it's commonly overlooked. It's always a good idea to do a sanity check and make sure things look normal. If you're not already familiar with the site: search operator, it's a way for you to restrict your search to a specific site. For example, the search site:googleblog.blogspot.com will only return results from the Official Google Blog.

* Use Google's Webmaster Tools.

They're free, and include all kinds of good stuff like a site status wizard and tools for managing how Googlebot crawls your site. Another nice feature is that if Google believes your site has been hacked to host malware, our webmaster console will show more detailed information, such as a sample of harmful URLs. Once you think the malware is removed, you then can request a reevaluation through Webmaster Tools.

* Use secure protocols.

SSH and SFTP should be used for data transfer, rather than plain text protocols such as telnet or FTP. SSH and SFTP use encryption and are much safer. For this and many other useful tips, check out StopBadware.org's Tips for Cleaning and Securing Your Website.

* Read the Google Online Security Blog.

Here's some great content about online security and safety with pointers to lots of useful resources. It's a good one to add to your Google Reader feeds. :)

* Contact your hosting company for support.

Most hosting companies have helpful and responsive support groups. If you think something may be wrong, or you simply want to make sure you're in the know, visit their website or give 'em a call.

We hope you find these tips helpful. If you have some of your own tips you'd like to share, feel free to leave a comment below or start a discussion in the Google Webmaster Help group. Practice safe webmastering!

7 must-read Webmaster Central blog posts

Our search quality and Webmaster Central teams love helping webmasters solve problems. But since we can't be in all places at all times answering all questions, we also try hard to show you how to help yourself. We put a lot of work into providing documentation and blog posts to answer your questions and guide you through the data and tools we provide, and we're constantly looking for ways to improve the visibility of that information.

While I always encourage people to search our Help Center and blog for answers, there are a few articles in particular to which I'm constantly referring people. Some are recent and some are buried in years' worth of archives, but each is worth a read:

1. Googlebot can't access my website
Web hosters seem to be getting more aggressive about blocking spam bots and aggressive crawlers from their servers, which is generally a good thing; however, sometimes they also block Googlebot without knowing it. If you or your hoster are "allowing" Googlebot through by whitelisting Googlebot IP addresses, you may still be blocking some of our IPs without knowing it (since our full IP list isn't public, for reasons explained in the post). In order to be sure you're allowing Googlebot access to your site, use the method in this blog post to verify whether a crawler is Googlebot.
2. URL blocked by robots.txt
Sometimes the web crawl section of Webmaster Tools reports a URL as "blocked by robots.txt", but your robots.txt file doesn't seem to block crawling of that URL. Check out this list of troubleshooting tips, especially the part about redirects. This thread from our Help Group also explains why you may see discrepancies between our web crawl error reports and our robots.txt analysis tool.
3. Why was my URL removal request denied?
(Okay, I'm cheating a little: this one is a Help Center article and not a blog post.) In order to remove a URL from Google search results you need to first put something in place that will prevent Googlebot from simply picking that URL up again the next time it crawls your site. This may be a 404 (or 410) status code, a noindex meta tag, or a robots.txt file, depending on what type of removal request you're submitting. Follow the directions in this article and you should be good to go.
4. Flash best practices
Flash continues to be a hot topic for webmasters interested in making visually complex content accessible to search engines. In this post Bergy, our resident Flash expert, outlines best practices for working with Flash.
5. The supplemental index
The "supplemental index" was a big topic of conversation in 2007, and it seems some webmasters are still worried about it. Instead of worrying, point your browser to this post on how we now search our entire index for every query.
6. Duplicate content
Duplicate content—another perennial concern of webmasters. This post talks in detail about duplicate content caused by URL parameters, and also references Adam's previous post on deftly dealing with duplicate content, which gives lots of good suggestions on how to avoid or mitigate problems caused by duplicate content.
7. Sitemaps FAQs
This post answers the most frequent questions we get about Sitemaps. And I'm not just saying it's great because I posted it. :-)

Sometimes, knowing how to find existing information is the biggest barrier to getting a question answered. So try searching our blog, Help Center and Help Group next time you have a question, and please let us know if you can't find a piece of information that you think should be there!

Taking advantage of universal search

we unveiled exciting changes in our search results. With universal search, we've begun blending results from more than just the web in order to provide the most relevant and useful results possible. In addition to web pages, for instance, the search results may include video, news, images, maps, and books. Over time, we'll continue to enhance this blending so that searchers can get the exact information they need right from the search results.

This is great news for the searcher, but what does it mean for you, the webmaster? It's great news for you as well. Many people do their searches from web search and aren't aware of our many other tools to search for images, news, videos, maps, and books. Since more of those results may now be returned in web search, if you have content that is returned in these others searches, more potential visitors may see your results.

Want to make sure you're taking full advantage of universal search? Here are some tips:

Google News results
If your site includes news content, you can, submit your site for inclusion in Google News. Once your site is included, you can let us know about your latest articles by submitting a News Sitemap. (Note News Sitemaps are currently available for English sites only.)

News Archive results
If you have historical news content (available for free or by subscription), you can submit it for inclusion in News Archive Search.

Image results
If your site includes images, you can opt-in to enhanced Image search in webmaster tools, which will enable us to gather additional metadata about your images using our Image Labeler. This helps us return your images for the most relevant queries. Also ensure that you are fully taking advantage of the images on your site.

Local results
If your site is for a business in a particular geographic location, you can provide information to us using our Local Business Center. By providing this information, you can help us provide the best, locally relevant results to searchers both in web search and on Google Maps.

Video results
If you have video content, you can host it on Google Video, YouTube, or a number of other video hosting providers. If the video is a relevant result for the query, searchers can play the video directly from the search results page (for Google Video and YouTube) or can view a thumbnail of the video then click over to the player for other hosting providers. You can easily upload videos to Google Video or to YouTube.

Our goal with universal search is to provide most relevant and useful results, so for those of you who want to connect to visitors via search, our best advice remains the same: create valuable, unique content that is exactly what searchers are looking for.

Taking advantage of universal search

Universal search and personalized search were two of the hot topics at SMX West last month. Many webmasters wanted to know how these evolutions in search influence the way their content appears in search results, and how they can use these features to gain more relevant search traffic. We posted several recommendations on how to take advantage of universal search last year. Here are a few additional tips:

1. Local search: Help nearby searchers find your business.
Of the various search verticals, local search was the one we heard the most questions about. Here are a few tips to help business owners get the most out of local search:
* Add your business to our Local Business Center
* Make sure your business is listed under the appropriate categories
* Make sure your business name, address and phone number appear on your website, and that they're accessible to search engines
* Add your business hours, images, coupons, or additional information to help searchers determine whether your business meets their needs
2. Video search: Enhance your video results.
Several site owners asked whether they could specify a preferred thumbnail image for videos when they appear in search results. Good news: our Video Sitemaps protocol lets you suggest a thumbnail for each video.
3. Personalized search basics
A few observations from Googler Phil McDonnell:
* Personalization of search results is usually accomplished through subtle ranking changes, rather than a drastic rearrangement of results. You shouldn't worry about personalization radically altering your site's ranking for a particular query.
* Targeting a niche, or filling a very specific need, may be a good way to stand out in personalized results. For example, rather than creating a site about "music," you could create a site about the musical history of Haiti. Or about musicians who recorded with Elton John between 1969-1979.
* Some personalization is based on the geographic location of the searcher; for example, a user searching for [needle] in Seattle is more likely to get search results about the Space Needle than, say, a searcher in Florida. Take advantage of features like Local Business Center and geographic targeting to let us know whether your website is especially relevant to searchers in a particular location.
* As always, create interesting, unique and compelling content or tools.
4. Image search: Increase your visibility.
One panelist presented a case study in which a client's images were being filtered out of search results by SafeSearch because they had been classified as explicit. If you find yourself in this situation and believe your site should not be filtered by SafeSearch, use this contact form to let us know. Select the Report a problem > Inappropriate or irrelevant search results option and describe your situation.

Building SEO Momentum by Using A Consistent Site Structure

Change. It is a part of life, especially on the web. Evolve or die.

But some things need not change to be successful. In some cases change undermines your momentum, particularly in the field of search, where most of the traffic goes to the top couple ranked sites.

One of the biggest problems in the field of SEO for enterprise-level sites is content management. Product lines, editorial calendars, marketing, and content management systems often dictate that pieces and parts of a site are organized in a sub-optimal way and/or move locations.

Back in 1998 Tim Berners-Lee stated that Cool URIs don’t change:

There are no reasons at all in theory for people to change URIs (or stop maintaining documents), but millions of reasons in practice.

In theory, the domain name space owner owns the domain name space and therefore all URIs in it. Except insolvency, nothing prevents the domain name owner from keeping the name. And in theory the URI space under your domain name is totally under your control, so you can make it as stable as you like. Pretty much the only good reason for a document to disappear from the Web is that the company which owned the domain name went out of business or can no longer afford to keep the server running. Then why are there so many dangling links in the world? Part of it is just lack of forethought.

The 10 year old document is as relevant today as the day it was published. And perhaps even more so since search is the primary mode of navigation on the web.

Map it Out in Advance

One of the easiest ways to avoid site decay is to plan your site out in advance. When you know where something belongs and have given it proper foresight its half-life is much greater than a site built nearly at random. The following image shows an example of how you can plan out URLs, page titles, meta descriptions, on page headings, and related keyword modifiers to include in the page.

Archiving & Static Hub Sections

Many publishing based business models cover a seasonal topic, like Christmas or the Coachella music festival. Each year those publishers could create a new section with a filename like christmas2009, but doing so may make it harder to rank for core generic phrases like Christmas. When a person links to a very specific URL it does not consolidate PageRank and anchor text between years. So in a zen-like fashion changing URLs each year throws away your old anchor text and has you starting from scratch again.

A better solution is to use a filename like christmas, and each year update it to talk about the current year, while archiving the default page for the prior year as something like christmas/2009/. In doing this, the core URL (yoursite.com/christmas) keeps building PageRank and anchor text each year, building off of last year’s success. Old content remains archived, and can be easily findable by linking to archives from the core page.

If you wanted to combine many core related URLs like christmas2008 and christmas2007 into 1 URL you can do so through the use of 301 redirects. Keep in mind that you want to think through what URLs you want changed, which ones you do not want changed, and then use a server header checker to verify the proper response codes.

Reclaiming Lost Link Equity

If someone is linking to a page that no longer exists on your site you are wasting link equity and traffic. Some content management systems offer features or extensions that can be used to track 404s and other errors. Drupal offers an error log and this Wordpress redirection plug in shows you pages people attempted to visit that returned a 404 status code. Your server logs should also help locate any pages with 404 status codes.

Once you discover a well linked to URL that no longer exists you have a couple options:

* Look through your back ups and/or Archive.org to see if any copies of the page’s content are still available. If the content was of high quality and you are uncertain why it went away then consider republishing the content on the same URL.
* If the page moved or the content is no longer relevant to your site, then 301 redirect the URL to a related page.

Google offers two tools to help you reclaim misdirected traffic and link popularity that is being wasted on dead URLs. They offer a 404 widget which can be used to help site users find related content on your site, which is useful for when someone makes a typo when typing out your filenames. Google Webmaster Central allows you to find 404 pages on your site that webmasters are linking to.