Traffic Distribution by Google Ranking

SEO Sales Process: Overcoming Common SEO Objections

1. Search Engines Will Find Us/We Already Rank

Sure. Under what keyword terms? How much of the site are the spiders missing?
There is a big difference between arbitrary ranking in search engine listings, and ranking for focused keyword terms. Demonstrate to the client the value of appearing under a wide variety of targeted keyword terms, as opposed to this being a random process. It is like the difference between advertising where few people are looking, as opposed to appearing on a string of billboards in prominent locations.
You could do a side by side comparison between the client and a more established competitor using Compete.com graphs. If they already rank for valuable terms, try to get them to track the business derived from those rankings, and show them the upside potential of increasing rank.

2. We'll Have To Redesign Our Site. That Costs Money

Quite possibly.
Try to demonstrate to the client that the potential benefits outweigh the costs. One way to price organic search traffic is to use the PPC prices as a guide. It could also be argued that organic listings have a higher trust level amongst users, making the traffic potentially even more valuable.
So how much is that poor design costing them in terms of lost opportunity?

3. SEO is Expensive

A common objection, usually made because the client can't determine the amount of work required, or the the value added.
Break down the work into separate tasks, and outline how long each task is likely to take. If the client knows your rate per hour, then they will be more able to determine if the cost is fair.
For example:
  • Industry analysis - research industry sector, marketing and sales trends.
  • Competition analysis - conduct review of competitor sites
  • Keyword research - research keyword terms
  • Site optimization, including title tags, meta tags, copy and internal linking
  • Link building/directory submission/social media promotion
  • Monitoring and reporting
Another aspect of this objection has to do with the value proposition. Again, try printing out the PPC bid prices for the same keyword traffic, and show how your work effectively undercuts that price. If you can, try and get information about how much the client spends on other channels, and do a side by side comparison of the relative merits, costs and benefits.

4. Upper management Won't Support It

Perhaps you need to be talking to the decision maker ;)
Ask what upper-managements objections would be? Sometimes this objection is legitimate, but it is often used to avoid having to tell you "no, thanks". The client cites an authority, who isn't present, implying that any further negotiations with the client will prove fruitless.

5. Why Should We Change The Way We Write Just For Search Engines?

This objection is commonly used by copywriters and journalists.
Established writers often use methodologies that don't take into account SEO. One way to get around this objection is to request a trail run on a few test pages. Once you're demonstrated that writing effective copy can result in an increase in visitors and conversions, you'll have more sway when it comes to changing the rest of the site.
Also, appeal to the copywriters vanity. If more people see their work, isn't that a good thing?
Cite "This Boring Headline Is Written for Google", an article about how The New York Times changed their writing practices to accommodate SEO.
"We're all struggling and experimenting with how news is presented in the future," said Larry Kramer, president of CBS Digital Media. "And there's nothing wrong with search engine optimization as long as it doesn't interfere with news judgment. It shouldn't, and it's up to us to make sure it doesn't. But it is a tool that is part of being effective in this medium."

6. SEO Doesn't Work. It's A Scam!

Ask the client why they feel this way. Has the client had dealings with SEOs in the past? Seen some bad press?
Have case studies on hand that demonstrate how you've solved search marketing problems in the past. Also provide recommendations from previous clients who were happy with your work.
Reframe the debate in terms of problems and solutions.

7. We Have A Strong Brand, So We Don't Need SEO

This is true, so long as people only search on the brand.
But what about those searchers who are searching for generic product/service names?
I once had this objection from a well-known childrens' clothes retailer. I ran a few search reports on generic searches, such as kids t-shirt, babywear, etc, and showed the client the traffic numbers. I then showed the client that their site wasn't appearing under any of those terms.
But her competitors were.
Why choose one or the other when you could easily have both?

8. We Like Flash. It's Cool!

Run away. Run fast..... ;)
Seriously though, such objections usually come from designers who place a lot of emphasis on site appearance, or want to play with the latest toys.
In the past, I've approached this in one of two ways. If they want to keep designing in Flash, or other technologies that make crawling and linking difficult, then suggest workarounds that don't affect the design. For example, create a print-friendly version of the site. This is the part of the site that gets crawled and seen by search engines and search visitors, while the designers can still focus on their elaborate designs. Essentially, you create a site within a site.
Show them that their competitors outrank them, in part, by using different technology. Is Flash really worth that competitive disadvantage?
From Google AdWords Blog:
Did you know that 20% of the queries Google receives each day are ones we haven’t seen in at least 90 days, if at all? With that kind of unpredictable search behavior, it's extremely difficult to create a keyword list that covers all relevant queries using only exact match."
It's even harder to capture that traffic using Flash.
BTW: Check out this example. Here is the spider's view of McDonalds.com.

9. Are SEO Services Really That Important?

Compared to.....?
It's an effort vs reward question. Again, if you can demonstrate clear commercial benefits over and above the cost, then "hell yes!". Try to focus on the clients business problems, and be prepared to demonstrate how the SEO spend will solve those problems in cost effective ways.
Those are a few common objections. I'm sure you've heard others. What is important to understand is that not all objections are legitimate. Most are stalling tactics used to delay making a decision. That decision is difficult to make because the client will expose themselves to risk.
Simply by being pre-prepared for objections, you help negate that risk, and can quickly move the client towards make a decision.

Top Time-Saving Yahoo! Search Tips

When it comes to the Web, there's nothing wrong with cutting a few corners. That's why we decided to focus today's post on some time-saving tips for your next search. Some may seem obvious; others you may already know and use. But we hope a few will help you cut through the chase. You can find a full list of shortcuts and search tips here.
1. Square Brackets, "inurl," "originurlextension," and Site Restriction
To get a more targeted search, try these tricks out:

  • Words within square brackets -- adding square brackets to your search makes the keyword match order dependent. So typing in '[Jack Black]' will return results such as 'jack with black' but not 'black jack.'
  • "inurl" -- if you want to be sure that a specific term will appear in the site's URL, use the "inurl:[query]" operator. For example: 'inurl:iPod.'
  • Site restriction -- to restrict your search to pages within a specific domain, use the "site:[domain]" operator, followed by your query. For instance: 'Site:Apple.com iPod.'
  • "orginurlextension" -- to search on specific file types, add 'originurlextension:[file format]' after your search query. For example: 'nanotechnology originurlextension:swf' OR 'nanotechnology originurlextension:pdf.'
2. Package Tracking
Did you know that you can track your packages right in Yahoo! Search? Here's How it works:

  • For UPS packages, simply type in your tracking number
  • For FedEx or the U.S. Postal Service, just add the name before the tracking number. For example: 'FedEx [tracking number]' or 'USPS [tracking number]'
3. Definitions & Synonyms
To look up the definition of a word, try adding "define" or "definition" to your search term. For example: 'quixotic definition,' 'definition of globalization' or 'define ergonomics.' Or, if you're looking for a synonym, try adding "synonym" to your search term. For example: 'humorous synonym.'
4. Exclude Terms, Either/Or and Exact Phrase Match
This one's been around for a while, but a few simple operators can be a huge time-saver:
  • Exclude terms -- if you want a term to be excluded from your results, use a minus sign before it. 'Simpsons -movie' returns results for "The Simpsons" TV show, books, games, etc., but not the movie.
  • Either/or -- by default, all of the words you use in a search are included in the results. If you want to be more flexible, try adding "OR" (note the capitalization) between two terms. For example: 'Sony laptops OR notebooks' gives you results containing either "Sony laptops" or "Sony notebooks."
  • Exact phrase match -- if you want results to contain an exact phrase, put quotation marks around it: "Queen Elizabeth I".
You can also combine these tricks for even more refined searches. Try: '"Sony VAIO" laptops OR notebooks.'
5. Travel
With the holidays approaching, many of us have travel on our minds. Here are a few shortcuts to get you to your destination even faster:
  • Flight tracker -- search for the airline and flight number and you'll get a shortcut to the flight's status. Try: 'American 83' or 'Lufthansa 421.'
  • Traffic -- if you're driving instead of flying, you can search for traffic before you leave. Example: 'traffic Los Angeles.' Click on the shortcut and you'll get a map with traffic alerts.
  • Maps -- try searching for the exact address: '1600 Pennsylvania Avenue, Washington, DC.' Don't have the address? No problem. Add "map" before the city: 'map San Francisco.' You can also search for the zip code by itself: '20502.'
You can go here to check out more handy travel shortcuts.

Map Penn Ave

6. Yahoo! Services
If you're looking for a Yahoo! site, simply add an exclamation point after the site name and voila! Try it out with 'Mail!,' 'News!,' 'Sports!,' or 'Finance!'
7. Yahoo! Open Shortcuts
Yahoo! Open Shortcuts are the ultimate time-saving search feature. Add an exclamation point to the front of certain terms to instantly navigate to a URL, search a site, recall a favorite Yahoo! search, or start an application.
  • '!wiki queen elizabeth' takes you directly to the Wikipedia page for Queen Elizabeth.
  • '!wsf' gives you the Yahoo! Search results for "weather San Francisco."
  • '!clist' takes you to Craigslist.
  • '!ebay lamps' searches eBay for lamps.
Search for '!list' to see a bunch more. Those examples have already been set up for everyone to use, but the real power is that YOU can create your own customized shortcuts.

Have new ideas or suggestions for us? Let us know in the comments below. We're always looking for ways to make Yahoo! Search more efficient for you.

Second Page Poaching - Advanced White Hat SEO Techniques

It is time that someone  put Quadzilla in his place. Don’t get me wrong, I think there is a huge amount of innovation in the black hat and gray hat industries that simply is too risky for white-hatters to discover. Nevertheless, it is simply ridiculous to claim that white hat techniques have become so uniform and ubiquitous in their application that nothing truly “advanced” continues to exist. In today’s post I am going to talk about a technique with which many of you will not be familiar. Here at Virante we call it Second Page Poaching. But, before we begin, let me start with a brief explanation of what I believe to be “advanced white-hat SEO”.
Chances are, unless your site has thousands well-optimized pages in Google, advanced white hat techniques and many others will be useless. Advanced white hat SEO techniques tend to deal with scalable SEO solutions that bring higher search RoI for sites that seem to have reached the peak of potential search traffic. We are not talking about training a young boxer to become a contender. We are talking about turning a contender into a champion. When your PR7 eCommerce site is competing against other PR7 eCommerce sites for identical products and identical search phrases, all traditional optimization techniques (white, black, gray, blue hat, whatever) tend to fall by the wayside. Why buy links when you already have 100K natural inbounds? Why cloak when you have tons of legitimate content? This is where advanced white hat SEO kicks in. These are techniques which can bring high RoI with little to no risk when scaled properly. The example I will discuss today, “Second Page Poaching” is highly scalable, easily implemented, and offers a high Return to Risk ratio.

What is Second Page Poaching

the coordination of analytics (to determine high second-page rankings) with PR flow and in-site anchor-text to coax minor SERP changes from Page 2 to Page 1.

Why Second Page Poaching

We need to recognize that most on-site SEO techniques, especially PageRank flow, will only increase rankings by a single position or two. If you have a well optimized site, even providing a sitewide link to one particular internal page is unlikely to push it up 5 or 6 positions. Moreover, PR-flow solutions are unlikely to move a page from position 3 to position 2 or 1, where competition is more stiff. Instead, we would like to target the pages that will see the greatest traffic increase from an increase of 1 or 2 positions in the search engines.
Looking at the released AOL search data, we can determine which positions are most prime for “poaching”. We use this term because we are hunting for pages and related keywords on your site that meet certain qualifications.
Below is a graph showing the relative percentage increase of clickthroughs based on location in the top 12 in AOL’s released data. As you will see, there are spikes at moving from position 2->1, positions 11->10 and 12->11, 3->1, 11->9 and 12->10.

If we look at the data directly, you can see that the increases are in the 500% range or greater for moving from the top of the 2nd page to the bottom of the first. More importantly, as previously discussed, it is unlikely that PR-Flow methods will help you move from #2 to #1, given the competitiveness. But will that little bit of PageRank boost help you move from 11 to 10? You betcha! Simply put, if you can move hundreds of pages on your site from ranking #11 to #10, you will see a 5 fold increase in traffic for those keywords. If you were to do the same to move them from #7 to #6, you would barely see an increase at all.

Data Provided by RedCardinal
Now, as an advanced technique, it is important to realize that this becomes incredibly valuable when a site already has tens if not hundreds of thousands of pages. If you bring in 10,000 visitors a month from Page 2 traffic, you could see your traffic increase by 40,000 fairly rapidly. If you are a mom-and-pop shop and have 10 visitors from Page 2 traffic, you might only see 20 new visitors, as your site’s internal PR will be less capable of pumping up the rankings for those Page 2 listings.

Step 1: Data Collection

For most large sites, data collection is quite easy. Simply analyze your existing log files or capture inbound traffic on the fly and record any visitor from a Page 2 listing. Identify all inbound referrers that include both “google.com”, “q=” and “start=10″ Store the keyword and the landing page. Make sure your table also stores a timestamp as well, as frequency will matter when we make future considerations of which keywords to poach. If your log files store referrer data, it may be useful to go ahead and include historical data rather than starting from scratch. A suitable site should find hundreds if not thousands of potential keyword/landing page combinations from which to choose.

Step 2: Data Analysis

Because this step will be the most processor intensive, it is important to prioritize. In the Data Analysis component, we will judge the keyword/landing page combinations based on several characteristics. Because we only want to try to poach keywords for which we already rank #11 or #12, we will have to perform rank checks. In the interest of lowering what could potentially be a large computing burden, we should first consider the other metrics and then choose from that group which keywords we will check for rankings. We will consider the following characteristics”
  1. Frequency: Number of visitors driven per month, the higher the better.
  2. Conversion Rate: Why poach keywords for pages that convert poorly?
  3. Sale Potential: Why poach keywords for low RoI goods?
Now, assuming we have a set of keywords ordered by highest conversion, profit and frequency, we run simple rank checking software to identify those keywords for which your site currently ranks #11 or #12. Once that data point is added to the set, we use a formula to determine which keywords are most worth targeting. I will not get into it now, it is kind of proprietary, but you will want to take into account several factors to determine the minimum number of links needed to promote a page from #11 to #10 or #12 to #10. Once complete, you should now have our list of hundreds if not thousands of “keyword/landing page” combinations worth targeting.

Step 3: PR Flow Implementation

There are many creative ways to add these links across your site, the easiest of which is a simple “Other People Searched For:” section at the bottom of internal pages that list up to 5 alternatives. Your system would then choose 5 pages from the list and add text links with the inbound keyphrase as the anchor-text pointing to these landing pages. If you want to get really crafty, you can use your own internal search to identify related pages upon which to include the different landing page links that occur in your list. Ultimately, though, you will have added a large volume of links across your site which slightly increase the PageRank focused upon these high potential pages ranking #11 or #12. As Google respiders your site and finds these links, these pages will crawl up the 1 or 2 positions needed to quadruple or better their current inbound traffic for particular keywords.

Bear in mind that you risk very little with this technique. PR Flow tends to have little impact on your high-dollar, high-traffic keywords (where inbound links rule the day). Most importantly, because the system is automated, it will allow your internal pages to drop from position 6 to 7, where little to no real traffic is lost, but will capture and restore rankings if any pages ever drop from 9 or 10 to 11 or 12 due to the new internal linking scheme. You will lose PR on pages th.at have nothing to lose, and gain PR on pages that have everything to win. Magnified across 10,000 pages, and you can see the profits from a mile away.

Step 4: Churn and Burn

This is perhaps the most important part of the process. Continued analysis.
  1. the system needs to capture any successful poaches and make sure you continue to link to them internally. if the system drops pages once they move from 11 to 10, you have failed :)
  2. the system needs to determine non-movers, pages where internal linking is not improving their position, and blacklist them so you do not continue to waste extra PR flow
  3. the system needs to continue to replace non-movers with the next best solution, and continue to look for more keyword/landing page options

Summing Up

I know this is a long post, but I feel like it is worth reiterating. Advanced white hat SEO techniques do exist but no one wants to talk about them for the same reason no one wants to talk about advanced black hat techniques. Second Page Poaching is just one of many different options available to large-scale websites looking to gain an edge over their competitors. Many of these techniques are fully automated, easily implemented, and highly scalable. However, most of them are kept under lock-and-key.

SEO Strategy Document

Below is a SEO Strategy Document I have used in the past. I decided to post it after speaking with a respected SEO Consultant who wasn’t sure how to craft the document. I hope this helps other consultants as well. Early this year, I also posted an Example of a PPC Strategy Document.
Feel free to add any comments, constructive criticism, etc.
Beginning of document
This is the strategy document for the SEO initiative within the overall marketing plan for [CLIENT SITE]. This document will provide detail to the strategy, analysis, and optimization recommendations.
It is our recommendation to use both an SEO & PPC Search campaign. However, to initially begin with the SEO component.
The reason the sequence should be SEO before PPC is because a highly optimized webpage will produce a much more efficient paid search campaign due to a higher quality score, and result in lower CPC’s and higher rankings.
Therefore, this document focuses on the SEO initiative for the [CLIENT SITE] main site, and the [CLIENT SITE] ecommerce site.
We will be optimizing the website in its entirety, with a specific success metric on a keyword/landing page combinations.
Primary Objective:
- Increase visitor traffic to http://www.[Client Site].com
- To drive people to the specified content on [Client Site].com from pages that we optimize for specific keyword phrases.
Secondary Objective:
- To generate brand awareness for the [CLIENT SITE] brand.
Domains:
- http://www.[Client Site].com
Target Search Engine:
- Google – We will focus our optimization efforts on Google.
- Google has roughly a 60% market share of search volume.
- Once we have a good ranking on Google, it is more than likely we will have a good ranking on the other engines because the ranking factors are all very similar across the engines.
Search Engine Market Share
Budget:
- $[blank] monthly budget
Results Timeline:
- After implementation of the recommendations, there is a delay of at least a week or two, before seeing results in the Search Engine.
- The keyword rankings should continue to improve as the strategic inbound links age, and the SEO implementations have been indexed. On average, the optimum effect from the on-site & off-site SEO effort will be experienced in approximately 12 weeks.
Launch Date:
- January 1, 2008
Keyword Research and Analysis:
- Determine which keywords will pull in qualified traffic, how many searches do these terms/phrases receive, how should keywords & webpages be used in combination to attract search traffic.
- Analyze Keyword Difficulty
Keywords:
- We will concentrate on optimizing keyword phrases that are mapped to a specific webpage.
- 1 Keyword/Landing page combinations.
Domain Analysis:
- Analysis of the website information architecture and internal linking structure
- Analysis of HTML and page layout
Content Analysis & Creation:
- Amount of Indexable Text Content
- Quality & quantity of visible HTML text on page
- SEO Copywriting
- Additional webpage/content creation if necessary
Linking Analysis:
- Overall Internal link structure
- Internal Link Popularity of Linking Page within Host Site/Domain
- Link Popularity
- Link Popularity of Site in Topical Community
- The link weight/authority of the target website amongst its topical peers in the online world.
- Link Relevance
- Topical Relevance of Inbound Links to Site.
- The subject-specific relationship between the sites/pages linking to the target page and the target keyword.
- Deep Link Analysis
- Deep link percentage is the % of all inbound links that point at pages other than the homepage.
Link Development:
- Increase link popularity
- Implement a Strategic Link Campaign by purchasing links on websites that will provide a high SEO value.
Indexing Analysis:
- Duplicate Content
- Content very similar or duplicate of existing content in the index
- Possible duplicate Title/Meta Tags on many Pages
- Robots.txt exclusions
Baseline Reporting:
- Generate a baseline Ranking Report for the keywords. The baseline, as the name implies, gives us an indication of where the webpages are ranked on Google, and will be used to measure results.
- Baseline Traffic report for http://www.[Client Site].com
- Baseline Monthly Sales report for http://www.[Client Site].com
Implementation:
- Programming our recommendations. Once all the changes have been completed, we will run an audit to verify that the recommendation was implemented correctly and identify any changes, if needed.
Success Metrics:
- First page Search Engine rankings.
- Increase in Website Traffic to http://www.[Client Site].com
- Increase Sales volume to http://www.[Client Site].com , and generate a positive ROI.

Link Building Strategies

Six and a half years ago (which is ages, in Internet years), Robin Nobles, Eric Ward, and John Alexander compiled a legendary list of 131 legitimate link building strategies. Four years later, Aaron Wall and Andy Hagans published 101 link building tips to market your website, which was inspired by the other article. Considering the furiously changing face of search engine marketing and with 2009 already ahead of us, I thought it was time to evaluate both lists and create an updated collection of link building strategies.

7 Internal link building strategies

1. Make sure that your navigation is spiderable. Either use (anchor text carrying) text based navigation, or an image based navigation with relevant alt attributes attached to each image link.
2. Breadcrumbs are a great internal linking tool. Use them for usability and anchor text differentiation.
3. In-content links not only tend to have a higher click through rate and perceived trust, but are also able to add more relevance to a link because of the surrounding text.
4. Use a sitemap. A good sitemap is useful for visitors, useful for search engines and, therefore, useful for you.
5. Link to topically relevant pages on important pages of your website. Link to important pages on every (or most) topically relevant page of your website.
6. Be consistent in linking behavior. If you link to homepage.com, always link to homepage.com, and not to homepage.com, homepage.com/index.php and homepage.com/index.php&id=123.
7. Identify your most linked-to pages, and make sure that the link juice flows to your most important pages from there, in a well-optimized way.

10 Easy link building starters

8. Optimize your existing links. Contact the webmasters of prominent websites that link to you and ask them to change ‘click here’ to an anchor text that contains relevant keywords, an anchor text that encourages clicking through, or -ideally- a combination of both.
9. Monitor your 404 statistics. Keep track of whoever links to old pages or misspelled URLs, which is data that Google provides as well. Contact those webmasters and provide them a good URL which they can link to.
10. Create a ‘link to us’ page, where you provide information about how people can link to you and which URL(s), logo and/ or anchor text they can use. Update this page regularly in order to diversify the anchor text.
11. Contact family, friends, colleagues and other people you know and let them know about your website. Some will send you useful feedback, others -who happen to have a website of their own- might link to you.
12. Do you block search engine bots from indexing certain parts of your website via robots.txt or meta-noindex? Find out if people link to this section of your site. If so, contact the webmasters of these sites and kindly ask them to link to an other page of your website.
13. Use your spell check. People will more likely link to correctly spelled articles than to content that’s full of grammar errors.
14. Search for websites that already mention your business name or URL, but haven’t linked to your website. This works excellently in Yahoo!.
15. Look for websites that mention your personal name, but currently don’t link to your site. Use Yahoo! for this as well.
16. Leave comments on the blogs you visit every day. Hey, you’re visiting them anyway, so why don’t leave a (relevant, useful!) comment?
17. Find out which website your company owns. If you work for a small company, there may possibly be several. If you work for a large company, the number will probably knock you off your shoes. Link these websites (carefully!) together, or redirect the most important and/ or relevant ones to your main website.

12 Old school link building techniques

18. Search for related websites by using relevant keywords. Filter out all interesting websites and contact them. When you did this for your main keyword(s), there are still tons of other combinations possible.
19. Check which websites link to your competitors. Try to get them to link to your website as well.
20. Check which types of websites link to websites that offer the same services or products as you, but in a different country/ language. This might result in a “I never thought of that…” feeling.
21. Either interview an expert from your field, or try to get interviewed by someone else. Don’t forget to mention your best content: readers of the interview might be willing to link to it.
22. Write guest posts for relevant websites in your niche. You could also write posts about your industry for websites that are slightly related to your niche.
23. Teach. Whether it’s a public workshop (local press), a class at a local college or University (.edu website) or at a business related event (industry links), teaching can result in authority links.
24. Use any search engine advertising program and advertise on keywords that linkerati might use. Try to convert the targeted traffic into links.
25. Use Google AdWords’ content network to determine which (relevant) websites generate traffic and conversions. Contact those websites directly.
26. Join an affiliate program. See #25.
27. Determine who’s linked to you before. Contact them again when you’re releasing an interesting new piece of content.
28. Trade links. There’s nothing wrong, with swapping links with a few, highly relevant, authority websites that can bring in extra traffic. Exchanging links with lots of irrelevant websites, however, might get you in trouble.
29. Donate to a charity. Although buying links is not allowed by Google, there are still lots of ways you can buy links (kind of) legitimately.

12 Places to submit your URL to

30. Most social media websites are only useful for promoting good content (which will get you links in return), but sites like LinkedIn still provide dofollow links with an anchor text of your choice.
31. Some general directories, such as DMOZ, the Yahoo Directory and Best of the Web are still worth submitting your website to. Make sure to submit your site to the most appropriate category.
32. High quality, niche directories can be worth considering as well. Notice the emphasis on high quality.
33. Don’t forget to submit your website to high quality, regional directories. Especially worthwhile for websites that target local markets.
34. Publish stunning, interesting, funny or beautiful images in your Flickr account, that contains a link to your website.
35. Writing an article about a relevant topic, that contains one or more links to your website, and submitting it to article directories such as eZineArticles might work for you.
36. Relevant, non-spammy links in Wikipedia articles, Yahoo! Answers or Google Groups may have nofollow attributes attached, but can lead to (dofollow) links indirectly.
37. Submit your RSS feed to important RSS directories.
38. Blog directories may be willing to link to your blog. Submit your blog to the high quality ones.
39. Use PR websites to distribute your press releases, in addition to your PR agency. Make sure that your press release contains one or more (clickable) links to your website.
40. Got a great design? Submit your site to CSS directories and/ or website design contests. Even well-designed parts of your website can result in links.
41. Twitter. Just published a new post or article? Mention it on Twitter, your followers might visit it and -if they find it interesting- link to it.

12 Ways to make people write about you

42. Send out christmas gifts or birthday gifts to bloggers (or website owners) you know.
43. Offer services or a product in exchange for a review. Don’t ask the bloggers or webmasters to link to you, they most often will do anyway.
44. Create something unique. Top 10s, top 250s, mash-ups, how-tos, best-ofs, surveys, studies, awards. Define the proper hook, create unique content and attract good links. The possibilities are infinite.
45. Try to start a hype, use a new word, get a meme going, or do something else you’re the first at.
46. Link to others. People -especially bloggers- will notice it if you link to them. If you do this several times and offer content that is or might be relevant to these bloggers, they might link to you as well eventually.
47. If you happen to have some breaking news, offer a blogger (or a select group) the scoop. Bloggers love to publish scoops.
48. Say something groundbreaking, shocking, confronting, stupid, weird or flattering. People tend to link to others who are different or act that way.
49. Create something with an amazing design. This does not necessarily have to be your website, just having an awesome business card can result in extra links.
50. Launch an extraordinary offline campaign. People will talk about this online. If you integrate this offline campaign with an online version in a perfect way, you may even receive some extra links from ‘this is how you should integrate offline and online’-articles as well.
51. Create a contest and offer give-aways for winners. This is not only a great way to get attention, but to get valuable input as well, for example when hosting a guest post contest.
52. Build useful tools and/ or plugins that are free to use.
53. Speak at an industry conference. You’ll meet lots of interesting new people, and will probably get mentioned in several conference write-ups.

12 Common business tactics

54. Add a link to your local Chamber of Commerce profile.
55. The Better Business Bureau, and any industry related association you’re a member of are interesting link targets as well.
56. Contact your (preferred) suppliers, manufacturers, other partners. Obtain links from these website if they have a partners page as well.
57. Offer to write testimonials or a quote to your suppliers, if they are willing to link back to your site in or near this testimonial.
58. Ask clients to write testimonials about your product or service that they publish on their website, in exchange for a discount, extra fast delivery or any other benefit you can provide.
59. Hire a publicist. Press agency employees usually know the right people in the right places, which can result in a higher acceptancy rate of your press release.
60. Join relevant forums. You can either link to your website on your profile page, in your signature or in your posts. Notice how this one is listed under ‘Business related tactics’ in stead of ‘Places to submit your URL to’? There’s a reason why: forums are not places to drop links, but to join discussions.
61. Sponsor something. There are tons of possibilities, such as an industry conference, a sports club, a relevant forum, a local happening, or just any offline event that happens to have a website.
62. Hire an intern. You can let him or her work on a piece of research, which you can in your link building process. Also, don’t forget the website of the University you’re intern is attending.
63. Offer awesome product or services. People love talking about great stuff they’ve bought. If your products are ‘just’ good in stead of awesome, make sure that your after sales or customer care is excellent. People love talking about companies with a great service as well. Of course, offering crappy products or a lousy service will also result in links, but I don’t think those are the links you’re after.
64. Look for companies that went out of business. Either acquire their website, or contact the website that they’re currently getting links from and ask these sites to link to you in stead.
65. Turn your colleagues into link developers. Each of them has his or her own specialty and group of contacts. This not only take works off of your hands, but is very efficient as well.

4 Important considerations

66. Hire a link builder or an expert. Either let somebody you trust manage (a part of) your campaign, or visit a link building workshop. Especially when you’ve been building links for your own site for several years, a fresh mind can bring new ideas.
67. Hang in there. Link building isn’t something you can do in just a few hours, or something that you only have to do during one week in a year. Building a brand can’t be done in a single day, the same goes for a solid link profile. It’s a continuing process that takes time. Lots of time.
68. Keep an eye on the news. Follow important and interesting different blogs, in order to keep up with the latest news, trends and tricks. I’m not just talking about link building or SEM blogs, but make sure to follow general marketing blogs, slightly different, creative blogs or industry related news websites as well.
69. If you have to ask yourself ‘is this a legitimate approach’ or ’should I be doing this’, the answer is probably no. Too much, too aggressive or too shady isn’t advisable. Don’t do things you would be ashamed of when explaining them to your mother. Or Matt Cutts.

0 Advanced link building strategies

There is no such thing as advanced link building. While this list already sums up quite a few different strategies, I’m pretty sure that you can easily come up with a dozen more, that are specifically suitable for your company or industry.
Eric Ward once said that link building is “one part marketing, two parts public relations, and three parts common sense”. I’d say that link building is 10% basic SEO knowledge, 20% business thinking, 30% creativity and 40% perseverance. Either way, there’s nothing advanced to it.

SEO Bookmarklets Collection: On-Page SEO and Domain Stats

Last week I listed a few useful Google bookmarklets for SEOs. This time I am looking at some more helpful browser bookmarklets that can come in handy for on-page SEO research and a quick view at essential site stats (to install any of them just drag it to your bookmark toolbar):

Domain information:


Page SEO:

Highlight Headings
Sources:

Concertina pagination - getting large scale websites indexed

As many of you will know getting large numbers of pages into the index of search engines is a science on its own.

image by hoveringdog
It’s clear that cross linking from other pages of the project helps massively. Not only with indexation but with ranking of the pages as well. SEO black-holes like Wikipedia are the living proof of the effectiveness of - what you can call excessive cross-linking.
The other obvious helper to high volume indexation is back-link power. A nice back-link from the yahoo.com homepage will do wonders to the deep-crawl frequency of your site and the number of pages listed.
Ideally you will get back-links not only to you homepage but on any level in the hierarchy of your site.
Time is an additional factor in this equation. It takes time to get millions of pages into the index of search engines.
Now the above is very well known common knowledge among SEO people and even the below is no ground breaking paradigm shifting revelation - but at the same time its a very effective method that’s easy to implement but still rarely used amongst sites with huge numbers of content pages and multiple levels of categories.
The method I’m talking about is a non constant based pagination. It is often referred to as concertina.
And the method is as simple as effective. When in most cases the pagination will look like this.

And the pages will show 1 to 10 on the first page 11 to 20 on the second and so on and so forth. This scheme will stay the same even if you have tens of thousands of pages.
Due to the fact however that few sites have back-links from the yahoo.com leads to the fact that search engine crawler will get “tired” after a while. This means that the crawler will only get to a certain depth of pages on your site.
To help with this it is much more effective to implement a pagination that looks like this.

or this

For the first couple of pages its identical to the example shown above. But after page ten you will get page 20 then 50 then 100 and so on.
By using this method you make sure that the crawler hits pages further down the hierarchy quicker and this does lead to a significant increase of pages indexed.
Here are a couple of sites that use various types of concertina paging for you to have a look.

Date with Googlebot

Our date with Googlebot was so wonderful, but it's hard to tell if we, the websites, said the right thing. We returned 301 permanent redirect, but should we have responded with 302 temporary redirect (so he knows we're playing hard to get)? If we sent a few new 404s, will he ever call our site again? Should we support the header "If-Modified-Since?" These questions can be confusing, just like young love. So without further ado, let's ask the expert, Googlebot, and find out how he judged our response (code).


Supporting the "If-Modified-Since" header and returning 304 can save bandwidth.


-----------
Dearest Googlebot,
  Recently, I did some spring cleaning on my site and deleted a couple of old, orphaned pages. They now return the 404 "Page not found" code. Is this ok, or have I confused you?
Frankie O'Fore

Dear Frankie,
  404s are the standard way of telling me that a page no longer exists. I won't be upset—it's normal that old pages are pruned from websites, or updated to fresher content. Most websites will show a handful of 404s in the Crawl Diagnostics over at Webmaster Tools. It's really not a big deal. As long as you have good site architecture with links to all your indexable content, I'll be happy, because it means I can find everything I need.

  But don't forget, it's not just me who comes to your website—there may be humans seeing these pages too. If you've only got a very simple '404 page not found' message, visitors who aren't as savvy could be baffled. There are lots of ways to make your 404 page more friendly; a quick one is our 404 widget over at Webmaster Tools, which will help direct people to content which does exist. For more information, you can read the blog post. Most web hosting companies, big and small, will let you customise your 404 page (and other return codes too).

Love and kisses,
Googlebot


Hey Googlebot,
  I was just reading your reply to Frankie above, and it raised a couple of questions.
What if I have someone linking to a page that no longer exists? How can I make sure my visitors still find what they're after? Also, what if I just move some pages around? I'd like to better organise my site, but I'm worried you'll get confused. How can I help you?
Yours hopefully,
Little Jimmy


Hello Jimmy,
   Let's pretend there are no anachronisms in your letter, and get to the meat of the matter. Firstly, let's look at links coming from other sites. Obviously, these can be a great source of traffic, and you don't want visitors presented with an unfriendly 'Page not found' message. So, you can harness the power of the mighty redirect.

   There are two types of redirect—301 and 302. Actually, there are lots more, but these are the two we'll concern ourselves with now. Just like 404, 301 and 302 are different types of responses codes you can send to users and search engine crawlers. They're both redirects, but a 301 is permanent and a 302 is temporary. A 301 redirect tells me that whatever this page used to be, now it lives somewhere else. This is perfect for when you're re-organising your site, and also helps with links from offsite. Whenever I see a 301, I'll update all references to that old page with the new one you've told me about. Isn't that easy?

   If you don't know where to begin with redirects, let me get you started. It depends on your webserver, but here are some searches that may be helpful:
Apache: http://www.google.com/search?q=301+redirect+apache
IIS: http://www.google.com/search?q=301+redirect+iis
You can also check your manual, or the README files that came with your server.

   As an alternative to a redirect, you can email the webmaster of the site linking to you and ask them to update their link. Not sure what sites are linking to you? Don't despair - my human co-workers have made that easy to figure out. In the "Links" portion of Webmaster Tools, you can enter a specific URL on your site to determine who's linking to it.

  My human co-workers also just released a tool which shows URLs linking to non-existent pages on your site. You can read more about that here.

Yours informationally,
Googlebot



Darling Googlebot,
   I have a problem—I live in a very dynamic part of the web, and I keep changing my mind about things. When you ask me questions, I never respond the same way twice—my top threads change every hour, and I get new content all the time! You seem like a straightforward guy who wants straightforward answers. How can I tell you when things change without confusing you?
Temp O'Rary


Dear Temp,
   I just told little Jimmy that 301's are the best way to tell a Googlebot about your new address, but what you're looking for is a 302.
   Once you're indexed, it's the polite way to tell your visitors that your address is still the right one, but that the content can temporarily be found elsewhere. In these situations, a 302 (or the rarer '307 Temporary Redirect') would be better. For example, orkut redirects from http://orkut.com to http://google.com/accounts/login?service=orkut, which isn't a page that humans would find particularly useful when searching for Orkut***.
It's on a different domain, for starters. So, a 302 has been used to tell me that all the content and linking properties of the URL shouldn't be updated to the target - it's just a temporary page.

  That's why when you search for orkut, you see orkut.com and not that longer URL.

  Remember: simple communication is the key to any relationship.

Your friend,
Googlebot


*** Please note, I simplified the URL to make it easier to read. It's actually much more complex than that.

Captain Googlebot,
   I am the kind of site who likes to reinvent herself. I noticed that the links to me on my friends' sites are all to URLs I got rid of several redesigns ago! I had set up 301s to my new URLs for those pages, but after that I 301'ed the newer URLs to my next version. Now I'm afraid that if you follow their directions when you come to crawl, you'll end up following a string of 301s so long that by the end you won't come calling any more.
Ethel Binky


Dear Ethel,
   It sounds like you have set up some URLs that redirect to more redirects to... well, goodness! In small amounts, these "repeat redirects" are understandable, but it may be worth considering why you're using them in the first place. If you remove the 301s in the middle and send me straight to the final destination on all of them, you'll save the both of us a bunch of time and HTTP requests. But don't just think of us. Other people get tired of seeing that same old 'contacting.... loading ... contacting...' game in their status bar.

   Put yourself in their shoes—if your string of redirects starts to look rather long, users might fear that you have set them off into an infinite loop! Bots and humans alike can get scared by that kind of "eternal commitment." Instead, try to get rid of those chained redirects, or at least keep 'em short. Think of the humans!

Yours thoughtfully,
Googlebot


Dear Googlebot,
   I know you must like me—you even ask me for unmodified files, like my college thesis that hasn't changed in 10 years. It's starting to be a real hassle! Is there anything I can do to prevent your taking up my lovely bandwidth?

Janet Crinklenose


Janet, Janet, Janet,
   It sounds like you might want to learn a new phrase—'304 Not Modified'. If I've seen a URL before, I insert an 'If-Modified-Since' in my request's header. This line also includes an HTTP-formatted date string. If you don't want to send me yet another copy of that file, stand up for yourself and send back a normal HTTP header with the status '304 Not Modified'! I like information, and this qualifies too. When you do that, there's no need to send me a copy of the file—which means you don't waste your bandwidth, and I don't feel like you're palming me off with the same old stuff.

   You'll probably notice that a lot of browsers and proxies will say 'If-Modified-Since' in their headers, too. You can be well on your way to curbing that pesky bandwidth bill.

Now go out there and save some bandwidth!
Good ol' Googlebot

-----------

Googlebot has been so helpful! Now we know how to best respond to users and search engines. The next time we get together, though, it's time to sit down for a good long heart-to-heart with the guy (Date with Googlebot: Part III, is coming soon!).

First date with the Googlebot: Headers and compression


googlebot with flowers
Name/User-Agent: Googlebot
IP Address: Verify it here
Looking For: Websites with unique and compelling content
Major Turn Off: Violations of the Webmaster Guidelines
Googlebot -- what a dreamboat. It's like he knows us , , and soul.  He's probably not looking for anything exclusive; he sees billions of other sites (though we share our data with other bots as well :), but tonight we'll really get to know each other as website and crawler.

I know, it's never good to over-analyze a first date. We're going to get to know Googlebot a bit more slowly, in a series of posts:
  1. Our first date (tonight!): Headers Googlebot sends, file formats he "notices," whether it's better to compress data
  2. Judging his response: Response codes (301s, 302s), how he handles redirects and If-Modified-Since
  3. Next steps: Following links, having him crawl faster or slower (so he doesn't come on too strong)
And tonight is just the first date...

***************
Googlebot:  ACK
Website:  Googlebot, you're here!
Googlebot:  I am.

GET / HTTP/1.1
Host: example.com
Connection: Keep-alive
Accept: */*
From: googlebot(at)googlebot.com
User-Agent: Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
Accept-Encoding: gzip,deflate


Website:  Those headers are so flashy! Would you crawl with the same headers if my site were in the U.S., Asia or Europe? Do you ever use different headers?

Googlebot:  My headers are typically consistent world-wide. I'm trying to see what a page looks like for the default language and settings for the site. Sometimes the User-Agent is different, for instance AdSense fetches use "Mediapartners-Google":
  User-Agent: Mediapartners-Google

Or for image search:
  User-Agent: Googlebot-Image/1.0

Wireless fetches often have carrier-specific user agents, whereas Google Reader RSS fetches include extra info such as number of subscribers.

I usually avoid cookies (so no "Cookie:" header) since I don't want the content affected too much by session-specific info. And, if a server uses a session id in a dynamic URL rather than a cookie, I can usually figure this out, so that I don't end up crawling your same page a million times with a million different session ids.


Website:  I'm very complex. I have many file types. Your headers say "Accept: */*". Do you index all URLs or are certain file extensions automatically filtered?

Googlebot:  That depends on what I'm looking for.

If I'm indexing for regular web search, and I see links to MP3s and videos, I probably won't download those. Similarly, if I see a JPG, I will treat it differently than an HTML or PDF link. For instance, JPG is much less likely to change frequently than HTML, so I will check the JPG for changes less often to save bandwidth. Meanwhile, if I'm looking for links as Google Scholar, I'm going to be far more interested in the PDF article than the JPG file. Downloading doodles (like JPGs) and videos of skateboarding dogs is distracting for a scholar—do you agree?

Website:  Yes, they can be distracting. I'm in awe of your dedication. I love doodles (JPGs) and find them hard to resist.

Googlebot:  Me, too; I'm not always so scholarly. When I crawl for image search, I'm very interested in JPGs. And for news, I'm mostly looking at HTML and nearby images.

There are also plenty of extensions (exe, dll, zip, dmg...), that tend to be big and less useful for a search engine.


Website:  If you saw my URL, http://www.example.com/page1.LOL111, would you (whimper whimper) reject it just because it contains an unknown file extension?

Googlebot:  Website, let me give a bit more background. After actually downloading a file, I use the Content-Type header to check whether it really is HTML, an image, text, or something else. If it's a special data type like a PDF file, Word document, or Excel spreadsheet, I'll make sure it's in the valid format and extract the text content. Maybe it has a virus; you never know. If the document or data type is really garbled, there's usually not much to do besides discard the content.

So, if I'm crawling http://www.example.com/page1.LOL111 with an unknown file extension, it's likely that I would start to download it. If I can't figure out the content type from the header, or it's a format that we don't index (e.g. mp3), then it'll be put aside. Otherwise, we proceed indexing the file.


Website:  My apologies for scrutinizing your style, Googlebot, but I noticed your Accept-Encoding headers say:
Accept-Encoding: gzip,deflate

Can you explain these headers to me?

Googlebot:  Sure. All major search engines and web browsers support gzip compression for content to save bandwidth. Other entries that you might see here include "x-gzip" (the same as "gzip"), "deflate" (which we also support), and "identity" (none).


Website:  Can you talk more about file compression and "Accept-Encoding: gzip,deflate"? Many of my URLs consist of big Flash files and stunning images, not just HTML. Would it help you to crawl faster if I compressed my larger files?

Googlebot:  There's not a simple answer to this question. First of all, many file formats, such as swf (Flash), jpg, png, gif, and pdf are already compressed (there are also specialized Flash optimizers).

Website: Perhaps I've been compressing my Flash files and I didn't even know? I'm obviously very efficient.

Googlebot:  Both Apache and IIS have options to enable gzip and deflate compression, though there's a CPU cost involved for the bandwidth saved. Typically, it's only enabled for easily compressible text HTML/CSS/PHP content. And it only gets used if the user's browser or I (a search engine crawler) allow it. Personally, I prefer "gzip" over "deflate". Gzip is a slightly more robust encoding — there is consistently a checksum and a full header, giving me less guess-work than with deflate. Otherwise they're very similar compression algorithms.

If you have some spare CPU on your servers, it might be worth experimenting with compression (links: Apache, IIS). But, if you're serving dynamic content and your servers are already heavily CPU loaded, you might want to hold off.


Website:  Great information. I'm really glad you came tonight — thank goodness my robots.txt allowed it. That file can be like an over-protective parent!

Googlebot:  Ah yes; meeting the parents, the robots.txt. I've met plenty of crazy ones. Some are really just HTML error pages rather than valid robots.txt. Some have infinite redirects all over the place, maybe to totally unrelated sites, while others are just huge and have thousands of different URLs listed individually. Here's one unfortunate pattern. The site is normally eager for me to crawl:
  User-Agent: *
  Allow: /


Then, during a peak time with high user traffic, the site switches the robots.txt to something restrictive:
  # Can you go away for a while? I'll let you back
  # again in the future. Really, I promise!
  User-Agent: *
  Disallow: /


The problem with the above robots.txt file-swapping is that once I see the restrictive robots.txt, I may have to start throwing away content I've already crawled in the index. And then I have to recrawl a lot of content once I'm allowed to hit the site again. At least a 503 response code would've been temporary.

I typically only re-check robots.txt once a day (otherwise on many virtual hosting sites, I'd be spending a large fraction of my fetches just getting robots.txt, and no date wants to "meet the parents" that often). For webmasters, trying to control crawl rate through robots.txt swapping usually backfires. It's better to set the rate to "slower" in Webmaster Tools.

The Biggest Web Site Usability Mistakes You Can Make

When you built your first web site, didn’t you just want to promote it everywhere with big bold letters saying, “HEY EVERYONE! COME HERE AND LOOK AT MY WEBSITE! ISN’T IT GREAT?” Or, when you submit your web site to forums for web site reviews, what do you typically ask for? You may write, “Tell me what you think of my web site” or “Which color do you like better, blue or red?” or “Did I optimize for search engines properly?”
The worst mistake you will make as a web site owner is to ask someone to “look” at your web site. It’s like the dreaded, “Do I look fat?” question. There’s never a safe answer. For starters, someone may look slim standing up, but resemble the Buddha when sitting on a couch. You need to assign tasks to get honest answers to these tough questions.
To understand if your web site is meeting its usability requirements, ask people to take it for a spin and try it out—and more specifically, to see if they can answer the following questions:
What is the purpose of the site?
Ann Smarty wrote in Check Your Site Usability With These Fun Tools about the Five Second Test tool. It’s a fun way to explore immediate impressions and experiment by asking, “Tell me what the site is about”, to see if the site’s purpose is communicated clearly. It can’t warn if your shopping cart is broken. It doesn’t alert you if your sales lead form was invasive and turned away potential customers.
When car shopping, good sales people begin by explaining a car’s features and describe updates from previous models. They’ll walk around the car with you and demonstrate how to pack the back with groceries and squeeze in fishing poles. You won’t buy it at this point, however. For now, the sales person is spinning you a tale to help you imagine yourself inside that gorgeous expensive hunk of machine.
Sales people don’t approach you with “Do you think this car would look better in orange?”
What need does it fulfill for me?
Another area of concern that web site owners have is conversions. They’ll ask for feedback by writing, “My sales are down! Can you look at my homepage and tell me what I’m doing wrong?” Sometimes they’ll write, “We just redesigned our entire web site and our data tells us people are still leaving from the homepage. Help!”
If you’ve ever sold a home, you know that real estate agents will tell you to clean up the yard, paint the walls, empty it out so it looks roomy and place flowers around. If all we had to do was to make a house look pretty to sell it, we wouldn’t need real estate agents to show our homes. They’d sell themselves. Web sites with fresh paint on the homepage but no repairs to the information architecture, persuasive content, functionality and user experience can’t be expected to perform miracles.
A good real estate agent will bring potential buyers to a house and encourage them to turn on the water faucets, open closets, and help them visualize 50 people in the family room at Uncle Frank’s birthday party. What if you move in, get arthritis and can’t manage stairs anymore? The value proposition is not just about features and price. It’s about what benefit someone will get by purchasing your service, buying your products or experiencing your online tools. In addition, try to help visitors plan ahead and make logical choices rather than purposely pushing them into a revenue stream that will only benefit you in the long run. Why? Word of mouth marketing, the “long tail”, customer satisfaction and brand management.
Is it responsive to my emotions?
When you wrote up requirements for your online business, did you remember to include emotion? Most likely, it never crossed your mind. Do you watch how people use web sites when they’re in a hurry? Upset? Worried? Stressed? Tired? Hungry?
Google recently launched SearchWiki. Regardless of what you may think about it, what motivated them are their users. Their data shows that searchers want better ways to search, organize, save their research and quickly find favorite web sites. With user feedback, Google can create user personas to help developers understand how searchers use Google when they’ve just been informed of bad news. How do stress and exhaustion affect search behavior?
Consideration for your web site visitors’ emotional state may be vital for your web site. While testing a web site recently for a young adult rehabilitation center, I was pleased to find their content was written in a warm, caring way. The colors were soothing pastels. The pictures showed happy clients. Unfortunately, their content was all about the center and types of therapy. It was long winded, requiring time to read and digest. What wasn’t addressed with bullet point details on the homepage was proof of their claims to calm fears and concerns over ethics. Were there case studies or testimonials? Could a parent talk to other parents who sent a child there?
What if the site visitor is a father at the end of his rope? He’s tired, busy, a widower and shaking with emotion as he scrolls, clicks and reads. Was the site designed for an upset Dad? If not, what could be done to create a better user experience for him? For this site visitor, the site of a healthy looking child beaming into the camera while holding a farm animal might be an emotional hook.  Someone available to speak with him “now” may persuade him to call because his feelings are raw at that moment.
Consider an insurance quote site with low conversions. It contains all the bells and whistles, content, easy navigation, and basically the same claims that every other insurance quote company places on their web site. Even if this site should present a reasonable competitive business value proposition, their form may not be converting. Why? In many cases, there is no information about how long the quote process will take. How many pages is the form? How much information is expected from the applicant? Is the follow up response by email or phone call? Is there a choice? Do they show competitive data and choices?
The biggest mistake is to believe that web site appearance matters the most. How it looks is only one part of the process. How it performs is another. What it can give back to site visitors and how effectively it conveys that information will matter even more.