Every search engine has a ranking algorithm and all the search spiders are tuned to the same. Web spiders are on a mission to identify quality content and separate it from irrelevant information, grab the content, extract the essence of this content on a page-by-page basis, compare content for source reputation, determine the relationship between the content and site and finally assign the result. The algorithm uses rules to identify the most relevant pages depending on your page text content and its context. The context is usually indicated by the number of links from other pages and sites. To deliver your site to a search engine result, there are some common factors which are needed.
This is where you match copy on the page to the search items entered. Factors are dependent upon the keywords used as anchor text, page title tags, Meta tags. Make sure that you do not overdo the keywords matching copy on your site as otherwise it would look like spamming. It is more like labeling the content on your site with the headline and link text reflecting the editorial content for example.
Your website requires inbound and outbound links. Search spiders access each link to a page from another page within your site and from other sites as well. Pages which contain more inbound links are usually ranked high, but again you need to be careful that these outbound links are parallel with the content of your site. Essentially, the quality of these links becomes a governing factor.
The Key is Content
Your site requires content which strikes a balance between information on the pages which the user reads and keywords which the search spiders effectively translate to position you at the top result page. Content is vital – and it should come from within key departments in your company like Media relations or PR or Marketing.
These individuals or sections identify the most accurate symptoms your customers have and would advise and guide your site to the consumer’s search query result pages. Fresh content written by different people also ensures your site’s overall content is different in context.
Cloaking: This is the technique where web sites intercept each request, recognize the ones from search engine bots and serve up different content to the bots rather than serving up the same pages seen by the end user. It is obvious how this can be misused; you can load the web pages you serve up to the bots with misinformation to get a higher ranking. An action such as this can very well result in the search engines blocking your web site.
Link Farms: Being a part of link farms, where web masters agree to put links to each other’s sites on their pages to increase their ranking, is also a quick way to get blacklisted.
Monitor and Control bot Traffic: Modern search crawlers constantly update their caches, and hence, trawl through the World Wide Web at amazing regularity to detect changes. There is a good chance that these requests from bots might overwhelm the web site, and controlling this is of high importance. One solution is to have a mirror website running on the site dedicated to serving bots only. A load balancer can then be configured to redirect all requests from spider bots to that particular web server.
No business can afford to ignore its web presence, especially its eCommerce portal. Online is increasingly the way business is done: nearly $1 of every $10 is now spent online, roughly 4.6% of $1.05 trillion in sales just in the 3rd quarter of 2011 alone. And the explosion of the mobile market just adds to this party buzz – purchases made on mobile devices are predicted to reach as much as $4.9 billion this year, and to grow to as much as $163 billion by 2015.
And the ground is rapidly shifting, given the number of new social media outlets appearing along with the seismic shift towards mobile browsing and buying. Over 63% of retailers are currently considering a redesign so that their site will look better to smart phone and tablet browsers, and according to Internet Retailer, over 80% said they were going to devote more resources to mobile in 2011, a number that has climbed in 2012 as mobile purchases rise. Read more
What online sellers used to know separately as Google Product Search and Product Listing Ads are now being merged into one entity called Google Shopping. And, as you probably know, while Google Product Search was a free, organic search offering, Product Listing Ads is a paid ad format. Now all sellers participating in Google Shopping will have to use the pay for performance model.
So, while sellers still can benefit from free, organic results on Google.com, those who were previously using the free Google Product Search and want to have their products show up in Google Shopping may have questions as to how to most effectively navigate the new world of bidding for their ads.
EcommerceBytes spoke with Google’s Vice President of Product Management, Sameer Samat, about how Google Shopping is different, how sellers can best take advantage of the new model, and how to get started with it. Read more
Search engine optimization is a popular and effective Internet marketing strategy. In fact, Marketing Sherpa found it is used by more marketers than any other marketing tactic. While many ecommerce merchants understand the value of SEO, they may not realize they need to optimize every single product page.
Since SEO can help your ecommerce businesses thrive, optimizing all your product pages for search engines is an important step for gaining more traffic and sales. Below are 10 tips to help ensure your product pages are properly optimized. Read more
It is generally thought that it is essential to be in the top two or three links in the search engine results listings, or at least on the first page of the results, but this isn’t necessarily so!
Think about it. Users that drill further down into the search engine results may be further along the purchase path, more committed to the buy, and looking for something specific. For this reason visit-to-conversion rates from the lower results may in fact be higher than your top listings! Read more