Cloaking: This is the technique where web sites intercept each request, recognize the ones from search engine bots and serve up different content to the bots rather than serving up the same pages seen by the end user. It is obvious how this can be misused; you can load the web pages you serve up to the bots with misinformation to get a higher ranking. An action such as this can very well result in the search engines blocking your web site.
Link Farms: Being a part of link farms, where web masters agree to put links to each other’s sites on their pages to increase their ranking, is also a quick way to get blacklisted.
Monitor and Control bot Traffic: Modern search crawlers constantly update their caches, and hence, trawl through the World Wide Web at amazing regularity to detect changes. There is a good chance that these requests from bots might overwhelm the web site, and controlling this is of high importance. One solution is to have a mirror website running on the site dedicated to serving bots only. A load balancer can then be configured to redirect all requests from spider bots to that particular web server.
How do websites go about collecting information about what their users are interested in buying? One is asked to answer a number of carefully drafted questions. Answers to these questions are used by web sites to determine how best to customize your shopping experience. This form of information gathering is called explicit profiling. The advantage of this approach is that the user is forthcoming with the information. No guesses required. The disadvantage is that in this day and age, people are over-cautious about the information they provide to online vendors. Often enough, responses to some of the questions may be false.
Explicit Profiling and the Anonymous User
If web sites personalize using only the explicit profile, then personalization might not be effective and, infinitely worse, might backfire on them and they might start losing users. Explicit profiles are created only for users willing to register with the web site. This effectively rules out personalized shopping experience for users not registered with the site (known as Anonymous users). The pitfall to this is that there is a major disparity between the experience of a registered user and an anonymous user. Anonymous users should be treated as prospective registrants. A good shopping experience for anonymous users might lure them to register with the site. An explicit profile will then be created for them.
Another approach to determine user interest is capturing click-stream data as the user navigates through the site. Click-stream is information on what pages a user viewed, what searches were performed and which links were clicked. This information can be captured for both anonymous and registered users. In the case of anonymous users, the click-stream can be used to perform some amount of personalization. In the case of registered users, click-stream data can be used to fine-tune the user’s interests. Click-stream data is an implicit way of capturing user information.
The Power of Two
In the context of determining user interests, these two approaches are not mutually exclusive. They go hand-in-hand and complement each other. When used effectively these prove to be potent weapons. In explicit profile creation, the online registration form is not just any odd form to capture information. The questions presented to the user should be limited enough in order to minimize the time required to complete the registration process while being succinct enough to gather as much information as possible. A lot of planning is required in drafting an effective registration form. In the implicit click-stream approach, within a few minutes of turning on click-stream data capture, the amount of data captured will be huge. Careful analysis of terabytes of data is required to glean knowledge from the captured data. Poor analysis of click-stream almost always leads to poor personalization.
- RSS 2.0 specification is the dominant method of RSS distribution via XML. However, it is plagued by many interoperability issues and the specification is considered ambiguous by most developers. ATOM 1.0 which draws from RSS 2.0 is the specification supported by the IETF and is considered the better specification.
- Use of a feed validator service early and often is highly recommended.
- Ensure unique IDs for articles.
- Support autodiscovery; do not use text/XML content type.
- Use atom:summary for summary data, atom:content for full content.
- Embed well formed XHTML.
- An XSL style sheet is the best method for formatting XML output.
- Using an RSS publishing website such as Feedburner is always a good idea.
- Encourage the practice of embedding license metadata in the feeds.
- Use ping services to notify third parties about feed updates.
- For decent interoperability with reasonable security, use HTTP basic authentication over SSL.
- For excellent interoperability with low security, use obscure feed URLs.
Wal-Mart’s ecommerce website gets a revamp with new social features like ‘Trending Now’, which is designed to match the shopping patterns of the new-age online customers.
The world’s largest retailer is making extensive use of its 4,000-plus store footprint along with mobility to achieve dominance in the digital sphere.
Walmart now credits revenues to brick-and-mortar stores for online purchases that are picked up there, reducing the sense of internal inter-channel competition, according to Neil Ashe, CEO of Walmart’s global ecommerce division. He added that about half of the merchandise ordered on Walmart’s website is shipped directly to a store, according to published reports, and that Walmart is also the only e-commerce site that allows its customers to pay cash for online purchases.
If you don’t have enough users who visit your website, it will obviously hit your ecommerce business badly. The success of an ecommerce business depends on the website’s traffic, so it is imperative to work on increasing the traffic to your website. Here are some tips:
Yes, email marketing is not dead and it is still a strong marketing tool. Have a tool like MailChimp or Aweber to capture the list of subscribers from your website. When you have a sufficient number of people, start sending newsletters on a daily/weekly basis to get back the users to the website.
Share your new offers, discounts, blog posts and virtually anything related to our business with social media. This is where you will get maximum traction. Have a proper strategy in place and work according to that.
After the proliferation of internet and social media, you would be thinking that offline marketing is dead? You are wrong! It’s still has so much to offer. If you use it strategically along with online media, then it can do wonders. For example, have a QR code of our website and apps on your print material rather than just the URL. It’s easy for the target audience to access the content. Also have your web and social media profiles on your business card too.
our business with social media. This is where you will get maximum traction. Have a proper strategy in place and work according to that.
Two years ago, when Strangeloop started tracking the load times of 2,000 top North American ecommerce sites, we had a hunch we’d spot some interesting trends over time. We did not expect, however, to see that pages are continuing to get slower rather than faster. Yet according to the Fall 2012 release of our quarterly Ecommerce Page Speed and Web Performance State of the Union, which came out today, that’s exactly what’s happening.
Not only are pages slower, they’re dramatically slower. Since November 2011, when we last tested these sites, the median home page has taken a 9% performance hit, with load time increasing from 5.94 seconds to 6.5 seconds. This flies in the face of conventional belief that, thanks to faster browsers, networks, and devices, the average end user is enjoying a premium online experience. This is clearly not the case. As user expectations continue to grow, the gap between expectations and reality continues to widen.
Maintenance of IT systems within the distribution center of a retail outlet is complex. Applications like enterprise resource planning (ERP) systems, warehouse management systems (WMS), warehouse control system (WCS), and others like labor management system (LMS) and transportation management system (TMS) are just some of the integration systems that top the maintenance checklist. A small problem in any of these areas could cause damage to the entire system.
To avoid such problems, it is imperative that you understand how data is collected originally within the various applications. This understanding can uncover deficiencies in processes or controls or even the base applications. These deficiencies can then be counter avoided by placing additional controls within the process of analyzing the base data for business intelligence applications. Software updates and maintenance releases are a must for all retailers systems. Read more