Locators have a unique challenge when presenting a brand’s sales, distribution and retail channels: On one hand, they serve the customer by identifying the nearest location to buy, but they can also provide a tool for competitors to analyze market penetration and distribution networks.
This can lead to a competitor automatically crawling a locator on a regular basis to monitor the expansion of a distribution network, any movement into new markets or other changes in their competitive landscape.
Why would competitors crawl your locator?
- To build a list of your installers/dealers/distribution contacts so their sales team can push competing products directly alongside your own.
- To watch for changes in your market penetration and distribution.
Many types of anti-crawling technology are available for Websites, however; the locator itself needs a special kind of anti-crawling. MetaLocator uses an AWS Web Application Firewall which includes bot protection, among a larger stack of security best practices. However, these off-the-shelf technologies rarely detect and block the type of competitive analysis activity discussed in this article.
Common examples of competitive crawling can include:
- A real human goes to the locator and performs searches over and over looking to collect a list of all partners
- A bot runs through a list of postal codes and searches each one; gathering a list of partners
The challenge to identify malicious traffic while not blocking real users is as old as the Internet itself. MetaLocator includes a built-in blocking layer to capture and log malicious bots and users. Our built-in crawler protection is driven by industry experience, statistical analysis and application-level logic.
The outcome is a data-mining and flood-resistent locator and serves customers and search engines, not competitors, while protecting your application from API overages and un-wanted content farming.
One thing that MetaLocator’s technical team has learned over the years is that locators are 1. a unique target for content crawlers and 2. not usually protected by traditional bot blockers and firewalls. This is yet another reason to avoid unsophisticated locator plugins and choose a hosted service like MetaLocator.
Robot toy Photo by Eric Krull on Unsplash