Viewing entries in
Best Practices

Is Your Data "Datanyzed"?

A new product by a cool young company called Datanyze is capitalizing on some well-established infocommerce best practices. Here’s how they did it.

The core business of Datanyze is identifying what SaaS software companies are using (sometimes called a company’s “technology stack”). To do this, Datanyze interrogates millions of company websites on a daily basis, looking for telltale clues as to the specific software they are employing online, and apparently a lot of categories of software can be divined this way. Datanyze aggregates and normalizes these data, then overlays company firmographic data (Alexa website rank, contact information, revenue estimates) to create a complete company profile.

Datanyze links directly to the Salesforce accounts of its customers, so it can add and update prospects on a real-time basis. At a basic level, the use case for this product is straightforward: a marketing automation platform like Eloqua could use it to find companies using a competitor or no marketing automation at all. But wait, there’s more!

Datanyze’s new product essentially flips this service. Now, Datanyze clients can have Datanyze analyze their existing best customers, and Datanyze will build a profile of these customers that can be used to predictively rank all their prospects, current and future. Here are the best practices to note:

  • The transition of Datanyze from a data provider to an analytics provider, something that’s happening industry-wide
  • The shift from passive (we supply the data, you figure out what to do with it), to active (here are top-rated prospects we’ve identified for you), and the associated increase in value being delivered by the data provider
  • The tight integration with Salesforce means that Datanyze customers just need to say “yes” and Datanyze can get to work – no IT involvement, no data manipulation, no delays
  • Datanyze is pouring leads into critical, core systems of its customers, a strong example of workflow integration
  • The use of inferential data. Boil down a lot of the analytical nuance, and Datanyze has discovered that companies that buy expensive SaaS software are better prospects for other kinds of expensive SaaS software. Datanyze doesn’t know these companies have big budgets; but it does know that these companies use software that implies they have big budgets

Datanyze offers a concrete example of how data companies are evolving from generating mountains of moderate value data to much more precise, filtered and valuable answers. Are you still selling data dumps or analytics and answers?

Shine a Light on Your Hidden Data

If you watch the technology around sales and marketing closely, you’ll know that beacon technology is all the rage. Stores can purchase beacon broadcasting equipment, and when shoppers enter their stores with beacon-enabled apps, the apps will respond to the beacon signals – even if not in use. Stores see nirvana in pushing sale offers and the like to customers who are already on the premises. And of course, it is expected that some mainstream apps (Twitter is often cited, though this is unconfirmed) will become beacon-enabled as well.

Beacons represent a concrete manifestation of the larger frenzy surrounding geolocation. Everyone wants to know where consumers are at any given moment, as epitomized by big players such as Foursquare, which has evolved from its gimmicky “check ins” to become more of a location-driven discovery service.

That’s why I was so intrigued by Foursquare’s most recent product announcement called Pinpoint. Shifting its focus from where people are now, Pinpoint is going to mine valuable insights around where people have been and let companies use it for precise ad targeting.

Details about Pinpoint are scarce right now, but Foursquare is smart to start mining its historical data. At the lowest level, it means that Foursquare can help, say, Starbucks target lots of Starbucks customers. Useful, but not too sophisticated. If Pinpoint can roll up businesses by type (such as pet food stores), it starts to get a lot more interesting. But the real home run would be to be able to divine purchase intent. If someone visits three car dealers in a short period of time, you suddenly have an amazingly valuable sales lead. And mining insights like this is now practical with Big Data tools.

But the real insight here is that your history data isn’t just ancient history: it provides the multiple data points you need to find patterns and trends. Knowing that a company replaces its CEO every 18 months or so is a hugely valuable insight that you can identify simply by comparing your current data to your historical data. At a minimum, you’ve got a powerful sales lead for recruiters. But that level of volatility might be a signal of a company with problems, thus creating useful insights in a business or competitive intelligence context. We’ve all heard about the predictive powerful of social media sentiment analysis. You may have equally valuable insights lurking in your own data. All you need to do is shine a light on them.

How Starbucks in Mall of America looks to Foursquare

How Starbucks in Mall of America looks to Foursquare

The Award for Outstanding Performance Goes to Internet Movie Database

We awarded the Internet Movie database a Model of Excellence  in 2003, and it is still a standout in terms of innovation and best practices.  

The Internet Movie Database (often called by its acronym IMDB) originally started in the UK as a non-profit undertaking, and it may well be the earliest and most successful example of crowdsourcing – well over a decade before the term was even coined. Very simply, the IMDB was a site for movie buffs worldwide to build an enormously detailed database of every movie ever made. And we are talking about a serious level of detail. Want to know who was the hairstylist for the co-star of an obscure French drama from the 1950s? Well, IMDB was the go-to source. What also made IMDB interesting was that from its inception it was a true database, and despite the inherently unruly nature of crowdsourcing, there were enough committed volunteers to take on the unsexy work of removing duplicate entries and normalizing the data.

In 1998, IMDB was quietly acquired by Amazon and turned into a for-profit company. There are some great best practices to be observed here. Taking over and commercializing a site built by tens of thousands of unpaid, die-hard movie fans was a risky proposition. The backlash could have killed the business in short order. But Amazon left IMDB alone, infusing it with editorial resources so the database got bigger and better every year. Better data, less work and all free. Not much here to get upset about!

But Amazon (surprise!) wasn’t in this to be charitable. First, it started marketing to the substantial audience of IMBD users with links to its site. Like the movie? Great. Amazon can sell you a copy.
Amazon’s next move was sell sponsorships to movie studios eager to promote upcoming releases. From there, Amazon launched a subscription-based Pro version of the database that offered enhanced searching and even deeper content to movie industry professionals for research purposes. The core site remained free, meaning Amazon was a pioneer with the freemium model, well before that term had become popular. 

Is Amazon now resting on its laurels? Absolutely not. To support both its Kindle and Amazon Prime offerings, Amazon has launched a service called X-Ray, powered by IMDB. Amazon also selectively licenses this new data capability. What X-Ray does is link movies to the IMDB database, so users can visually identify actors in the film, find movie trivia, explore the movie soundtrack and much more, right while watching the movie.  It’s not all software magic, by the way. Amazon is doing a lot of the necessary linkages manually, but it already has thousands of movies coded. Also of interest, it’s touting its “X-Ray Enabled” badge that if it plays its card right, could someday become a differentiator for new movie releases.

Endless innovation. Strong support of its core e-commerce platform. Deft handling of often prickly enthusiast community. Endless monetization. This is where data is going!

Does Co-Dominance Spur Disruption?

Outspoken Zillow CEO Spencer Rascoff made headlines this week by using an industry event to publicly describe his arch-rival, Murdoch-owned Move Inc., as “a crappy company.”

There’s no love lost on the Move side either. Move, which operates the Realtor.com real estate listings site, has previously cut off listing fees to Trulia right after Trulia was acquired by Zillow, and now has Zillow in court over its merger with Trulia.

Certainly, the stakes in the real estate listings data business are huge, so bare-knuckle competition isn’t surprising. What is surprising is that both companies are finding success with radically different business models.

Realtor.com has what I view as a very conventional “just the facts ma’am” user interface. It offers basic parametric search, with listings displayed as summary listings, each offering fast access to listings detail. Real estate agents can pay to advertise themselves or highlight specific listings, and are provided with sales leads as well.

Zillow, as you may recall, burst onto the scene with its “Zestimates,” its estimate of the value of every home in the country. This got Zillow immediate interest and tons of traffic, and it quickly became a major player in the market. Zillow also distinguishes itself with a map-based user interface and somewhat different listing detail than Realtor.com. But the “Zestimates” that helped Zillow rocket to the big time are a two-edged sword. Sellers almost always feel they should be higher, and buyers tend to assume they are much more authoritative than they really are. Zillow also sells advertising to real estate agents with essentially the same suite of offerings as Realtor.com.

Does it make sense that both can thrive? Certainly, we see examples of “co-dominance” in many very large BTC markets simply because they are so large. But while more subtle, it appears that the biggest weakness of both sites – neither has 100% of all listings – may be a strength. That’s because lots of people use both products, leaving real estate agents uncertain about where to place their advertising dollars.

It’s the same situation we saw play out in the heyday of the yellow pages industry. Independent yellow pages directories sprung up everywhere as lower-cost competitors to big, established telephone company directories. But advertisers, rather than cheering and running to advertise in the new, cheaper upstarts, found themselves confused and fearful. Which directory did their customers use? Did they use both? Well, the safest course for many advertisers was to advertise in both directories, meaning their cost to reach the same market went up significantly. Not surprisingly, advertisers were not happy with this outcome.

There are rumblings of discontent in the real estate market as well. Indeed, a new initiative called National Broker Portal Project, meant to be run by and for real estate agents and brokers, is gaining steam. It wants to create a major site that will be both dues-funded and run according to rules developed by the brokers themselves. It’s a long shot to be sure, but it shows once again that being the dominant player in a market is tricky, and sharing that dominance is even trickier. We must all remember that disruption in any industry is not inherently a one-time event.

Getting Your Data Into the (Work)Flow

In a fascinating move this week, Salesforce announced a new plug-in offering tight integration with Microsoft Outlook. This new capability, still in beta release, is offered free to Salesforce customers with its Enterprise plan or higher.

Why is this fascinating? First of all, Salesforce compete head-to-head with Microsoft in the CRM space, so this is arguably a shot across Microsoft’s bow on that front. But more importantly, it’s showing the growing importance of both applications and tight integration to data publishers.

One of the great weak spots of most CRMs has long been email integration. Getting email from one’s email client to the CRM has tended to be clunky and far from automatic. Getting salespeople to send email from the CRM tended not to be practical, and nobody wanted email messages spread across two systems.

This new integration from Salesforce doesn’t magically solve all these issues, but it’s a big step forward to making the user’s preferred systems and processes more powerful. And that’s exactly the mantra we’ve been preaching to the data industry for many years now. Get into the user’s work environment and get in deep. This is a great example of this principle at work. And even better, the many data publishers already leveraging the Salesforce platform can leverage it immediately and for free.

This brings up a larger discussion about data publishers becoming dependent on third-party platforms. Sometimes it makes sense and sometimes it doesn’t. And part of that decision process involves an honest assessment about whether or not you can reasonably achieve deep embedment like this yourself.

Another useful point this new plug-in highlights: for all its issues and flaws, email isn’t going away anytime soon. And because it is arguably the most core workflow tool at most companies, it is arguably the most important place to seek to embed your data … provided your data makes sense in this context.