Comment

Inferring Intent

Today’s Gartner blogpost points to some interesting limitations and opportunities surrounding intent data. Let’s start at the beginning by defining what it is.

Simply put, intent data is an indication that an individual or organization is actively interested in purchasing a specific product or service. You may already be familiar with sales triggers. One classic sales trigger is so-called “new move” data. It’s valuable to know when a company moves offices because it is highly likely that the company will likely make lots of new purchases such as office furniture and the like. Think of intent data as a more sophisticated cousin of the sales trigger.

Media companies are in a great position to generate sales intent data, because much intent data is generated by watching what a person reads and does online. If a reader looks at five articles on 3-D printers in a short period of time, those actions can be viewed as indicating an intention to purchase a 3-D printer. Intent data can get a lot more sophisticated than that, but this gives you the general idea.

You might think that if a sales organization has intent data available to it, that’s probably all the data it needs. After all, intent data is like mind-reading: it’s identifying people who are likely to be purchasing a product before they purchase it. What could be better?

Well, as the Gartner blogpost points out, many companies are filtering sales leads based on intent data with something called “fit analysis.” This is an automated attempt to evaluate if the company is a likely buyer. If your company typically sells to larger, multi-office organizations, a fit analysis will filter out smaller, single location companies because they represent lower grade prospects.

Further, the Gartner blogpost notes that companies selling highly specialized products or brand-new technologies often can’t get enough intent-based sales leads or they get leads that are weak because the intent indicators aren’t sufficiently granular. Finally, some sales departments don’t like intent-based sales leads because they identify prospects too early in the sales process. As you can see, sales leads based on intention are still fairly rudimentary, and there is lots of opportunity to refine them.

But what’s most worthy of note is that Gartner believes that most intent-based sales lead data is focused on the technology industry. But there is no reason that it should. Technology sellers just happen to be free-spending early adopters. I have long preached the virtues of what I call “inferential data,” a term that includes both intent and sales trigger data. I firmly believe that many data publishers have opportunities in this area, and if they happen to be part of larger media companies, they are even greater. In fact, data publishers are natural providers of fit analytics as well. If you look at your data creatively and read between the lines you can make some very lucrative connections. 

Comment

Comment

Data Democratization: A Timely Trend That Empowers Users

“Democratization” is the latest trend in data. While it is rapidly acquiring multiple definitions, the one I find most useful suggests that there is a growing opportunity to open up complex datasets to people who could benefit from them, but haven’t traditionally used them.

With this definition, data democratization usually involves some combination of pricing and user interface design. Reduced pricing is meant to make a data product more broadly accessible, and user interface design is about making the data incredibly easy to use. Putting these two together, those employing a data democratization strategy believe they can significantly expand their markets. In addition, a powerfully simple user interface should result in reduced support costs by enabling less sophisticated data users to start getting the answers they need directly, by themselves.

The best opportunities for data democratization? Look for data silos.  The data provider combines several datasets, doing all the complex normalization and matching that is required. The user interface then lets users painlessly do what amounts to cross-tabulation and filtering with all the complexity carefully hidden. Results are usually in the form of highly visual data presentations.

Data democratization is not “dumbing down” data. Indeed, a democratized data product often has all the power of much more complex and expensive business intelligence (BI) software. The nuance is making the user interface more accessible and less scary, and reducing the price point so that the product isn’t a major purchase decision.

You can see an analogy of sorts with what happened with computers, moving from centralized, expensive installations operated by a few with specialized skills to the amazing desktop computing capabilities we all enjoy today. Whether data democratization is an opportunity of the same scale and profundity as the computer revolution is unclear, but it certainly bears close watching because this is a strategy with a powerful first-mover advantage.

To see a great example of data democratization, check out one of this year’s Models of Excellence, Franklin Trust Ratings.

Better yet, meet the founder behind it. John Morrow, at this year’s Business Information and Media Summit, Nov. 13 – 15 in Ft. Lauderdale. There will be lots of other data trendsetters there too! 

Comment

Monetizing the Middle

Regular readers know that I am focused (fixated?) on a concept I call “central market position.” I use this term to describe companies (typically media and data companies) that occupy a trusted, established and neutral position in the markets they serve. Central market position is important because it can be monetized.

Traditional data publishers collect data themselves, whether via manual or automated means. They scrub it, organize it and otherwise add value to it, then turn around and sell it This is a solid, established and successful model, but companies with central market position have a much larger opportunity.

With central market position, you have the potential to do things that nobody else can, things that would otherwise be viewed as impossible. You can, for example, ask all the companies in your industry to share their customer lists with you, their sales data, employee information, their prospect lists – practically anything. How is this possible?

Well, two conditions must exist. First, this privileged information will only be provided if it is directly used to solve a major need or problem in the marketplace. Second, the intermediary who will be handling the information has to be established, trusted and neutral. The natural intermediary is a company that has a central market position.

Consider a product called PeerMonitor, a Thomson Reuters product. PeerMonitor literally hooks into the accounting software of participating law firms and sucks out all their billing information, right down to line item detail. Why would any law firm allow this? Because the need to know the going market rate for, say, a bankruptcy attorney in Atlanta far outweighs the reflexive need to protect information like this.

Consider also a company called SQAD in the media world. Advertising agencies electronically submit their purchase orders to SQAD. Are they giving away key company secrets by doing this? Yes, but it’s worth it because the data that comes back to these agencies – namely the real prices being paid for television and radio advertising – is more than worth it. And SQAD, as discussed, is a central, trusted neutral player that normalizes, de-identifies and aggregates the data in such a way that companies can give away their secrets without giving away their secrets. It works for everyone involved. Another interesting company is MDBuyline. Here, participating hospitals submit price quotes for medical devices and other hospital equipment. MDBuyline aggregates the data so that all participating hospitals can see the true going rates for medical equipment, not the meaningless list price. Again, the benefit is sufficiently large to justify supplying confidential information to a third party.

What you need to do is recognize your central market position, and start identifying market needs you can address as the central collector and aggregator of critical industry data that would otherwise never be shared. Trust me, the opportunities are endless. 

Getting Inside the Head of a Sales Prospect

B2B prospect identification and targeting has come a long way in the last few years. Things that once seemed impossible are now taken for granted. We can now identify with some precision when someone in a company is actively in the market for a new product. We can take this purchase interest information and bump it against company firmographic data to help qualify and score this individual as a lead. We can easily review the business contacts of this person to see if we know people in common. We can view the work history of this person, and even order a deep background report based on public domain data. We can order an organization chart for this person’s company to understand where he or she fits in the hierarchy of the business, as well as to identify other possible purchase influencers. Pretty impressive, right?

But what if we could go further? What if we could get something close to a psychological profile of the prospect to better understand how to interact with that person to advance the sales conversation. You probably won’t be surprised to hear that there is a company working on it.

The company is called CaliberMind. By mining public domain data, email exchanges with the prospect and even recorded telephone conversations with the prospect (prior consent to recording is required by Caliber Mind), CaliberMind can provide a salesperson with deep and unique insights into the personality and motivations of the prospect, along with recommendations on how to engage them most productively.

Yes, there is an inherently creepy aspect to this, but CaliberMind stresses it only works with public information and freely shared information between the salesperson and prospect. What it does, besides mining these information nuggets, is interprets them in order to build a deep profile of the prospect and specific tips on how to accelerate the sales process. Not surprisingly, the company was founded by former intelligence agents.

This is cutting-edge stuff from a young company, but in many respects it seems to be the logical culmination of the various selling tools that have been introduced over the past few years. CaliberMind is leveraging both increased computing power and the explosion of public domain information to help inform and accelerate the B2B sales selling process. CaliberMind also represents just one more piece of evidence that data opportunities are everywhere – and that the tools needed to collect, process and apply data continue to get more and more powerful.

 

Can You Over-Monetize?

To avoid accusations of commercial blasphemy, I am going to pose this as a question, not a statement: can you over-monetize a data product?

Consider the online real estate listings databases. There are lots of them, all engaged in a fierce battle to the death. They make their money selling listing upgrades to real estate agents, a hotly competitive and demanding group. The product they are selling is homes that can easily cost $1 million and more, with very sizable commission dollars at stake. In such a high ticket and fiercely competitive market, would you want to junk up the user experience with irrelevant advertising, and annoy your real estate agent customers by distracting users from the listings they are paying to enhance? The answer appears to be yes.

Several of these sites have now been designed to display programmatic ads. With all it takes to attract a live buyer to your site, do you really want to risk that buyer clicking on an ad for a local car dealer and leaving your site entirely? Do you want to intersperse listings of homes with ads for mortgages when your primary source of revenue is real estate agents who badly want your site visitors to look at their listings?

You know the saying: real estate is all about “location, location, location.” Does it make sense then that when a potential buyer clicks on a map icon to see where a home is located, she is presented with a map cluttered with logos indicating the location of nearby State Farm insurance offices? Does anyone buy a house based on proximity to an insurance agent? Doubtless someone thought this was a clever marketing gambit, but it distracts, confuses and possibly annoys the potential buyer.

The photo slideshows that are the critical core of each home listing are now increasingly cluttered with advertising. If I was a real estate agent paying to upgrade a listing only to find it was chock full of ads, I’d be furious. I want prospects looking at pictures for the home I am selling, not distracted or annoyed by irrelevant advertising from third parties.

A lot of this comes down to the degradation of the user experience. But in some cases, it’s an even bigger issue: it’s a problem of the data publisher forgetting who they are serving and in some cases, why they are even in business. A little bit of incremental revenue can sometimes have a very high cost attached. And the guiding rule of all things online remains the same: just because you can, doesn’t mean you should.