Comment

Adwords Now Promote Tele-Measurability

Google has just launched a new Adwords feature that offers lots of useful applications for online marketers. It’s so slick, I am a little surprised it hasn’t received more press. In a nutshell, Google Adwords can now track and report not only how many people click on an ad, but how many people call based on an ad.

Yes, you’ve doubtless heard claims like this before, but this one seems pretty solid and pretty powerful as well. Here’s how it works.

Google Adwords  dynamically inserts a key telephone number (supplied by Google) into all places you specify on your landing pages or even your full website. Put another way, anyone who gets to your website via Adwords will see a Google-supplied phone number and everyone who gets to your website any other way will see your regular phone number. You can format the Google numbers to match your website look and feel, and the phone numbers will be dynamically supplied for up to 90 days.

While Google doesn’t mention B2B applications specifically, you can immediately see the benefits. Few B2B buyers click through from an ad and immediately place an order. More likely than not, there’s a phone call involved, and even then, the phone call might not happen immediately. With this new feature, it’s possible to measure and track this unique aspect of B2B buying which is well understood, but has been devilishly difficult to measure.

Comment

1 Comment

The Value of Volume

A fascinating story in the New York Times takes us inside a planning session between Facebook and a major consumer products marketer, Reckitt Benckiser (RB). RB wants to market its fish oil nutritional supplement via Facebook. After listening to Facebook executives pitch a large and broad marketing campaign, the RB marketing manager stops the conversation to note that he’s interested in using Facebook because of its targeting capabilities, not because of its massive reach. He suggests that perhaps RB would be better served by targeting known fish oil buyers as well as buyers of other products that are suggestive of the consumer having an interest in heart health – a key benefit of taking fish oil.

The representatives from Facebook, rather than embracing a targeted approach, instead pushed back. It wasn’t that Facebook couldn’t deliver this highly targeted audience – it could. The primary objection of the Facebook team was that a highly targeted marketing campaign like this would be “too expensive.” And keep in mind that RB is a major global marketer, with annual sales pushing $15 billion.

If you find it odd that Facebook is pushing marketers to broadly-based marketing campaigns, the underlying logic is even more intriguing. The Facebook “advertising strategist” in the meeting explains the company’s view by saying that advertising on Facebook was like firing a shotgun. My early education in the world of direct marketing taught me that the goal was to use information to move away from sloppy, imprecise shotgun marketing, to precise rifle-like marketing. Perhaps this strategist simply butchered this well-known metaphor. But as you read deeper into this article, you start to see that Facebook really wants to sell big, volume campaigns with only minimal targeting.

You probably see where I am going with this. Facebook develops this truly massive and engaged audience. On top of this, it has unbelievably detailed and accurate information about this audience, and can target against these data. Yet when a large and sophisticated consumer marketer wants to take full advantage of this deep targeting capability, Facebook proposes a mass media solution instead. Sure, Facebook is happy to tweak this huge campaign once it rolled out, but Facebook clearly wants you to start big and whittle your campaign down over time.

Why is Facebook seemingly disavowing the precision targeting that is ostensibly its greatest asset and differentiator? While the article doesn’t provide any numbers, the likely answer is this: money. You make more money selling volume than precision.

I saw this many years ago in the direct marketing world. Everyone there preached the gospel of targeting – rifle precision over shotgun sloppiness. I listened to endless presentations by clever people showing how you could use data to identify and reach the exact best prospects for your product. But try to actually purchase these highly targeted names, and you got the exact same tap dance now being performed by Facebook. The reality is that nobody with a ten million name database wanted to sell you just 500 of those names – your absolutely best prospects. Direct marketing was a volume-based business: you made your money selling big lists, not precisely targeted lists (which truth be told were kind of a hassle to produce anyway). It was a business where you talked targeting and sold saturation.

And lest you think I am pointing a finger only at media companies, be assured that marketers need to sign up for their fair share of the blame. For example, I see B2B media companies working so hard to deliver fresh, hot, highly qualified sales leads to marketers who in many cases won’ t buy them unless the media company can guarantee a minimum volume each month. Quantity trumps quality – again.

Like it or not, there’s an important lesson here for those of who run media and data companies. Precision and targeting are great – but only to a point. Marketing and sales activities are inherently sloppy and imprecise activities, and that’s why they depend on volume. And we also now must acknowledge that in a digital world, mass marketing is cheap and can often yield powerful market research insights. That’s why, despite all the advances that have been made in increasing precision, there will be continuing value in volume.

1 Comment

Comment

Deals of Excellence

It’s been a banner few weeks for deal making for companies honored as Models of Excellence by InfoCommerce Group.

In the mega-deal category, we have 2006 honoree real estate data powerhouse Zillow, entering into a $3.5 billion deal to acquire arch-rival Trulia. This will of course put pressure on market leader Home.com, which operates the Realtor.com website. The whole real estate vertical has been one to watch from a data perspective. Zillow was not only an early innovator in the area of map-based user interfaces, it also blew more than a few minds by not only aggregating property data on almost every home in the country, but creating a home price estimate for every home as well. If this merger goes through, expect even more extreme innovation as these two giants battle it out for audience and advertising.

In the smaller (but hardly small) category, we have the $175 million acquisition of 2009 Model of Excellence honoree Bizo by 2004 Model of Excellence honoree LinkedIn. From a strategic standpoint, I’d rate this acquisition as nothing short of brilliant. At a high level, you are putting together “who” (the LinkedIn database, with “where,” (the Bizo B2B ad network). The potential opportunities are endless.

And while we’re in the world of high finance, a shout-out to 2010 Model of Excellence honoree SmartZip also seem in order, as they’ve just closed on a new $12 million financing round.

Where do these and other Models of Excellence companies meet each year to get their deals on? InfoCommerce Group’s DataContent gathering, now part of an even bigger show, the Business Information & Media Summit. See you in Miami!

Comment

Comment

Transforming Data the Humin Way

Imagine launching a start-up that is touted as a pioneering “social operating system” a key player in the burgeoning area of “contextual computing” and even a “digital butler.” Let’s go even further, and imagine the burden of having to live up to the goal of “organizing the world” and most intriguing of all, building “a master contacts database for pretty much the entire world?” Well, if you can in fact imagine living up to expectations like this, you’ll probably want to apply for a job at a company called Humin.

On a more practical level, Humin (at least for now) is an app that grabs your contact list, calendar entries and social networks to build a master list. It then automatically contacts everyone on the list and asks them to confirm their details and provide additional information. Once all these data are confirmed and unduplicated, you get a contact list that can be searched by location, by connections (who knows who) and a lot of other ways that go far beyond the typical address book.

To live up to its contextual computing hype, Humin wants to move into push mode. Fly into Cincinnati, for example, and it will present you with a list of your contacts there. Humin will of course get smarter as it begins to find deeper meaning in both the data itself and how you use it. Privacy concerns? Not to worry. Humin hangs onto only the minimum amount of data needed to do its magic – all the most valuable data stays right on your phone.

Those of you who are students of data may see shocking similarities between to an earlier service called Plaxo. In its original incarnation, Plaxo grabbed your address book and would periodically query everyone in it via automated emails to confirm that their details were current. Even more cool, if you updated your own information, Plaxo pushed it out to all your contacts automatically. It was the original globally synchronized contact list. Ultimately, Plaxo went astray, jumping on the social media bandwagon in a failed attempt to challenge Facebook.

The lesson of Humin (beyond possible confirmation that all great data publishing ideas are derivative), is that while Humin may be loosely based on the Plaxo concept, it is moving aggressively to surround data with tools. Humin isn’t just organizing and tidying a giant pile of data and then asking the user to find value in it – it is innovating in multiple ways to do that thinking for the user, and to deliver the right data in the right format at the right time to offer maximum value. We at InfoCommerce Group call it “data that does stuff.” Surround good data with good tools, and you, too, can become master of the data publishing universe.

Comment

Comment

Deriving Data from Images

Google recently acquired a company called Skybox Imaging for $500 million. While admittedly a small deal by the standards of Google, the potential of this acquisition is so mind-boggling it is amazing it has not received greater attention. You see, Skybox makes satellites. And it’s just a year or so away of having six satellites orbiting the earth that will be able to photograph, in high resolution, every spot on Earth and do it twice a day.

In many respects, the innovation here is speed of refresh, not resolution of the images. There is nothing particularly special  about the optics used by Skybox. The excitement comes from the frequency of update. That’s because if you check on things frequently, you can see changes easily. And from those changes you can infer meaning.

Already, we know that satellite photos can be used to calculate the square footage of a building (based on the roof surface), a potential boon to roofing contractors, who can now do estimates from their desks. But you can also start to see angles for data providers as well: the size of a company’s facility can let you infer a lot of valuable things about that company. And Skybox will potentially take this concept to the next level.

Overlay satellite photos with map data (something Google routinely does now), and you now know who owns the property you are looking at. Checking the number of cars in the parking lot twice a day could allow you to infer number of employees. Over time, you could infer if the company is growing or shrinking.

One hedge fund (allegedly) now uses satellite photography to check the number of cars in lots at big box retailers to infer sales. It’s suggested that Skybox can assess the quality and yield of crops while still in the ground, as well as the amount of oil being pumped around the world (by analyzing storage tanks). Consider construction data, where new home starts and completion rates could be accurately measured on a daily basis. Consider measuring the truck and rail traffic into manufacturing plants over time to assess financial conditions. Let your mind roam, because that’s what Skybox is all about.

And lest you think I am alone in this geeky view of things, consider this statement by Skybox co-founder Dan Berkenstock, “"We think we are going to fundamentally change humanity's understanding of the economic landscape on a daily basis.”

The key to all this magic is software that is smart enough to interpret photographic images. This is where images get turned into data. And once that data is overlaid on maps, giving it both context and other data such as ownership, you quickly move to actionable data.

I’m focused on commercial applications for Skybox. For those considering the consumer implications, privacy concerns abound. For the moment it seems, we have to rely on Google not to be evil. And in the interim, there’s still a lot of work to be done to get this infrastructure fully in place and to determine what can be measured as well as what is worth measuring. But as a potential new source of high-value business intelligence in structured form, Skybox is painting a very pretty picture of the future.

Comment