Uncategorized infocomm Uncategorized infocomm

Categorical Denial

Just recently, industry visionary Esther Dyson published an open letter to Yahoo suggesting a new strategic direction. The article, entitled "Release 0.9: What Should Yahoo! Do?" offers two separate ideas, but it's the first one that caught my eye.

Dyson suggests that Yahoo can truly distinguish itself from Google by returning to its roots and becoming a directory again. In this context, Dyson means classifying websites against a taxonomy. She likens a Google search to using a searchlight in a darkened room. It highlights specific things, but you have no idea what you're not seeing. A classification system for websites removes this constraint by grouping similar websites together, through some human or automated effort.

This immediately brought back memories of a 1996 article in Wired magazine, that discussed Yahoo's categorization activities in fascinating detail. Although Yahoo co-founder Jerry Yang admitted that Yahoo was already falling behind in the job of categorizing the web (imagine a staff of 20 indexers trying to categorize every website), both Yang and the article's author seemed to agree the real challenge lay in creating an ontology - a classification scheme for all human knowledge. Ultimately, Yahoo abandoned its focus on site classification in favor of an automated keyword index. Another company (starts with "G") came along with an indexing approach that produced more relevant search results, and the rest is history.

There are two important insights here for data publishers:

First, while many data publishers view Google as an overwhelming and unstoppable competitor, its one-size-fits-all offering is both its great strength and weakness. You search on Google's terms and while it delivers a lot of value quickly, reliably and free, you never know what you are not seeing. There continues to be significant opportunity for those who can refine what Google does by improving on discoverability, by expediting the research process, by returning more relevant results in a more convenient format, and by coupling content with tools. And for most of us, it's best not to confront Google broadly; better to pick a specific niche and mine it deeper and better.

Second, while doing away with human indexers in favor of automated keyword indexing, Yahoo was able to vastly expand the number of websites it indexed, but ultimately, this high level of automation left it exposed to competitors with better technology. Had Yahoo continued to manually categorize websites (perhaps coupled with some level of automation and maybe a user-generated assist of some sort), I believe that today it would be an important alternative to Google, not an also-ran to Google. Further, investment in people to categorize websites, while expensive, would have yielded not just significant competitive differentiation, but also significant competitive defense.This game's not over; it's just beginning. And those who are smart about how to gather and organize information are on the winning team.

Model of Excellence Awards

We are pleased to announce that
Panjiva Inc.is a finalist for an InfoCommerce2009 Model of Excellence award.

Review Panjiva's Model of Excellence profile here Hear Panjiva CEO Josh Green at DataContent 09 DataContent 09: All Roads Lead to Data. Full program here.

Read More
Uncategorized infocomm Uncategorized infocomm

What You Design is What You Get

I just finished looking through a fascinating presentation from online design guru Joshua Porter. While the presentation is focused on social media sites, I think there are some good and useful lessons in here for data publishers as well.

Joshua kicks off the presentation with a challenging comment: the online behavior you are seeing is the online behavior you designed for (intentional or not). Stated another way, your results can't be better than the tools you use to achieve those results. It's hard enough to get visitor sign-ups, even for free products and services, so make sure you're not erecting inadvertent hurdles that will encourage people not to sign-up. When I talk to publishers who are having trouble getting sign-ups for free trials, for example, there is usually an implicit "What's wrong with people?" hiding in the conversation. Come at the problem this way, and you're not likely to ask the more productive question "Am I doing everything I can on my site to encourage sign-ups?"

And just how do you optimize your site to maximize sign-ups? Joshua's presentation provides some great, concrete ideas:

- Recognize that there are usually multiple user mindsets, and make sure your site provides for them -- one registration page does not necessarily fit all.

- Put some of the most length and onerous parts of your sign-up process at the back-end, not the front. In other words, get them online fast and collect a lot of your sign-up information later.

- Do A/B tests of the copy on your site. In many respects, copy matters more now than ever, right down to the buttons that users click.

- The real works starts after the sign-up. True user engagement just doesn't happen.
There's lots more in this presentation, and it's well worth taking the time to spin through it. As with everything, the devil is in the details.

Read More
Uncategorized infocomm Uncategorized infocomm

The Need for Speed

Kudos to Dave Jung of the always-interesting B2Blog.com for bringing up a topic that might seem tired and even quaint in this broadband era: optimizing your site for fast loading.

We live in a world of nanosecond attention spans. That means the speed with which your web pages (particularly your home page) load can mean the difference between a visit or a bounce. As web page design tools get more and more sophisticated, so too do web sites, often with a corresponding decrease in load time. Yes, I admit there is value in having an attractive, engaging site, but if that result comes at the cost of speed, you're probably shooting yourself in the foot. And please keep in mind I am talking about basic web pages here, not the self-indulgent Flash monstrosities that take 30 seconds to load in order to deliver another 30 seconds of information-free graphics and sound. And since I mentioned sound, I'd also like to suggest that you carefully consider the value of loud music and audio sales pitches on a B2B site, particularly on a home page. B2B sites are often accessed in open spaces and during meetings. What are the trade-offs of potentially embarrassing your visitors the moment they hit your site?

I also see issues with ad networks that serve ads from their own servers. I've had more than a few home pages essentially freeze on me as the home page desperately tries to download a display ad from an ad network server. Advertising is supposed to engage users, not frustrate them.

In short, rapidly increasing download speeds for more users hasn't actually erased all the issues relating to speed, because too many publishers have now constructed such elaborate sites they that have negated the value of all this additional download speed. Yes, in an era of seemingly "no limits," there are still some lines that should not be crossed.

The solution? First and foremost, make sure you are logging into your own site regularly - and not just from your office, where you might not get a true picture of what the rest of the world is experiencing. Also, Dave helpfully suggested two tools: Page Speed from Google and YSlow from Yahoo that both test site download speed and suggest ways to improve it.

Labels:

Read More
Uncategorized infocomm Uncategorized infocomm

Roadmap to Success

It's now almost de rigueur for data publishers to offer some sort of mapping option within their online products. Nine times out of ten, this is a simple link to Google Maps based on the street address of the company whose listing information is being viewed. There's absolutely nothing wrong with this, provided that publishers don't fool themselves about how much added value this provides to users, which is ... not much. At best, you're saving the user a few clicks. That's because basic street maps, like so many things online, are widely available for free.

But there is more to mapping than basic street maps, and there are real opportunities for data publishers to take their mapping capabilities beyond just an afterthought. What got me thinking about this was the announcement by MapQuest of a new mapping product complete with an API to make it easy to integrate into your own data product. What is offered in this premium product is several location-based datasets. Suddenly, mapping is more than where you are. It's now what's around you.Consider the possibilities. With this MapQuest product, you can overlay Census demographics, Congressional district boundaries, the name, SIC and location of over 13 million businesses, location of every U.S. public school, areas of traffic congestion and accidents, and a whole lot more. While every data product serves a different market and user need, I think the concept is clear: It's becoming a lot easier to integrate your data with third party data to develop very powerful and very valuable mapping applications. Don't underestimate the power of data visualization to set your data product apart.

Of course, the next logical step is then to allow your users to upload their own data, so their stores, sales offices, factories or whatever can be plotted alongside your data, as well as third party data. There are lots of different angles here, but they all start with a firm understanding of what kinds of data and applications your customers need.

Finally, because one of the beauties of the data business is that it's always a two-way street, you might want to consider whether your own location-based data might be something that MapQuest might want to license from you.

Right now, sophisticated mapping applications are the road less traveled by most data publishers, but for many, now's the time to chart a course to greater profits.

Labels:

Read More
Uncategorized infocomm Uncategorized infocomm

Data Recovery

You probably know about the big initiative in Washington to put more government data online, more quickly and more accessibly, in order to promote government transparency. A much-touted early pilot for this new initiative is Recovery.gov, designed to provide total transparency on the spending of $787 billion in federal stimulus money.

I'll admit right away that the Recovery.gov website is head-and-shoulders above the other federal agency websites I have used, both in terms of interface, navigation and ease-of-use. The problem with Recovery.gov, however, is that according to an article in the Philadelphia Inquirer, it is woefully behind in reporting contract awards.

If you find it frustrating that you can't get access to these contract awards, don't be. You simply need to go to a different site, Recovery.org. This site offers vastly more contract awards data under the stimulus program, but it's not a government website. In fact, it's operated by 2008 InfoCommerce Model of Excellence award winner Onvia.

Onvia, which reports contract awards at the federal, state and local levels, has spent years developing data feeds from government agencies, supplemented by its own compilation efforts. It's also developed sophisticated software to normalize and process all these disparate data feeds. Result: Onvia can post contract award data faster than the U.S. Government, which is writing all the checks.Onvia isn't entirely selfless in its launch of Recovery, org, which is free to use. The site also shows users how much additional contract information Onvia can make available on a paid subscription basis. In short, Recovery.org is a clever and powerful sales promotion tool.

Is the government committed to improving its Recovery.gov site? You bet. In fact, it's just let an $18 million contract to that end. Did this contract end up in the hands of Onvia, which not only has demonstrated its skills, but controls high-value data feeds to speed receipt of contract data? What do you think? According to eWeek.com, the contract went to a company called Smartronix, which is going to build all this functionality from scratch. A quick peek at the company's own website tells me it fairly earns the moniker "Beltway bandit," with its long list of government clients, and no evidence it has ever built a B2B website before, much less one designed primarily for consumer use.

Sometimes transparency just confirms what you could have just guessed.

Labels:

Read More