When it comes to information, is faster always better? As information users, we all want to have the freshest possible information possible on which to base our decisions. But, as many data publishers have learned the hard way, while everyone wants up-to-the-second accuracy and currency in their data, not everyone is willing to pay for it. Indeed, we’ve noted with concern the growing trend towards “good enough data,” where users are willing to sacrifice some amount of accuracy and currency in favor of a significantly reduced price. So, on a practical basis, a data publisher could be excused for concluding that the most accurate and current data shouldn’t be a top priority.

Things, however, are a bit more complicated than that. The speed of information updates does matter, a lot, in specific applications and markets and people in those markets will happily pay a stiff premium to get hold of such data. The obvious place to look for proof of this is the world of finance. If you have information that can move the price of a stock or the entire market, speed matters. Consider that Thomson Reuters used to charge a premium for those who wanted access to an important consumer sentiment survey just two seconds before everyone else.

There are more mundane examples of this in non-financial markets. Consider sales leads. While every second may not matter when it comes to sales leads, there is added value in delivering them quickly, particularly if based on a real-time assessment of a prospect’s online browsing pattern.

Given all this, it would seem that a new service called Now-Cast Data has a winner on its hands. That’s because the company, run by economists, is preparing to offer a real-time economic forecasting service. Real-time delivery is actually something of a breakthrough in the world of economic forecasting, which is used to monthly, quarterly and even annual reporting. Clearly, by accelerating forecasting, the financial types will gain an information advantage for which they will pay handsomely. Or so it would seem.

But as an article in the Wall Street Journal notes, Now-Cast Data has some convincing to do. The core issue is that while Now-Data is certainly accelerating forecasting, at the end of the day, it is still offering forecasts. It can’t be sure what will or won’t happen, or whether specific events (e.g., inflation) will persist. As an economist in the article notes, “When a big outside event disrupts the economy, those are hard things to forecast. By definition you can’t build them into your forecasting model because they haven’t happened yet.” In short, we’re guessing faster, but we’re still guessing.

So where do I come out on speed? Is faster data always better? At least for now, I don’t think it is. Right now, it’s only really valuable in a specific, limited set of applications. Keep in mind too that we’re already drowning most of our customers in data. Getting the fire hose to pump faster just makes things more unmanageable for them. And speed is a relative concept as well. If a company changes its address, that’s a valuable, time-sensitive piece of actionable information. But if you already pass that information to your customers the same day you learn about it – say 8 hours at most – accelerating that to 8 minutes won’t improve either customer sales results or your bottom line. As data publishers, we want to be continually looking for ways to obtain and move information faster, but speed is something that’s ultimately defined by your customers and your competitors.