An article in the current issue of Wired discussing a new product from Dow Jones called Lexicon offers up this irresistible line:

"But many of the professional investors subscribing to Lexicon aren't human -- they're algorithms."

Okay, algorithms don't actually call and order up fresh, hot data for delivery like pizza ... but the people in charge of those algorithms do, and that's the real point.

Let me step back and explain Lexicon. It's an XML feed of breaking news stories with an intriguing twist: Lexicon pre-processes each news stories to add fielded sentiment analysis to each story, which is expressed quantitatively. In other words, the tone of the article is reduced to a number. That means that the customers of Lexicon -- institutional traders for the most part -- can more easily feed news content into their computerized models that are used to drive stock trading. Imagine, for example, a story about copper prices with a strongly negative numeric value associated with it.  Traders feed that into their software, which is likely looking at real-time copper prices and who knows what else, and formulate a computer-based buy/sell decision.

There are two elements to this product that really get me excited. First, we have a perfect example of how publishers, by pre-processing data to impute, infer or summarize, can add tremendous value to their content. Second, we have a wonderful example of the blurring lines between content formats. News and data used to live in distinct worlds, and Lexicon illustrates how they are coming together, by analyzing the news and assigning a structured numerical summation to it. As importantly, Lexicon makes the news more amenable to machine processing, and that's at the heart of the value proposition for data products.

Lexicon stands as a great illustration of the increasingly rapid evolution of data-text integration, an InfoCommerce Group "mega-trend" we've been advancing since 2001 (better a little early than a little late!).

Comment