Skip to Content

Panel with Admiral Poindexter on the Future of Analytics

Predictive Analytics

Last Thursday I have been on panel with Admiral Poindexter on the Future of Analytics by invitation of TiE in North Carolina. TiE is the world’s largest non-profit technology entrepreneurship association.

 

Manny Aparicio, founder and president of Saffron Technologies opened the panel as moderator stating that data analytics is undergoing a huge transformation. Manny said, not only is there a growing demand for analytics to extract more value from data, but analytics must also be easier, faster and more accurate than ever before.

In his opening statement Dr. Poindexter pointed out that the Government misses (much like industry) the capability to bring data from different data bases together and make quantitative statements on future risks. Machine learning technologies like Associative Memory Systems could offer a unified view on disparate data from different data bases and text sources.

So far Analytics have been dominated for rear view analysis of corporate or intelligence data. Sense making and predictive capabilities have been very rarely deployed yet.

As Manny has asked me to bring in a Silicon Valley point of view I have decided to make my case for consumerization of machine learning talking about two of the companies which have received the most press coverage in recent weeks in the Silicon Valley, Groupon and Facebook.

Groupon is said by Fortune to be the company that has reached $ 1 B revenue the fastest ever. Some two years old their market cap was $ 1.35 B in April 2010. They have refused an offer by Google for some $ 6 B. Today their market cap is estimated to be $ 15 B. Why? Groupon bridges the gap between seller and buyer. The seller knows all about his product, e.g. when can it be produced at the lowest price, what are the detailed specifications, etc. The buyer on the other hand has little knowledge about the product.

Understanding the buyers needs and desires Groupon offers coupons that have to be purchased upfront and finds a good local or global producer that meets the customers demand. Groupons ads -witty and capturing- save buyers time to read through product specifications; they are fun to read too. If there are enough coupons bought for the product of the day the seller has to produce it and deliver it at the promised discounted prize; a classical win-win.

Groupon claims to have saved their online customers $ 1 B. In my opinion this is the first real biz model of the internet economy. Groupon offers global and local producers upfront buyer buy-in; the traditional coupon model lacks this risk mitigation for producers. Analytics with strong predictive capabilities are core to Groupon to preselect the right stores and producers and match them to the buyers demand. Of course, human beings take the final decision which product to promote.

Consumerization of AI techniques like machine learning or associative memory systems offer analytics at our fingertips supporting the information worker from the sales person to the executive being deployed as associative memory systems, as in-memory data bases or on mobile devices.

Predictive analytics are essential for the productivity of enterprises. Making Sense by connecting seemingly unconnected information like

  • the failed investment by Warren Buffet in Lehman with

  • the reluctance of some European banks to trade Lehman credit swaps with

  • the near collapse of Bear Stearns which caused all the short sellers to turn their attention to Lehman and cause a run on Lehman’s stock with

  • CEO Fuld trying to put a deal together to merge Lehman and Barclays, etc

could have predict the bankruptcy of Lehmann ahead of time. A certain European bank needed 3 weeks to formulate the right query to answer the question, “how many and which of their customers had an exposure bigger than X hundred thousand dollars to Lehman”. This European bank has realized the imminent Lehman bankruptcy only on September 13, 2008 when Timothy F. Geithner, then president of the Federal Reserve Bank of New York called a meeting on the future of Lehman; two days before the actual bankruptcy filing. If this European bank had predictive analytics they would have been able to know about their customer’s exposure before this famous Friday to take appropriate action.

SAP and Saffron Technologies have shown a prototype at SAPs SAPPHIRE 2010 that could have predicted the imminent bankruptcy of Lehman well ahead of time just using public documents. These documents were ingested by SAPs ThingFinder and analyzed and triaged by Saffron’s memory base.

An associative memory base can be used for sense making because it predicts data trends, connects entities and ranks them not just at the document level (like Google) but at sentence level. The Saffron Memory base is like an RDF triple store but its schema less and much faster.

I feel honored to have been on this great panel with the admirable Admiral and fellow physicist.

 

4 Comments
You must be Logged on to comment or reply to a post.
  • It sounds like a great event to participate in. Very exciting.

    I’ve got to say though, things like the Lehman Brothers example you give here consistently pop up when vendors want to show how great their analytics solution is. In reality, this kind of post-hoc analytics demonstration doesn’t show anything except maybe how to use the tool. In order for predictive analytics on a historical data pool to be a convincing demonstration, the predictive model must be applied to a large(ish) pool of data and supply an analysis with small enough rates of false positives and false negatives to be useful. Even this doesn’t truly demonstrate predictive efficacy (for that you need to actually predict stuff, so we know that the pool of data wasn’t cherry-picked), but at least it shows that the potential is there.

    When analytics vendors make these kinds of misleading demonstrations, it makes me wonder if the vendor’s system doesn’t really work that well after all.

    • You are right; there are no truly predictive algorithms. We cannot predict the future. The algorithms give hints and show which entities (persons, locations, organizations, etc.) are logically closer to each other by analyzing co-occurrence. A human being will always take the final decision. I mention that in the blog w/r to the Groupon example. Predictive Analytics is a marketing term used by everyone.

      The point being that “predictive” algorithms are a big leap forward compared to a “flat” Google search which can connect key words only to one URL. Machine learning algorithms like associative memories can connect many entities in one document and triage between them, i.e. they can look into the document and analyze at sentence level understanding predicate, subject and object (like RDF). Further, associative memories can evolve their schemas over time and thus understand the transition from a customer to a lost customer, a sleeper to a terrorist and from a top bank to a bank in troubles.

      The data pool would not be a problem for associative memory systems; they scale very well nowadays.

      • I think we are in strong agreement 🙂

        I agree with you that there are a lot of new techniques that are very applicable to analytics. It’s just the examples that drive me nuts. In my opinion, a vendor (and we’re not talking about SAP here, thankfully) giving a single example of the system “getting it right” gives us almost no information about how good the system actually is.