RETAIL FUTURES: The Last Detail
The Last Detail
Playing Navy Signalman First Class Billy “Badass” Buddusky in Hal Ashby’s The Last Detail, Jack Nicholson won the Palme d’Or for best actor at the 1974 Cannes Film Festival. I first saw this film while in college at the Electric Cinema in Portobello Road, London as part of an all night screening of “Films That Made Jack Nicholson Famous” 1.
In those days, when not slumped in the back row of the cinema transfixed by the silver screen, I was trying to master Computer Programming 101, a required course for my Chem Eng degree. For the nerds of that time computing was all batch processing with 80 column punch cards. Interactive computing or timesharing (as it would initially be called) was still a few years away. Real-time computing did exist but in our world this was reserved for direct digital control of chemical plants and processes (and the occasional game of Colossal Cave).
Retail is Detail
Fast forward 10 years2… One of my first jobs as a budding consultant was to code allocation and replenishment routines for a retailer to automate what they had been doing manually (not spreadsheets but pencil and paper!). Fortunately, we had the services of a wise and wizened retail consultant that had been parachuted in from the US who taught us all we needed to know about exponential moving averages and Poisson distributions. More importantly he told us that we needed to perform our retail calculations efficiently – you see, he warned, “retail is detail“. We were dealing with 168 stores, 2,000 SKUs and 52 weeks of history which generated over 40 million data points. With an overnight batch window (from store closing to store opening) of 6 hours, and executing RPG code on a System/38, our allocation and replenishment routines needed to run fast. (Another retail project we had at that time, and which our US consultant helped us on, was to build a Merchandise Planning application (top-down, bottoms-up) on a PC using spreadsheets – we usually went to get a cup of tea – sorry no Starbucks then – when we hit F9 to recalculate).
Today, retailers have thousands of store, tens of thousands of SKUs and stores that stay open 24 hours, 365 days a year. There are also more channels, customers, competitors, e-mails, tweets, RFID, the Internet of Things (objects attached to the Internet) – the list goes on – all generating even more data. This adds up to a mountain of data with dual challenges – it is harder to find what you want and it takes longer to process it. As Eric Schimdt, CEO of Google, said at the recent Techonomy conference: “There was 5 exabytes of information created between the dawn of civilization through 2003, but that much information is now created every 2 days, and the pace is increasing”.
Of course, thanks to Moore’s law we have had, and will have in the future (at least till 2015, the experts say, when we will reach molecular limits), greater and greater computing capacity. However, as CPU speeds and memory capacities have increased, other aspects of computing performance such disk access speeds have failed to keep up. As a result of these disparities in speed, access latencies are more and more often the bottleneck in system performance (and not the efficiency of code… )3.
While processing in memory has been around since the 1990’s, it has only been in the last few years that it has become commercially viable. The recent introduction of SAP’s HANA (High Performance Analytic Appliance) means that Real Real-Time Computing is Here! Although in the first iteration SAP HANA is initially positioned as an analytics engine, the roadmap calls for an extension of in-memory computing to on-line transaction processing. This will present enormous opportunities for retailers to leverage the next generation of computing power. Of particular relevance to retail would be applications that involve:
- Optimizing large, multi-dimensional matrices such as Merchandise and Assortment Planning, Allocation and Replenishment
- Scanning large numbers of rows (records) or objects (media – image, audio, movie) for forecasting, purchase behavior, customer segmentation, loss prevention, etc.
(I am tempted to go back to being a programmer to see just really how fast is fast.)
The Devil is in the Details
So, show me the money (or, if you are so inclined, where’s the beef?). Does it really matter that we can process 460 billion records in 0.04 seconds rather than 20 minutes?
In 1990, amidst the business process reengineering frenzy of the early days of ERP, George Stalk, Jr., of the Boson Consulting Group published the seminal work Competing Against Time: How Time-based Strategies Deliver Superior Performance and gave birth to the concept of Time Based Competition. Time is a resource and a firm that makes better use of time (in responding to the changing market situations and other environmental conditions) acquires a competitive advantage. One of his prescient observations (also note the size of Walmart at that time, and that it was not the largest retailer but number three) was:
“Wal-Mart is one of the fastest growing retailers in the United States. Its stores move nearly $20 billion of merchandise a year. Only K Mart and floundering giant, Sears, are larger. Wal-Mart’s success is due to many factors not least of which is responsiveness. “
The main premise of Stalk’s book is that costs do not increase when lead times are reduced; they decline, and costs do not increase with greater investment in quality; they decrease. For retailers this is especially true today as we enter the ever increasingly interconnected world of mobility and multi-channel retailing and the online shoppers’ need for speed can mean lost sales in a matter of seconds.
With some trepidation, we’ll leave the last word to “Badass” Buddusky: “***** ******! That’s what I call quick.”
- The other films were Easy Rider, Five Easy Pieces, Carnal Knowledge, Chinatown and One Flew Over the Cuckoo’s Nest.
- How you go from chemical engineering to retailing in 10 years is another story – stick with this blog and you may find out.
- Unfortunately, according to Gates Law, the speed of software generally slows by fifty percent every 18 months thereby negating all the benefits of Moore’s Law. This could occur for a variety of reasons including: “featuritis” and bloatware.