Skip to Content


Memory is the process in which information is encoded, stored, and retrieved. In general memory both biological and physical is important in life and we generally have a tendency to store every event in life physically, even duplicate or multiplicate data if it is crucial. In Business apart from the daily transactional data, companies want to store as much details of customers as possible and use the data to know the customers and their behaviours, to find the patterns that might improve their business. Now with devices and apps that track each and every movement of anything, the demand for data storage is definitely on rise.

bigdata-1

When we go back to the history of data and how it stored, we realise that it is all about decreasing the size of data, increasing the capacity and performance simultaneously. The history goes back to 1837 when Charles Babbage first purposed the Analytical Engine, which was the first computer to use punch cards as memory. In 1930s Gustav Tauschek developed drum memory and the data was stored in magnetic tapes and magnetic drums. Then cathode ray tubes and selectron tubes evolved in 1940s followed by floppy disks in 1970s. In 1970 Intel released its first commercially available DRAM (Dynamic random-access memory), capable of storing 1024 bytes or 1KB of memory. In 1990s CD disks and DVDs were invented and in early 2000 micro drives like USB flash drive and SD card came into existence.

Hard-Drive        9811-evolution-of-memory-storage

Now we have DDR SDRAM series devices, Solid State Devices and Hybrid memory devices that enable more storage in a given space and faster processing of data in a given time.  Also the data storage has moved from on-premises to cloud storage and memory storage is provided as a service.  All of these different storage devices helped businesses and people in general to manage their daily workloads and data.

As the demand for data storage performance capacity expands every day, the wave of new applications and data flowing from systems of engagement create a data management challenge. Therefore, transforming a data centre can totally transform a business, and storage is a key to accelerating the whole process, including database and its analytics, business applications and mix workloads. The improvement is all based on the idea of storing more data in the smallest size with the highest performance possible. Even the advent of in-memory technology like SAP HANA could be related to the evolution of fast performing storage devices. SAP HANA is an in-memory, relational database management system designed to handle both high transactions and complex queries on the same platform. It was developed by SAP team  in collaboration with Hasso Plattner Institute and Stanford University

There are many big names in the market competing to bring new products to improve data storage. EMC promising to create true agility for businesses workload and business process overall by providing their new product called XtremIO. Companies are able to react and respond successfully to the business processes workloads in the way that used to be impossible. XtremIO is totally purpose-built, scale out, all-flash array powered by Intel processors. Ori Bauer, Director, WW Development and IBM Systems Israel Development Centre, argue that they have the best solution for the increasing demand for the smaller size and faster performance of data storage and this is through their latest product of Data compression for the business demand on how to deal with their data, and this is by providing IBM Real-Time Compression, which compress data in-line and can store up to five times more data in the same physical space, This technology can be used for Enterprise level and cloud workload. In addition, Violin Memory changed the game of data storage by inventing new way to store data via violin flash storage platform; enterprises can run their primary storage and entire Business in a Flash. This is developed from a vertically integrated design of software, firmware, and hardware enable the transition of primary storage from legacy solutions to all-flash.

Lets consider a business case where flash memory could improve the testing process in a company. SAP Innovation discovery has about 300 applications (do not require additional licenses) that could be implemented in a company. Testing these innovative applications would be a tedious and time-consuming process for any company. The process could be made easier by using the Flash memory and its fast processing capacity. Flash memory capability could be used to copy production data into multiple testing platforms in parallel and in less time. This enables parallel automation testing of innovative applications on various testing platforms. Hence reducing the test turnaround time of implementation of these applications. Multiple test environments would even enable the faster implementation of new software patches in a company.

sap innov

In future the computer memory may reside in holographic technology. Holographic data storage is latest technology in the area of high-capacity data storage, which is  currently dominated by optical and magnetic storage devices. These storage devices rely on individual bits being stored as distinct magnetic or optical changes on the surface of the recording medium. Holographic data storage is capable of recording multiple images in the same area utilizing light at different angles and records information throughout the volume of the medium. In addition optical and magnetic data storage records information in a linear fashion a bit at a time, whereas holographic storage is capable of recording and reading millions of bits in parallel, enabling data transfer rates greater than traditional optical storage. There are still concerns about its storage capacity, durability and sensitivity.

264953_7081-789865

Holographic data storage might be the next big thing. With more research and technological advancement this data storage might replace the traditional optical and magnetic storage devices and may be the future in-memory technology might reside on holographic memory devices. Finally, we would like to Thank our BCO6181 Lecturer – Tony De Thomasis who shared his valuable knowledge throughout the semester at Victoria University and provided his innovative thoughts and inputs for this blog.

Also a special mention about the guest speakers for the BCO6181 class – Alfonzo Venturi, Judy Cole and Leanne O’Connor, and SAP Mentors – Matthias Steiner, Graham Robinson and Paul Hawking. Thanks for spending your valuable time with us and sharing your SAP knowledge and experiences.

Thanks visitors for your time and reading through our blog. Please post your comments 🙂

-Krishna Mattaparti
Araz Albeg
Vamsi Krishna Bathula
Radhika Tammireddy
Khalid Jameel
Ravi Kumar

To report this post you need to login first.

8 Comments

You must be Logged on to comment or reply to a post.

  1. Tammy Powlas

    Nice work and nice blog

    I wasn’t aware of holographic data storage or flash memory; these are nice use cases

    I look forward to seeing future contributions from you here on SCN

    Tammy

    (0) 
    1. Krishna Mattaparti Post author

      Thanks for the comment Tammy 🙂 . Tony enlightened us with these concepts.

      I would have more contributions on SCN in future 🙂

      Kind regards,

      Krishna

      (0) 
  2. Eduardo Sato

    Congrats Krishna. Nice blog.

    Maybe you would consider transfer rates/bandwidth as future subject.

    I guess this could help people understand why increasing storage IO is totally different from processing things in DRAM directly. Best regards,

    (0) 

Leave a Reply