The Big Data Jolt
Few would argue that the age of Big Data has arrived. The promise of previously unavailable insight and analysis is a driving factor behind the Big Data proliferation, and with good reason. When was the last time senior decision-makers from across a utility’s business unit – from finance and marketing to product development and sales all the way up to the C-suite – were universally so enthusiastic, excited, and eager to leverage what has been, until recent years, an intangible resource?
The energy industry is among those which have fully embraced the concept of Big Data with gusto – and why not? Virtually every aspect of the operation creates data which can be saved, culled and analyzed to mine important business intelligence that assists with future decision-making by identifying business opportunities and locating expensive inefficiencies. Big Data can help break down sales trends, find patterns through customer service interactions, weed out waste from the supply chain, and even help to optimize cash flow by optimizing important business processes like procure-to-pay and order-to-cash.
However, when utilities approach the task of effectively managing Big Data, a substantial challenge quickly arises – there’s simply too much of it. Think about this: every transaction, interaction, and electronic decision produces data. Every moment a customer’s electric meter is ticking, data is being produced. Some data is much more valuable to energy providers than other data, and it becomes problematic when there is too much data present in an active database. Data overload can easily choke processing speeds of systems and networks, creating performance bottlenecks that limit access to important information. Another issue is the speed, and depth, at which analytics can be performed. More data to process means a bigger picture view and, in theory, better insights delivered to executives and decision makers throughout the organization. Despite rapid advancements in analytics in recent years, technology is only now reaching a point where truly massive data caches can be analyzed in real-time.
SAP HANA: Next Generation Insight
Now, there is a new in-memory database platform available to energy providers and others who operate in an SAP environment, and it boldly promises to help fulfill the potential of Big Data at a level previously unattainable with existing data management technologies. That platform is SAP HANA. SAP HANA is an in-memory database designed to perform real-time analytics and to develop and deploy real-time applications utilized by large organizations to run the business. With such an emphasis on real-time, one promise of SAP HANA lies in its speed. SAP notes that early adopters of the SAP HANA platform have experienced query performance increases by as much as 100,000 times compared to disc-based databases. Such massively powerful computing horsepower has more than a few or many data-obsessed executives eager to make the transition to SAP HANA in order to gain what they perceive is a competitive edge over industry rivals.
A primary point that differentiates SAP HANA from other databases is the use of in-memory blades to store information. This is one piece of the system which allows for the dramatically improved processing times that enhance analytical intelligence: by processing more information at a faster rate of speed, a more accurate picture can be created. It also allows for faster information retrieval. This has a wide-ranging application, from expediting business processes to allowing energy usage queries that let customers access an up-to-date picture of their energy usage habits to make smarter decisions. Crunching all of this data quickly allows ultimately for the creation of newer, and more effective, business models.
Despite the promise of SAP HANA, there is one sobering aspect of this technology to consider: cost. It’s not practical to simply operate the database at a near full capacity, as the system becomes subject to the same performance bottlenecks as current enterprise resource planning (ERP) systems when available memory runs low. However, SAP HANA’s in-memory blades are rather expensive, which makes it impractical to routinely add memory on a ‘pay-as-you-grow’ basis. This can quickly cause IT budgets to spiral out of control, growing at an unsustainable rate.
For this reason, keeping databases as lean as possible is a major priority for businesses pursuing an SAP HANA strategy. This is accomplished by utilizing an aggressive and proactive approach to data management and archiving. Traditionally, the word ‘archiving’ conjures images of data stored away in an offline database that is difficult, if not impossible, to access or have active visibility into. However, that simply is not the case anymore.
Dolphin, an SAP software partner, for example, uses an approach that moves large amounts of static data to a lower-cost, high performance nearline storage environment that complements the in-memory SAP HANA architecture. Nearline storage, or NLS, is an inexpensive, scalable option for storing large volumes of data which also allows for near real-time access, ensuring that visibility of this data is not limited.
NLS is successful when static data that has lower business value but is accessed occasionally is separated from frequently used, ‘high-value’ information. The result is a balance between performance speeds and storage costs. It stabilizes database growth, allowing for predictable Total Cost of Ownership, while protecting data for business needs such as analytics, business processes.
Early Adopter: Southern California Edison
To date, approximately 2,000 SAP HANA platforms have been deployed, in various stages, throughout the world. One regional utility which has opted to implement SAP HANA is Southern California Edison (SCE), based in Rosemead, California. SCE is the largest subsidiary of Edison International, and is one of the nation’s largest electric utilities, serving nearly 14 million people in 180 cities across 50,000 square miles in Central, Coastal, and Southern California.
Ron Grabyan, Manager of Data Warehousing Services at SCE, notes that the electricity provider is utilizing SAP BW powered by SAP HANA for its Enterprise Data Warehouse and native SAP HANA for specific use-cases and analytical calculation energies. “In-memory computing intrigued SCE, as it appeared to provide the significant improvement of performance for reporting, analytics, and data loading we required,” said Grabyan, who has managed SCE’s business intelligence development effort on SAP BW since its inception in 2005.
Grabyan said the promise of faster reporting, faster data loading times, lower Total Cost of Ownership, reduced maintenance costs and decreased development costs all influenced SCE’s decision to transition to SAP HANA. “Our data management strategy going into SAP HANA was to migrate the entire BW platform to BW HANA, producing capabilities on native SAP HANA for analytics that we could not achieve prior,” Grabyan explained.
An important piece of the SAP HANA puzzle for SCE was to develop a strategy to keep database growth in check. According to Grabyan, SCE’s Enterprise Data Warehouse was growing at 34 gigabytes per month prior to BW HANA. By utilizing SAP HANA, SCE was able to compress data by five times in its BW and up to 15 times on native SAP HANA. The next step was to utilize nearline storage. SCE worked with SAP to identify SAP-certified partners, and ultimately selected Dolphin, which architected a solution that included PBS Software’s nearline infrastructure. The end result is database growth has been reduced to 17 gigabytes per month – a full 50 percent reduction in growth.
“The savings from adding nearline storage is directly proportional to the memory reduction on BW HANA,” observed Grabyan. “We can still execute queries seamlessly against BW HANA, which retrieves data from both SAP HANA and NLS as appropriate. It’s relatively easy to maintain and to keep BW HANA objects up-to-date in the NLS system.”
Grabyan points out the benefit of SAP HANA and NLS is improved turnaround time for data retrieval and providing governance data to regulators. Additionally it delivered a lower total cost of ownership and faster project payback.
The Road to SAP HANA Starts Today
For IT teams considering the SAP HANA journey, there is a practical strategy to pursue that will ease the transition to the platform whether it is taking place months or years down the road. Even better – it will also improve existing SAP ERP performance.
To begin, evaluate current needs for streamlining infrastructure and accessing data. Identify key performance indicators (KPIs) for system performance as well as specific areas for cost reduction, management and avoidance. This will also allow CIOs to more fully understand the full promise of data environments.
Another important consideration when migrating to SAP HANA is a comprehensive database assessment or health check. Dolphin’s HealthCheck, for example, is a proactive audit that will help safeguard against costly system down time and ensure that the in-memory infrastructure remains lean and stable by analyzing system performance, size and growth of production environments, and cost reduction/containment.
The resulting report will provide an overview of the health of the database on a monthly basis. What can businesses hope to achieve from this audit process? In this context, the HealthCheck for SAP archiving provides insight into realizing significant opportunities for SAP database improvement and a clearer path to SAP HANA.
Companies preparing for the transition to SAP HANA from their current SAP NetWeaver BW or ERP platforms will also benefit substantially from utilizing nearline storage. By developing a strategy for archiving static, less critical data using a NLS infrastructure today, organizations will experience improved performance, reduced Total Cost of Ownership, and a faster payback while also creating the most navigable road to SAP HANA.
What the Future Holds
Many utilities today are challenged in ways they’d never imagined just a few years ago. The amount of Big Data available for consumption and analysis is simply staggering. It’s no longer the sole concern of the CIO, either. Senior leadership at all levels has increasingly come to understand the value offered by Big Data and are now starved for insights. Advancements in ERPs, such as the leap forward offered by SAP HANA, promise to deliver processes, analytics and new uses for data at a level never-before achievable. Early adopters of HANA, especially those who manage database growth to keep costs in check, will be the first to realize this competitive edge as they unlock new, wide ranging data applications that benefit both the business and the customer.
About the Author
Dr. Werner Hopf, CEO, is responsible for setting the company’s strategic corporate direction and is the Archiving Principal at Dolphin. With more than 20 years of experience in the information technology industry, 16 focused in SAP, Dr. Hopf specializes in SAP Information Lifecycle Management initiatives including Data and Document archiving, SAP Archive Link Storage Solutions and Business Process solutions. His experience spans both large and mid-sized companies across all major SAP modules. Having worked on SAP projects across North America and Europe, he has extensive experience in global markets and is well known for his expertise. Dr. Hopf earned a Masters of Computer Science degree and a PhD in Business Administration from Regensburg University, Germany.