Archive

Author Archive

Protiviti Authors Set to Launch New Book, Data Provisioning for SAP HANA®

Five of Protiviti’s Data Management and Advanced Analytics practice SAP experts have come together to write a 375-page guide, Data Provisioning for SAP HANA. Before making data available in SAP HANA, the data must be standardized, integrated and secured. This book details the options to accomplish that data provisioning, introducing readers to the various tools available and providing detailed case studies that demonstrate those tools in action.

SAP provides several options to extract, transform and load (ETL) data into SAP HANA. Data Provisioning for SAP HANA looks at each tool independently to understand the strengths and weaknesses of each, and where each typically sits in the IT landscape. Those tools include:

  • SAP HANA smart data integration (SDI)
  • SAP HANA smart data quality (SDQ)
  • SAP HANA smart data access (SDA)
  • SAP agile data preparation
  • SAP data services
  • SAP landscape transformation replication (SLT)
  • SAP data quality management (DQM)
  • SAP HANA data in the cloud

Each chapter demonstrates how to install, configure and develop in these tools and case studies show how these tools were implemented in a number of organizations.

This book will be helpful for decision makers who want to understand the different options available to load data into HANA for new implementations, or for companies looking to simplify the IT landscape by utilizing some of the new tools to ETL data into HANA. Architects, developers and system administrators will also appreciate the book’s step-by-step instructions for configuration, development and administration of these data provisioning tools.

Congratulations to our authors: Don Loden, Managing Director; Managers Russ Lamb and Vinay Suneja; Senior Consultant Vernon Gomes and Consultant Megan Cundiff.

Data Provisioning for SAP HANA is expected to be available in June in hardcover and ebook format. Pre-orders are being accepted here: https://www.sap-press.com/data-provisioning-for-sap-hana_4588/

Categories: HANA, Industry Trends

Demystifying SAP® HANA: Understanding Options, Determining the Best Path

Although SAP HANA products have been around for some time, they continue to evolve, and we continue to find that many clients remain unsure of how to best unlock the potential of HANA solutions within their organizations. Most know that HANA is SAP’s high-powered, in-memory column and row store database. Yet many overlook the fact that SAP HANA is much more than a really quick database. It offers a multitude of functionality, including a database which also serves as the db foundation for many of SAP’s Netweaver based solutions, a development platform and a data warehouse solution. In addition, SAP HANA can function as the foundation for the BI platform, providing users the ability to model business scenarios in real-time, adding tremendous value. Yet, not all versions of SAP HANA are equal.

At the recent 2018 BI/HANA conference, Protiviti Managing Director (and co-author of this post) Don Loden presented a breakout session on understanding and demystifying the SAP HANA options available. During that session, Loden discussed real-world use cases and used product demonstrations to help attendees understand when and where HANA options make sense and the impacts these solutions have on complex global organizations.

During the conference session, Loden outlined the most common options for SAP HANA. First is SAP Suite on HANA or SAP S/4HANA.  With both, HANA becomes the data platform that runs business suite applications. But in a true S/4HANA world users see another lift in performance because S/4 has been purposely designed for HANA.

Click here to read more.

Don Loden

 

 

 

 

 

Russ Cohen

 

 

 

 

Categories: HANA

Developing a High Performing Data Management Organization

These days, the hot analogy in the analytics industry is that “data is the new oil.” Like oil, data must be found, extracted, refined and distributed. More companies are investing heavily in cutting-edge technologies like machine learning, artificial intelligence and big data processing to help them harvest, groom and apply data to the correct use case for maximum value. So why, then, in the midst of this prospectors’ rush, do studies of business intelligence (BI) implementations repeatedly indicate that 60 to 85 percent of BI projects fail?

While tech is changing rapidly, the nature of most data management efforts has stagnated. Traditionally, the IT team has been seen as an all-knowing and all-capable “data priest,” producing the exact report requested by the business. We’ve seen businesses put a lot of focus on acquiring and storing data as cheaply as possible, while neglecting the equally important business use case and governance aspects. Because of this, we often see that data management organizations (DMOs) are not able to withstand the waves of change from sources such as new technology, organizational drivers and government regulations like the General Data Protection Regulation (GDPR).

Armed with that historical knowledge, I want to offer a few considerations for organizations to take into account when analyzing their DMOs.

Click here to read the full blog post.

Don Loden

Three Fundamentals for Building a Solid Data Governance Program

Time and again, we talk with clients who are neglecting perhaps the most important feature in a solid data strategy: data governance. With the explosion of data resulting from an increasing adoption of digital initiatives and the undeniable fact that we are now living in a data-driven world, it is more important than ever for organizations to recognize the importance of protecting data as a key asset. From regulatory challenges in the U.S. driving a need for better data governance programs and a trend in hiring chief data officers to the imminent General Data Protection Regulation (GDPR) in the European Union, the pressure is growing on organizations across all industries to recognize the need for better maturity in managing and governing data assets.

Data governance as a practice has been around for some time, but many organizations continue to struggle to incorporate basic data governance processes into their overarching data strategies. Those who fail do not always do so from a lack of effort. Where to start and how to build a data governance plan is still a significant issue for most companies, and we have seen many firms have multiple false starts before they are able to gain the needed traction.

During a recent webinar we hosted, we asked the audience – primarily IT, audit, finance, and risk and compliance professionals ­– to weigh in on how well their organizations are doing with data governance. A full 39 percent of this group told us they have no idea whether their data governance programs are effective. Even more startling, just short of 20 percent admitted their enterprise has no data governance program in place.

These numbers may appear surprising, but they are typical of what we see across all industries – although certain groups, such as financial services, do have a higher maturity when it comes to data governance due to specific regulatory and compliance requirements that include anti-money laundering (AML) and Dodd-Frank regulations, and the fact that many banks have a global presence, making them subject to GDPR for their EU clients. Many organizations recognize the need for strong governance but often find it takes years to work through the complexities involved in establishing workable governance functions.

We understand the situation. We also know there is a way for organizations to build an outstanding data governance program that fits their needs, without the frustration. Here are just three tips to help get a data governance program started:

  1. Begin with an assessment of the organization’s current state. At Protiviti, we leverage multiple assessment models, including the Enterprise Data Management (EDM) Council’s Data Management Capability Assessment Model (DCAM) for financial services companies, and the Data Management Association (DAMA) International’s Guide to the Data Management Body of Knowledge (DMBOK®) across other industries. The DCAM framework includes eight core components ranging from data management strategy, data and technology architecture, and data quality to the rules of engagement for data governance programs. Whatever the model used, it should be matched to the organization’s needs and not applied generically.
  2. Establish a pragmatic operating model. Data governance programs must combine functional expertise, industry knowledge and technology in a well-organized and coordinated way that is planned, holistic, actionable, simple and efficient. We call that our PHASE approach, and it sets a solid foundation for future data governance by bringing together these three key components and identifying tactical steps to execute and operationalize data governance.
  3. Have simple guiding principles. We recommend that organizations:
    • Establish clear goals and purpose
    • Only put governance where needed
    • Keep the plan simple
    • Design from the top down, but implement from the bottom up
    • Be flexible
    • Communicate, communicate, communicate.

One of the most critical success factors in establishing a data governance program is to identify the value it will deliver to the organization. There is a risk this focus on value may get lost in compliance situations, where meeting a specific requirement is unquestionably the goal. Therefore, it is important for organizations to also ask: What real business problem are we addressing through our governance strategy? How will the organization be better off tomorrow than today as a result of our governance work?  What are our data problems costing us – both in opportunity costs (not being able to pursue something) as well as real monetary costs?  And how can we do all of this with a smaller spend, showing quick value?

As chief data officers join the executive suite in increasing numbers, the importance of maturing data governance is confirmed. Ensuring that the data governance team has a seat at the table for all major business decisions and key projects – both business and technology – is proving to be a best practice and a critical success factor for the future of the organization’s data strategy. Data governance is a process, not a project. By making it a core competency, organizations will be ready to take on the data-driven future.

Matt McGivern

 

 

 

 

 

 

Josh Hewitt

 

 

Categories: Data Governance

What’s New in SAP S/4 HANA Implementations? A Report from GRC 2018

Note: Several of our colleagues from Protiviti’s Technology Consulting practice attended the SAPInsider 2018 GRC and Financials  Conferences. Their blogs on SAP-related topics are shared here. Mithilesh Kotwal, Director, discusses the importance of proactively addressing implementation risks during S/4HANA migrations.

Ronan O’Shea, our ERP Solutions global lead, delivered an insightful session reviewing the different responsibilities of the business during a system implementation. As he pointed out, systems must be designed from the outset to support the business. Organizations cannot expect system integrators (SI) to develop these designs alone, as SI are technical experts – not business process experts. This is why the business should be responsible for defining the vision and operational expectations for the future state of each business process that the new system will impact.

During his session, Ronan shared key system Implementation Statistics, including:

  • 74.1% of ERP projects exceed budget
  • 40% report major operational disruptions after go-live

What do you do to ensure your implementation does not become a part of statistics like these?

The role that the business plays in an ERP system implementation is at least as critical as those played by IT and the system integrator (SI). The business owns the top four risks on an ERP implementation:

  • Program Governance
    • Misconception: The SI will manage the governance of the entire ERP implementation.
    • The truth: Typically, it is beyond the scope of the SI to provide the level of management needed to oversee the implementation end-to-end.
    • What should companies do? Establish a comprehensive PMO structure that manages the program beyond just the SI deliverables i.e. it includes things like:
      • Oversight of business and IT resources
      • Management of other vendors
      • Open engagement with company leadership on the risks and issues within the program
      • Unrelenting commitment to the transformation goals of the program.

These implementations are complex and have impact across many functions; the incentives of different parties must be checked and balanced.

  • Business Process Design
    • Misconception: The SI will guide us to adopt leading design practices baked into the software.
    • The truth: The requirements and design of the future solution emerges over time (if at all), leading to rework, changes, delays and missed user expectations both pre- and post-go-live. The SI is primarily a technical expert and not a business process expert.
    • What should companies do? The business retains the responsibility to define the vision for what to expect operationally of the new system with regard to each business process. This vision can take the form of:
      • Future-state end-to-end process flows that outline the automation level expected
      • Governing business rules (e.g., customer price calculations, cost allocations, tax computations)
      • Data requirements and event triggers for integrations to other systems
      • Controls and contingency or exception workflows
      • Who takes action

Take your time to define this vision so that you have a baseline against which to evaluate the technical solution delivered by the SI and make sure you are meeting your transformation objectives. Assess process owners’ awareness and understanding of key design decisions’ expected outcome.

  • Data Conversion
    • Misconception: Data conversion is a technical task with no business involvement and we can just move the data from legacy to the current system.
    • The truth: Companies leave this activity till too late, without any business involvement resulting in incorrect mapping of data and poor data quality that cause delays in implementation and impact operational effectiveness of the new system.
    • What should companies do?
      • Review the plans and design for the overall information strategy, data governance and data conversions and ability to ensure complete and accurate data will be available at Go-Live
      • Perform project-specific quality assurance procedures
      • Provide recommendations for longer-term initiatives to maintain data quality

Data is key, the business should treat data conversion design and data cleansing as a top priority work stream and take operational and audit considerations into account. The business must establish strong data governance that extends beyond successful rollout of the new system.

  • Organizational Change
    • Misconception: Organizational change is training, right?
    • The truth: Users and business process owners are unprepared to participate effectively on the project, business requirements, design, testing, training and adoption. Lack of focus on building user and management support, adoption, and readiness leads to ineffective and inefficient processes, and post-Go-Live disruptions, regardless of the quality of the system implemented.
    • What should companies do?
      • Examine user adoption / enablement plans for the system and processes, including ongoing user support and training processes, process organization change, and process performance measurement.
      • The business must plan to develop policies and procedure and define new roles and responsibilities as well as delivering practical training.

Prepare the organization well for the transformation project you are undertaking. Engage the users frequently to prepare them for the change to increase user adoption.

These four key risk areas, in addition to other risk areas, are explored in detail in this white paper.

Mithilesh Kotwal, Director
Technology Consulting
mithilesh.kotwal@protiviti.com

Categories: S/4HANA

ICYMI: Protiviti’s Brian Jordan Talks Data Mining

In case you missed it, click here to listen to a recent episode of the “Coffee Break with Game Changers” radio show, presented by SAP.

In this episode, Protiviti Managing Director Brian Jordan joined Marc Kinast from Celonis and SAP’s John Santic to discuss “Digital Footprints: Mining the Data in Your Operations.” Tune in to learn why Brian’s favorite movie quote is from Clint Eastwood: A man’s got to know his limitations.”

You’ll also learn why process mining is one of the hot trends in business intelligence today.

  Brian Jordan

Tech Trends at BI/HANA 2018

Protivit’s Narjit Aujla was a first-time attendee at the 2018 BI/HANA conference. He shares his observations here.

The morning begins with Spanish guitars before the keynote session at the BI & HANA 2018 conference in Las Vegas. Taking the stage is Ivo Bauermann, SAP’s Global Vice President, SAP Analytics, Head of Business Development & Global Center of Excellence. He tells an amusing story about how people had shunned the newly created automobile in favor of the horse-drawn carriage—a decision in favor of drinking more alcohol. To Ivo’s point, people are resistant to change. What may seem farfetched now becomes the de facto standard tomorrow, such as the way we solve this problem today:  Uber.

Today’s analytics market is growing rapidly. SAP’s data warehousing solution, SAP BW, is still commonly used in companies, with numerous conference sessions supporting it. Businesses are seeking enhancements to their data solutions, such as SAP HANA® in-memory computing and cloud agility. The good news is that there is a wide range of solutions for SAP users such as BW on HANA, BW/4HANA, and S/4HANA®. However, the ability to choose the right system architecture to meet your business needs is going to be critical for managing your data effectively going forward.

Analytical toolsets are more varied than ever. While products such as Tableau and Microsoft Power BI rival for the self-service spotlight, SAP’s Analytics Cloud (SAC) is also a large consideration. Riding the bleeding edge of real-time and predictive, SAC is looking to reinvent the way we approach analytics. Simple workflows and powerful natural language queries might be the difference between a successful Business Intelligence (BI) strategy and an outdated software graveyard. Regardless of tool selection, what we continue to see is the criticality of established data foundations and governance practices.

Read more

About Narjit Aujla

Narjit Aujla is a Manager in the Data & Analytics practice at Protiviti. Working with companies from all industries, he works to understand a business’s primary concerns and delivers end-to-end business intelligence solutions.

Narjit’s specializes in Data and Analytics with a focus on Enterprise Information Management (EIM), front-end dashboard development, data modeling, and architecture strategy. With Protiviti, Narjit specializes in supporting the SAP BusinessObjects Enterprise suite of tools including SAP Design Studio and SAP Web Intelligence in addition to other analytical tools such as SAP HANA Studio and SAP Lumira. He has also helped businesses refine their Data Governance strategy and identify gaps in business process using data profiling tools such as SAP Information Steward.

Categories: Industry Trends