Managing Data Governance in a Cloud-Focused World

The rate at which companies are amassing data is staggering. More than half of organizations today (57%) have production workloads running in the cloud and with the amount of new devices being introduced that create, consume and transmit data to the cloud, it has become critical to have some type of cloud governance program in place. However, one of the most challenging elements of such a program is how to manage an organization’s sensitive data. This data could encompass anything from bank account and credit card numbers to HR payroll data. Misuse or negligent handling of this information could cost companies tens of thousands of dollars per record lost in a potential data breach. Besides monetary consequences, we’ve also seen how disastrous a data breach can be to customer confidence. Cloud governance is nothing to scoff at!

When the stakes are this high, it is understandable that companies are reluctant to trust the cloud. Gartner predicts that “through 2020, 95% of cloud security failures will be the customer’s fault.” However, cloud providers have made significant improvements to their security offerings over the last five years. This means that with proper planning and preparation, you can still reap the benefits of cloud efficiency and agility while maintaining appropriate levels of security.

Read more

Narjit Aujla

Getting to Excellence in Business Intelligence Webinar Series

We invite you to join experts from Protiviti’s Data Management and Advanced Analytics team for this three-part webinar series. Join us on consecutive Wednesdays in September at 2:00 pm eastern to learn how to take your BI programs to the next level.

Register for one or all three! 

A How-To Guide for Creating a Global Analytics Hub
Speaker: Marshall Kelley, Manager
Wednesday, September 12 @ 2:00 pm

Imaging bringing 15 different ERP systems into one unified analytics data mart, using SAP Data Services and SAP HANA®. In this webinar, attendees will learn how to create a global analytics hub using several SAP® solutions in a complex technological environment. This session will also cover proven methodologies for BI success and share tips with attendees on how to gain valuable insight from real-world projects.

Key Learning Points:

  • Identify systematic and repeatable best practices for driving success and increasing user adoption
  • Review real-world projects, discussing the technology, strategy and methodologies
  • Hear tips about staying agile in a global analytic environment


Hybrid Analytics: Bridging the Gap between Cloud and On-Premise
Speaker: Patrick NeSmith, Director
Wednesday, September 19 @ 2:00 pm

It seems everyone has a cloud analytics solution these days and there are many valid use cases for moving analytics to the cloud. But that doesn’t mean abandoning an existing SAP® BusinessObjects™ on-premise solution in favor of the cloud. This webinar will review use cases for both on-premise and SAP Analytics Cloud and discover how a hybrid approach can help organizations quickly and affordably provide information from a greater number of sources to a wider user base – ultimately, driving adoption and empowering users to make better decisions.

Key Learning Points:

  • Learn how to leverage your on-premise investment to get the most out of cloud analytics
  • Discover when to use on-premise, cloud, or hybrid
  • Learn how to add cloud analytics without complicated licensing


Creating a Business Intelligence Center of Excellence – A Step-by-Step Approach for Success
Speaker: Chris Hickman, Associate Director
Wednesday, September 26 @ 2:00 pm

Establishing a Business Intelligence Center of Excellence (COE) is a proven approach to achieving a strategic, cohesive, enterprise-wide BI environment. This session will walk attendees through the process of establishing a COE, an internal group that provides services and oversight to the various development groups within an organization. Attendees will learn how establishing a COE will allow them to guide BI initiatives within their firm (regardless of size or maturity) to achieve common goals. Attendees will also experience the challenges and benefits inherent in this process, along with the bottom-line results that can be expected when a successful COE is in place.

Key Learning Points: 

  • Understand how to evaluate your organization’s business intelligence maturity model
  • Assess the value opportunities of implementing a center of excellence
  • Assess the risks associated within maintaining the status quo
  • Define the typical properties of a business intelligence center of excellence

Categories: Events

SAP HANA 2.0 supports LDAP!

via SAP HANA 2.0 supports LDAP!

One of the great new features available in SAP HANA 2.0 SPS0 is its support of LDAP authorization. SAP also takes that a step further in SAP HANA 2.0 SPS3 by also adding support for LDAP authentication with automated user provisioning. With this in mind, one could now state that SAP HANA supports LDAP in the enterprise environment. However, closer inspection of the evolution of its LDAP capabilities in required. Because SAP HANA’s LDAP support evolved from authorization to authentication and provisioning in later versions, the setup can get a little confusing. In many ways the components of authorization and authentication can each operate independently. However, using both together is the most practical approach. With that in mind, lets look at authorization, authentication and user provisioning each in more detail. I will also conclude with an example setup using SQL commands.

Categories: HANA

Protiviti Authors Set to Launch New Book, Data Provisioning for SAP HANA®

Five of Protiviti’s Data Management and Advanced Analytics practice SAP experts have come together to write a 375-page guide, Data Provisioning for SAP HANA. Before making data available in SAP HANA, the data must be standardized, integrated and secured. This book details the options to accomplish that data provisioning, introducing readers to the various tools available and providing detailed case studies that demonstrate those tools in action.

SAP provides several options to extract, transform and load (ETL) data into SAP HANA. Data Provisioning for SAP HANA looks at each tool independently to understand the strengths and weaknesses of each, and where each typically sits in the IT landscape. Those tools include:

  • SAP HANA smart data integration (SDI)
  • SAP HANA smart data quality (SDQ)
  • SAP HANA smart data access (SDA)
  • SAP agile data preparation
  • SAP data services
  • SAP landscape transformation replication (SLT)
  • SAP data quality management (DQM)
  • SAP HANA data in the cloud

Each chapter demonstrates how to install, configure and develop in these tools and case studies show how these tools were implemented in a number of organizations.

This book will be helpful for decision makers who want to understand the different options available to load data into HANA for new implementations, or for companies looking to simplify the IT landscape by utilizing some of the new tools to ETL data into HANA. Architects, developers and system administrators will also appreciate the book’s step-by-step instructions for configuration, development and administration of these data provisioning tools.

Congratulations to our authors: Don Loden, Managing Director; Managers Russ Lamb and Vinay Suneja; Senior Consultant Vernon Gomes and Consultant Megan Cundiff.

Data Provisioning for SAP HANA is expected to be available in June in hardcover and ebook format. Pre-orders are being accepted here:

Categories: HANA, Industry Trends

Demystifying SAP® HANA: Understanding Options, Determining the Best Path

Although SAP HANA products have been around for some time, they continue to evolve, and we continue to find that many clients remain unsure of how to best unlock the potential of HANA solutions within their organizations. Most know that HANA is SAP’s high-powered, in-memory column and row store database. Yet many overlook the fact that SAP HANA is much more than a really quick database. It offers a multitude of functionality, including a database which also serves as the db foundation for many of SAP’s Netweaver based solutions, a development platform and a data warehouse solution. In addition, SAP HANA can function as the foundation for the BI platform, providing users the ability to model business scenarios in real-time, adding tremendous value. Yet, not all versions of SAP HANA are equal.

At the recent 2018 BI/HANA conference, Protiviti Managing Director (and co-author of this post) Don Loden presented a breakout session on understanding and demystifying the SAP HANA options available. During that session, Loden discussed real-world use cases and used product demonstrations to help attendees understand when and where HANA options make sense and the impacts these solutions have on complex global organizations.

During the conference session, Loden outlined the most common options for SAP HANA. First is SAP Suite on HANA or SAP S/4HANA.  With both, HANA becomes the data platform that runs business suite applications. But in a true S/4HANA world users see another lift in performance because S/4 has been purposely designed for HANA.

Click here to read more.

Don Loden






Russ Cohen





Categories: HANA

Developing a High Performing Data Management Organization

These days, the hot analogy in the analytics industry is that “data is the new oil.” Like oil, data must be found, extracted, refined and distributed. More companies are investing heavily in cutting-edge technologies like machine learning, artificial intelligence and big data processing to help them harvest, groom and apply data to the correct use case for maximum value. So why, then, in the midst of this prospectors’ rush, do studies of business intelligence (BI) implementations repeatedly indicate that 60 to 85 percent of BI projects fail?

While tech is changing rapidly, the nature of most data management efforts has stagnated. Traditionally, the IT team has been seen as an all-knowing and all-capable “data priest,” producing the exact report requested by the business. We’ve seen businesses put a lot of focus on acquiring and storing data as cheaply as possible, while neglecting the equally important business use case and governance aspects. Because of this, we often see that data management organizations (DMOs) are not able to withstand the waves of change from sources such as new technology, organizational drivers and government regulations like the General Data Protection Regulation (GDPR).

Armed with that historical knowledge, I want to offer a few considerations for organizations to take into account when analyzing their DMOs.

Click here to read the full blog post.

Don Loden

Three Fundamentals for Building a Solid Data Governance Program

Time and again, we talk with clients who are neglecting perhaps the most important feature in a solid data strategy: data governance. With the explosion of data resulting from an increasing adoption of digital initiatives and the undeniable fact that we are now living in a data-driven world, it is more important than ever for organizations to recognize the importance of protecting data as a key asset. From regulatory challenges in the U.S. driving a need for better data governance programs and a trend in hiring chief data officers to the imminent General Data Protection Regulation (GDPR) in the European Union, the pressure is growing on organizations across all industries to recognize the need for better maturity in managing and governing data assets.

Data governance as a practice has been around for some time, but many organizations continue to struggle to incorporate basic data governance processes into their overarching data strategies. Those who fail do not always do so from a lack of effort. Where to start and how to build a data governance plan is still a significant issue for most companies, and we have seen many firms have multiple false starts before they are able to gain the needed traction.

During a recent webinar we hosted, we asked the audience – primarily IT, audit, finance, and risk and compliance professionals ­– to weigh in on how well their organizations are doing with data governance. A full 39 percent of this group told us they have no idea whether their data governance programs are effective. Even more startling, just short of 20 percent admitted their enterprise has no data governance program in place.

These numbers may appear surprising, but they are typical of what we see across all industries – although certain groups, such as financial services, do have a higher maturity when it comes to data governance due to specific regulatory and compliance requirements that include anti-money laundering (AML) and Dodd-Frank regulations, and the fact that many banks have a global presence, making them subject to GDPR for their EU clients. Many organizations recognize the need for strong governance but often find it takes years to work through the complexities involved in establishing workable governance functions.

We understand the situation. We also know there is a way for organizations to build an outstanding data governance program that fits their needs, without the frustration. Here are just three tips to help get a data governance program started:

  1. Begin with an assessment of the organization’s current state. At Protiviti, we leverage multiple assessment models, including the Enterprise Data Management (EDM) Council’s Data Management Capability Assessment Model (DCAM) for financial services companies, and the Data Management Association (DAMA) International’s Guide to the Data Management Body of Knowledge (DMBOK®) across other industries. The DCAM framework includes eight core components ranging from data management strategy, data and technology architecture, and data quality to the rules of engagement for data governance programs. Whatever the model used, it should be matched to the organization’s needs and not applied generically.
  2. Establish a pragmatic operating model. Data governance programs must combine functional expertise, industry knowledge and technology in a well-organized and coordinated way that is planned, holistic, actionable, simple and efficient. We call that our PHASE approach, and it sets a solid foundation for future data governance by bringing together these three key components and identifying tactical steps to execute and operationalize data governance.
  3. Have simple guiding principles. We recommend that organizations:
    • Establish clear goals and purpose
    • Only put governance where needed
    • Keep the plan simple
    • Design from the top down, but implement from the bottom up
    • Be flexible
    • Communicate, communicate, communicate.

One of the most critical success factors in establishing a data governance program is to identify the value it will deliver to the organization. There is a risk this focus on value may get lost in compliance situations, where meeting a specific requirement is unquestionably the goal. Therefore, it is important for organizations to also ask: What real business problem are we addressing through our governance strategy? How will the organization be better off tomorrow than today as a result of our governance work?  What are our data problems costing us – both in opportunity costs (not being able to pursue something) as well as real monetary costs?  And how can we do all of this with a smaller spend, showing quick value?

As chief data officers join the executive suite in increasing numbers, the importance of maturing data governance is confirmed. Ensuring that the data governance team has a seat at the table for all major business decisions and key projects – both business and technology – is proving to be a best practice and a critical success factor for the future of the organization’s data strategy. Data governance is a process, not a project. By making it a core competency, organizations will be ready to take on the data-driven future.

Matt McGivern







Josh Hewitt



Categories: Data Governance