Archive

Author Archive

Saving Analytical Data Without Violating GDPR – Part 1: Data Minimization and Masking

With an effective date less than four months away, the General Data Protection Regulation (GDPR), known officially as “REGULATION (EU) 2016/679 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL of 27 April 2016,” is becoming a pressing concern for companies inside and outside the European Union (EU). Broadly, the regulation specifies that personal data protection of natural persons residing in the EU (aka EU data subjects) is a fundamental right. Personal data has a broad definition in the EU, applying to typical personal identifiers (national number identifier, passport number, etc.) as well as broader categories like location data and online identifiers (IP address, cookies). GDPR goes on to outline severe measures for non-compliance, including fines up to the greater of 20 million euros or 4 percent of total worldwide annual revenue for the preceding financial year.

The GDPR spells out a number of restrictions for the use, storage, removal and access to personal data. This can have potentially significant effects on analytical data (enterprise data warehouse, data mart, data lakes, report systems, etc.) as data removal and rectification requests can change historical reporting, introduce data gaps and complicate backup and ETL processes (“ETL” refers to three database functions – extract, transform and load – that are combined into a single tool designed to pull data out of one database and place it into another database).

Mitigation Techniques

There are several possible strategies for reducing the impact of GDPR on a company’s analytical data. Since compliance will be required for a large number of companies by May 25, the best methods are those that can either utilize processes already in place or that can be implemented with as small an effort as possible. Each company will need to look at the strategies below and make decisions on which strategy to apply and to which data elements to apply it. Below, we discuss two of those techniques – minimization and masking.

Minimization

The simplest way to comply with GDPR is to remove any non-essential personal data from analytical systems. The lower the number of data elements that identify a unique individual, the easier it is to deal with any remaining elements. The viability of this strategy will vary widely, but in many cases companies have taken the approach that it is better to have data and not need it than need it and not have it. GDPR turns this axiom on its head but it also provides an opportunity to take a hard look at what the company is storing and what the use case is for keeping it in an increasingly privacy-centric international environment.

Minimization will likely not be a standalone solution. Most companies cannot simply remove all personal data and still use the data for the business purposes it was originally designed to satisfy. However, minimization will reduce the number of data elements that need to be addressed by other strategies and thus should be strongly considered as a first priority.

Masking

Masking is replacing some or all of the characters in a data field with data that is not tied to the original string. These can be random or static, depending on the situation (i.e. 999-99-2479) but should always remove the ability to uniquely identify the record even when combined with other elements from the company’s records.

Masking is probably the least desirable solution from a security standpoint, since in many cases it does not sufficiently de-identify the record. If a phone number, for example, has its area or city code digits masked but is associated with a person’s place of residence, one would only need to know the area or city code(s) of the place of residence to unmask the identity of the person. Even if the entity masks some of the non-area digits, the number of possible exchanges may still be low enough that an automated hacking algorithm can uncover the number.

There are some cases when masking can still be useful or can augment other strategies. If the company has transactional data sets that must be retained for statutory, business or other exception cases, masking can help control data access by limiting the data shown based on existing access control mechanisms. In other cases with more possible combinations (credit card number, street address, etc.), masking can be used situationally to satisfy GDPR requirements.

In Part II of this topic, we will review data aggregation and anonymization.

Authors 

Don Loden 

 

 

 

 

 Erinie Phelps

Categories: GDPR

Want to Increase User Adoption? Try This Simple FRA2MEwork

For as long as Business Intelligence (BI) has existed, organizations have made significant investments in high-performing platforms – only to find no one will use the solution. Why? For one, end users cannot find information quickly, or at all. Two, the information they do find isn’t relevant. Three, they expect their BI systems to work as effortlessly as popular search engines and social media, which yield results within seconds of a query, and the systems often don’t. So users drift away, the system goes stale, and the effort the organization has put into building the system goes to waste.

Getting the right information to the right people at the right time is intrinsically valuable to any organization. The ROI is not in how you drive your BI program, but in how effortlessly your organization can achieve a “nirvana-like” state where collaboration really happens.

To alleviate the user adoption issue, the Protiviti Data and Analytics group has devised a simple, six-step process that can be easily put in place to ensure organizations can maximize use of their data. The FRA2ME methodology adds the foundational elements organizations need to ensure that end-user adoption is not lost in the hubbub of building a state-of-the-art BI solution.

The FRA2ME Methodology

FRA2ME focuses on the importance of understanding end-user workflow and use cases to drive relevance, in turn ensuring usefulness and adoption. To understand the methodology, it helps to explain what the FRA2ME acronym represents:

Foundation

  • Creating a BI program that is trustworthy, performs well and is accessible when and where the end user needs it is essential to user adoption. Strong foundational elements, such as governance, speed, security and reliability, create user trust in the data.

Relevancy

  • A BI solution should focus on the business user, the use case and the desired outcome. The final solution put in place must be relevant for the purpose it was built to serve, or it will fall out of use.

Agility

  • We have learned that organizations need to build, adapt and perform outreach to achieve that nirvana state of collaboration. With an eye to the cadence of change, continuous improvement should be delivered incrementally to support end-user engagement. And, the technology required to support an agile BI team must be agile, too.

Advocacy

  • Gaining and promoting advocacy is a very important step in the FRA2ME methodology, accomplished through creative, well-defined efforts. One client internally branded their BI program to gain visibility, in turn generating advocates while growing adoption of the system. The client’s success was solidified by activities such as social media posts, competitions among users, and other promotions that encouraged users to try the new system. The goal: A scenario in which users say, “I can’t imagine my life before this solution” or “I can’t imagine living without this solution.”

Monitoring/Measuring

  • Keeping an eye on user activity and data usage is essential to establishing a positive track record for reliable data, in turn building the trust of business users.

Education

  • Training on new solutions should be situational, contextual and personal, which means using the kinds of training tools users relate to best.

At the end of the day, what organizations need is to delineate between information and insights. Information is, by its very definition, informative, and some information might be useful. But insights are actionable, adaptive and help achieve the desired objectives.

There are many areas where a methodology like FRA2ME can help organizations achieve insight, including:

  • Process optimization (“How will we anticipate and reduce costs?”)
  • Operational efficiency (“How can we increase sales and improve customer satisfaction?”)
  • Financial visibility (“How can we better understand and improve profitability?”)
  • Sales effectiveness (“What steps are needed to increase sales and improve supplier service level agreements?”)
  • Consumer behavior (“How can we engage our customers more effectively? What consumer trends are developing in our industry?”)

One Size Does Not Fit All

What BI solution is right for one organization may not be appropriate for another, and that’s where the FRA2ME methodology is particularly useful, as it helps pinpoint where to focus. One of our clients, for example, used the methodology to cut through the distractions of an upcoming IPO to quickly implement a real-time, interactive and highly intuitive dashboard providing visibility across 50 metrics and their related tolerances, all while launching a new manufacturing facility. The client saw 100 percent effectiveness in its first 90 days of operation at the new plant.

Another client, the fastest growing optical retailer in the U.S., needed to understand how to best segment and target customers while also determining when and where to open new markets. The FRA2ME methodology allowed us to identify how this client could effectively build a trusted data platform and implement customer analysis models that provided greater visibility into customer behavior for targeted sales and marketing campaigns, improved customer retention and optimized site selection for new stores.

A healthy, profitable company is in a constant state of change. And the cadence of change, at least from a BI perspective is this: Build, adapt, outreach. Build the solution that is best for current needs and resources. Adapt the solution and  the organization, as monitoring and measuring will define how well the solution is working and how the organization is responding to it. Outreach, by developing those advocates or “raving fans” who drive user adoption at the grassroots level.

To learn more about FRA2ME, download our white paper.

About Steve Freeman

Steve is a Managing Director – Protiviti’s Data Management and Advanced Analytics Practice. He developed the FRA2ME methodology to help clients generate “raving fans” among end users. Steve is also responsible for Protiviti’s SAP Analytics practice. He serves on the firm’s Financial Services practice leadership team. Steve has held numerous roles in sales and executive management in the Business Intelligence and Analytics space including: SAP BusinessObjects, Oracle, Verint and Cognos. A thought leader in analytics and end user adoption, Steve’s expertise also centers on Customer Insight Analytics, Sales & Financial Forecasting, and Organizational Optimization.

Categories: Data Strategy

Data Strategy Webinar Series

Maximizing Data to Drive Your Organization Forward 

Get a jump-start on your 2018 professional development. We invite you to join subject matter experts from Protiviti’s Data Management & Advanced Analytics team for this informative five-part webinar series. Join us on Thursdays beginning February 1 and continuing through March 8 at 2:00 p.m. eastern to learn what’s new in Business Intelligence and Data Governance.

Register for one or all five!


Building a Comprehensive Data Strategy
Speaker: Jeremy Stierwalt, Director
Thursday, February 1 @ 2:00 p.m.

Data is arguably the most important asset a company possesses, and a data strategy allows the company to recognize that its data is a structured, comprehensive, cross-domain asset. The organization’s volume of data, where its data is stored, who in the organization is responsible for the data and where future data investments will be made are all components of this comprehensive strategic approach. Jeremy will analyze the anatomy of a data strategy and will also share success stories from Protiviti clients.

Key Learning Points:
– Learn what to include when defining the scope of a data strategy
– Discover how to develop a road map for the organization by defining key gap-closing initiatives
– Recognize how strong, experienced leaders can ensure success
– See what a Future State recommendation would look like
– Learn how to formalize a Data Governance process as part of the strategy development

Speaker:
Jeremy Stierwalt brings more than 20 years’ experience in all aspect of data management and governance, business intelligence, advanced and predictive analytics solutions. This includes vertical expertise in Financial Services and Insurance, Manufacturing, Consumer Products, Retail, Sports and Entertainment, and Professional Services coupled with Line-of-Business expertise in Finance, HR, Supply Chain, Sales and Marketing, and Procurement functions. Jeremy has held various leadership positions including membership to the Executive Leadership Team, and is a frequent speaker at National & Regional Conferences on topics related to data governance & management, big data, and analytical solutions.


6 Simple Steps to Solve Your BI User Adoption Challenges
Speaker: Steve Freeman, Managing Director
Thursday, February 8 @ 2:00 p.m.

In spite of significant Business Intelligence (BI) investments, just a relatively small percentage of companies consider their BI programs a success. Many continue to struggle with how to best adapt to changes in the business and evolving systems needs. Join Steve as he walks through the six components that make up the FRA2ME methodology, which allows every client to maximize their BI investment, turning business users into advocates and dramatically growing BI adoption throughout their organizations.

Key Learning Points:
– Explore the six steps to building FRA2ME methodology in your organization
– Learn how FRA2ME helps organizations delineate between information and insights
– Understand how a “Build, Adapt, Outreach” approach helps users adjust to change
– Discover how to have your end users saying “I don’t know how I’d do my job without this”

Speaker:
Steve Freeman, a Managing Director in Protiviti’s Data Management and Advanced Analytics practice, developed the FRA2ME methodology to help clients generate “raving fans” among end users. Responsible for Protiviti’s SAP Analytics practice, Steve also serves on the firm’s Financial Services practice leadership team. He has held numerous roles in sales and executive management in the Business Intelligence and Analytics space including: SAP BusinessObjects, Oracle, Verint and Cognos. A thought leader in analytics and end-user adoption, Steve’s expertise also centers on Customer Insight Analytics, Sales & Financial Forecasting, and Organizational Optimization.


Managing the New Currency: Developing a High-Performing Data Management Organization
Speaker: Don Loden, Managing Director
Thursday, February 15 @ 2:00 p.m.

Data has become so critical for success at any organization that it’s now being described as the new currency – impossible for a business to grow and thrive without it. Attendees will learn what successful companies are achieving with strong data management in place and will learn how to avoid common pitfalls that jeopardize solution adoption. Don will also outline why quality data and a trusted data platform are key components for both data and analytics success.

Key Learning Points:
– Learn why many data management efforts fail and how to align those efforts with the business
– See how to follow the crumbs to find the reward of user adoption
– Learn why a trusted data platform is critical for success

Speaker:
Don Loden is a thought leader in the data warehousing and governance space. With more than 15 years’ experience in ETL Architecture, Development, and Tuning: Data Modeling; Data Warehouse Design: Analytics Architectures; Business Analysis and more, he also speaks regularly at industry functions and has written numerous articles for publications focused on data warehousing, master data management, governance, SAP HANA, and data quality best practices. Don is also the author of two books on SAP technologies and strategic use cases. His sales acumen coupled with his expertise has helped the Data and Analysis solution grow substantially over the last several years.


Avoid an Epic Fail: Why Data Governance is a “Must Do”
Speakers: Josh Hewitt, Director
Thursday, February 22 @ 2:00 p.m.

Do you trust the data your organization uses for strategic, financial or regulatory reporting? Is there clear ownership and understanding of key data elements? Does identifying root causes of data quality issues often leave you scratching your head and scrambling for help? Data Governance needs to be viewed as a Program and not a one-time Project. Discover real-life lessons learned from our clients at varying levels of Data Governance maturity and will provide tips to ensure you can best prepare and improve your Data Governance capabilities along your journey.

Key Learning Points:
– Understand scoping and key functions of Data Governance programs
– Review different operating models
– Examine components of achieving higher maturity within people, processes and technology
– Learn how to build a business case and drive value through Governance
– Explore how Data Governance supports an overall Data Strategy and Advanced Analytics

Speaker:
Josh Hewitt, a Director, has more than 13 years’ experience in Financial Services and has a wealth of knowledge surrounding data governance/management, information technology, regulatory/compliance Risk Management and program/project management. Having been part of multiple data governance program planning and implementations, Josh offers a unique perspective on what has worked in other organizations and the best approach to the implementation of successful programs. He also has significant experience in data governance/quality tool selection and evaluation and a strong understanding of the benefits and pitfalls of many different solutions and can help Financial institutions evaluate, select and implement solutions.


Managing Data Governance in a Cloud-Focused World
Speaker: Narjit Aujla, Manager
Thursday, March 8 @ 2:00 p.m.

Companies like Amazon and Microsoft are opening the door for anyone with internet access to stand up rogue environments outside of corporate guideline. This produces a unique set of challenges from a Data Governance standpoint like standardizing KPI definitions, Master Data Management, and Data Security. Narjit will take attendees through the steps necessary to integrate the cloud movement into a Data Governance approach, while still enabling the usefulness and practicality of cloud applications.

Key Learning Points:
– Learn how metrics used by cloud shadow applications adhere to the Enterprise Data Dictionary
– Learn how to assess the risk of a cloud service to better protect its sensitive data
– Examine how IT manages authentication and data flow within a cloud application
– Learn how to create concrete and reasonable cloud data governance controls
– Understand what steps to take to ensure the cloud provider adheres to vendor requirements

Speaker:
As a manager in Protiviti’s Data Management and Advanced Analytics practice, Narjit Aujla has worked with companies spanning a variety of industries. Narjit specializes in Data and Analytics with a focus on Enterprise Information Management (EIM), front-end dashboard development, data modeling, and architecture strategy. He also works closely with SAP BusinessObjects Enterprise suite of tools including SAP Design Studio and SAP Web Intelligence in addition to other analytical tools such as SAP HANA Studio and SAP Lumira, Narjit has experience working with various database solutions, including Oracle DB2 and Microsoft SQL Server. He has also helped businesses refine their Data Governance strategies, identifying gaps in business process using data profiling tools such as SAP Information Steward.

Lumira 2.0 Discovery

During some downtime recently, I decided to play around with the new Lumira 2.0. I had not used Lumira 1.0 in about eight months. I was excited to see what improvements were made to the tool from the first iteration.

I knew that 2.0 is similar to 1.0 in that users have to deploy an add-on to the BI server. Lumira 2.0 renamed Lumira Desktop to Lumira Discovery and integrated Design Studio’s functionality into another tool called Lumira Designer. This blog focuses on Lumira Discovery and, in particular, about connecting to a universe.

What I Like Best about Lumira Discovery:

  1. Previously, Lumira Desktop had three screens: Prepare, visualize, and compose. In Discovery, these are integrated into one single canvas page. It removes what I thought to be excessive amount of steps, especially since this solution is supposed to be user friendly and able to provide quick access to data.
  2. After a short period of time getting used to where all the options available are, navigation was easier and seemed intuitive. Like Web Intelligence: its many options for charts can be accessed by either right clicking or from the contextual menu icons.
  3. While I didn’t do an in-depth analysis on this, it was nice to see there is an ability to consume anything created in Lumira Discovery with Lumira Designer. This is will allow for reusability and collaboration between IT and business users.
  4. Easily connecting to a Live BW source. Discovery supports BEX hierarchies and variables.
  5. I was able to connect to a universe from my BI Launchpad. The query builder screen is the same as the one in WebI, which will make any Webi user comfortable. There are a couple caveats to using a UNX, which I detail in the “Needs Improvement” section below.
  6. While users cannot create a LUMX file from BI Launchpad, it is possible to edit these kinds of files. This is a great feature if you have published something from Discovery application, but then noticed a quick fix is needed. Hopefully, future versions will include a create option from within BI Launchpad.
  7. Merge, Append and linking datasets: I remember glitches with these in Lumira 1.0. In 2.0, once you understand which one you need to use based on your use case, it’s fairly easy.
  • MERGE DATASET – While Append adds the records of data set B to data set A, Merge adds columns from dataset B to dataset A. The prerequisite is to have a common key column of the same data type.
  • APPEND DATASET – Appending two datasets is similar to performing a union between them. The mandatory prerequisite for appending one dataset to another is that both the datasets must have equal number of columns of similar data types.
  • LINK DATASET – Linking Datasets can be done when there is no common key but having a common column (e.g. Country) into which the data can be blended.

To download and Continue Reading…

About Mitesh Shah

Mitesh is a manager in Protiviti’s Data Management and Advanced Analytics solution. He specializes in data services, front-end reporting and analysis, and Information Governance for numerous enterprise customers across the country. Specifically he works with SAP Data Services, SAP Information Steward, SAP BusinessObjects and SAP HANA.

Categories: Lumira

We Asked: What’s Your BW Strategy? The Answers Surprised Us

Recently, as part of Protiviti’s Data Management & Advanced Analytics practice’s popular Expert Webinar Series, my colleague Vinay Suneja and I reviewed the various options available to BW customers to migrate to HANA. Our discussion included overviews of SAP HANA Enterprise, SAP BW on HANA and SAP BW/4HANA, clearly differentiating the business value each of these technologies bring to an organization’s landscape. Several hundred people attended and many of those guests asked questions, both throughout and after the presentation. What we learned from those questions confirms much of what I see in my day-to-day work with current clients: there is still a great deal of confusion in the marketplace about how organizations can optimize their investment in data tools to deliver the best return in terms of simplicity to the architecture, and provide best value to business for overall organizational growth.

During the webinar, we took a look at what we believe are the top three options for companies looking to refine, or define, a holistic Business Warehouse (BW) strategy using the SAP solutions that are best for their needs. Every day, we see that organizations are investing in increasingly complex architecture with many building blocks, resulting in complex data lineage causing reconciliation issues, longer nightly batch data management and lack of real-time data due to inflexible data loads in the resident BW system, among other issues.

The first, SAP HANA Enterprise with a side car, is an independent system which can extract data directly from ECC and other source systems, providing a multitude of analytical capabilities with text, spatial, big data, and predictive algorithms.  The second world be a migration to SAP BW ON HANA, bringing agility and versatility to data models. Finally, we covered the new release of SAP BW/4HANA, a migration and upgrade that brings redesigned modeling and enterprise-wide data warehouse capabilities integrating data from operational and digital business.

To download and Continue Reading…

 

About Sri Velicheti

Sri Velicheti is a Director in Protiviti’s Data & Analytics practice advising clients in information management strategy, business intelligence, data warehouse, data governance, and master data management. He has 18+ years’ experience in implementing SAP solutions for diverse clients in the Manufacturing, Utilities, Public Sector and Banking industries. He has been a thought leader and a frequent speaker at SAP Analytics conferences and other regional conferences.

 

Categories: BW, HANA

The End is Near! Planning Your SAP BusinessObjects 4.2 Upgrade

Unsupported technology is everyone’s nightmare scenario. But, it’s inevitable that every good software program will evolve and will ultimately reach end-of-life, requiring upgrades. Both SAP BusinessObjects 4.0 and 3.1 are winding through their respective final months of tech support, with 4.0 ending Priority One Support in December, 2017 and 3.1 following just 12 short months later.

 

Highlighting New Features

I recently co-moderated a webinar outlining all the reasons why BusinessObjects 4.0 and 3.1 users should be planning now to make the switch to BI 4.2 SP4. The list of features is impressive, as BusinessObjects 4.2 SP4 is focused on strengthening how you engage your business users with better features and tools. Those features are highlighted by a new, interacted Web Intelligence (WEBI) interface with new charts that will help explain and provide better insight. There is also a geo-coding feature, displaying information on interactive maps. Additionally, with the new version of BI 4.2, your team can leverage SAP BusinessObjects Lumira 2.0 Discovery and Designer. Lumira storyboards can consume personal data, universe content, SAP HANA® and more to present data with amazing insight and design. SAP is the only visualization tool that allows you to start with an engaging end-user interface and then enhance the developed content into corporate dashboard standard – it’s easy and saves time and effort.

To download PDF and Continue Reading

 

 

About Bob Zenker
Bob Zenker is a long-time member of Protiviti’s Data Management & Analytics team, and brings extensive experience in data management and visual insight to his current role as a Director. He has worked with more than 600 U.S. and international companies of varying sizes, providing integrated solutions that reduce complexity and focus on the “elegant interaction” of business needs and technology. His areas of specialization include: SAP BusinessObjects, SAP Lumira, SAP Cloud Analytics, AWS, Tableau, data modeling, server architecture, BusinessObjects SDK, portal integration, dashboard and report design.

 

Categories: BusinessObjects

Thinking of upgrading to SAP BusinessObjects 4.2 SP4? – Not so fast!

SAP BI Blog

SAP BusinessObjects 4.2 SP4 offers numerous new features and many of these features have been anticipated for years. Below is just a sampling of all the great new enhancements found in SAP BusinessObjects 4.2 SP4.

  1. The Web Intelligence DHTML designer can realistically now replace the Java Applet Designer. You can again create WebI reports with Google chrome and without the need to deploy an Oracle JRE.
  2. There is a new optional HTML 5 (Fiori Style) BI LaunchPad and Web Intelligence viewer.
  3. There is a new CMS Database Driver for metadata reporting against the CMS database. I always thought it was odd that you could not generate reports based on metadata generated by a BI reporting platform.
  4. You can create Run-Time Server groups and they can be assigned to folders, users and security groups. This allows you to setup dedicated “server nodes” to process workloads.
  5. New Virus Scan integrated support with…

View original post 784 more words

Categories: BusinessObjects