As more and more organization implement SAP HANA native or standalone, the need to understand how to provide access and secure information views has emerge. The intent of this article is to provid…
Starting with SAP HANA 2.0 we can now partition a single table between in-memory storage and SAP HANA Extended Storage (AKA.. Dynamic Tiring). This is an excellent feature because it simplifies the…
In the traditional EIM process, typically there are separate ETL, Replication and SDA processes and functions. Traditional processes introduce latency to the data in the data warehouse, as well as other challenges to getting data into one location seamlessly. Today’s world is moving faster, and the need for clean, real-time data is imperative.
Chances are, if you’re reading this blog, you’re no stranger to the power of SAP HANA. But to recap, the SAP HANA platform removes the burden of maintaining separated legacy systems and siloed data. It hold capabilities to transform and cleanse data in real-time from multiple sources.
SAP HANA also has features that enable you to both manage data and improve your data quality. These options offer real-time transformational possibilities that were unthinkable until recently. These features allow you to transform data in real-time with Smart Data Integration (SDI) or conform and cleanse data with Smart Data Quality (SDQ).
To download full PDF and Continue Reading…
Don Loden is an information management and information governance professional with experience in multiple verticals. He is an SAP-certified applications associate on SAP EIM products. He has more than 15 years of information technology experience in the following areas: ETL architect, development, and tuning: logical and physical data modeling; and mentoring on data warehouse, data quality, information governance, and ETL concepts. Don speaks globally and mentors on information management, governance, and quality. He authored book SAP Information Steward: Monitoring Data in Real-Time and is the co-author of two books: Implementing SAP HANA, as well as Creating SAP HANA Information Views. Don has also authored numerous articles for publications such as SAPinsider magazine, Tech Target and Information Management magazine.
What are Repository Roles?
Repository roles are roles that are created as development artifacts within the SAP HANA system. They start as design-time objects and become runtime objects upon activation. They can be utilized within a security model just like database roles. However, they offer numerous advantages over catalog or database roles.
Why Should I Use Repository Roles?
- The definition of the repository role is stored in the SAP HANA system repository. Using package security, we can control access to individual repository roles for a diverse group of security administrators. For example, super administrators can be given access to change the definition of all repository roles while a department level administrator can be given limited access to roles that affect only his/her users.
- They help prevent assigned privileges from being deleted from the system when the grantor’s user account gets deleted. Repository roles are granted as the system user _SYS_REPO and this account cannot be deleted from your system.
- They are owned by _SYS_REPO. This system account cannot be deleted, unlike database roles that can be removed from the system when their creator’s database user account is deleted.
- Repository roles can be transported from one HANA system to another using HANA Application Lifecycle Management (HALM) or using the Delivery Units (DU) Export and Import process.
- Through auditing, we can track the true grantor of the repository roles.
- When a user is coped in SAP HANA Studio, all of the repository roles are also copied.
- Because they are development artifacts, the standard SAP HANA Version management tools are supported.
- They are granted and revoked via special stored procedures. Users only need EXECUTE access to these procedures to grant and revoke rights. Unlike database roles which can only be revoked by the original grantor.
To download full PDF and Continue Reading…
Jonathan Haun is a Director in the Data & Analytics solution at Protiviti. He has over 15 years of BI and IT experience. He currently focuses exclusively on Business Intelligence tools, technologies and EIM processes. He has helped hundreds of companies implement BI tools and strategies over the past 10 years.
End-to-End Security for Your SAP Environment
From designing and configuring your SAP application security to implementing SAP BI and HANA security, our Webinar Series gives you tips, tricks and best practices from our top SAP experts.
Join us on Thursdays @ 2:00 pm Eastern beginning April 21st. This series gives you real life examples on how to properly secure and manage your SAP landscape.
Register for one or all three!
Securing Your SAP Application
Speaker: Michele Makuch
Thursday, April 21 @ 11:00 Pacific | 1:00 Central | 2:00 pm Eastern
Designing and maintaining a best in class SAP security environment is a challenging endeavor given the complexity of parameters and general IT controls that need to be properly configured and monitored.
Join Michelle Makuch, SAP security expert, to understand the key steps to design and maintain an optimized SAP security environment helping you avoid the need for a redesign later.
Key Learning Points
- Improve your SAP role design strategy
- Learn key general IT controls to implement and monitor
- See how to establish an adequate SoD management framework
- Understand how to institute a comprehensive SAP security program, including cyber-security risks
Implementing SAP BI Security Best Practices
Speaker: Bob Zenker
Thursday, April 28 @ 11:00 Pacific | 1:00 Central | 2:00 pm Eastern
BI has become pervasive for decision-making at all levels. With that pervasiveness come risks due to a centralized architecture that houses potentially sensitive data in one place that can be used by many people.
Join Bob Zenker, BI expert, to learn how to properly implement SAP BI security based on practical experiences and real-world use case studies. This session will provide detailed tips and tricks for participants on implementing security for adhoc users and discuss challenges of security data and BI content.
Key Learning Points
- Learn best practices on implementing 3rd party security applications like SAP, Windows AD and LDAP
- Gain an overview on data security in Universes, Crystal Reports and SSO to databases
- Review deploying mobile access to BI content utilizing SSL or VPN
- Examine mobile application security setup on devices within the BI4.1 architecture
Securing Your SAP HANA Environment
Speaker: Jonathan Haun
Thursday, May 5 @ 11:00 Pacific | 1:00 Central | 2:00 pm Eastern
Although all systems have to deal with authentication, authorization and user provisioning, SAP HANA deviates from typical database platforms in the amount of security configuration that is done inside the database.
Join Jonathan Haun, co-author of “Implementing SAP HANA”, for expert recommendations on configuring SAP HANA and setting up the proper security models for your SAP HANA system.
Key Learning Points
- Get best practice strategies to properly provision users, manage repository roles, and implement a manageable security model in SAP HANA
- Gain an overview of the 4 different types of privileges in SAP HANA
- Explore how third-party authentication can integrate with SAP HANA
- Find out how to provide selective access to data using analytic privileges in SAP HANA
In Part 1 of this blog series, I discussed Information Views and provided some tips on how to use them more effectively. In this part, I will be discussing two design strategies and the pros and cons of each.
We’ll look at two methodologies:
- A “real-time centric” design will excel at operational reporting, but performance may be a challenge with increasing data volumes and significant transformation.
- A “storage centric” design will always scale well and offer exceptional performance.
Real-time Centric Design
In terms of HANA, this design style is where natural OLTP tables are utilized as the basis for creating real-time OLAP multi-dimensional models. When using these tables as the source, all join, calculation and transformation logic has to be implemented in the SAP HANA information views. The operations are conducted at runtime each time a query is executed. Generally, this is a very resource expensive way of designing a solution because the work has to be repeated over and over again. It also has a tendency to consume a lot of memory on the SAP HANA system. In some cases, more memory is required to compute the model than is actually required to store the data. It is sometime called ELT (Extract, Load, Transform) and it is a required design technique to achieve true zero latency real-time analytics.
- Products like SAP HANA Live ERP reporting utilize this concept heavily
- Effective for real-time as you can report directly against normalized source tables or replicated source tables in an SAP HANA ‘Sidecar’ format
- No physical ETL is required, but all transformations and processing occur at each query’s runtime
- Great for moderately complex operational reporting
- Excels for reporting scenarios where real-time is the highest concern and you are facing only one transaction table joined to several descriptive tables
- Use with care as the Calculation Engine is the slowest engine in SAP HANA
- Extreme table sizes + heavy transformations = slow performance
- May not scale well
- More difficult to do true business intelligence reporting using this design paradigm
- Many BI concepts require components and concepts that are difficult to model using only Calculation Views. This is due mostly to the number of projections that are required to manipulate multiple datasets for complex comparison activities
In terms of HANA, this design style is where the natural OLTP tables are converted into conformed Dimensions and Fact tables. To produce these new tables, an ETL process is executed. Most of the calculations and transformation logic is conducted in the ETL process. The SAP HANA OLAP model (or Information Views) are then only responsible for aggregating and joining the dimensions to the facts. Because most of the calculation and transformation logic is only calculated a few times each day with the ETL process, the SAP HANA models are not as expensive to execute. Generally, they use considerably less memory and CPU power when executed. Because the OLTP tables have to be copied into new structure, we call this a storage heavy model. It’s a model that re-persists some of the more complex transformations into new tables.
- Exceptional performance
- Scales incredibly well
- The majority of processing is in the OLAP engine; this is the fastest engine in SAP HANA
- ETL/ELT is required for physical transformations
- Data is physically modeled into a star schema design using ETL/ELT processing
- Analytic Views are used extensively
- Real-time is more difficult and complex but still possible with SAP HANA Smart Data Integration
- Multiple sources will likely need a latency based ETL approach. This will render some sources to be batch based in nature. Many customers just run batches every 5 – 15 minutes if near real-time is needed
For pure performance and scalability, stay in the OLAP engine. A storage heavy design paradigm ensures this is maintained and scaling is very linear.
ETL/ELT processing and physically moving data is required for a storage heavy design paradigm. This is technically a drawback, but keep in mind the ETL/ELT becomes simpler as you are really only transforming basic structures. Use SAP HANA to transform what is not required to support a basic star schema.
You can read more about modeling techniques in my article “Mastering SAP HANA Data Modeling for Maximum Performance.”
Don Loden is an information management and information governance professional with experience in multiple verticals. He is an SAP-certified applications associate on SAP EIM products. He has more than 15 years of information technology experience in the following areas: ETL architect, development, and tuning; logical and physical data modeling; and mentoring on data warehouse, data quality, information governance, and ETL concepts. Don speaks globally and mentors on information management, governance, and quality. He authored book SAP Information Steward: Monitoring Data in Real-Time and is the co-author of two books: Implementing SAP HANA, as well as Creating SAP HANA Information Views. Don has also authored numerous articles for publications such as SAPinsider magazine, Tech Target and Information Management magazine.
Part 1 – SAP HANA Information Views
In this two-part blog series, I’ll discuss some key tips for building successful analytics models in SAP HANA. These considerations are specific to Information Views and Model Design.
In this post, I’ll describe SAP HANA Information Views and provide tips how to maximize their effectiveness.
What are SAP HANA Information Views?
SAP HANA provides information views to construct basic logical multidimensional models to produce advanced data calculations. SAP HANA Information Views are designed to work with the different query engines available in SAP HANA.
There are three main types of Information Views – attribute, analytic and calculation. All three types of information views are non-materialized views. Any design produced in an SAP HANA Information View are entirely logical and the result set is materialized at run-time. This creates agility through the rapid deployment of changes since there is no latency when the underlying data changes.
To download full PDF and Continue Reading…
About Don Loden
Don Loden is an information management and information governance professional with experience in multiple verticals. He is an SAP-certified applications associate on SAP EIM products. He has more than 15 years of information technology experience in the following areas: ETL architect, development, and tuning; logical and physical data modeling; and mentoring on data warehouse, data quality, information governance, and ETL concepts. Don speaks globally and mentors on information management, governance, and quality. He authored book SAP Information Steward: Monitoring Data in Real Time and is the co-author of two books: Implementing SAP HANA, as well as Creating SAP HANA Information Views. Don has also authored numerous articles for publications such as SAPinsider magazine, Tech Target and Information Management magazine.