Showing posts with label Master Data Management Webinar - Part 4. Show all posts
Showing posts with label Master Data Management Webinar - Part 4. Show all posts

Benefit from Pre-TechEd Workshop about SAP Master Data Governance

Seminar Topics

Now that you've hopefully caught a glimpse of SAP Master Data Governance (MDG), for example by watching the quick demos featured on SCN, take the chance and attend at a dedicated MDG pre-SAP TechEd workshop, scheduled Sept. 12, 2011 in Las Vegas. There are still seats available.

In this seminar, you can learn how master data governance approaches the data management problem within the SAP software landscape with utmost ease and efficiency. Whiteboard-led sessions provide insight into all aspects of the solution including data model, security, workflow, data replication, extensibility, and customization.

For seminar details and registration information, see the ASUG Pre-Conference Seminars site.

Enjoy the session.

Regards,

Markus

Markus Ganser Active Contributor Gold: 1,500-2,499 points is a solution manager in SAP Master Data Management (MDM)

Benefit from SAP NetWeaver MDM global data synchronization - GDS 2.1

SAP NetWeaver Master Data Management (MDM) global data synchronization - GDS 2.1 has successfully completed the Ramp-Up phase, and is now generally available for customers:

Highlights of the new release include:

  • Additional connection to SA2Worldsync data pool
  • Ability to exchange price in the GS1 format
  • Enable the exchange of catalogue selections via peer-to-peer
  • Automated initial publication of trade items
  • Display of images assigned to trade items
  • Propagate data through a hierarchy based upon material number
  • Limit user visibility by target market
  • Create trade items for multiple target markets

For more information about global data synchronization, see the SAP NetWeaver MDM global data synchronization site.

An Eye Opener for Transporting MDM Repositories

Its been a long time since i have not written a new blog in SAP MDM. Being SAP MDM as part of EIM, I was busy working in SAP BW and some other person reasons. Today, i am gonna talk about transport of MDM objects especially MDM repositories in different Environments like (Development, Quality and Production).
Once a developer done with the repository design, maps and all other stuffs and then need to transport it to Quality/Production Environment. What is the right approach.
I would always suggest you should move the repository schema with no records or Export/Import schema to quality environment and then finally to Production.
Its strange to know that some guys keep asking about how to move values of all LUT (look-up Table) to new environment without Main table records. So what they usually do..? They take the archive .a2a file of Developement repository and then unarchive it in Quality environment where they keep the LUT records and delete manually Main table data for this repository which sometimes causes lot of issues.

Issues Threat:

1). Since you have kept the same archive file of development repository with all records. After deleting main table records using MDM Data Manager, it may be still possible that these records are available in Database as Database also keep track of these MDM records using Internal ID key. So though you have even deleted records from MDM Data Manager but because of Database there might probablilty that you might face some issues while creating records in Data Manager like The Requested Record could not be found. So after deleting main table records one should ideally load repsoitory with Update Indices to refresh the linking of records in database too but still after doing it there may be probablilty that you can still face some other issues.
2). In order to keep LUT table values while transporting, you usually forget to integrate ECC system with MDM for populating Refrence table values which is not correct. In future, if you increase some values for look up tables in particular environemnt of ECC, same would not be reflected in MDM as you took these LUT values from ECC Development Environemnt only not from Quality, similarly it could be one of the biggest threat if you dont stick to basics and forget the same thing for Production Environment. You might configure MDMGX extraction for each of Environments for populating reference table values in MDM from ECC for smooth business operations.
3). Since in each environment your Remote System might be with different name and can have different LUT key value, so sometimes it may not allow to save main table records along with LUT Values.
For Example: For LUT table Countries, Country India has Key as IND in ECC Dev Env with remote System ECC DEV. It may or may not be different in ECC Quality Env. If India country Key maintianed as IN in Quality ECC so while importing it will lead to Data discrepancies. So sometimes you can face issues like Error 5611520 - Error Saving Key Mapping and Key mapping value must be unique. You cannot overwrite key.
4). If you are having Auto ID field maintianed in Main table of Repository so you have to make some alternative to count Auto ID starts again from 0 (Zero) since if you deleted exisiting records from MDM repository which you took as complete .a2a file from Development Environment it does not start from 0 (Zero) automatically.
There might be some other issues as well you can face. So I would suggest always move repository transport from one environment to different environment with no records just schema of .a2a file or with help of Export/Import Repository Schema in order to achieve smooth implemenation. One should not bothered about tranporting all Lookup table values from one environment to another using MDM repositories as it can lead to data discrepancies.
Note: All above are my personal views, there is nothing as such strong evidences i can furnish to support above and what other issues you can face, if you transport MDM repository.a2a file from one environment to another with complete main as well as lookup table records and then later deleting main table records to work on MDM repository.

Update on MDM Info Collector

MDM Info Collector Updates

After the initial version of MDM Info Collector (Klaus David blogged about it) that proved pretty successful in message handling by delivering required contextual information for thorough analysis and streamlined processing, the tool now comes with additional features.

The MDM Info Collector creates a comprehensive snapshot (zip file) of MDM system info that can be used later by SAP Support for offline failure analysis. This accelerates the analysis and resolution of the reported failure.

What's New in Version 2.0 of MDM Info Collector

New features include:

  • Remote execution of the MDM Info Collector tool by "SAP MMC" (uses "Server Snapshot" functionality)
    When you do not have access to MDM Server on machine (e.g. the machine is managed by the IT department) but you still need to provide the MDM system info to SAP support, then you would like to use the remote execution.
    From your local Windows PC, run SAP MMC to collect info from a remote UNIX/Windows MDM machine, then download snapshot (containing info package) to your local machine (requires new SAP-Framework (720-path90) and updated MDM profile (available in MDM7.1 SP7 Patch7).
  • Supports also Master Data Import Server (MDIS) and Master Data Syndication Server (MDSS), in addition to the Master Data Server (MDS)
  • Collects much more information than before
    This additional information accelerates the analysis time of customer messages by SAP Support
  • Supports SSL
    As of MDM7.1 SP7

For more info, see the how-to information (PDF) that is attached to SAP Note  1522125.

Hope this info is useful for you.

SAP Inside Track 2011: EIM sessions

SAP Inside Track 2011 Midwest is jointly held in two locations: St. Louis and Chicago. I'll be speaking at the Chicago meeting. Hey, did I mention that it's FREE?

Of course, my topic is Information Governance. I'm calling the session "Faster and Cheaper Information Governance. Seriously." If implementing information governance sounds like a 5-year plan involving more meetings than you can stand, then this is the session for you. Whether you are just getting started in your governance initiatives, or whether your company is struggling to reach *success* with governance, we can help. In this session, we’ll discuss common pitfalls of information programs. Most of our time, however, will be spent on how to start small to ensure future success. In this session, you'll learn these skills: - identify good target projects - shape lean teams to accomplish the work - identify which technologies will accelerate your project - prove the value of the initiative to your organization.

The stellar SAP Mentor Gingler Gatling will also be there. That alone is worth a trip. She wrote the book on workflow, you know. :-) Ginger has three sessions, spanning workflow, data migration, and EIM in support of workflow.

Understanding SAP Business Workflow and how it fits in with SAP's BPM Strategy: Sharpen your knowledge of SAP Business Workflow, what it can and can’t do, and how it fits into your SAP architecture. Find out how other companies are leveraging workflows to improve and streamline processes, and the business drivers that make SAP Business Workflow a critical piece of the SAP infrastructure.

Comprehensive Guide to Data Migration: This session takes a deep dive into SAP BusinessObjects Data Services and offers best practices for using both the integration and data quality management capabilities provided in SAP BusinessObjects Data Services.  Get step-by-step instruction for getting started with SAP BusinessObjects Data Services, such as leveraging the pre-delivered migration content it provides and understanding the various connectivity options. Walk though a demonstration of SAP BusinessObjects Data Services migration content for SAP ERP and SAP CRM data migrations, showing the connectivity to SAP configuration tables for data validation and how to map common structures such as customer basic data.   Explore the jobs, workflows, and transformation capabilities within SAP BusinessObjects Data Services, and learn how to ensure SAP application configuration is referenced for data validation.  Discover the data quality major capabilities that can ensure your data migration project enables data management and data governance.   Learn the pitfalls to avoid when using SAP BusinessObjects Data Services for data migration, such as not taking into consideration the customizations – such as Z tables – within your target SAP systems. Take home links to the data migration content within SAP BusinessObjects Data Services.

Enterpirse Information Management and how it relates to business processing: This session will provide a  'what is Enterprise Information Management" and then discuss what this means for the SAP Business Suite, business processing, and business process experts.  The goal of this session is to provide insight into Enterprise Information Management including what it is, why it’s important to business, how it fits into SAP’s strategy and where you can go to get more information.

Tammy Powlas will also be there. Join us at the SAP Inside Track 2011 Midwest meeting.

A bucket of smart ideas - Master Data Governanace Framework for Enterprises

In May of 2011 SAP Master Data Governance Solution aka Embedded MDM solution became generally available. Some brief information about the value proposition of this framework for your organisation can be read here.

Before I dive into describing some of the features of MDG which I find really useful let me describe the need that this solution majorly targets.

Need of an embedded MDM (embedded on ECC system) solution : Managing a solution built on the same/similar system where master data resides is highly desirable from application architecture point of view. It reduces the complexity of the solution to be built and thus also helps reduce long term maintenance cost. It also reduces the footprint of the application largely. Also reusage of the components (out of the box validations) etc helps to create uniformity on what the user experiences while creating data conventionally and via MDG change request framework. This also helps manage the change management activies easily for example end user training etc.

Now some of the smart features in MDG application framework which can help you design an optimum Master Data Governance Solution according to me are :-

1. Change request based processes : Having uniform user experience for differenet data domains such as suppliers, material etc. This framework also exposes historical data changes in the form of change documents (both from the change request and outside of change requests).

2. Flexible processes : The processes (change requests for create / update etc) can be designed on a generic out of the box business workflow. In broader terms this workflow solution has two components - 1. business workflow and 2. BRFPlus application. These two tied up together create a very basic but effective BPM environment where via quick and easy to do configuration, a process can be easily setup / modified to meet business requirements. This again helps expedite application building and also reduces process maintenance turn around times.

3. Data model & related UI views and validations : The MDG solutions need a data model which mirrors the relational model of the master data in question for creation of change request UI, staging area (explained below) and to create validations (both out of the box or custom). The smart thing here is that the application inherits the validations related to master data directly into the change requests as well gives the end user option to code validations via BADI's or though BRFPlus rule sets.The framework also has the capability where data duplication can be prevented via configuration.

4. Staging Area : The work in progress data is not stored in the master data tables but a temporary database until completly approved by the concerned user. This helps control availability of data in transactions.

5. Smart UI :  The UI for the embedded MDM solution is primarily ABAP WebDynpro based and can easily configured using FPM configurations. The resuting views can be same as the conventional master data maintenance transactions (SAP GUI) or can be modified as per business requirements.

6. Integration with Enterprise Search : Easy information search options.

7. EHP5 added functionaity like BCV (Business Context Viewer) : The BCV can be used as a side bar addon on the NetWeaver Business Console (NWBC) for integrating master data key elements to other relevant data and view it in different avaliable options. For example Data relevant to a given material can be obtained from PLM system or CRM sysem etc. So with this information integrations becomes seamless and thus representation of data is better.

8. Deployement Options :  Can be standalone on one system (ECC 6.0 and EHP5) or can be on top of an existing ECC 6.0 system by updating it with EHP5. The MDG system can also integrate with other non SAP systems to create one enterprise solution. More about this can be read here.

With these features, SAP has given the customerrs an option to create solutions which are robust, which donot compromise on data quality aspects but at the same time are flexible and have low cost of development and maintenance.

Master Data Management Webinar - Part 4

SAP Developer Network Latest Updates