SAP Master Data Management - Info Update

SAP Master Data Management has evolved from a single product to a holistic solution suite facilitating all master data scenarios and use cases, both in SAP Business Suite-centric environments and heterogeneous multi-system landscapes.

In this Webinar, SAP MDM Vice President, Gerd Danner, provides you with an up-to-date picture of SAP's MDM solution portfolio and informs you about what is planned in the near future. In particular, the Webinar will cover the following topics:

SDN Webinar - Agenda

SAP Solutions for Master Data Management -
Info Update

October 5, 2010
15:00 - 16:00 GMT
17:00 - 18:00 CEST (GMT + 02:00)08:00 - 09:00 Pacific Time (US & Canada)
1. Master Data Management - Brief Introduction
2. SAP Solutions for Master Data Management - Overview
3. Release-specific Information
4. Summary
5. Q & A Session

To join the 1-hour SAPConnect session, please find the following dial-in information:

Session Link:

https://sap.emea.pgiconnect.com/mdmupdate/

Participant Code:

7480784957

Telephony Information:

  • Germany, Frankfurt: +49 69 71044 5497
  • Switzerland, Zurich: +41 43 456 9604
  • UK, London: +44 20 7153 9921
  • France, Paris: +33 1 70 99 43 40
  • Italy, Milan: +39 02 3041 0328
  • USA, New York: +1 212 999 6675
  • Brazil, Sao Paulo: +55 11 3351 7063
  • China: +86 400 810 2682
  • India, Bangalore: +91 80 6127 5001
  • Japan, Tokyo: +81 3 5767 9356
  • Singapore, Singapore: +65 6622 1189
The Webinar will be held in English.

SAP MDM Tutorials | SAP MDM Training | SAP MDM Interview Questions |SAP MDM Books

How important is Sarbanes Oxley (SOX) to the Procurement function: Is Compliance really an Opportunity ?

SOX, better known as Sarbanes Oxley is as dry as “the desert”; nevertheless I’ve seen people trying to shy away from any trainings that are provided in this genre around compliance,

….Don‘t laugh at me, that’s what you’d feel to sit through seminars or sessions related to it, that’s what public opinion is.

We’ll for me it was positive, the follow-on bile of the seminar pumped my enthusiasm to begin research on the SEC’s (Securities Exchange Commission) strategy and requirements for Sarbanes Oxley and the procurement function.

I don’t want to repeat the shady state of affairs that affected Enron and others, that’s a stale tale.

Nevertheless, I went that extra mile on the “walnut” and tried looking at it from various angles; trust me, the first question in my current SRM implementation at an Oil field Services major was around SOX controls in procurement.

We’ll there lied a target audience that I could now vent on J

I gave them only 4 Quadrants of Gyaan that they had to know, to “know it all”, All that I had to communicate was that, please view “compliance as an opportunity” and not a threat or an axed burden

What’s the Dope

-          a 101 on how Procurement Controls impact Sarbanes Oxley

-          What are those sections in the Sarbanes Oxley act that are a focus area for procurement

-          Does the SRM solution they’ve chosen cater to the Audit points, customers want prescriptions that are implementable, reportable and biggest of all “Auditable”

..Little more on the length and breadth of the artifacts

1)      a 101 on how Procurement Controls impact Sarbanes Oxley

Procurement processes create hundreds, if not thousands, of financial transactions every day

What are the procurement Audit Points and the Business Objects that are impacted, see illustration below.

Dun & Bradstreet integration for Supplier relationship’s are now a necessity, more than a “good to have feature” in SRM solutions

image

2)      What are those sections in the Sarbanes Oxley act that are a focus area for procurement

The answer is very simple, its about satisfying the 4 Sections, the 4 critical Compliance quadrants

The agenda on the controls and the cause and effect becoming a CPO and CFO agenda item very clearly reveals that, all elements of compliance are under the CCTV now, there’s no hiding

Trust me these days there are trained auditors for Enterprise SOX audits, that can ask you very uncomfortable questions and you need to be prepared with an answer.

image

3)      Does the SRM solution they’ve chosen cater to the Audit points, customers want prescriptions that are implementable, reportable and biggest of all “Auditable”

After giving all the dope, customers new to Sarbanes Oxley, ask you very simple questions

-          Are we SOX compliant with the current package that we’re implementing

-          What are the key questions that auditors will ask us

-          How prepared are we, is our readiness factor healthy, do we need more time, more resources, more money, what do you suggest

The answer for this is easy these days, with almost all new releases or product lines across various package vendors bundling the features canned and auditable. To new customers, all of this looks very

Jazzy in the beginning, they will give you 99% credit to themselves having been consulted, in the very first place.

If it’s a veteran oldie to SOX: They will bombard you with SOX reporting requirements across 302, 401(A), 404, 409either to be delivered out of the Box or via some custom development, but they need to have a leading edge to address Auditability, what they fail to understand till date is that, its not about getting a heavy duty compliance framework, its actually about understanding “What’s expected out of Basic Procurement Control” and building traceability in processes and more importantly following them end to end without break-points to derive maximum !!!!

Please do visit the SEC website to see sample audit questionnaires on SOX, you will also get to read a recent whitepaper published by SAP on Automating SOX audit testing

I know some of you would definitely read through till here, hence the last but not the least or let’s say the most “interesting” statement that I’ve ever heard about Sarbanes Oxley.

Compliance and Good Internal Controls is no longer Best Practice………….It is the Law!!!!

….If SOX interests you and you want to know more do collaborate for any follow-on, I’d be glad to help

Do write to me @ tridip.chakraborthy@cognizant.com

Better preparation will get you past them and believe me, it will translate into benefits sooner or later, it continues to benefit my clients.

System Landscape Governance @ TechEd 2010

oncerned about increasing landscape complexity ?
Facing challenges when implementing new innovation within you SAP Business Suite system landscape ?
Uncertain about version dependencies when updating parts of your system landscape ?

If your answer to one of the questions is ‘ yes’, TechEd session ALM109 (Berlin, Las Vegas) might be able to dispel some of your doubts. Just have a look, how system landscape governance by SAP will ease the use of SAP products.

The session outlines SAP's approach and current offerings to reduce complexity in system landscapes covering multiple SAP applications. Learn how SAP is tackling the most important challenges customers face when designing and managing their system landscapes. Understand how landscape governance will help to standardize and automate the management of your system landscape. Furthermore we will provide an update of the SAP System Landscape Governance Board, covering the newest deployment recommendations, how to design system landscapes for certain SAP products.

SAP MDM Tutorials | SAP MDM Training | SAP MDM Interview Questions |SAP MDM Books

SAP Mentor Podcast: Interview with Maria Villar, SAP’s Line of Business Executive for Data Management (Part I)

"What I've come to learn is that the way to get really good data management in companies is to build it into the business process." - Maria Villar, Global VP of Data Management at SAP

When I wrote about the schism between data and business process, I asserted that data management was a business process  - actually a sub-process of larger business processes.   In that blog, I also shared my observations of how SAP customers were doing a good job, or not so good job of managing EIM processes.  But, I failed to pose or answer a very important question:  how do companies get past this schism and develop a strategy for managing their information in business processes?

The answer is simple:  they organize for it.

Maria Villar, Global VP for Data Management at SAPAnd just recently, I learned how SAP organized for EIM when I met Maria Villar, SAP's Global Vice President for Data Management, reporting to SAP's Global Field Organization.  Despite her geeky sounding title, Maria is not in SAP's IT organization.  She ‘s the ranking executive within SAP's sales and service line of business in charge of maintaining and improving SAP's information assets.

In BPM parlance, this makes Maria SAP's "Business Process Expert" for SAP's EIM business processes.

So naturally, within 15 minutes of meeting her, I was already asking to interview her in a podcast.

So I'm happy to present Part 1 of my interview with Maria Villar as she describes her role within SAP, her discussions with SAP customers and their EIM strategies, and the importance of managing information within business processes.

Listen to the podcast here (12 minutes):
Podcast Summary

Here is a short summary of the questions I ask Maria in this podcast:

0:00        Introductions

0:40        Explain the role of your team in SAP

1:40        What are the data issues that SAP is focusing on as a priority?

2:04        How are we measuring the impact and improvement of data quality programs in relation to these priorities?

2:40        So what are the kinds of things that SAP needs to improve in terms of managing our own data quality?

3:27        I recently profiled a data migration project in SAP IT utilizing SAP BusinessObjects Data Services and SAP NetWeaver BPM related to the acquisitions of Technidata and Sybase.  What role is your team playing in these projects?

5:43        I frequently write about the need for alignment between business and IT in BPM.  How important is this for an EIM strategy?

6:56        Tell us about the kinds of discussions you have with SAP customers in terms of developing their strategies for enterprise information management?

8:38        What SAP customers have you talked to that you feel have a high degree of maturity in their information management strategies?

9:21        What is your opinion is the connection between business process and enterprise information management?

11:49     End of podcast

SAP MDM Tutorials | SAP MDM Training | SAP MDM Interview Questions |SAP MDM Books

Survey on recommended architecture types for Enterprise Information Management

Information and data management are the key enablers of insightful business decisions in today's business across practically every industry. Even if data storage was kept aside as a separate topic, the information user's mind today is yet a myriad of several products, each promoting a technology platform of its own, in addition to the homegrown data connectivity and data mining solutions. What is the point of all these choices, when a customer is not sure of which works the best for his organization?

In our continuing efforts, to help our customers standardize and better manage their enterprise information, we have launched an architecture recommendation program - Enterprise Information Management Architecture Program. This program is a platform for us SAP, to interact with our customers, to understand their top information management needs and help them build these needs into their environment, through a careful and practical suggestion of Enterprise Information Architecture patterns and supporting tools.
As a founding activity of this project, we wish to collect existing architecture from you the EIM users and customers, to get insights into your
• current BI / EIM architecture
• key requirements
• interests on enhancement and future invests in BI and/or EIM area

We encourage you to take part in this short survey that does not take longer than 5 minutes. Your response will help us understand your needs and expectations. With that we can optimize our product strategy.
Please click here to take part in the survey.
If you decide to receive to updates from us, please provide your contact details. We are looking forward to cooperate with you and to share the outcome of this initiative with you.
Many thanks for your participation!

SAP MDM Tutorials | SAP MDM Training | SAP MDM Interview Questions |SAP MDM Books

MDM Expressions: Use of Callable Function in Search Capabilities (Part 2)

In Continuation with Previous blog: MDM Expressions: Use of Callable Function in Search Capabilities (Part 1)

Here, I elaborate a scenario with complete step by step screen-shots for your understanding.

I have four records in my Main table which is linked to Category Bottle. Bottle has two Attributes named Color and Material Type. I want to search a record which is fulfilling criteria of color has multi-values RED;BLUE;YELLOW and for Material Type it should two values as Copper;PVC as shown in below screen-shot for your reference. LEXAN ID 300

image

Now have a look at other three records also with linked Category and their Attribute Values. LEXAN ID 900

image

LEXAN IP 300

image

LEXAN IP 600

image

Now to search this criteria, this is not possible either using Free Form Search OR Drill  Down Search. This is where Callable function is important and makes significance in searching capabilities using Expression under Free Form Search.

Now create a validation named Callable Function as shown in below screen-shot.

image

Now right click on validation Callable Function->Add Branch
Select Property Branch Value as Bottle as shown in below screen-shot.

image

In this validation Callable Function [Bottle], write expression as shown below:

image

Now For searching you can use this callable function in Expressions of free-form search. For searching record, Go To Free Form Search->Click on Expression as shown below.

image

An Expression pop up window will appear and under Function Drop-down list you can select Callable Function as shown in below screen-shot.

image

Now once it is done, you will get the desired result as shown below.

image

This is one example where Callable function can be used in Searching records. So use of Callable Functions is very handy in searching complex records requirement. Similarly you can think and make use of Callable Functions in other ways as per your requirement.

SAP MDM Tutorials | SAP MDM Training | SAP MDM Interview Questions |SAP MDM Books

MDM Expressions: Use of Callable Function in Search Capabilities (Part 1)

Around the corner, we have this property available in Validations tab of MDM Data Manager while defining validation. As the name suggests "Callable", it is reusable Function which can be used in validation, assignments and more.

Callable Function: Whenever you set this Property Callable = Yes for any validation as shown in below screen-shot. You can then call this Function in validations/Assignments.

image

Now this Callable Function would be available in the drop-down list of functions as shown in below screen-shot.

image

Now everyone is wondering is this the only information I wanted to share with all of you. Obviously No, I wanted to tell everyone irrespective of using this callable Function for validations and assignments, there are other things too which you can do like Searching.

Multiple Searches : Sometimes there is requirement for filtering records with multiple search criteria's. This is quite a good capability of searching multiple criteria's using Callable Function without too much headache.

E.g. I want to syndicate or search a record with following criteria which has linked Taxonomy value as "Bottle" and this Bottle is linked with two Attributes color and Material Type. I want to search this record "Lexan ID 300" which has Attribute color has values RED; BLUE; YELLOW and Material Type has values as PVC;Copper as shown in below screen-shot.

image

This is not at all possible using Drill down search or using Free form search as it is multivalue search with AND operator but is quite possible using Callable Functions. This is just an example, you can also do lot of other stuffs using Callable Functions.

Let's see how to make use of it through this blog

SAP MDM Tutorials | SAP MDM Training | SAP MDM Interview Questions |SAP MDM Books

Enter Semantics: An Expert System for Automated Monitoring – Part 1

In this weblog series I'll show how to create an expert system for automated error analysis using semantic technology. The aim is store information about an error situation in an wiki. An expert system reads information about an error from an SAP system and classify the error condition. The result is a number of wiki pages (in the best case there is only one) that contain a more detailed description of the error situation - and hopefully a guideline what to do. So a wiki is a knowledge database of an expert system as well knowledge management tool for administrators that can be used in collaborative way.

So what are the functional requirements of an expert system?

  • An expert system must have the possibility to "look inside an SAP system". This is simple because it is simple to expose protocols using REST web services.
  • An expert system has to be able to extract the error information from a wiki.
  • And last but not least it has to be able to classify the protocol using the knowledge base from the wiki.
The use case

SAP systems possess very powerful monitoring capabilities. The Solution Manager is a central hub for error diagnosis in a system landscape. The systems can communicate using different protocols.

Within an SAP system information about error situation can be found on different places: dumps can be found in transaction st22 and error messages can be found in the business application log (in short BAL) using transaction slg1:

BAL protocol

Here we have a lot of red error messages which indicates a problem. Each message can be identified using a message class and message number:

Error message

As seen above in a protocol a message is contained in a protocol that is associated to an BAL object resp. subobject.

An error protocol can be quite long because of repetitions but usually a relatively small number of error messages can be significant for an error. Here these are the following to error numbers:

BAL Object
BAL Subobject
Message Class
Message Number
Type

FICA
FPG1
>U
425
E

FICA 
FPG1 
F5
808
E

These are stored as a table in a wiki as you can see in the following picture:

Semantic Media Wiki

Here we can use any mediawiki or an add on. Using categorization we can regulate the knowledge database. Here I am using a semantic wiki because of its superior classification techniques.

The expert system

The expert system I'll introduce within this blog is a Groovy application that reads a BAL protocol from an SAP system and extracts the knowledge database from a wiki, i.e. a mediawiki. From these data it constructs an ontology. This ontology contains a set of classes and each class corresponds to an error situation. Using a reasoner like Pellet we classify the error messages.

Prototype

In the next part I will explain how to design a knowledge base using OWL DL. Later I discuss how to extract data form a wiki, build the ontology and how to do reasoning.

The vision: automated monitoring of an IT landscape

Of course this is only a prototype but you don't need much fantasy to imagine how a product would look like: an expert system constantly reads data from SAP and non-SAP systems using various protocols (think of SNMP for example), classifies those messages and helps an administrator to do his job.

product

Outline of this blog series

In the next part I present an introduction in ontologies. Then I dicuss data extraction from a wiki and how systems can expose data about their status. In the last part of this series I show how to perform reasoning and put everything together into a prototype.

SAP MDM Tutorials | SAP MDM Training | SAP MDM Interview Questions |SAP MDM Books

Improving your Data Quality by Integrating SAP NetWeaver MDM and SAP Business Objects Data Services

In this Webinar "Improving your data Quality by integrating SAP NetWeaver MDM and SAP Business Object Data Services" by Klaus David, there is talk about several scenarios combining SAP NetWeaver Master Data Management and SAP Business Objects Data which results in Improving Data Quality. In addition there is an outlook what's planned for the SAP NetWeaver MDM 7.3 release in regards to these capabilities

In brief, there is DB View Creation package provided by SAP which needs to be installed on MDM Server Machine (system) and it utilizes the JAVA APIs. With this Package you can create DB Views of any MDM Slave Repository. Once this creation step has been done then you can access MDM data from Business Object Data Services. Using Business Object Data Services you can cleanse data, check duplicate records by running Matching and many more

Limitation: So far, you can not use DB Views with Master Repository.

Note: DB View package is not available on Service Market Place. For this, you need to get in touch with SAP.

With MDM 7.3, there is a plan for pre-built integration of MDM with SBOP DS XI 4.0

Please check this Webinar by Klaus David for complete details as given in above URL

I found this Webinar very useful and informative so just thought of sharing with all of you

SAP MDM Tutorials | SAP MDM Training | SAP MDM Interview Questions |SAP MDM Books

Jellyfish

If We Can Put a Man on the Moon, We Can Have Good Asset Information

Information underlies performance in every modern organization.  Customer information is the lifeblood for consumer-facing businesses, as it enables effective marketing and service programs.  Product information plays a similar role in the discrete manufacturing world and enables coordination of complex networks of suppliers and distributors.  Asset information represents the "family jewels" for asset-intensive industries, like process manufacturing, infrastructure, and energy.  It enables them to optimize returns from their enormous investments in equipment and facilities.

image

To drive value from information, organizations need to ensure it is always accurate and up to date.  Recognizing this, leading consumer and manufacturing organizations have established customer information management and/or product information management programs.  Solution providers also support this with complete information management suites that address the unique challenges in these areas.   

While the need is analogous, asset information management (AIM) has not received the same attention from owner/operators or solution providers.  It's not that they do not appreciate the importance of good asset information, just that they have not yet recognized the breadth of the issue and the significant benefits that can be reaped through information sharing across asset lifecycle management (ALM) groups.  So, in place of an effective set of common policies, practices and technologies, the typical asset-intensive organization struggles with a motley collection of inconsistent, siloed information management programs that create unnecessary risks for the organization and frustrate everyone's efforts to improve performance.

Today's AIM situation is similar to how organizations originally managed their customer and product information.  Not surprisingly, we see the same results - excessive errors, inefficiencies, and customer complaints.  So, it is reasonable to expect that the problems with AIM can be solved by applying the lessons learned in these other areas.  In most cases, organizations can also leverage their investments in people, processes and technology for other enterprise information management programs to kickstart an effective AIM program.

While there are many similarities, asset information still differs from customer and product information.  So, an AIM program requires certain AIM-specific practices and technologies.  Most asset-intensive organizations already have what's needed, they just have to refine and reconcile multiple versions to remove conflicts and ensure consistency and compatibility across all ALM groups.  Some new solutions may also be required, but they can generally be justified on the basis of enabling the organization to introduce more efficient, industry best practices into their ALM program.   

AIM has been a recurring theme for ARC in our reports and annual forums.  We have shown how poor AIM can cost the typical asset-intensive organization 1.5 percent of its revenues and unnecessarily increase organizational risks.  We have outlined the elements of a good AIM program and the steps that organizations can take to unleash the tremendous value that lies hidden in their asset information.  The analogy between AIM and other enterprise-wide information programs shows why AIM deserves top management support and that it can be achieved through an extension of existing programs, rather than a new "big bang" IT initiative.

We believe that the case for action is clear and justifiable.  It's the will to act that is lacking.   For that, I offer some George Bernard Shaw quotes that have inspired me:

"People are always blaming their circumstances for what they are. I don't believe in circumstances. The people who get on in this world are the people who get up and look for the circumstances they want, and, if they can't find them, make them."

"The reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself. Therefore, all progress depends upon the unreasonable man." 

"You see things; and you say, 'Why?' But I dream things that never were; and I say, "Why not?"

SAP MDM Tutorials | SAP MDM Training | SAP MDM Interview Questions |SAP MDM Books

Special Characters Removal Using MDM Import Manager

With the latest Market trend MDM-BO integration is emerging as it results in cleanse data and lot of other benefits. On the other hand we don't have any function available under MDM Expressions in MDM Data Manager for removing special characters. So other than MDM-BO do we have any other alternative ? Everyone will have different views on it. But still I feel that MDM Import Manager is very impressive tool for removing special characters effectively and doing other conversions before entering data into MDM.

So I want to tell here some of the capabilities and functionalities of MDM Import Manager for cleansing data before entering data into MDM.

See, when you right click on source field, you can have there property->Set Value Conversion Filter->Multiple
    1. Apply Operator
    2. Normalize
    3. Change Case
    4. Replace
    5. Accept Truncated Value

Points to be noted:
1.One of the Most Important aspects for playing about these Properties depends on the Type of field defined in MDM.
2.You can set these Properties only after mapping your source field with target field.

Now if you map your source field with MDM field of TYPE INTEGER
After right click on source field->Set Value Conversion Filter->Multiple
Here you can play with only Apply Operator
You can set operator for your source values as +, -, x, /, equals, Null Equals, Round, Ceiling, Truncate.

These functions are very useful in case if you want your source field values to be multiplied by some certain digit before coming into MDM, there we can use multiply operator "x". Similarly other operators one can be used as per requirement.

Now if you map your source field with MDM field of TYPE TEXT
After right click on source field->Set Value Conversion Filter->Multiple
Here you can play with Apply Operator (Append and Prepend only) as well as other filters like Normalize, Change Case, Replace, Accept Truncated Value.

Here I will discuss only Normalize, Change Case and Replace functionalities.

Sometimes we have requirement for removing special characters like @, &, #, $ etc. So question is when to go for Normalize and when to go for Replace Conversion filters.

If there is requirement to remove everything including spaces too then one should go for Normalize functionalities.
E.g. if field has value Lexan$@ ID& 300
So outcome after Normalizing would be LEXANID300

You can also make use of CHANGE CASE (UPPER CASE, lower case, Title case etc) along with Normalize as per your requirement. If you also set Change Case = Sentence Case, along with Normalize you will get output as Lexanid300
But if you want to keep spaces between tokens and only want to get rid of special characters, you should go ahead with REPLACE. Other benefit, you can remove the specific special characters if there is requirement. In other words Custom Special characters removal.
E.g. if field has value Lexan$@ ID& 300
Requirement is just to remove special characters @ and &, so here you are maintaining $ as well as SPACES between tokens. Set Replace as shown in below screen-shot.

image

You will get desired result as shown below where symbol $ and SPACES are maintained between different tokens.

image

I am not sidelining the benefits of MDM-BO integration but just wanted to aware everyone that MDM Import Manager is also proficient up to certain extent in dealing with cleansing of data. I hope this blog would be useful and informative for everyone especially MDM Beginners.

 

SAP MDM Tutorials | SAP MDM Training | SAP MDM Interview Questions |SAP MDM Books

Data Migration: Installing Data Services and Best Practice Content

In a previous blog I discussed the options for data migration and the technologies used.    We stated that SAP delivers software and content for data migration projects.  In this blog I'll discuss the installation of the Data Services software and loading of the best practice content. 

In order to get started you can use online help or the bp-migration alias in SMP.   In this blog I used the best practices link in online help.    There is a lot of great collateral here to take advantage of, however when I was ready to install The combination of the content down (bp-migration) and the online help giver you everything you need to get started!  It will take you to the ERP quick guide that literally walks through what you need!   It includes a silent installation.  I tried the silent installation, with no luck, so I went to the appendix and did the steps for the manual installation.  I'm sure the silent installation works if everything is setup correctly, but I wasn't too worried about it since I wanted to see the required steps for the manual installation.

Experience with the manual installation

With the manual installation you install Microsoft SQL Express, Data Services, and do things like create specific repositories, load best practice content into Data Services, and ensure everything is ready to run.    I had to do the manual installation between meetings, so it took me a couple of days, but in man-hours it probably only took ½ of day - and a lot of that was me reading the guide to ensure it was done correctly.  There are a lot of scripts that you need to run, so I'd suggest setting up the passwords to be the same as what would be set in the silent installation.

The installation of Microsoft SQL Express and Data Services is very straight forward.  I did them on my laptop which has other SAP software and didn't run into any problems.   I couldn't find a trial download of Data Services on SDN.  Hopefully your company has already purchased the product.   Someone mentioned you can download and then just get a temporary key from http://service.sap.com/licensekey, but I'm not sure.   I also think there is a version of BI-Edge that includes it, but it's not in the basic BI-Edge, you at least need the "BI Edge with Integrator Kit".   If you have experience with using Data Services as a trial, please post a response and share!

Once the basic installation is done you start to import the content into data services.  The content is grouped into rar files and you download and unzip the content.  The import happens into a couple of Data Services repositories.  The first repository is related to all the lookups that are required during the migration.  The first content I unzipped had spreadsheets for IDOC mapping and files that will be used for doing lookups to validate data. All of this goes into the lookup repository.   One example is the following screenshot.  This is for HCM data.  It has description of the field, if a lookup is required, and if the field is required.  There's other information as well.

Spreadsheet with HR IDOC information

At this point I wanted to know what else is delivered to help understand the IDOCs.   So, I explored a couple things.  The first is the documentation with the best practices in online help.   In online help you'll find sample project plans, and links to ASAP methodology that has data migration content.  I haven't explored the content available in detail for project management, I'll do that and blog about that soon.     You can also download the documentation from SWDC on the service market place.   The documentation is the same in online help and in the download area on SWDC.    The documentation includes a word document for each of the objects.  So, for example, in the HCM example mentioned above, the word document is 32 pages and includes description of the IDOC segments, number ranges, and process steps.

After the import I had datastores, jobs, project content in Data Services.  Examples are below:

Project and jobs created for creating lookup tables in the staging area for the relevant migration objects. These look up tables will be used to validate the data when the migration jobs execute. 

Delivered project"

Data flows created in data services for the lookup table creation jobs:

image

Datastores were created. The datastores create lcocal storage for temporary data for the migration, as well as linking to the SAP target system.   I haven't yet created a datastore for the source system.  The next step will be to update the DS_SAP datastore with the actual SAP system to be the target system for the migration.

Data stores

The second repository holds all the migration jobs with the target of the IDOCs going to the SAP system.   Once the import for this repository were completed I had a project with jobs related to the IDOC structures:

Project and jobs for migrating data

From this screen shot you can see the jobs for some of the content objects such as accounts payable, bill of material, cost elements, etc.

That was pretty much it for the manual installation. The last piece was to add the repositories to the Data Services Management Console.  The management console is web-based way to schedule and monitor the jobs.   OK - now I'm ready for the post-installation work!    Look for the next installment to discuss post-installation and start using the content!

SAP MDM Tutorials | SAP MDM Training | SAP MDM Interview Questions |SAP MDM Books

SAP Netweaver MDM Trends and changing role of MDM consultants Part1.

There have been multiple queries on the newly added features in MDM after the recent activities- SAP acquiring Business Objects, Use of BPM for cMDM and collaborative master data processes, enhanced features like WS generator and Webdynpro configurator added as new emerging trends in MDM 7.1.

In this series of two blogs I will try to list all the new technologies added, scenarios and we will also see how it changes the MDM consultant role in the project, what new skills and knowledges are required etc.

Till 2007-08 the focus was more on the MDM core services like improving performance of MDM servers, import and syndication processes. Year after we saw MDM 7.1 released and within a short span of time we have now 5 patches. The topmost differentiators in MDM 5.5 and 7.1 are-

1. Flexibility in Data modeling- By introduing Lookup(Main) and Tuple datatype we can now model anything under the sun in MDM repository. Deep nested structure, supporting cross linkages between multiple entities, managing relationship data which were all forbidden to dream till 5.5 are now easily achieved in 7.1

2. CTS+ ensures improved administration and life cycle management for MDM projects. It also increases Project Life Cycle auditing compliance and monitoring.

3. Others worth mentioning are Enhanced security measure- User Passwords, Activity logs, Change tracking support for the Qualified tables, Use of DB Views on slave repositories for specific enterprise reporting.

4. MDM 7.1 offers new functionality by implementing the MDM PI Adapter which enables direct communication between MDM and PI. The MDM PI Adapter delivers data to and pulls data from MDM by using the MDM Java API port functions. This enables a tighter MDM PI integration as well as message monitoring. It also supports asynchronous data flow(Exactly once model) and acknowledgment from the target systems.

In part2 of this blog we will focus on the changes in the Process layer and the UI layer which are required for master data management, managing the CRUD operations, Data migration from SAP and NON SAP sources, enhanced Data Quality. We will also see how this impacts the MDM consultants working on any MDM programs.

SAP MDM Tutorials | SAP MDM Training | SAP MDM Interview Questions |SAP MDM Books

Matching & Merging: Equals Vs Token Equals (Part 2)

In Continuation with previous blog: Matching & Merging: Equals Vs Token Equals (Part 1)

Through this blog, I will try my best that this difference between Equals Vs Token Equals is easily understood by everyone and how scores get affected and come different than what we have set in case of Function = Token Equals.

Working with function "Token Equals"

At this point of time you know that for executing Matching, you need to define Transformation, Rule and Strategy.

Open MDM Data Manager, Go to Matching Mode: 

1. Define a Transformation :

image

2. Define a Rule : Here in this Rule "Matching Material Description", I have included Transformation just created above and set Function = Token Equals and other parameters as shown in below screen-shot.

image

3. Define a Strategy now : Here in this Strategy, I have included the Rule "Matching Material Description" as created above and set other parameters as shown in below screen-shot.

image

Now coming to Data Manager: Record Mode
Now, I have four records for which there is value for field Material Description as shown below:

image

Since we have set Rule Property Function = Token Equals, it will treat these 3 tokens as separate (distinct) token. Lets see how it shows score when we execute Strategy

image

After executing the strategy: In matching Mode, we get the following scores

image

Now you are wondering with Material Description "Lexan IP 300" we have the right Score 100 which we defined during Rule but what about other two records how the score 20 is coming for Material Desciption "Lexan ID 900" and how it is showing score 50 for Material Description "Lexan IP 600" since we have not set these scores during defining Rule.

Logic is pretty simple:

1stly in Function "Token Equals" each Token is considered as separate token (identity) unlike Function "Equal" where these 3 tokens considered as single individual token.

2ndly it gives score basis on the below formula:
Score = Success * Number of Unique Matching Tokens / Total Number of Unique Tokens

Now let's see score for each of these Material Description:

For Material Description : "Lexan IP 300" with "Lexan IP 300"
Number of Unique Matching Tokens: 3 (Lexan, IP and 300)
Total Number of Unique Tokens: 3 (Lexan, IP and 300)

So Score: 100* 3 / 3 = 100.

For Material Description: "Lexan ID 900" with "Lexan IP 300"
Number of Unique Matching Tokens: 1 (Lexan)
Total Number of Unique Tokens: 5 (Lexan, ID, IP, 300 and 900)

So score: 100* 1 / 5 = 20

For Material Description: "Lexan IP 600" with "Lexan IP 300"
Number of Unique Matching Tokens: 2 (Lexan and IP)
Total Number of Unique Tokens: 4 (Lexan, IP, 600 and 300)

So score: 100* 2 / 4 = 50

My primary objective is to show Score calculation through these blogs. So I hope now you would not wonder if you find Matching score different than you defined in Rule if you are using Property, Function = Token Equals.

References:

MDM Data Manager Reference Guide

 

SAP MDM Tutorials | SAP MDM Training | SAP MDM Interview Questions |SAP MDM Books

Matching & Merging: Equals Vs Token Equals (Part 1)

I came across a thread where the Author of the thread was concerned about the Scores of matching duplicates. He was getting different Matching Scores other than what he set there during executing Strategies. He just wanted to know how it is happening and so are other SDN MDM members.

I divide this difference between "Equals Vs Token Equals" into two parts. This blog will contain the information of executing Strategies with Funtion = Equals. I will try my best that this difference is easily understood by everyone.

Working with Function "Equals"

For executing Matching, you need to define Transformation, Rule and Strategy.

Open MDM Data Manager, Go to Matching Mode:

1. Define a Transformation :

image

2. Define a Rule : Here in this Rule "Matching Material Description", I have included Transformation just created above and set Function = equals and other parameters as shown in below screen-shot.

image

3. Define a Strategy now : Here in this Strategy, I have included the Rule "Matching Material Description" as created above and set other parameters as shown in below screen-shot.

image

Now coming to Data Manager : Record Mode
Now, I have four records for which there is value for field Material Description as shown below:

image

Since we have set Rule Property Function = equals, it will treat these 3 tokens as whole that is one. Lets see how it shows score when we execute Strategy

image

After executing the strategy: In matching Mode, we get the following scores

image

Since during Function = Equals, it treats all token as one Individual Token

It tells that there are 2 duplicate records with Material Description "Lexan IP 300". So there Score of Success is 100 as defined in Rule above.
"Lexan IP 600" is different from "Lexan IP 300" so showing score of Failure 10
"Lexan ID 900" is different from "Lexan IP 300" so showing score of Failure 10

Now its turn for Token Equals example and here it is :

References:

MDM Data Manager Reference Guide

SAP MDM Tutorials | SAP MDM Training | SAP MDM Interview Questions |SAP MDM Books

SAP TechEd 2010 is Near - See the EIM Tracklist

Benefit from the broad spectrum of lectures and hands-on sessions in the Enterprise Information Management area that will be given at this year's TechEd in Berlin (October 12 -14, 2010) and Las Vegas (October 18 - 22, 2009). Select the EIM sessions (Berlin/LasVegas) that suit you best regarding the content and level of expertise required.

As a teaser, I'd like to pick just a few sessions per EIM discipline, hoping you'll get hungry for more from the broad offering:

Overview
Enterprise Data Warehousing and SAP NetWeaver Business Warehouse Accelerator:
  • EIM 201 Enterprise Information Management and Metadata Management Around SAP NetWeaver BW 7.3 (Berlin/Las Vegas)
  • EIM 261 Utilize SAP NetWeaver BW Accelerator and SAP BusinessObjects Explorer to Analyze Huge Amounts of Non-SAP Data (Berlin/Las Vegas)
  • EIM 300  SAP NetWeaver Business Warehouse 7.3 Feature List Overview and Roadmap (Berlin/Las Vegas)
Data Integration and Quality Management
  • EIM105  Product Preview: The Next Generation of Data Quality Assessment and Analysis (Berlin/Las Vegas)
  • EIM106  What's New for SAP BusinessOjects Data Services 4.0: Vision and roadmap (Berlin/Las Vegas)
  • EIM160  Getting Started with SAP BusinessObjects Data Services (Berlin/Las Vegas)
Master Data Management and Data Governance
  • EIM162  How to Jumpstart Your SAP NetWeaver MDM and SAP NetWeaver BPM Projects with SAP NetWeaver MDM Quick Starter Packages (Berlin/Las Vegas)
  • EIM200  Master Data Management: SAP's Strategy and Roadmap (Berlin/Las Vegas
  • EIM263  SAP NetWeaver MDM and SAP BusinessObjects Data Services Integration  (Berlin/Las Vegas)

I hope you'll find what you need for your expertise in the individual EIM disciplines. Please note that this personal track list is just an excerpt of all EIM-related sessions. For a complete overview, see the links above.

Kind regards,

Markus Ganser

SAP MDM Tutorials | SAP MDM Training | SAP MDM Interview Questions |SAP MDM Books

MDM Expressions: Part 2

In continuation with previous blog: MDM Expressions: Part 1

So far i have discussed some of functions in Previous blog. So Let's continue with the use of some other Functions with suitable examples:

7. MID( ) : It returns a specified section of a string. Please have a look how it is being used in one of the Project requirement in this thread

8. IS_NULL( ) : Please refer the usage of this function in this thread

9. Qualified table validations : As the functionality of Expression is limited in case of Qualified tables. So I just want here you to aware of validating non-qualifiers inside qualified table by this thread

10. Lookup table Expressions : Refer this thread for details

11. Field population from Tuple table fields : This is possible only in case if your Tuple table is single-valued. For more details please refer thisthread

12. Assignments to field of type Lookup : This is not possible as you get below error:

Assignment operation failed: A runtime error occurred while evaluating an assignment, validation or calculated field. For more details please refer this thread


13. Difference between Calculated field and Assignment

: Refer this thread for details

14. HAS_ALL_CHARS( ) : Refer this thread for its usage.

15. Validation that does not contain 1) Alpha numeric, 2) -ve number, 3) All Zeors, 4) String, 5) Special characters and should have only NUMERIC Values : Thread ID

16. Use of OR Operator : This is useful in case of more than one conditions. If any of the condtions is TRUE, It returns TRUE. IT returns FALSE if none of the condition satisfies. Please refer this thread for more details

I hope this would give everyone a fair idea to understand and deal with the usage of Expressions after going through all of the above mentioned threads.

References:

MDM Data Manager Reference Guide

SAP MDM Tutorials | SAP MDM Training | SAP MDM Interview Questions |SAP MDM Books

MDM Expressions: Part 1

I am not here to tell you the basic functionality of Functions and Operations as this thing you can read and understand easily from MDM Data Manager Reference Guide. I have a keen interest in writing expressions and I have already answered a lot of threads in SAP Master Data Management. So I have consolidated some of these good threads for your references. I think this blog would be really helpful for the SDN MDM Community in order to understand and get familiar with the use of Operations, Functions and more under Expressions.

What is MDM Expression: MDM expressions are Excel-like formulas that are evaluated by MDM and return a distinct value for each record.

Let's start with the use of some Functions with suitable examples:

1. SYSTIME( ) : If you give in Expression as SYSTIME(0), it will give you the Current Date and Time. So accordingly you can make use of this expression.
e.g. SYSTIME(1) : It will add 1 day to the current date. i.e. Tomorrow
SYSTIME(NEGATE(1)): It will subtract 1 day to the current date. i.e. Yesterday

   Business Requirement where you can use this Function:

Comparing System Date with a Date field

Assign constant time to field in DM

2. TRIM ALL, TRIM LEFT and TRIM RIGHT : These are the functions which use to TRIM extreme SPACES only not Special characters. I think in future if SAP can provide such a function for removing special characters that would be really great in writing Expression for Assignments.

Special Characters Removal Using MDM Import Manager

3. Callable Function : You can write an Expression in Validation Tab and if you have set Property Callable = Yes then you can use this expression in any other Validation/Assignment Expression. For this you will see this Validation name in Functions tab of Expression.Please refer this thread for more details

4. Concatenate : Concatenate function will give you always delimiter comma (,) between two field values. For the same you can also use & operator which will give you values without any delimiter. Thread ID

5. Concatenation of Attribute Values : This is possible using Add Branch Asignment. Please refer this thread for more details.

6. Language Field Validation : In fact, Multi lingual validations cannot validate the inherited if the data manager is opened in the primary and vice-versa. For more details Please refer this thread.

MDM Expressions: Part 2

References:

MDM Data Manager Reference Guide

SAP MDM Tutorials | SAP MDM Training | SAP MDM Interview Questions |SAP MDM Books

Import of PDF files using MDM GUI Clients

I have been watching threads on SAP Master Data Management for the last 5 to 6 months. I noticed that SDN members keep asking about import of PDF Files and often confused about this concept. So I think this is useful for all of SDN community if there is requirement to load PDF files.

Pre-requisite: In order to load PDF files Adobe Acrobat® Standard or Professional Version is required and it needs to be installed on the machine that is loading the PDF and not on the Server machine. I mean on machine where you are using MDM Data Manager. It is also required to generate Thumbnails. If you don't install this Adobe Acrobat Standard/Professional you will get error as given below:
"Importing [English US] 'xyz.pdf'...failed.
Unable to open Adobe Acrobat.
See your administrator about installing Adobe Acrobat."

Procedure: PDF files can not be imported through Import Manager. For this you need to upload the PDF files with the help of MDM Data Manager using Data Group. While uploading the file through Data Group you can store that file either using option Link to original file only or store in repository. Once you have uploaded all the files in data Group then after that using Import Manager, you can link main table records with the corresponding PDF files using PDF file names.

Some SDN members often confused about Allow invalid PDF property which can be set in MDM Console. I want to throw some light on it.

Note: MDM considers a PDF invalid only if it is unable to generate a thumbnail for it. If you have installed Adobe Acrobat Professional on the machine that is loading the PDF, you don't need to set this property Allow Invalid PDF in MDM Console. I mean in console right click on repository -->properties-->allow Invalid PDFs = Yes is absolutely not required.

References:
SAP Note: 1258537
MDM Console Reference Guide
MDM Import Manager Reference Guide
MDM Data Manager Reference Guide

Lexmark on Using SAP NetWeaver MDM and SAP BusinessObjects Data Services

SAP BusinessObjects Data Services and SAP NetWeaver MDM are a perfect match when it comes to integrating and consolidating customer data spread across the enterprise. At Lexmark, a global provider of business and consumer printing solutions, this combination is applied in their MDM initiative for streamlining enterprise analytics. Listen to what Lexmark's Information Delivery Manager, Joe Young, says in detail in this 4-minute-testimonial.

For more details on combining SAP NetWeaver MDM and SAP BusinessObjects Data Services you may watch this SDN Wiki space

SAP MDM Tutorials | SAP MDM Training | SAP MDM Interview Questions |SAP MDM Books

Data Standardization and Enrichment an integral part of Spend Performance Management

In every SPM project I invariably get asked the same questions, what is DSE? is it part of the product?  I will try and answer these questions here.

So what is DSE?

Data Standardization and Enrichment ensures the reliability of your spend analyses.  DSE compiles global market data from external sources including diversity and risk, in order to validate and enrich your suppliers and classify your transactions.

1.Validation & Enrichment

Supplier identities are often vague due to poor data quality and lack of content.  It's tough for an enterprise to maintain consistency across multiple systems involving supplier data.  Let alone maintain parent/child relationships across the entire list of suppliers they work with.  And because of this, procurement organizations are not able to analyze their spend by suppler and they cannot leverage total spend with a supplier parent.

Supplier validation and enrichment is the process of transforming unstructured and incomplete supplier data into valid, enriched information for reliable analyses and greater leverage.   DSE standardizes the format and content of your suppliers, removes duplicates and then validates them against a global business directory.  Once validated, the supplier records are enriched with information such as legal name, trade names, corporate ownership, risk indicators, and so on.

Before validation & enrichment

Supplier ID

Description

Spend

SAP001

SAP America

$5500.00

BOBJ01

Business Objects

$3500.00

SAP002

SAP

$2200.0

After validation & enrichment

Supplier ID

Description

Cleansed Supplier

Spend

SAP001

SAP America

SAP Inc

$5500.00

BOBJ01

Business Objects

SAP Inc

$3500.00

SAP002

SAP

SAP Inc

$2200.00

By creating linkages between supplier corporate parents and their children, procurement organizations will be able to realize that all spend is against the same supplier.

Additional enrichment facilitates strategic decision-making such as diversity status (e.g. Minority-owned, Women-owned, Veteran-owned), credit/financial risk, and other attributes supporting corporate initiatives.

2.Data Classification

Classification is the process of assigning purchases to a standard structure such as UNSPSC, eCLASS or Custom Structure.  The issue at most organizations is that a single classification structure is not used for classification across all procurement and financial transaction systems.  Hence procurement organizations are unable to effectively analyze their category spend across the entire enterprise.  The benefits of using a single classification structure are:

  • A standard view of all goods and services purchased across the enterprise
  • The ability to leverage total spend to drive cost savings, monitor compliance, and rationalize suppliers

Below is a good example of data classification

Unclassified Data

Product ID

Description

Supplier

ERP Category

Spend

X08092010

Shoes

Athletic

Clothing

$5000.00

AJRD2009B

Shoes

Balance

Footwear

$2500.00

TP2008CRV

Blackberry

A-Mobile

Fruit

$3500.00

PX2009BD

Blackberry

B-Mobile

Telephone

$2300.00

Classified Data

Product ID

Description

Supplier

ERP Category

Standard Category

Spend

X08092010

Shoes

Athletic

Clothing

Footwear

$5000.00

AJRD2009B

Shoes

Balance

Footwear

Footwear

$2500.00

TP2008CRV

Blackberry

A-Mobile

Fruit

Telephone

$3500.00

PX2009BD

Blackberry

B-Mobile

Telephone

Telephone

$2300.00

Note:  Supplier description is not the only field with which categorization is done.  Depending on the vendor and the engine they use it involves lot many fields to correctly categorize your data.

Looking at unclassified data business is not in a position to find what is the total spend in Footwear or Telephone category, but once the data is classified this question can be answered correctly and accurately.

Although not mandatory it's very beneficial to use DSE process.  So it's very important to talk to your business users and explain to them how DSE improves leverage.  This is one of the important decisions that will drive project timelines.  The next logical step is to decide which vendor you would like to work with.

As you have realized by now DSE process is not part of the SPM application.  This is an additional service that data goes through.  Which means (in most cases) you will need to send your unclassified data out of the SPM application to your 3rd party vendor for DSE process.  Each vendor has their own way of processing data.  Following are some questions that you might want to ask yourself and your DSE partner to ensure the process is smooth:

  • How frequently do you want to send your data for DSE process
  • How extensive is your DSE vendor's business directory (number of businesses, geographic distribution, type of content)
  • What supplier enrichment attributes are needed by your business users (diversity, risk, sustainability, clinical, parent/child linkages, franchises, sales, geocodes, etc.)
  • What categorization structure (UNSPSC, eCLASS, Custom Categorization) would business like to use
  • How long does it take your DSE vendor to pass your data through their engine for the first iteration and how long does it take for subsequent iterations.
  • Last but not least what do they charge you for this process

Note: Usually DSE partners do take more time during the first iteration as their engine is still trying to learn your data.  Subsequent iterations generally take less time.

In my next blog I will talk about how the SPM application handles all this DSE in its data model.

SAP MDM Tutorials | SAP MDM Training | SAP MDM Interview Questions |SAP MDM Books 

MDM Auto ID: Common Questions Asked

I was always thrilled with the idea of writing blog. When I look into MDM blogs and Articles section then I found that most of topics have already been covered. So I thought of writing about MDM Auto ID. I feel this is useful and informative to everyone especially for MDM Beginners. This is my first blog. So, I would try my best that this information is easily understood by everyone.
What is MDM Auto ID Field: In simple words, it is a field of type INTEGER which automatically gets incremented by one for every new record created into MDM. It is also known as Unique Identifier.
E.g. If you have 5 records in MDM with Auto ID 1 to 5 and you delete record with Auto ID 5, still next time when a new record gets created into MDM then Auto ID 6 will get created not Auto ID 5. That’s why it is a Unique Identifier and you can easily know that somebody has deleted record with Auto ID 5 in MDM and now it is not present.
Commonly asked Questions:
1. Can we load our own value to the MDM Auto ID field, manually or Using Import Manager?
No, as Auto ID field gets populated automatically for every new record created and incremented by one. Also Auto ID will always display as readable only. So you can’t set its value using source file through import manager.
2. Can we Syndicate the MDM Auto ID field to Target System?
Yes, you can easily map your Auto ID field with target field and then can syndicate Auto ID.
3. In case of CMDM scenario, if I send data from MDM to distributing systems. Is this possible that MDM Auto ID field can be used as Global ID?
No, this is not feasible. Because when records get syndicated to target systems and come again to MDM with more field information’s like Remote System’s local ID then you can’t make use of remote key concept as you can not map source field with MDM Auto ID during Import.
What to do in this case: Solution is you need to create one more field of type integer/text in main table say MDM Global ID and populate this field using assignment expression with the help of Auto ID; you maintain the same Auto ID values for this MDM Global ID field. So that when this data comes to MDM with more fields’ information from remote system you can map source Global ID(MDM Auto ID) field with this MDM Global ID and map source Global ID(MDM Auto ID) clone field with Remote Key during Import. In this way, you can maintain remote keys of respective remote systems which you can see into MDM data manager by right clicking on record using Edit Key mappings.
I assume that you are familiar with the concept of Key mapping into MDM. So I don’t discuss it here. I hope th

Technorati Tags:
is will give everybody a fair idea about MDM Auto ID field.

SAP MDM Tutorials | SAP MDM Training | SAP MDM Interview Questions |SAP MDM Books

What is Master Data?

Authored by:  Jim Whyte, Director of Solutions Consulting, Utopia, Inc.

Definition:  Master data is reference data that is shared or exchanged across strategic applications, work processes, lines of business, orenterprises.

My definition has been personally field tested for thebetter part of 20 years and has served a number of Fortune 500 organizations very well.  For me, the focus is on the sharing and leveraging of the reference data across the various organizational boundaries that master data can and must cross.

The definition is based on the goals and objectives my former employer was trying to achieve back in the late 80's - ERP consolidation, moving from a decentralized, LOB/regional multi-national company to a global, top-down, centralized management, standardizing applications,infrastructure, business processes, roles and responsibilities, and reporting.  To achieve that, we needed common reference data across our solution landscape.  Does that sound familiar to you?

We were reducing the number of ERP systems, moving to four regional instances, based on SAP software (common applications). This required a broader view of master data than most are willing to adopt today. We wanted to ensure that our four SAP instances were configured identically. It was the start of our implementation of enterprise data governance, data standards, and standardized master data content. We knew we couldn't achieve one, without the other.

I often wonder if I would have the same deep belief in my MDM definition and support of centralized data governance if it would have been possible to run our entire enterprise on a single instance back then. I might not have been as concerned with managing configuration table changes across four production instances (e.g. order types, pricing conditions, and unit of measure, currency and exchange rates in addition to customers, vendors and materials) are system development landscape with common data content across development, user acceptance, training and production environments.

We successfully implemented a global data warehouse a few years following our ERP consolidation and we recognized that all fifty, yes 50,master data objects and hierarchies we maintained in our propriety MDM application were either enterprise KPIs or global reporting dimensions in our data warehouse. That wasn't by design up front, but it became a benchmark forus when determining what additional reference data objects we should consider to centrally manage on behalf of our enterprise.

So what is master data? It is somewhat in the eye of the beholder - but I firmly believe it is those data objects shared and exchanged across your enterprise, that are core to your business operations.  Additionally, they are the data objects that you will pick up and migrate from one strategic application to the other, as they are timeless. They are your enterprise "family jewels" and will be handed down from one generation to the next.

SAP MDM Tutorials | SAP MDM Training | SAP MDM Interview Questions |SAP MDM Books

Parametric Data - Don't let the "metric" scare you.

Authored by Peter Dahl, Senior Consultant, Utopia, Inc.

taxonomy

It's funny how looking at different parts of a compound word gives different connotations. In our taxonomy world, parametric has nothing to do with the metric measurement system ...it's all about parameters.  Parameters are the "old" version of our current terms attributes or characteristics. However, parametric also implies that there is order to the characteristics.

Why would we care about order? Ok, that's a tongue-in-cheek question; order is invariably a good thing. Seriously though, nowadays we seldom manage to run any data system - whether it is a mom-and-pop shop with a simple database of retail products, or a full-fledged ERP - without considering the value of an excellent taxonomy. And, until someone creates and donates a worthy full-fledged taxonomy to the public domain, it's usually developed in-house.

A good taxonomy will include all the elements that allow the advantages of database tools - a noun and modifier (or class), attributes (or characteristics), a short description and a longer, fuller description.  Ideally, the attributes are ordered. Why ideally?  Well, let's think back to my parametric term. The implied order in the list of attributes allows the use of the noun and modifier (class) and the attributes (characteristics) to automatically generate the full description.The order dictates, then, that the description will contain the relevant information about the object from most to least relevant. Why?  Well, again, we can use database tools to make comparisons, and allow the use of fuzzy logic in searches.  If the data in a text block flows from most to least relevant, the most relevant attributes can be matches, while the least relevant can be fuzzy matches ... which allows one to find something "close enough" in an emergency.  Parametric data also allows for the most effective reviews for potential duplications.

The critical thing, of course, is the order ... and it is quite difficult to arrive at a perfect order. Five taxonomy experts will likely have five opinions on the order of any specific item.  To confirm that, check out product catalogs for five different bearing companies and you'll very likely find that none agree on all the attributes, or on the order of relevance.  My rule of thumb for dictating the order of attributes is that each successive attribute should narrow the set of possible matches the most. For example, the noun bearing will contain all possible bearings.  The class roller bearing is an immediate huge reduction in the bearing set. One then needs to look at OD, ID, width, material of construction and other such relevant attributes to determine which will reduce the set of potentials most.

While one can only dictate the ordered attribute set and the resulting parametric description for one's own business, a well defined and controlled taxonomy will allow the use of standardized tools to reduce the variability of data and/or goods as well and helping ensure the data/items can easily be found and harder to inadvertently duplicate.

 

SAP MDM Tutorials | SAP MDM Training | SAP MDM Interview Questions |SAP MDM Books

SAP Developer Network Latest Updates