SRM-MDM UI configuration User role

SRM-MDM UI configuration User role   
For configuring the UI for SRM-MDM, after connecting to the server What userid and the role for the user to login to the MDM repository?. Is UICONFIG role is assigned to the user before login?

Is there any standard userid and role available for configuring the UI?  
Hi Steve,

For configuring SRM-MDM UI, user should be assigned UI Configuration Manager role. This role is available in standard shipped SRM-MDM repository. Also this role is by default assigned to user Master. You can add this role to any of the users you want to use to configure UI

Jitesh Talreja

How to retrieve Tuple Lookup values using MDM Java API

How to retrieve Tuple Lookup values using MDM Java API   
Instructions for using the Java API with tuples can be found here - link.
The way to retrieve the lookup fields within a tuple is same as other fields inside the tuple. The only difference being other fields return the value direclty when you try to access the field inside tuple. Where as if it is a lookup field then it will retun you the record Id of the value. From this record Id you need to iterate over the lookup table itself to find out what is the actual value.
you need to add the lookup fields in the result definition of the command than only it comes in tuple value. My investigation is If the lookup field is not a main table field, you don't get the value for that lookup in tuple too.
just check it out.

MDM Tables

MDM Tables 
Q:I want to know what are the backend database and tables name where MDM repository data is stored and how to read them.
My intention is to write MS access query to perform a search to retrieve material parts as it is very difficult to do so through front end.(Data manager)

As in front end i need to search one part at a time through free form search.

I am on MDM 5.5 and i use SQL server as my Database.

Ans1:The backend database name for a MDM repository is derived from the repository name (with non-alphanumneric characters removed. A repository named "products" would have an database called "products_M000".

The backend database name for a MDM repository is derived from the repository name (with non-alphanumneric characters removed. A repository named "products" would have an database called "products_M000".

Each repository on your DB server will get stored in Databases Folder . Repo can be identified by RepositoryName_m000.

Under this folder tree, 2 folders A2i_CM_Tables and A2i_CM_Fields exist which store the table and field names.

You can read these DB tables from below mentioned options.
1.MS Access > Connect to SQL Source > Define DB Server name > Select Repository name>Choose the table
2. Use SQL Server Enterprise manager > Navigate to required table (table names mentioned above).

It is very difficult to get the entire Table data because there would be few tables corresponding to one MDM table in database hence i will suggest you to try with Expression (part of FreeForm Search) which allows you to write expressions similar to validations and in addition you can use other search parameters of free form search as well.

If you want you can share what kind of search you are looking for, so that we can try with Data Manager only.
My requirement is I want to search multiple material numbers.
The material numbers in my data model is a text field and so i m not sure how to run a query to find out say if 100 mat exist in mdm ?

If you can think of a feasible expression plz help.
As per you,i should not face any major problem to using Access to run this query at the database level.
If so ill be happy to use your suggestion.But plz tell me on which database table should I query and how to get the above result.
If i query on the A2i-CM_tables will i get the extract of the 100 mat i want to search.
Say you want to search whether Material Numbers 100, 300, 500 and 900 exists or not then your Expression would be

Material Number = 100 OR Material Number = 300 OR Material Number = 500 OR Material Number = 900

Click on Browse button of Expression field present under Free Form Search and select the Material Number field from the Fields dropdown (don't write it)

Thanks for the reply..

ill def try this option..but i am really interested in knowing how to query this frm the database.

As it is really convient to use these MS Access.
Please clarify

Will I be able to see the Main table(products) and the main tables records under the A2i_CM_Tables in the backend database?

If yes,then can i query on this table directly from MS Access and what are the complexities in doing this if any?
A2i_CM_Tables contains all the tables in Repository . Note the table name or Table ID. Eg.Main table name - Products

A2i_CM_Fields contains all the fields in a given table. Select the table name/id to get the list of all fields in the table .
Note the Field name or field Id that you are looking for .Eq. Product Name.

Write a simple query using above table and field names to search field values.

Hope this helps. 

MDM Repositories not getting loaded in Data Manager

MDM Repositories not getting loaded in Data Manager
Q:We have 2 MDM Servers one for Development and another for Demo. We were able to access the development repositories through data manager. After we copied the repository from the development server and created the repository in demo server we could not able to load repositories of Development Server.

When we mount the server in both MDM Server and Auxilary server the versions in MDM Server is showing and the same Server in Auxilary server the version is showing I have tested in both the versions of MDM Datamanager to load the repository but I am getting the message as repository not loaded.

Do any one have fased this issue before please share how to proceed further.

Ans:Open MDM Console and mount the development server first, mount the repository (if it is not present) you are trying to access from data manager, login into the repository and check whether it is loaded or not (green triangle along with repository name shows repository is loaded), if not, right click on repository and select Load Repository -> Immediate.

Repeat the same for Demo server.

Ans1:You should also check that all 3 (MDM server, Console and Data Manager) are on the same 7.1 SP (for example, that all are on 7.1.03.xx). If not, then you should install all 3 (and in general all MDM components) on the same SP level, other ways will not work.

The enhance fields in xml schema?

The enhance fields in xml schema?   
Q:We are at SRM-MDM Catalog 3.0 SP02, we enhanced two fields in the contract data which were transfered from the SRM via xml. But when the MDM import manage do the mapping between the xml file and MDM catalog items automatically, it will bring a exception with the mapping error. The error message is "Logon Error: Source file dose not conform to XML schema. Element not found in the xml schema."

How could I add these fields to xml schema?

Ans:Upload the latest xsd file in XML Schemas table present under Admin node of your repository. At the time of import it validates the XML file with its definition stored in this XML Schemas table.

How to add Field to Search in Popup ---- Create iView

How to add Field to Search in Popup ---- Create iView
We have integrated MDM with EP, using the standard iViews.

I have a create iView (i.e Item Details iViews) , where i am creating a new record.

While creating a new record, there are certain Lookup fields present in them, which has a popup to search for a value in the Lookup table.

As per my understanding in the Popup screen, the user will be able to search by the fields which are marked as display fields in the Lookup table. But it doesnot seem to work that way. Can anybody help me to add some fields to the search in the Popup screen. We are using SAP MDM 7.1

The selected fields for searching is not customizable in MDM Standard Item Details iView. Only searchable fields which are configured in the Console as Display fields will be in the search grid of the popup.
I have one more question. If there are 3 display fields in a table in MDM, then is it possible to make only 2 fields as searchable in the Portal in the Popup screen?


All Display fields will be searchable, this is according to the documentation

a2a file MDM 5.5 SP06

a2a file MDM 5.5 SP06   
Please guide me where i can download sample a2a file for MDM 5.5 SP06

Automap Values feature for Manual Import

Automap Values feature for Manual Import 

We have a Import Map, which the end users uses to do Mass inserts into the Main table as & when required. The end user runs the Import manually from the Import Manager.

We have about 20-25 columns which are of type lookup flat. Hence every time any new values are added to these columns in the source, the user would have to manually go and click on the Automap for values for all the necessary fields.

Is there any property which we can set so that, the Import map automatically performs an value Automap for all the lookup fields when it is loaded?

The source file format is an excel template provided to the user.

We are currently using SAP MDM 5.5 SP06 (Version

Refresh all the lookup tables in MDM with remote key mappings.

Then perform perform manual import to Main table. All the Lookup field values will auto map.

Under same configuration optiosn choose and select appropriate values for -
Default MDIS Handling > a. Automap Unmapped handling > Yes
b. Unmapped Value handling > Add

Just check if this happens for all the lookup tables or the only ones used for Matching Records. For second case, use some different matching keys and try the import. Automap feature is available for MDIS only. 

Resetting AUTOID field

Resetting AUTOID field

How do I reset a field which has type AutoID to start again from count 1 after deleting all the records.Currently it starts from the next sequence.?   

AutoID field cannot be reset in a repository. This is an internally generated integer by MDM.
You will have to create a new repository using the schema of original repository for new set of Auto IDs starting from 1.

Ans2:Instead of creating new repository, just recreate the AutoID field. This will server your purpose it is not required to create the repository again

Session time out for Create iView

Session time out for Create iView   

We have integrated SAP MDM with the Portal using the Standard iViews provided by SAP.

We have Item Details iViews, which we also use to create a new record in the table. There are some popup buttons for the lookup fields to search for values.

Currently we have set the session time out for portal as 30 mins. If the session is opened for a long time, ie after the session is timed out, we are getting a error {null} in the Portal.
Can we change this and provide a meaningful message. We are using MDM 7.1 SP03.?

The message you are seeing comes from the MDM Connector saying that the connection was already released.
Would be nice to have a "nicer" error message or to be able to change the message at design time, but would be a feature request as it is currently not supported in MDM Standard iViews.

Hope this information helps,

I need some help on validation.

I need some help on validation.
I have 3 fields in main table Field1, Field2 and Field3. All lookups in some table.

If a user enter data let say in Field1 is 001 and Field2 is 002 from dropdown.
In filed3 which is also lookup into flat table Tab1. That flat table contains two fields both are display.
Mean to say while selecting data from drop down we will see like 001,002... 001,003.. 003,005 etc
So user after entering data in field1 and field2 will check that in feild3 that 001,002 exists or not.

Like in this eg. user selected 001 in field 1 and 002 in field2 so in field3 three user will select 001,002 from dropdown.
If he slects different link 001,003 then validaton should fire and say that u have entered wrong link.

In short we need to verify from filed3 that we entered correct link from dropdown which is same as in field1 and field2

Field1 - 001
Field2 - 002
Field3 - 001,002 ( must be)

if Filed3 is 001,003 or say 001,004 or any other combo not in sync with filed1 and field2 there shud be a validation fire..

Need to know how i can do this in validation..

Please use this expression - HAS_ALL_VALUES((Field3.Name1&Field3.Name2),(Field1.Name&Field2.Name))

Field1 is a field with its lookup field as 'Name'
Field2 is a field with its lookup field as 'Name'
Field3 is a field with its lookup fields as 'Name1' and 'Name2'

Gartner MDM Excellence Award 2009 Goes to Kraft Foods Inc

Gartner MDM Excellence Award 2009 Goes to Kraft Foods Inc

Dear MDM community,
This is good news for SAP's master data management solution!
At this year's Gartner MDM Summit, the premier industry event for Master Data Management, held 5-7 October 2009 in Los Angeles, Kraft Foods Inc. won the Award for MDM Excellence. The winner runs a central, multi-domain MDM scenario on the basis of SAP NetWeaver MDM. Read more in the Gartner announcement (linked above).
For more details on the scope of their MDM initiative, please see the SAP Press Release (dated September 2008) and the included video (also dated 2008) that shows how Kraft Foods uses SAP technology to track and analyze its master data around the world.
Number two in the race for the Gartner MDM Excellence Award is BP who also runs its MDM strategy on the basis of SAP NetWeaver MDM.
For me, a clear indication that SAP is fully on track with its master data management strategy.
Kind regards,
BTW - For those of you who attend SAP TechEd, see the MDM tracklist.


ASUG Annual Conference Session: Managing Transports Centrally in an SAP Landscape Using Change and Transport System (CTS+)

ASUG Annual Conference Session: Managing Transports Centrally in an SAP Landscape Using Change and Transport System (CTS+)

Please join Storm Archer at the 2009 Sapphire/ASUG Annual in Orlando, Fl. May 11-14th. Storm will be presenting a session on the SAP Enhanced Change and Transport System (CTS+). Below is the session information:
In an SAP landscape, there are different development, content, and configuration objects that require a transport from a development system to a testing environment and finally into productive systems. Learn how to leverage the ABAP-based enhanced Change and Transport system (CTS+) for transporting changes in the entire SAP landscape, both for the ABAP programming language and Java objects. In addition, attendees will receive answers to questions such as, "What is new in the area of CTS+?" and "What is planned for the future?"
The Session will be offered:
May 14, 2009
Room 220E
Hope to see you there!
Warmest Regards,
Storm Archer
Senior Product Manager
SAP Labs, LLC.


"Driven to Perform" Webcast: Managing Performance and Risk in a Down Economy – A Practical Approach to Optimizing Business Performance for Everyone

"Driven to Perform" Webcast: Managing Performance and Risk in a Down Economy – A Practical Approach to Optimizing Business Performance for Everyone



Managing Performance and Risk in a Down Economy – A Practical Approach to Optimizing Business Performance for Everyone
What differentiates you from your competitors? How can you improve performance in difficult times by finding and executing the most effective and efficient strategy, and then tuning it to achieve even better results? How can you achieve alignment across your organization and partner network? Don’t miss this important webcast, where you can get answers to all these questions. Come hear directly from the authors of Driven To Perform – Risk Aware Performance Management from Strategy Through Execution as they describe how to strategize, plan, execute, monitor, analyze, and optimize performance, all within the context of the myriad risks and compliance requirements that companies face today.
Driven To Perform provides a unified approach to performance, risk, and compliance management that can help all you effectively manage performance across a global business network. Driven to Perform shows how to apply the principles of risk-aware performance management to your area of expertise, whether it is Finance, HR, Sales, Marketing, Service, Supply Chain, Procurement or Product Development. You will receive valuable context for all of these business areas as it relates to managing performance and risk, as well as learn the collaboration points between them enabling you to optimize performance beyond your four walls.
The concepts of Driven To Perform are essential for those who are serious about driving transformation and who are looking to create a high performance and data driven culture beyond just concepts. Driven to Perform will take you from strategy through execution, with a focus on end-to-end business processes. A detailed case study shows how a corporate strategy is executed across an organization, fleshing out the meaning of risk-aware performance management so you see a practical use case for how to apply this to your business.



SAP NetWeaver MDM 7.1 Entered Unrestricted Shipment

SAP NetWeaver MDM 7.1 Entered Unrestricted Shipment

Dear Community,
SAP NetWeaver MDM 7.1 has successfully completed the ramp-up phase and entered unrestricted shipment now.

 The highlights of this release are: 

  • High flexibility in data modeling and support for complex data structures
  • Optimized inbound and outbound processing
  • Integration into SAP NetWeaver’s standard administration and life-cycle components
  • Pristine data quality (for example, through connectivity to SAP BusinessObjects Data Services)
  • Acceleration of service-based implementations
Our customers already have SAP NetWeaver MDM 7.1 in productive operation today and are enjoying the benefits: They have confirmed the optimized performance of the platform and reinforced that new features such as the data modeling enhancements, are exactly what they were looking for.

If you want to learn more about SAP NetWeaver MDM 7.1, you can access our collection of informative material stored on SDN:

SAP NetWeaver MDM Capability Page 
SAP NetWeaver MDM 7.1 Overview Page
Opinions about SAP NetWeaver Master Data Management 7.1 (Blog)
SAP NetWeaver MDM 7.1 Features Overview (eBook)
Thread on SAP NetWeaver MDM 7.1 Documentation Updates

Kind regards,

Thomas Ruhl

We are Definitely faster...

We are Definitely faster...

In continuation to Oliver’s blog If it were a hundred times faster...
Just a small thought that if lifts working in our building work at speed of 10 years ago, will they make difference for us for reaching office. All of us hate to wait for lift to arrive and reach our floor. Hence I am with Users on their requirements.
However as Vijay pointed in Oliver's blog; everybody loves that work finish as soon as possible but what is the limit !

example; I remember of one of my project where the MDM repository used to take 3-4 hours to reloaded which was required as every night activity (there was a reason why we needed it and later it was solved). Because of this reason, the MDM server was unavailable for long time but in night. If we required this system in multi-geographies then we had to plan something else as current system taking too long to finish the job.

Another example, that if a organization system/process is waiting the customer master creation process in its system while customer is waiting for the product then instead of solving problem, its not solving the problem.

The work we used to do 10-20 years before in 8 hours, now we are able to do much more than that.
May be we became faster or systems made us perform faster !

Ankur Ramkumar Goel has 8+ years of SAP experience in ABAP, BI and MDM with implementations, Roll outs, Maintenance, due diligence and strategy projects across Europe, USA and APJ, and has been instrumental in MDM for 3 years. The Blog contents don’t necessarily represent my Organization positions, strategies or opinions.

Perspective on Improving Matching Performance of Large SAP MDM Repositories

Perspective on Improving Matching Performance of Large SAP MDM Repositories

Some immovable requirements result in replicating the data model of SAP ECC in SAP MDM. Result is a large repository with over 250 fields. Once this repository is loaded with over a million records, and you start performing matching or searching using SAP Enterprise Portal you would encounter several performance challenges.
A slow MDM system will never be accepted by business users. To address this eventuality it is essential to proactively identify and mitigate these risks. Clients have to be aware that replicating the data model in SAP MDM might be as easy amongst the options rather than a building a solution on eSOA.  But this approach runs the risk of real time performance.  For accurate matching results we use Token Equals Feature, but this results in huge lead times specially while processing million records with over 250 fields.
Following SAP Document is a good start to help in improving the performance:"How to Optimize an MDM Matching Process'
To enable faster performance and searching results, one way is to look at 2 separate repositories. One dedicated for matching and searching tasks and the other holding the parent repository with all fields. The dedicated repository for matching must have only crucial fields such as Matching Fields, Primary & Foreign keys. The Portal could then connect to this smaller repository for matching. Once the results are displayed on SAP Enterprise Portal, the user could then choose to add, delete or update. The resultant action would then connect to main Repository.
Keeping a smaller dedicated repository for matching also reduces the loading time. You cannot use the Slave Feature in Console as a slave repository must have the same fields such as the Parent. As per the SAP Document another good practice is to improve the speed by using Calculated Fields in the smaller repository. These calculated fields are trimmed values of matching criteria for Example First Name can be trimmed to first three characters stored in one calculated fields, first two characters stored in another calculated field etc. Using Equals feature in matching the performance can be extremely fast. But the results of this option might not be as accurate as using token equals as per the analysis done.
Handling the dilemma of choosing precision accuracy with long delays versus faster results with average accuracy was a good learning for us. A lot of "What If Scenarios" needs to be done with multiple options and time taken for each. These options could be trying out choosing different calculated fields, Different matching strategy and scores, and choice of equals or token equals for each matching field.  Analysis of this Performance Improvements would give ideal insight to the approach step with quantified data supporting.  With this study of matching behavior, one should be able to identify the approach for accurate results in shortest time.
A good practice during Blueprint workshops is to present all the results along with choice of matching strategy, scores and threshold limit. If there is disagreement amongst various client stakeholders while identifying the best matching criteria, statistical techniques such as Average Ranking, Spearman's rank etc can be used.  Since each project is unique, generalization of approach is difficult. For Business Partners to increase accuracy, using calculated fields approach for both First Name and Last Name would be more efficient then just using calculated fields of First Name for example.
Insight into the behavior of business users would give information to decide whether to choose Token Equals or Equals using calculated fields.  You could choose matching criteria with Equals using calculated fields on every day use for business users purely due to the high speed results.  Matching with Token Equals can be used on periodic basis such as weekly or fortnightly to identify possible duplicates by Data Administrator. This dual approach might involve redundant activities but would ensure healthy data.
Data analysis using random sampling would give insight to spread of master data in different categories such as Country, Organization etc.  Depending on the pattern of classification, you could filter records based on Country, Region, Categories-Retail Example Apparel, Food etc. Filtering would enable faster matching performance.
The best practice is to stick to the line of thought that Master Data (Global) should be stored in SAP MDM and rest of transactional fields in respective systems such as SAP ECC etc. This would enable standardized data model and attributes for global use and not replicate legacy or SAP ECC data model in SAP MDM.
Navendu Shirali is a Consultant in SAP MDM Center of Excellence at Infosys . His areas of include building new solutions, converting opportunities and Master Data Consulting.


Using MDM Java APIs to retrieve and execute Matching strategies in MDM

Using MDM Java APIs to retrieve and execute Matching strategies in MDM

Taking forward, the build  for my Customised Data Manger using MDM Java APIs, please find this blog as the new addition the series:
 Using MDM Java APIs to retrieve Taxonomy attribute Values                                 
 Using MDM Java APIs to add Text(type) taxonomy attribute allowable values
Before I demonstrate using MDM Java APIs to retrieve matching strategies, let us understand what are matching strategies and how do we create or add them in MDM.
A matching strategy is comprised of one or more matching rules and a pair of numeric thresholds.Each strategy can be executed against a set of one or more records against the set, the search results, or all of the records in the repository.
Matching strategies identify potential matches for each record based on the matching scores of the individual rules that comprise the strategy and the thresholds that determine which records are potential matches for each record.
1. As a first step we will create a matching strategy from with in Matching Mode of Data Manager, to add a new strategy to the list of strategies:

  • If necessary, click on the Strategies tab to make it the active tab.
  • Right-click in the Strategies pane and choose Add Strategy from the context menu, or choose Records > Matching > Strategies > Add Item from the main menu.
  • MDM adds a new matching strategy named "New Strategy" to the list of strategies, and highlights it for editing.
  • Include the columns against which target records are to matched, say Material_Number in current scenario.
  • Type the name 'Material Strategy' for the matching strategy and press Enter.
2. Now since we have created the matching strategy by the name 'Material Strategy' let us look at the MDM Java API code snippets, which will be required to execute the matching strategy. Using the Java API, at first we reteieve matching strategies available in MDM repository:

RetrieveMatchingStrategiesCommand retMatStr =  new RetrieveMatchingStrategiesCommand(connection);
try {
retMatStr.execute(); }
catch (CommandException e) {
e.printStackTrace(); }
// we have only one matching strategy, retriving the matching strategy id
matchStID = retMatStr.getMatchingStrategies()[1].getId();
connection & authUserSession vaiables hold MDM Connection and authenticated user session respectively
After fetching the matching strategy id, we are left with two following steps:
3. For any new records being created,using the matching strategy id we execute the matching strategy to find mathcing records:

ExecuteMatchingStrategyForNewRecordValuesCommand exeMatstr = new ExecuteMatchingStrategyForNewRecordValuesCommand(connection);
exeMatstr.setTarget(new Search(new TableId(1)));
catch (CommandException e)
recs is the array of source records, these are records to find matching for, currently only one record is supported
rids is the array of target recod ids, these are the recods to match to
setTarget(new Search(new TableId(1))) sets the target records in form of a search object, these are the records to match to, i.e. all records in main table
4. After executing the matching strategy we execute the RetrievedMatchedRecordsCommand, to retrieve the records that matched the source record for the matching strategy, which was executed:

RetrieveMatchedRecordsCommand retMR = new RetrieveMatchedRecordsCommand(connection);
retMR.setRecordId(new RecordId(-1));
retMR.setResultDefinition(new ResultDefinition(new TableId(1)));
try {
retMR.execute(); }
catch (CommandException e) {
e.printStackTrace(); }
setMatchingTaskId(exeMatstr.getMatchingTaskId()) sets the matching task Id  fetched using executeMatchingStrategy command object from the last step, this identifier is a handle to the matching stratgey that was executed in the last step.
setRecordId(new RecordId(-1)) sets the source record Id, the matched records will be based on this source record.For external record matching, the record Id is -1.

As discussed in the blog, one can use the steps to execute matching strategies already existing in MDM in a different way. Though one might argue for running parallel searches in MDM using the API to accomplish certain kind of matching between records, instead of executing matching strategies this way. I would say this is a feature provided by the java API, it empowers developer to replicate data manager features over the web, and I personally feel using it this way would save a lot of development time required to explicitly code such a matching strategy using the APIs.


"Driven to Perform" Podcast from SAPPHIRE Orlando 2009 on Are you ready for corporate performance management?

"Driven to Perform" Podcast from SAPPHIRE Orlando 2009 on Are you ready for corporate performance management?

Stephanie Buscemi, co-author of Driven to Perform – Risk-Aware Performance Management From Strategy Through Execution, did a podcast at SAPPHIRE 2009 discussing the key concepts of the book. This is a great way to get a quick overview!

Are you ready for corporate performance management?

Corporate performance management (CPM) is one of those hazy terms that means different things to different people, but the basic concept almost everyone can agree on is that CPM is the discipline of tracking progress against preset goals to make sure those goals are being met in as efficient a way as possible.

How to actually achieve effective CPM, which involves aspects of business intelligence (BI) and risk management, is where things get tricky. There are any number of performance management methods that companies can follow, but choosing one over another is often an exercise in guesswork.

To help you better understand the concept of CPM and develop specific steps to successfully implement and maintain a performance management framework, we're speaking with Stephanie Buscemi, vice president of marketing for enterprise performance management and governance, risk and compliance at SAP and the coauthor of the new book Driven to Perform.

In this 30-minute podcast, appropriate for both business and IT professionals, listeners will:

  • Learn just what elements make up CPM and how the discipline has evolved over the last several years (1:45).
  • Get an understanding of the current state of CPM adoption and the major stumbling blocks most companies encounter (5:30).
  • Find out how CPM demands a balance among people and process issues and technical demands (10:15).
  • Get advice on the sometimes overwhelming task of starting a CPM initiative and managing the CPM lifecycle (14:20).
  • Get tips for evaluating CPM technologies and how to make buying decisions that match your CPM needs (21:50).
  • Find out what you risk if you fail to develop a comprehensive CPM framework (25:00).
You can also download the podcast directly.
About the speaker: Stephanie Buscemi is vice president of marketing for enterprise performance management and governance, risk and compliance at SAP, where she is responsible for go-to-market plans, cross-product solutions, and product strategy. She has served in leadership roles within performance management and business intelligence over the past 14 years. Stephanie joined SAP from Hyperion/Oracle, where she was most recently senior director of global marketing. Prior to Hyperion/Oracle, Stephanie was at Business Objects, where she led in building the company's U.S. presence. She holds a B.A. from UCLA.

Changes in life...of all !

Changes in life...of all !


For most of us, there was someone who made our paths changed a lot…
It was my cousin because of whom I am in SAP world. He gave me idea and support for entering the SAP world. Without him I would be here where currently I am. Once in SAP world, It was my dream to join the parent organization SAP itself.
Then there was one of my manager in one of previous organization, who over the lunch asked me to take the new challenge of MDM and that’s how I entered MDM world. Definitely there was history behind that why he asked me for that however the point is that this decision paved the way for me to join SAP organization.
I still remember when I first touched the computer in my engineering course and soon I realized that my logical reasoning for programming is far better than theoretical understanding of engineering subject. It was clear that I chose wrong steam for my engineering degree. However someone had some different plans and finally I landed in computing world only.

One thing I learned for sure is that finally we all land up where we have to be; where we deserve to be…


Producing Printed Catalogs with SAP NetWeaver MDM

Producing Printed Catalogs with SAP NetWeaver MDM

SAP NetWeaver Master Data Management is a multi-domain solution for master data consolidation and harmonization scenarios applicable for customer, vendor, product, employee or custom-built data objects. When it comes to product data, there is a native SAP NetWeaver MDM feature on which I'd like to draw your attention: It is the definition and publication of printed product catalogs. For this scenario, SAP NetWeaver MDM provides a specific UI client application enabling you to expose MDM repository data to a catalog publication and make the required layout settings in a DTP-like environment. A new how-to guide provides you with insight into this client application (MDM Publisher), drawing a basic publication scenario with the key process steps involved.
I hope this guide gives you an overall understanding of the print publishing capabilities of SAP NetWeaver MDM.


Standards and role of Data Quality with Masterdata

Standards and role of Data Quality with Masterdata

Let me quote what Wikipedia states about the ISO 8000-110: ISO 8000-110, Data quality - Part 110: Master data: Exchange of characteristic data: Syntax, semantic encoding, and conformance to data specification (revision of ISO/TS 8000-110:2008); basically states the importance of bringing standardization and advantages of the data quality in a MD space with the DQ.
What have we from this certification ...
Most importantly the MDM implementation is assured of seeking advantage of the implementation of a data quality program already available as part of the system , hence leading to a highest intangible of customer/vendor confidence of data requirements to be accurate.

I think we may all agree to three main MD benefits viz:

1. Application interpretable data requirement statement
2. Unambiguously labeled data
3. Automated gap analysis
While the program we post and undertake as part of a diligence with most client may be a separate initiative that organization will need to seek should not be linked to the MDM program. Newer cutting edge technologies and processes in today's industries have emphasized the important role in the complexity of information sharing. Business Suite 7 launched by SAP in another view of looking at the same. The additions of the Internet and revelations in interface design have resulted in companies being closely intertwined with each other and the consumer (GDS for GS1 could be an apt analogy in the mdm world) which we as consultant should be able to leverage. With the simultaneous transfer of information constantly taking place, it is more important than ever for clients to share and display quality data - a place where MDM can ideally be a hub for the master data. Duplicates and obsoletes have crept in swelling the data size above the expected level - Data quality being the key message here.
QAS Survey indicated only 46% have their own documented data quality strategy which lays an important precedence for the MDM program. More so showing why we as mdm consultant show also focus on the DQ functionality and due diligence study helping the MDM implementation.
I welcome thoughts on the same.

Rajesh Iyer Rajesh involved in the strategic management of complex Proof Of Concepts, Pilots and Custom Demonstrations to clients, COIL with SAP and various prospects in the SAP MDM space. He has carved several accolades from clients and customers. He has also been a key contributor in a technical pre-sales role where he often had to not only prove out the capabilities of the SAP product in extremely demanding and challenging environments but also had to craft the correct messaging for the relevant groups evaluating the software with competitive products such as IBM MDM , Hyperion and Kalideo.


SAP MDM Business Content and What Is In It For You

SAP MDM Business Content and What Is In It For You 

SAP NetWeaver MDM provides a generic infrastructure for multi-domain master data management. This is all perfectly described on the MDM home and the MDM 7.1 pages at SDN. However, bundled information about the business content delivered with the software and its benefits in implementation projects was somehow missing there, or at least less obvious to find.
Therefore, a new SDN site in the MDM space is explicitely dedicated at providing you with the required information at a glance. Check the new MDM Business Content page.
You may be aware that organizational intricacies are often an issue when it comes to designing the perfect data model to manage a company’s master data. Reiterated discussions and redesign cycles cost worthy time and effort. And once you are through with it, you’ll probably spend even more time integrating it into your IT landscape! In such a situation, predefined business content for SAP NetWeaver Master Data Management can help to reduce the implementation effort, and save you time and money.
For a quick overview, watch this SDN presentation to get informed on the exisiting MDM business content and planned content enhancements. 


Why Solaris for SAP?

Why Solaris for SAP?

I’m happy to announce the release of a new whitepaper:  "Platform Design, TCO, and Cost-Effective Flexibility: The Role of Solaris in Support of SAP Enterprise Applications."   What excites me about this new paper is that it really outlines the imperatives facing the CIO today and then  provides answers to those challenges:

  • CIOs are being asked to cut costs for computing platforms, but must also increase flexibility to better meet business needs.
  • New versions of enterprise applications, such as SAP Business Suite 7, are driving companies to upgrade and consolidate.
  • CIOs are also under pressure to implement Green IT: that means less power demand and less heat generation in the data center.
  • Server consolidation and virtualization are increasing trends that require deft management of increasingly complex and diverse machine arrays across the enterprise.
  • Platform as a Service and Infrastructure as a Service (Cloud Computing for short) are driving decisions such as when it makes sense to move applications and infrastructure to the cloud to lower costs and provide faster provisioning.

Under these conditions, scalability is necessary.  You need that scalability across  heterogeneous existing hardware and you need a truly top-tier operating system to orchestrate the pieces effectively.  In Hasso's keynote speech - The Power of Speed - at Sapphire last month, he used a few vivid examples of how the data retrieval paradigms used by most enterprise's operating systems vastly underutilize the capacity of the hardware you already have. He compares the cycle time of one processor to the number of cycles needed to retrieve data for business applications to “going to Mars to get a sip of water.” It’s quite astounding when you think of it that way, and I encourage you to watch the whole thing.

This paper tells the story of how the Solaris and SAP combination is ideal for meeting these challenges I have described above. It will make the SAP community more aware of how Solaris solves a lot of problems that people are seeking to solve by using lots of separate technologies, which can run you into hazardous and expensive territory when support and integration issues arise. In addition to racking up some of the most impressive SAP performance benchmarks,  Sun has facilities based in Walldorf – The Solaris Lab for SAP that allows other HW vendors to certify their platform for SAP on Solaris and also provides partners and customers questions for any related Sun and SAP question for Solaris (  Of course you can always visit our wiki too!

Many people know Solaris is the leading enterprise software operating system. What many people don’t realize is that Solaris is supported on over 1000 x86 and SPARC platforms—delivers the performance, stability and security that users and customers demand.  Solaris has a platform for virtualization, for security, and for a related file system all in one package that is engineered to meet the needs of the enterprise.   Solaris is a one-stop shop for meeting all of these challenges. The recent blog by Andre Bogelsack gives a great overview of how the University of Munich uses Solaris and it's built in virtualization capabilities to serve 82,000+ students.  Pretty Impressive!

You can get the paper by going to, but I’d like to summarize some of the highlights.

This paper demonstrates the big trends I’ve outlined here: cloud computing, virtualization, server sprawl, and the need for Green IT. It then demonstrates how Solaris delivers a stable, reliable, scalable and innovative solution for all of these demands.

Customers benefit from the singular vision and engineering muscle that created Solaris, while, through OpenSolaris, they can also capitalize on the flexibility of open source.

The paper describes how easy it is to migrate your SAP installation to Solaris (which has binary compatibility, by the way, which many other operating systems don’t – so if you need to run a Solaris 8 application on Solaris 10, you can).

Then you can see how Solaris is optimized for virtualization and high-availability computing, even on ordinary commodity hardware. Customers who switch to Solaris have recorded an average 30 to 40% decrease in downtime.

Solaris is free. You can get started today.  Solaris can run on the full range of hardware available.   Additional details about Hardware Certification for SAP on Solaris x64 can be found at

Again, I encourage you to download the paper and see for yourself. It’s a succinct document that illustrates clearly why the Solaris platform is ideal for running your SAP installation, and indeed your entire enterprise.


Information compliments Intelligence...

Information compliments Intelligence... 

Information brings/felicitate intelligence...however Correct Information is the key to Intelligence....

Checked an article that Coke is using RFID technology to collect information of Customer perspective (Coke's RFID-Based Dispensers Redefine Business Intelligence). It is wonderful utilization of technology.  The collected information will be used to formalize correct and effective strategy and processes to meet the specific demands.

Intelligence is always required. The Data mining, analytics offers complete new environs of Information. This experiment again emphasizes and proves that right information is very vital. I am not sure if it’s already capturing the climate conditions, type of outlet at the time of drink order in dispenser. This kind of information, trends are very important in establishing or redefining the strategy, rectify mistakes (if any) to achieve more success.

The type of outlet is important for any company since only right and complete information leads to right decisions. For example, in a burger joint or pizza joint cold drinks will sell more. However at the eating joint, beer will sell more. Moreover time is important since in evening, tea might sell more and in morning juices will win.

The right information puts the right perspective to the reports clients regularly see or want to see. This simplifies the job of management or may be rectifies will be right word.

Ankur Ramkumar Goel has 8+ years of SAP experience in ABAP, BI and MDM with implementations, Roll outs, Maintenance, due diligence and strategy projects across Europe, USA and APJ, and has been instrumental in MDM for 3 years. The Blog contents don’t necessarily represent my Organization positions, strategies or opinions.


What's new around the world of MDM - SAP guys should know

What's new around the world of MDM - SAP guys should know


I see large SAP shops also having competitive products and more so a partner has the cruel and cross bearing job to give the best of breed solution - more so when he or she is more comfortable with the SAP Ecospace.
I find the constant updates from other product vendors in the MDM space to be a good point to actually look at SAP MDM from a different hat. While the information of the updates is seeked from different sources, the inferences and observations are with my experience with the products and interactions with key users and experts in that field.
Let us take for example couple of updates which have been published in the new suite of MDM :
A) Kalido - an old horse with a niche fan following has brought in a matching capability within the matching engine. This uses the bitpartite graphs which is also a good feature. How do we compare this with SAP MDM - Personally SAP MDM wins hands down however bitpartite is best run by Netrics matching has DQ - global ID functionality built in. Food for thought for SAP Folks.
B) Oracle - probably the least threat concieved by the MDM players ; It has decided to sell product data cleansing solution in conjunction with its product data hub with one of its partner in the market. SAP MDM - should not be a problem for has a clear leader board adavantage with oracle on the MDM front.
C) Last but surely not the least - SAP's biggest threat IBM MDM; Exeros to be aquired by IBM - impact : the IBM WPC is now tougher and stronger having capabilities of profiling during the data doscovery phase during a mdm program. My view from the SAP family and fan club- the BO suite integration in MDM7.0 and further roadmaps would be really the key to break free stringent competition in this space. Siperian  and dataflux who are players in this segment would also see this as oppurtunity to share the pie. MDM implementation hence would be further elaborate to make sure data profiling and diligence time required for the same being extended. Consultants , partners need to be aware of more integration techniques and pre-req's required during an implemention.
While I feel this is early days in the slow economy aquisitions to restate the pie under a new banner - it is surely challenging to see more products in this domain. If you felt implementation was tough - now you have the changes and buy-outs to keep an eye out for.
I would welcome thoughts and more insights on the products my fellow sdn bloggers have come across. I am sure these are quite argueable and hence idea of a blog and have opinions and substantiate the same with demos.
Rajesh Iyer Rajesh involved in the strategic management of complex Proof Of Concepts, Pilots and Custom Demonstrations to clients, COIL with SAP and various prospects in the SAP MDM space. He has carved several accolades from clients and customers. He has also been a key contributor in a technical pre-sales role where he often had to not only prove out the capabilities of the SAP product in extremely demanding and challenging environments but also had to craft the correct messaging for the relevant groups evaluating the software with competitive products such as IBM MDM , Hyperion and Kalideo.


Managing Product Hierarchies using SAP MDM

Managing global product hierarchies for an organization can be a huge challenge, especially when there are multiple systems & no ownership of managing the hierarchies. This blog is not to provide more information on importing or syndicating hierarchies, but I have tried to explain how management of complex hierarchies can be a strong business case while implementing MDM.
The need to manage Product Hierarchy
Any organization offering products or services for sell, need to categorize there offerings systematically. A simplistic way to group different products is by grouping them in distinct Product Lines. But many times this high level grouping does not suffice especially for organizations have a large exclusive product offerings. Hence, it is necessary to drill to a logical level of hierarchy where a finished good can be linked. This can be achieved by creating a logical hierarchy which can not only help in categorizing the Products but also in analytics & forecasting.
Building a Product Hierarchy
Few points to keep in mind while building a product hierarchy:
  1. Applicable globally for the organization
  2. Covers all Products & are categorized uniquely
  3. It should be at a logical level & recommended not more than 5 levels
A product hierarchy can be built by grouping products by Line, Type, Family, Group, Series etc. The order & the levels of these categories must be identified carefully so as to avoid expansion or repetition of nodes. Strong capabilities of SAP MDM
SAP MDM comes with a strong mechanism to build, manage hierarchies & is one of the strong business cases for implementing MDM in organization. MDM allows you to easily manage complex product hierarchies, import it from excel files & automatically create a tree structure.
SAP MDM provides features to add sibling, child, sort the tree etc. This makes it a strong tool. Additional functionalities like Taxonomy, Images etc add to the strength of the tool. Capability of SAP MDM to trigger Workflows on hierarchy table help in better management of hierarchies. 
Though Product hierarchy is a lookup reference data for master data objects like Material & Products, it still can be managed centrally using SAP MDM. Syndicating the changes / additions in hierarchy can be easily managed using SAP MDM & can help in having a synchronized, unified hierarchical product classification. Further, it can also be leveraged during Catalog Management, SRM & with BI to perform analytics.


SAP UK and Ireland World Tour 2009

SAP UK and Ireland World Tour 2009  :  Your complete guide to Business Intelligence , all under one roof. 

Join us on the UK leg of the SAP World Tour on the 15th July in Birmingham  and discover how the latest tools and technologies from SAP and our partner companies worldwide deliver clarity, accountability and insight to businesses of all sizes.
What can I expect to gain from attending?
This event offers a unique showcase for the latest business strategies, industry best practices and sap solutions, with particular emphasis on business intelligence and enterprise performance management: 

Find out how you can manage and mitigate risks and optimise business performance while closing the gap between strategy and execution

Tame information chaos by turning structured and unstructured data from disparate sources into high quality, actionable insight that can drive business process improvement

Get the inside track on our range of affordable, quick-win solutions that can deliver rapid return on investment in a changeable climate and help you identify new opportunities by putting your existing data to work.

What if my business doesn’t run SAP?

SAP BusinessObjects’ tools are heterogeneous – you don’t need to run SAP platforms as our applications are designed to run across any and all data sources – making this is the ideal event to get up to speed with our solutions to your business challenges and talk to partners and customers who can share their first-hand experience.

Affordable SAP e-Learning

Virtual SAP TechEd (VSTE) provides access to valuable education with over 500 sessions from SAP TechEd events from 2006-2008 and used by SAP practitioners worldwide to learn latest SAP products, technologies, and develop new skills.
For SAP practioners, staying current with the latest and greatest in SAP techologies and solutions is key to professional success. In the current economic climate, keeping SAP skills of your team has become more affordable. ASUG members and memebrs of other SAP professional groups are now eligible for a 30% disocunt.
To buy, preview, or learn more, please visit Virtual SAP TechEd page. If you are an ASUG member (or beolong to another SAP User group), please send us an email at with your full contact information and the association you belong to for instructions to purchase Virtual SAP TechEd at the special price.


Producing Printed Catalogs with SAP NetWeaver MDM - Part 2

Producing Printed Catalogs with SAP NetWeaver MDM - Part 2 

The recent How to Create Publications with SAP NetWeaver MDM Using MDM Publisher guide introduced the MDM Publisher as a UI client application enabling you to expose MDM repository data to a catalog publication and define the desired layout in a DTP-like environment. While the focus of that beginner's guide was to draw a basic publication scenario with the key process steps involved, a follow-up guide provides you with information on how to optimize table layouts for more advanced publications.
I hope you'll find the new How To Create Publications with SAP NetWeaver MDM Using MDM Publisher - Advanced Topics guide helpful for setting up more sophisticated print publications.


Removing Complexity from SAP EDI and B2B Processes

Removing Complexity from SAP EDI and B2B Processes 

I had a conversation with a large Oil and Gas company today about complexity in the areas of EDI and B2B processes. They have, like many large companies, acquired different subsidiaries which run different ERPs and have different EDI translators. A quick review of their IT infrastructure easily uncovered huge amounts of complexity and costs associated with operating, supporting and maintaining these multiple disparate systems.

Who can fix this complexity? Is it the EDI department? No, the EDI department usually is task oriented and rarely can say NO to the business. So where is the complexity, who owns it and who can fix it? The complexity is in the integrations of data with the multitude of database applications and ERPs, and supporting the various file formats that the business units agree to support. The challenge is in developing, managing and documenting all of these integration scripts. This is often outside of the responsibility of the EDI department, but never really embraced by the software development teams in IT either. It is a kind of NO-MANs land. No one wants to own it. Software developers will grudgingly develop an integration script, but only if they are promised it is a short term project. The second it mostly works, the developer moves on to sexier development projects - leaving an undocumented, and poorly supported integration script to be found and deciphered next year by the next unlucky soul. These undocumented integration scripts grow like weeds in July. After a few years, there are hundreds and thousands of these scripts just waiting to be broken by an upgrade or process improvement somewhere which will bring IT to its knees fast.

The complexity involved in supporting all the various business processes and data requirements of dozens of different database applications and ERPs can be enormous. Companies need to either develop specialized database applications to help manage all of the integration scripts or buy some specialized application (I have never seen an application that does exactly this, except one I wrote myself many years ago).

Added to the complexity of managing hundreds and even thousands of internal integration scripts is the data and file format requirements that the business units agree to support with customers and suppliers. One simple electronic purchase order which can easily be mapped and supported by an EDI expert, suddenly becomes a nightmare when the business unit agrees to support a different file format for every customers' purchase order process. These kinds of complexities have a name - "combinatorial explosion!!!!" Multiply all of the different database systems and ERPs that all require different data and processes, with an understaffed EDI department and business units agreeing to support hundreds of different file formats for simple processes. HELP! No wonder EDI departments are often considered slow and unresponsive. They are shell shocked and buried alive in complexity!

SAP's Business Network Transformation strategy and recent investment in an automated business exchange is designed, for the first time, to solve these problems - at least for SAP customers. It is a new and much simplified and standardized paradigm for SAP EDI and B2B.

SAP's Netweaver PI can facilitate the internal enterprise integration of all of the various components of SAP and aggregate the integration of data into one set of standardized interfaces that can be pre-developed and stored in the ESR (enterprise services repository). These standardized sets of business processes and integration points, connected to the automated business exchange, eliminates the need for internal integration scripts, their development, maintenance and support. This greatly simplifies EDI and B2B projects and significantly reduces the costs of implementing EDI.

The automated business exchange runs an SAP-centric hub for SAP users. The automated business exchange utilizes SAP's Netweaver and other solutions co-developed by SAP on its network which enables it to integrate with all other SAP customers through a Netweaver PI-to-PI connection. Once an SAP customer is connected to the automated business exchange, they gain access to the 40,000 plus companies that are already connected and supporting a huge library of B2B business processes for various SAP applications. The 40,000 are growing expoentially now as the network effect kicks-in.

SAP's new paradigm for EDI and B2B removes the complexity of integrations, standardizes processes, reduces costs, enables rapid on-boarding of large numbers of trading partners and addresses the remaining issue that causes uncontrolled complexity which is supporting large varieties of different EDI standards and trading partner required custom file formats. In SAP's new strategy, once an SAP customer connects to the automated business exchange through Netweaver PI, IDocs or tRFC, the exchange as a managed EDI/B2B service provider takes over the management of the infinite number of different file formats and communication protocols required by trading partner communities.

Let's review - Integration complexity is now resolved as there is a simple and standardized way of integrating with an EDI/B2B system (operated by the automated business exchange co-owned by SAP) via Netweaver PI, IDocs or tRFC. Netweaver PI can also integrate all the back-office applications into an enterprise service bus architecture (eSOA) so data can be shared as well. The automated business exchange (operating in a cloud computing environment) already has over 40,000 connected companies that can be accessed by connecting once to the exchange.

Every new company that connects to the automated business exchange can be available, with permission, to exchange data with all other connected members. This SAP-centric network effect means that for the first time SAP EDI and B2B data exchanges are guaranteed to get easier, faster and simplier over time as the network expands and the library of pre-developed and pre-integrated business processes and data exchanges grows.

Investing in SDN Blogging

Investing in SDN Blogging

Many of my colleagues and friends that work in and around SAP solutions enjoy reading other people's blog postings, but have never contributed a blog article themselves.  Often they say they either don't have the time, or are uncomfortable writing for public consumption.  Several of my friends that have published blog articles on SDN took literally 2-4 weeks to write one article as they wanted it perfect before publication.  They misunderstood the purpose of blogs.
Blogging on SDN takes time, but it does not have to be perfect.  Blogs are the sharing of ideas and experiences in real time.  These are your ideas and experiences. I don't think I have ever re-read a blog article I wrote in the past without wanting to edit it again (what was I thinking?).  This is a dynamic media where the free flowing stream of ideas and experiences can be shared.  Sometimes you may even change your mind about a subject, perfect!  Write another blog article explaining why your ideas have changed!
Blogging on SDN is about sharing.  It is about community learning and the distribution of ideas and best practices across a community with a shared interest.  I often have emails and comments sent to me after posting a blog.  These are from readers offering personal experiences, asking questions or even, can you believe it, disagreeing.  Wonderful!  That is how learning and progress happens.  If along the path you can build your personal reputation as a subject matter expert, then even better.
I write on SAP Netweaver, EDI and B2B business processes and strategies.  I have posted over 50 articles on these subjects in the last 12 months that can be found by searching on my profile or the words EDI and B2B.  These articles are posted in real-time, but also have longevity.  My experiences are archived for others that want to learn in the future.  Blogging on SDN is helping create a knowledge base of experiences, an intellectual asset that the entire SAP community can share.
I encourage you to share your experiences, ideas and strategies with the SDN community!
Kevin Benedict Kevin Benedict is an independent consultant on Mobile and EDI/B2B Strategies,


Exchange ECC customer master standard field with the CRM Z- fields (ECC -> CRM)

Exchange ECC customer master standard field with the CRM Z- fields (ECC -> CRM) 

This blog is a continuation of my previous blog CRM 7.0 How to --4  .

In this blog I will cover the Scenario 1 : Exchange ECC customer master standard field with the CRM Z- fields (ECC->CRM).

Since you have used AET/EEWB to enhance the BP master, You don't have to perform any task in CRM side as the tool (AET/EEWB) has already taken care of all the necessary task for you.

What you need to do is map the ECC field to the CRM custom fields. For that perform few simple steps mentioned below and your are done.

Step 1: In ECC Tr. SE11, look for the structure BSS_CENTI, double click on CI_CUST and create structure CI_CUST, Add all the fields added to BUT000 table in structure CI_EEW_BUT000. Make sure that you add the fields in the same sequence (this is very important)


Step 2: Follow the same for the structure BSS_CENTIX double click on the CI_CUST_X and create structure CI_CUST_X. In the CI_CUST_X follow the same sequence of fields but use the component type GB_BAPIUPD (A flag to indicate the change in the field).


Step 3: Copy the FM SAMPLE_FCTMODULE_DE_EIOUT to Z function module and all the code mentioned below to map the standard fields to the custom fields.
           LS_XKNA1               TYPE  KNA1,
           LS_CRMKUNNR            TYPE  CRMKUNNR.





  IF SY-SUBRC = 0.
    -CENTRAL_DATA-COMMON-DATA-CI_INCLUDE-ZZAFLD000003  = LS_XKNA1-XXXX. --> Here use the std field

   -CENTRAL_DATA-COMMON-DATA-CI_INCLUDE-ZZAFLD000004  = LS_XKNA1-XXXX. --> Here use the std field

    -CENTRAL_DATA-COMMON-DATA-CI_INCLUDE-ZZAFLD000005  = LS_XKNA1-XXXX. --> Here use the std field

    -CENTRAL_DATA-COMMON-DATA-CI_INCLUDE-ZZAFLD000006  = LS_XKNA1-XXXX. --> Here use the std field


The next two steps are required to call the above created FM while sending the data from ECC to CRM.
Step 4: Go to tr. SM30 and maintain table TBE24. Create a product and mark it active.


Step 5: Go to tr. SM30 and maintain table TBE34 for the event DE_EIOUT. This event will get trigger when data will flow from ECC to CRM.


That is it. Your standard field is mapped to the CRM custom fields and ready for exchange from ECC to CRM.

Reference note: Note 736595 - Exchange of EEW fields with R/3 customer master


Master Data Management Represented in BPX Community Space

Master Data Management Represented in BPX Community Space

In the SDN Community space there have been repeated discussions about which information is exactly needed. Should it be merely IT-related content, exclusively treating the technical aspects that are relevant for software developers and technical consultants, or should this community channel also provide a broader picture that also integrates information about the underlying business dimension of the software.  There is no once and for all answer to this since SDN covers both areas with a predominant IT spin and others where the IT and business aspects are very close.
When it comes to master data management, it becomes very clear that it is about IT and business matters alike: The impact of applied MDM on business execution in diversified IT landscapes is dramatic (which, after all is the main reason why there is such as a thing as MDM), and therefore I'd consider the business context always as an intrinsic part of MDM. To get an overview of the MDM business impact, you may watch this animated 3-minute-demo.
Finally, this means nothing more than that comprehensive MDM information for the community needs to cover the What, How and the Why adjusted to the specific target group.

This figure perfectly illustrates the bottom-up impact of the quality of master data on related business transactions and analytical processes. If, in a specific business situation [e. g., during a Merger & Acquisition phase] the quality of master data decreases, it is very likely that the quality of related business and analytical processes deteriorates as well.

To pay tribute to this natural ambivalence of MDM being relevant for the IT and the business world, it is clear that MDM needs its due space on BPX where the MDM business dimension is focused.
As a consequence, Master Data Management is going to be featured in the BPX Community space under Business Themes starting today.
I hope you'll find this "new" BPX topic interesting and the information therein beneficial for your business.
Kind regards,


Master Data Quality, Governance and Management

Master Data Quality, Governance and Management 

Master Data Quality, Governance and Management

During one of my client’s implementation, I met very good sales people to whom I had to sell the idea of Data Standards, Quality and Governance. 

This was due to the fact that in their current landscape, data was entered in any format wished for. Their used to be new customers and materials creation all the time without checking duplicates (whether such customer or material) was already created in system.  The first reason I got that search functionality of SAP ECC is not very good. The second reason was that it’s very hard to find the right customer or material as of now since there are already so many customers and materials with same parameter. Over the period of time there are so many duplicates in system that it’s hard to choose the right customer or material. After looking into their system I was shocked to see the that either or all of Name 2, Name 3, and Name 4 has address details and fields like street was empty. For some Street data was filled. Street has the city details. City field has District, region or even country. For some customers though rare cases, data was properly maintained. One good thing was there that one of their systems was following standard procedure and had governance mechanism. The data was 90% correct in that system. However it also had scope of duplicates and some of wrong details.

One of the challenge was definitely was the Data Standards. While creating the Customer Master data, as far as possible, they should enter the data in respective fields only and in correct format. One of the beautiful questions was that why this MDM is required since day to day work activities are being carried out as usually. He was very right since the organization was able to carry the business. However there was some challenges and gaps which I had to highlight in that meeting.

The other challenge was management was not able to get the correct information for the customers and not able to utilize the customer base to expand the product profiles by cross selling products.

One of their pain point was that if someone entered wrong pin-code and right city (since they don’t know the correct pin-code), then the shipment reaches wrong location. This is because the courier delivery is based on pin-codes. After this shipment comes back, the correct details are tried to be found either by calling the customer directly or getting this information from sales person. The correct pin-code is then again passed to Courier Company and the shipment gets finally delivered. This whole exercise leads to increased shipment cost, increased delivery time and most importantly customer dissatisfaction.

For the Duplicates, I took a very simple approach. I asked for our sales person mobile phone and went to his contacts folder. As expected, he was using company name as prefix or suffix. I asked simple question to him that if you have many contacts with common name, how you identify that you always call the right contact. He answered that I use company name along with contact name to distinguish between contacts. My another question was then that without adding company name, how he will recognize which is right contact to call until he remember his contact number (which is not possible). By this, he already understood that by not adding company name, he might have to call all contacts by same name to talk to right contact and dissipate lot of time and money in this.

For the Data Quality, I was simply confused and also amazed by looking at scattered data. I explained if the right data would have been entered in right fields with the help of standards, validation checks, processes and little governance then system would be in better shape and maybe we not a reason to meet.

I was sincerely thankful to be part of such engagement and learning experience and finally able to solve their problems. It’s such an amazing feeling to be part of such challenging assignments and help customers with the right solutions. This is what provides the fire to be part of such requirements.


SAP Developer Network Latest Updates