SAP NetWeaver MDM 5.5 Support Package 02 on the Way

Support Package 02 for SAP NetWeaver MDM 5.5 is available for customers since August 15, 2005, through the SAP Software Distribution Center. It enhances MDM 5.5 with new and improved capabilities. These include:

  • Enterprise Portal enablement:The MDM iView Library provides a set of collaborative iViews to search (e.g. pick list, hierarchical, attribute, text) and display (e.g. search state, result set, item details) data stored in an MDM repository. iViews can be configured and assembled into portal pages that are searchable in multiple dimensions without writing a single line of code. Finally, each iView is search-state aware and fully collaborative with other MDM iViews on the page.

  • Improved communication with R/3 Client Systems:Mass IDocs for Material (MATMDM), Customer (DEBMDM), and Vendor (CREMDM) help transfer object data more efficiently from ERP to MDM. Refer to SAP Note 836985 to find the required SAP plug-ins. You can start the local extraction (initial load, delta load) from an ERP system (PUSH scenario). The substructures for Customer and Vendor have been enhanced. New endpoints for collection and distribution of data, called ports, simplify the manual and automated exchange of data between MDM and external data providers (client systems). Ports also serve to bundle relevant configuration information for importing/exporting data, including client system, map, and data location.

  • Key Generation (Key creation in the outbound process:If an object does not yet exist in the client system, the Syndicator uses distinct keys for a target system taken from a range or a qualified range. When an object is distributed to a client system in which the object does not yet exist, a new key has to be determined for this client system. The key generation settings define if there will be a controlled key generation and the key ranges for the client system.
  • Enabling BI reporting: In conjunction with SAP BI the following scenarios are supported:

      You import data from your BI system, consolidate it, and send it back to your BI system. This scenario can be used e.g. as basis for cross-company reporting.

      You import data from any client system (e.g. ERP), consolidate it, and send it to your BI system. This scenario can be used e.g. to find duplicates and store key mapping information in your BI system.

  • Extended platform support including 64 bit versions: The platform availability is extended to include 64-bit Linux, Solaris and HP-UX. For details refer to the Product Availability Matrix (PAM) on the SAP Service Marketplace.
  • Transora support (GDS-CP) Transora Data Pool 5.1 in addition to UCCnet 3.0 is supported.

  • Data management workflow:MDM Data Management Workflow integrated into the MDM Data Manager enables the orchestration of parallel and sequential data management activities on groups of objects, including user tasks, validations and approvals. It includes a workflow designer and integrated worklist. You can design your own workflows in a graphical environment using elements, called workflow steps, representing user activities, validations, approvals, and notifications. Each workflow step can be configured and assigned to users or roles to suit your business processes. The workflow is started and executed in the MDM Data Manager.
  • Enhanced Data Manager functionality A Web browser is embedded in the MDM Data Manager enabling “punch-out” integration.

  • Additional language support (Japanese)

SAP XI/PI Tutorials

SAP XI/PI Tutorials

Master Data Management - Why Now? Part II



In this two part series weblog, I have put together 10 questions that can help customers understand the importance of MDM within their enterprise. In Part I, I focused on business aspect of master data management while in this part my focus is on IT aspect of MDM. I hope you find it useful. Please provide your comments and feedback. Also, let me know if there are important questions that need to be answered during MDM implementation.

What have IT done to maintain accuracy and reliability of master data?

There is no way in the world that IT has not done anything and left it to business users in heterogeneous IT environment. If you are big SAP shop then you are probably using ALE integration or XI to transfer master data across all instances. If your landscape is mix then you may either have custom solutions or EAI tools doing work for you. But, give a closer look at the question here and then try to find out ‘what’ IT has done so far to ensure reliability and accuracy of master data. Answer most likely is going to be ‘none’ just because there were not any sophisticated tools available until now to solve this problem.

How does SAP NetWeaver MDM fit into your new IT Architecture?

Yes, ‘New Architecture’ is important here. If you and your IT are talking old way of point to point integrations, custom interfaces and not thinking of new ways of solving IT problems then stop. I think even before you start topic of MDM, you may want to focus on new IT architecture, which is more flexible and agile to adapt to new business requirements. Today’s ‘Services Oriented Architecture’ just talks about these benefits and one should think about introducing MDM in this new architecture. Master data is the foundation of business and master data management is the core of new architecture as well.

Where should I start?

Ok, I got all my pieces right, now what? Where should I start, what should I do first, how does my roadmap for MDM look like? If you not only have answers to all the questions above but have started working on implementing them, then you are on right track. You always want to start with defining master data management strategy for your enterprise and then start implementing it. Remember, SAP NetWeaver MDM is an important piece of your puzzle but not the bigger one. You need to have answers to all the questions above to start implementing MDM tool to achieve results you expect.

Why NOW?

Many customers I talked to have either major ERP project or technology enhancement projects like RFID going on. One does not want to carry forward their problems around master data going forward. It is important to think about bigger picture from your IT roadmap perspective but having MDM as an important piece of it is critical for achieving better results. You may need to find out pros/cons for MDM along with cost and benefits of it. But don’t just calculate the cost of implementing MDM but find out cost of not having MDM and you will know the answer for ‘why NOW’.

SAP MDM Training

SAP MDM Tutorials

SAP MDM Interview questions

Master Data Management - Why Now? Part I


In this two part series weblog, I have put together 10 questions that have helped customers understand the importance of SAP MDM within their enterprise. In the first part, I focus on business aspect of master data management while in the second part I will focus on IT aspect of MDM. I hope you find it useful. Please provide your comments and feedback. Also, let me know if there are important questions that need to be answered during MDM implementation.

Is management of master data strategic to your enterprise?

Master data has always been high importance to organizations but many times people don’t pay much attention to maintaining and managing master data properly. One need to ensure that management and maintenance of master data is not only of high importance to the enterprise but also strategic in terms of having reliable and accurate information management. Master data management strategy should encompass but not limited to achieving accurate and reliable master data, governance, maintaining quality of master data and ultimately achieving better results.

What is master data for your enterprise?

Generally, master data comprise of customer, supplier, product, employee, etc but is not limited to only these objects. For Oil and Gas industry, master data object can be oil well, while for food and drug industry it could be chemical composition. One need to define master data for the enterprise and should specify exclusively why.

Where and how master data is being used in your enterprise?

Once the definition of master data is clear, it is important to find out where and how master data is being used in the enterprise. One can not restrict its view to application only because sometimes use of master data could be beyond application boundaries. E.g. product which may get originated in an application can be used in a marketing campaign and may have totally different attributes than application could have. This is very important exercise of overall MDM implementation because this will define characteristics of the master data object. Once all the attributes and characteristics of master object are clear, one can start linking them to software applications.

Do you have master data management processes?

Whenever this question gets asked, one responds back with return question – ‘do I need one’? And answer is always yes. Many times people get confused between a process and a workflow. Every organization has processes and sure they have for master data management as well. Question needs to be asked around – is current process to manage and maintain master data enough? Having well defined process - from creation of master data to its retirement through retention and maintenance over its lifecycle is very important and should be well thought of during SAP MDM implementation.

How do you maintain master data quality in your organization?

What does it mean? We don’t maintain it, our IT systems maintain quality of data – this is the most common answer I had received so far. But, I must admit that some companies are ahead on this aspect than others and I realized, based on statistical data that was presented to me that enterprise needs to think of having processes and procedures to maintain data quality.

Have you implemented Master data governance?

Governance is a hot topic everywhere, from corporate governance to IT governance, it has taken important turn in the way we manage things. Having an authoritative body who defines policies and procedures, implementation of those policies and procedures and then monitoring them through out is a general purpose for governance. The same applies to master data management and it can be part of overall IT governance or information governance but one should thought about as overall MDM strategy.

SAP XI Training

SAP PI Training

Retrieving Data from MDM server using the MDM Java API

In my previous Weblog MDM Connectivity to Java Applications. I have introduced the SAP MDM connector.
In this Weblog I will go deeper and show how to establish a connection and retrieve data from MDM using the connector APIs.

The classes in the MDM Java library expose the functionality of the MDM catalog to a java application. The main functionality are:

  1. Establishing connection
  2. catalog data model,
  3. multi-dimensional search
  4. catalog data retrieval
  5. catalog data management.

The main class in the library is the CatalogData class, which exposes all the services of the MDM Server.
Among the services that it provides outside of those of the MDM catalog itself are a local catalog data cache,
a pool of MDM Server connections, and full thread-safety. Other objects in the library exist to support the
CatalogData class. They are either used as arguments to methods or return values for methods and properties
of the CatalogData class.

The following code shows how to retrieve categories from the MDM repository.
Step 1: Obtain a connection to the MDM repository.
1.1 get the context :
Context ctx = new InitialContext();
1.2 lookup in the JDNI to get the connection factory class
IConnectionFactory connectionFactory = (IConnectionFactory)ctx.lookup("deployedAdapters/MDMEFactory/shareable/MDMEFactory");
1.3 Get Connection Spec to set connection properties
IConnectionSpec spec = connectionFactory.getConnectionSpec();
1.4 Set Connection Properties
spec.setPropertyValue("UserName", "Administrator");
spec.setPropertyValue("Password", "123456");
spec.setPropertyValue("Server", "server1");
spec.setPropertyValue("Port", "50000");
spec.setPropertyValue("RepositoryLanguage", "Chinese [HK]");
1.5 Get the Connection
IConnection connection = connectionFactory.getConnectionEx(spec);
Step 2 : obtain access to the native object
2.1 Retrieve Native inteface
INative nativeInterface = connection.retrieveNative();
2.2 Get the CatalogData the physical connection
CatalogData catalog = (CatalogData) nativeInterface.getNative(CatalogData.class.getName());
Step 3: Retrieve data from Catalog
3.1 Create ResultSetDefinition for products table
ResultSetDefinition resultdefenition = new ResultSetDefinition("Categories");
3.2 Create Search object for Categories table
Search search = new Search("Categories");
3.3 Get Data from repository.
A2iResultSet rs = catalog.GetResultSet(search, resultdefenition, "Id", true, 0);
Once we have all the data in the result set, one can do manipulation on it and retrieve the data.
Step 4: Close the connection
The last step is configuration:
Step 5: Set the connector libraries in the application-j2ee-engine.xml file
The MDME Connector uses the SAP Connector Framework and MDME4J libraries as a shared library.
You need to add these libraries into the application-j2ee-engine.xml file.

 <?xml version="1.0" encoding="UTF-8"?>

<!DOCTYPE application-j2ee-engine SYSTEM "application-j2ee-engine.dtd"> &ltapplication-j2ee-engine> &ltreference reference-type="hard"> &ltreference-target provider-name="" target-type="library"> tc/conn/connectorframework </reference-target> </reference> &ltreference reference-type="hard"> &ltreference-target provider-name="" target-type="library">MDME4J</reference-target> </reference> &ltprovider-name&</provider-name> &ltfail-over-enable mode="disable"/></application-j2ee-engine>




Using MDM5.5 Java Connector with SAP Enterprise Portal 6.0

This example uses a portal component which opens a connection to a repository running on MDM Server,
retrieves data from the repository, and closes the connection.
There are 5 steps we will implement:

  1. Set the connector libraries in the deployment descriptor file portalapp.xml
  2. Open a connection to a repository
  3. Get the physical connection via the native interface
  4. Get data from a repository
  5. Close the connection

Step 1: configure the potalapp.xml file
The MDME Connector uses the SAP Connector Framework and MDME4J libraries as a shared library.
You need to add these libraries into portalapp.xml.
&ltproperty name="SharingReference"

Step 2 : Open a connection to a repository (using the connector gatway service)

The useful EP6 features of system object aliases and single=sign-on user mapping are available

only by using the Connector Gatway Services approach.

// Get the Connector Gateway Service 
IConnectorGatewayService cgService = (IConnectorGatewayService)
PortalRuntime.getRuntimeResources().getService( IConnectorService.KEY);

// Create ConnectionProperties for portal user
ConnectionProperties prop =
new ConnectionProperties(request.getLocale(), request.getUser());

// Get a connection
IConnection connection =
cgService.getConnection("MDME_SYSTEM_ALIAS_NAME", prop);

Step 3: Get the physical connection via the native interface

Retrieve the native interface from the connection, and invoke getNative method.

// Retrieve Native inteface
INative nativeInterface = connection.retrieveNative();

// Get the CatalogData the physical connection
CatalogData catalog = (CatalogData)

Step 4: Get data from a repository

All methods of the CatalogData object are available for retrieving data.

The following code uses the GetResultSet method as an example.

// Create ResultSetDefinition for products table
ResultSetDefinition rsd = new ResultSetDefinition("Products");

// Create Search for products table
Search search = new Search("Products");

// Get Data from table
A2iResultSet rs = catalog.GetResultSet(search, rsd, "Name", true, 0);

Step 5:Close the connection







The Architect's World - Episode 11



The B2B bust never happened. It’s a return of ideas. MDM 5.5 could potentially bring the concept back to life. It is the strategic decision that SAP would take in the future with MDM that would prove to be the trigger for a B2B resurrection. There exists today a substantial gap between the promise and the reality of collaborative commerce unlike what was touted in the late 90s. Visionaries always wanted us to believe that we will all soon be connected to vast, virtual, paradigm-changing digital public marketplaces & start collaborating. Like the X-Files, these marketplaces were supposed to be “somewhere out there”, we were never supposed to know where or when. From these initial thoughts are derived a set of key capabilities and requirements which may be used in the evaluation of specific collaborative commerce implementation technologies.

And more importantly, understanding the space or having a single platform vendor address business needs, not just technological, and not looking at MDM 5.5 as a lone, disconnected product as is seen by many today. Well connected and making sense. Virtual collaboration that allows the business processes of a customer’s value chain partners who are linked through the Internet, and the knowledge that is available would be exchanged throughout the value chain for a specific industry vertical with universally accepted identification numbers. Virtual collaboration is a means to increase efficiency while maintaining or even reinforcing uniqueness, a solution that now seems to be within reach with the integration of MDM in the SAP NetWeaver platform. And thus, the improvement scope for an organization broadens from reducing costs to increasing revenues, accelerating time-to-market and enhancing customer satisfaction.

A new form of virtual interaction is added to our comprehensive platform model: collaboration. Naturally, this will increase the complexity of the partners’ interactions, which are now mostly limited to informational and transactional interactions with customers who are attempting this initiative with MDM 5.5 today. At first, collaborative interactions between organizations need to focus on the operational level - upon establishing virtual links between the business processes involved. With EP as the front-end and a little bit of flexibility with SP02 proves this. Let us call this the e-Integration stage. SP02 will at least assuage MDM 5.5 users with its display abilities through the enterprise portal. This may be followed by a second collaboration stage: e-Partnering, in which the value chain partners would extend their collaborative interactions to tactical and maybe, even strategic levels. For this, the interaction with BI 3.5 would make sense. The opportunities created by such virtual collaboration are immense, especially in the area of composites. Let us try to see how MDM 5.5 in conjunction with GDS and industry specific needs will shape the vertical nets of the future as an integral part of the netweaver stack and help us define a case for MDM to carve a road ahead with ESA for an organization that is exploring such a possibility towards collaborative commerce.

Blast from the past:

An eMarketplace could be a business destination, which provides a broad offering of products, services and content as well as a venue for business transactions electronically including exchanges and an eProcurement is purchasing of materials, mainly indirect materials or services through an electronic Exchange. MDM 5.5 would be a great solution for catalog management and syndication for the same. In the past, these used to be advanced excel based tools to create .cif files for catalog management.

Buying and Selling of direct materials, finished goods, services come under eMarketplaces. eMarketplaces most often include an exchange as part of their services. For example, a marketplace might offer catalogs, Negotiations, and an exchange in addition to a number of supply chain services. (A combined digital marketplace of Ariba Marketplace and Ariba Dynamic trade) The core engine for the same is an advanced catalog management engine – something like the MDM 5.5 solution.

Name calling: Exchanges, eHubs, iHubs, e-Markets, Trading Networks, e-MarketPlaces, NetMarkets etc. etc. etc.

Resurrection of the e-marketplaces?

Early adopters across the value chain had experimented with a variety of new business-to-business commerce models, technologies and application designs. At the same time, there was always a growing demand for new standards to facilitate the exchange of information, catalog content and transactions between buyers and sellers. While the methods have been numerous and complex, the underlying goal has remained the same: to bring buyers and sellers together with an automated flow of information and transactions, while still supporting individual business and contractual relationships between trading partners, but for small collaborative chains and inherent players as depicted in our storyboard. The essence of the B2B boom now gets carried forward, except now there is a promise of reality.

The Extended Enterprise Concept now gets stretched to become an Extended Collaborative Enterprise. And the case of logical composites become:

1. Design: Product conception and design for New Product Introduction (NPI)

2. Planning: Determine product mix and quantities based on demand forecast and manufacturing capacity

3. Sourcing and reverse auction: Identify and select suppliers and negotiate and establish purchase contracts with suppliers

4. Marketing & Sales: Market and create demand for new and existing products

5. Manufacturing & Inventory Management: Work with sourcing to maintain low inventory levels and manage an efficient just-in-time (JIT) process

All these have one element in common – master data. A virtually connected extended collaborative enterprise, which the original initial players could not manage owing to a myriad of technological platform with no master data management backbone.

The Matrix Evolution:

First-generation digital e-marketplaces with the sell-side as one supplier and many buyers and the second-generation ones with the Buy-side with one buyer and many suppliers had vendors like CommerceOne, Ariba (ex-tradex), FreeMarkets (now Ariba) and others commerce solutions had fallen short of this goal with the top down approach they had, compared to the bottom-up approach with SAP NetWeaver. They had limited their focus to either the buy-side or sell-side of the equation, without truly understanding how to bring buyers and sellers together into small collaborative chains. The other shortcoming was, despite the concept of Vertical marketplaces having taken shape then, the industry backing behind the key standard bodies was disparate. A stance SAP has changed completely.

This lopsided view of the commerce process resulted in one participant (the buyer or seller) inappropriately dictating proprietary solutions or standards to the other. And, in most cases, this strategy did not scale. The next-generation of commerce solutions saw the birth of eMarketplaces. They were specifically designed to enable multi-buyer/multi-seller interaction and collaboration. They provide a common trading hub, where multiple buyers and sellers could come together and conduct commerce without compromising individual processes and relationships among the participants which is best understood by the pictorial depiction of the evolution as below:


The Matrix Revolution: MDM could be the cup of life for e-marketplaces

E-marketplaces and Vertical exchanges never died, it was only a shakeout of the incompetent and a result of nascent technology . E-procurement systems, e-marketplaces and exchanged always rode on catalog management systems. With MDM 5.5 (xCat), it would have ideally provided the perfect catalog management solution to the same. For those who been through the B2B boom and bust, it would not be completely wrong to say that these concepts existed at that point in time, they set the foundation in place. Catalog syndication is not a new concept when we were consolidating MRO items and finished goods to be hosted on the exchanges, marketplaces and e-procurement solutions. Where the B2B bubble failed was the disjoint way in which these solutions with the backend ERP systems. Again, this is only ONE factor. Today, MDM 5.5 (some call it a regression from the MDM 3.0 solution), on the ABAP stack. MDM 5.5 is a standalone application, nothing to do with WAS – pure xCat application, does not hold much promise. But what is important is the vision that one may want to decipher that SAP may be taking with the same. One would presume that the next version (MDM 7.0?) would be on a WAS (Java +ABAP) stack, fully integrated with EP in the true sense using XI and BI (and beyond) for a complete solution.

Some solution architecture thoughts on MDM in your landscape:

(a) Acting as the online hub for catalog management, syndication across the landscape, the master being the source of truth, flexibility for an organization would be rendered to create online stores with ease through the enterprise portal – specific and business partner driven.

(b) Business partner-specific product catalogs published online to help organizations wired exchanges and e-marketplaces providing community specific information like pricing and specifications to create a network of coherent business partners where the data is harmonized within the organization and extended to the partner community.

(c) Synchronization of trading item catalogs with vertical-specific hubs like Transora or UCCNet data pools, using GDS to create a meta-directory of items facilitating the use of industry-specific standards. An Industry specific yellow pages for all materials to be synchronized and used for trading. (This would necessitate the need for XI and BI leading to Analytics and CAF)

(d) Extended Supplier extranets to facilitate reverse auctions, linked up with the e-marketplaces to be considered as xAPPs extended from what pure-play vendors set out towards. These composites need to switch between intranets and extranets to address various processes.

This is where composites need to seamlessly address processes to complete process loops over the extended value chain and this is the culmination of all our efforts with ESA.

MDM 5.5 – Old wine in a new bottle, and it tastes good!

Before we get into the MDM mould and explore some of the details it is important to understand a little on the background on the product evolution. The MDM product was available for the last few years and its last version was MDM 3.0 this product was ABAP based tool with decent capabilities in large scale data integration and consolidation between SAP systems. The problem with this application was that it was way too complicated for a quick implementation and lacked the feel-good factor on the front-end. SAP may have had to decide whether to invest more on development efforts or to buy off another application from a vendor that would be able to provide the missing capabilities. The quest for this tool led SAP towards A2I and their product called xCat(Of course, there would be other strategic thought processes).

Even though the xCat product was intended for product catalog management, it seems to have the potential - both from a conceptual and a technical standpoint – to fit snugly under the SAP NetWeaver umbrella. At first this product was called MDME (MDM Extension) but now it was announced as the MDM official tool by SAP. The MDM5.5 application can help in providing data Modeling , data consolidation, fast data extraction , and decent search capabilities, and it seems realatively simple for installation and implementation (both from infrastructure standpoint and end user experience). Add to this a good experience in business scenario handling for MDM, as it was captured during the years in the “old” MDM development team and the technology which enables one to do complex things with little effort.

Product Catalog Management, doesn’t it sound familiar?

A PCM product would help the catalog manager of a product company manage the different aspects of information about the product. It could be complex with a lot of information to be handled. Every product oriented company has to have its product catalog in order to use it both internally (for manufacturing, purchasing, etc) and externally (customers, suppliers, etc). Complexity could be owing to a product that can be built from several parts, of various configurations, and can be connected to other associated products. In most companies the product information comes from different places in the organization (manufacturing, purchasing, operations, etc) derived from that it is stored in different IT systems to help in:

a. Extracting product information from different source systems

b. Consolidating information and convert different terminology of the same product

c. Providing ways of organizing the information depending on the user group for specific information

d. Providing fast and intuitive search

e. Providing different ways of displaying the catalog – especially, quick publishing online via EP using Java APIs

The Online Laundromat - MDM

Here we need to answer what is Master Data, well if I will try to define it in a simple way I can say that Master Data is descriptive data regarding specific entity. The Master Data is fairly Static and it is not contains transactional information

So Master Data Management systems came to solve those issues, by providing tools for extracting master data from different IT systems, consolidating master data coming from different sources, central Management crosses IT systems and master Data Provisioning from and to the different IT systems in the organization. The PCM is just a private case of MDM with some extensions like printing catalogs or image management.</p. <p>To ensure that we progress towards the b2b vision, the MDM 5.5 comes in useful when you need to define a generic/flexible data model that is able to pull in data from different sources to have a consolidated source for all master data. Then comes the job of cleansing the data across organization (find duplicates , find similarities ,define transformation rules, etc) in the master server and finally create slaves to help use the data and uses personalized access to Master Data according to end user profile (progress is already up for review with SP2 and finally synchronize the Master Data with different systems holding equivalent information (something that may not be possible today) However, the speed of accessing the data with MDM 5.5 is nothing short of phenomenal, especially when it comes to large number of records.

The Plug Ins (QuarkXPress and Adobe InDesign) come in very handy in publishing catalogs are part of the PCM scenario’s supported by this tool, the Image Manager is for handling Images usually kept in products context and the Plug Ins are meant for publishing the product catalog in different ways. Customers without an MDM solution till date have been trying out all possibilities to handle this. Sample third party applications containing this data which needs to be integrated via the enterprise portal as part of a POC. Add to the the java APIs which can be used by organizations to create their own storefronts. Now, this is an area which has to be put in place by SAP very fast. Imagine customers with an MDM solution in place creating web applications in IBM Websphere or any other application server to create their on static web-stores. Of coursem this is a technological solution, but if falls completely out of sync with the bigger picture with SAP. It is debatable whether this approach is good or not.

The Management Console

The console is primarily intended for two roles – The MDM administrator which deals with connectivity, accessibility, security, etc and the other, The MDM modeler, for creation and maintenance of the object/data model as defined according to business needs.

An Stop-gap arrangement for XI – Bring on the Import Manager

With limited data sources to deal with, the Import Manger is today meant for importing data from several data-sources (can be relation DB, can be excel-sheet) in order to define and population of the data model. With this, one can define the rules for importing data into the model according to the source structure. To assist in aggregation of data from electronically format (the source), transforming the same using rules, rationalizing and normalizing the data to finally have a clean set of data without having to do any coding is where the punch lies.

Of course, it has always been a point of debate on the usage of XI 3.0 for the same. But if it is a one-time load (which is not the idea here), it makes no sense. It is like using a Cadillac for buying groceries. Once the entire landscape is in perspective with the master and the slaves defined, along with data upload and distribution strategies in place, XI starts making sense. It has to come in sooner or later as the landscape will move towards GDS (it the vertical is retail) with 1SYNC today, along with the mandate for the usage of BI and Solution Manager in the landscape. But it is the need today that should define the IT strategy. Maybe not stop-gap at all.

A Stepping stone towards ESA – The Syndication Manager

The Syndication Manager is the component that knows how to exchange data with other systems, meaning this tool will be the one that will handle the synchronization between the MDM storage and the source systems. The syndication manager will not do it alone but it will warrant the use of XI in order to provide a robust way for interacting with the myriad systems the client would have in the landscape. Then again, this is an area which will have explored in detail before an solution can be proposed.

Content Manager

The content manager is what the end user needs to maintain and work with Master Data. Also known as “the client”, it finds its use in fetching data in the most intuitive way across users working on the same data model. In reality, the workflow feature for adding, changing and flagging for deletion the master data is not something that The workflow to manage the master date stored in the MDM (add, change, flag for delete), supposedly to be with workflows (a .exe file available with SP02, which is currently being explored in-house) should really set the ball rolling.) may work. If it doesn’t provide the desired results, there are options to get this in place – create customized iViews for changing and deleting records, create a custom workflow engine integrated with EP to handle this or use webflow, or use a third party workflow engine. MDM being deployed on WAS would be very useful indeed from an integration standpoint.

The invisible men - MDM Java APIs

The MDM5.5 server exposes API both for java and Microsoft applications (.Net and previous) that enables your applications to use the data and functionality provided by it. Due to the fact that most of the MDM5.5 functionality is done in the server, every thing that you can do from one of the clients above you can do from your applications via the API. API for ABAP is being developed as we speak and will be available in the near future. Now, the point that needs to be kept in mind is – how does one, as a Solution Architect, influence the decision of having this on-line web-store within the boundaries of SAP Applications?


GDS and Vertical Nets:

The next step of the solution makes sense when we bring in the vertical industry specific nets into the picture. And this is the area when XI and BI start making sense in the big scheme of things. Leading one to believe that MDM 7.0 (Or whatever), would be an application deployed on WAS (Java and ABAP) soon and xCat would soon be a thing of the past. When we extend our solution as an on-line marketplace with or without SRM/EBP with MDM to publish catalogs on an online store with the Java APIs on an application on or not on WAS, it may not help an organization derive the true business benefits of collaborative commerce with what SAP has on the anvil. Extend this with GDS and the RFID concept and that is leads one to the concept of resurrecting B2B exchanges.


This is something that may or may not happen in toto. But it is logically possible. And with the SAP NetWeaver platform in place, MDM 5.5 does become the backbone for these c-chains in the future. Conservative companies will be in a position to reach out to their business partners in a collaborative manner with minimal investment. The emergence of 1SYNC, the transition of Transora from a B2B marketplace running Ariba Marketplace, an e2open with Dynamic trace…..the new IT landscape layout and the orchestration of the various elements that would render the usefulness of ESA. Afterall, wasn’t SOA a core concept with the B2B boom? This is where ESA begins. More on this topic on my next blog




MDM as a core NetWeaver component

MDM is a wonderful tool for data consolidation and harmonization. All customers with multiple applications, each with multiple definitions of data assets, soon or later will face redundancy, waste, and lots of lost opportunities. MDM comes along to solve all this, and while it can be used as an standalone application, I’ve found that using it with other NetWeaver components will greatly help to raise its value.

In order to demonstrate the huge flexibility of MDM as a core part of NetWeaver, I propose the following architecture as a simulation of data integration system:


In this example, I’m planning to use any standard SQL based system such as mySql or even MS Excel to demonstrate its flexibility and ease of use. Both sources should have different data structure and some duplicated data, such as in real life. MDM will then harmonize and syndicate all data using XI as the main media for this. This way, we will be able to use XI characteristics such as workflow integration.

It’s clear that the same integration could be done without SAP XI, however, establishing point to point connections will probe to be unmanageable due to the overhead caused by the addition of new systems, duplication of business objects and lack of transparency on the data flow. By introducing XI, all data will become compatible and as a result we will obtain a highly scalable landscape, reducing costs and time of implementation.

I am currently developing this architecture and I want to share my results. Soon I’ll post my experiences, recommendations and problems I encounter during this process.




Brief Introduction to Masks in MDM 5.5

What are Masks?

As Roman Rytov explains in one of his blogs, Masks are static lists of selected object and not executed every time. To put it in other words, Mask is a snap shot of the whole bunch of records.

SAP’s MDM Console guide’s definition: A mask acts like a stencil, in that it blocks (“masks”) all main table records from view except the defined subset of records that are included in the mask, to allow the subset to be viewed and manipulated as a whole.

Is there any separate table for Masks?

Yes. There is a Special Table(Hierarchy) called Masks. Whenever a new repository is created, a special table called "Masks" is also created by default (along with other special tables).

How to create a Mask?

Select the Menu option "Records->Modify Mask..." and in the available Masks, right click and add a Mask as a sibling or child.


How to add/remove records to/from a Mask?

From the MDM Client go to “Records” Mode and from the List of records displayed, select the records that you want to add to a mask, right click and choose “Add to Mask” from the context menu. Choose the mask to which you want to add records from the cascading hierarchy of menus.

For removing records from the mask or to replace records in the mask, try other options displayed in the context menu.


Where can I use the Masks?

The customer has a product repository mounted on MDM and he has different users accessing this repository. The customer's requirement may be is to restrict the users from accessing the complete set of records and allow them to access only a subset of records.

This solution can be achieved easily by utilizing the Masking capability in MDM 5.5. First of all, different masks will be created and records to these masks will be added as per the requirement. After this, the users will be given access to a specific Mask containing the records that are relevant to the user (Hint:Just play around with the "Users" & "Roles" in Console).




MDM - Using Standard iViews for Search and Display


After the initial problems of downloading SAP MDM SP02, we were finally successful in getting the server up. However the tricky part is getting the iViews. Because unless the standard business packages, SAP MDM gives you couple of .SDA files which when deployed will get the SAP MDM connector up and some MDM content on the EP stuff. It will not give us ready made iViews which we can start using immediately. And there are good reason behind the same.
The reasons are that the repository will be different from customer to customer and so will be the data model. So, the iView has to pick fields from the specified repository and then build the screen.
In order to do that, the standard thing that you will have to do is to create a MDM system on the Enterprise Portal, where in you specify the MDM server details and also specify which repository you want to connect to. Once that is done, do the user mapping for the MDM system.
Now, coming to the creation of iViews, what SAP gives you as standard are the following iView types for MDM. The following is the screen shot taken from what will be displayed in EP while you are creating the iView. (I did this because I don’t have to type the list myself).
You might be wondering what each of these mean it looks different from the standard iView types that usually see for R/3, CRM etc systems. Let me try and explain the same.
The following 4 are meant for customizing the search criteria.
MDM Search Hierarchies - This iView will allow you to search the data by any look table that you have defined as Look up –Hierarchy. So, while creating this iView it will prompt you to enter which Hierarchy table you want to use, if you have multiple.
MDM Search Attributes - This iView is to search on the attributes of your product. However, this has the dependency as it requires a Category so that it can take the list of attributes for that category.
Now that we have chosen the category, the iView that we have created by specifying the table. In the above picture, we have selected Goggles as the category. So, below we see the attributes of the Category.
I have tried to show that it also lists the possible of each of the attribute, so that if you want to further narrow down your search, you can do that too.
MDM Search Picklists - This is more or less a straight forward search on a specific column of your main table in the repository. In this case I have chosen Manufacturers.
MDM Search Texts
This is the best iView that I personally like. Here you can search by a Keyword.
Now comes the fun part, as I have narrowed down my search, where is my list of product list being displayed.
This happens in the MDM Resultset iView
Now, we need to keep in mind that these iViews are interdependent on each other. The data is the filtered data which you have done in the other iViews. While creating this iView you can select what are the fields that you want to display.
SAP MDM Item Details
This is pretty straight forward, once you select the radio button in the resultset iView, this one shows all the details of the product. In this also, while creating this iView you can select what are the fields that you want to display. If the product has got attachments, they will appear as hyper links, and once click will open the corresponding document.
Oops, I almost forgot about the last iView type – SAP MDM Search States What this does is that it will tell you what are the search criteria the user has applied.
If you want to remove search criteria, its pretty straight forward, just click on reset search for the specific field or the button for resetting the entire search criteria.
So, from a Catalog management perspective we are good to go. We need to remember that the standard functionality given by these iViews are only Search and Display. If you have business process that requires extension for the same, we need to enhance this.
In further continuation for learning process of MDM, we will see about the process of synchronizing data between R/3 and MDM using Syndicator and XI

Visit for more details

SAP MDM Interview Questions

SAP MDM Training

SAP MDM Tutorials


The Architect's World - Episode 12


In continuance to my previous blog on MDM, where we looked upon MDM as an application, a product, or as just another application under the umbrella of SAP NetWeaver that could help resurrect the era of digital marketplaces. In this blog, I will approach the same solution from the other end of the spectrum. From the world of PIM or Product Information Management applications and to link up the same with GDS and RFID and take a look at SAP MDM as another PIM solution and how it can fit into your overall scheme of things. Organizations have not been new to the solution, rather, some of them have been out there for a very long time. Having burnt their fingers with best-of-breed applications facing rough financial weather, this organization (ORGA) is now in the process of evaluating point solutions vis-à-vis MDM to leverage the SAP NetWeaver platform to use their existing SAP landscape and other systems to meet these requirements. PIM products have been in existence with great ideas coming from vendors with many different strengths and approaches encompassing enterprise data management/master data management , enterprise content management, Global Data Synchronization services, product life-cycle management solutions, printing and publishing catalog solutions and vertical industry-specific applications. In this blog, I will put forth a part of my study on this subject for Org.A in helping them arrive at an ESA roadmap.

The Evolution of PIM

The PIM application space evolved a few years back along with Web catalog management and print catalog management products merged to extend themselves towards:

1. Extraction and transforming product data. Ensuring that the solution helps in normalizing, classifying and cleansing the existing master data in terms of various attributes and rules specified.

2. Integrating structured product data with unstructured content. Having the application assist in merging structured and unstructured content like long descriptions, images, video clips, and data sheets for items with the items in the same repository.

3. Print/Publishing Catalog content . A need that evolved during this time that needed catalogs to be published online, as paper catalogs and continue as a catalog handling system for digital marketplaces, exchanges and e-procurement systems.

4. Data Synchronization. Synchronization that goes beyond existing data internally within other source or multiple-source systems, but primarily aimed towards global data synchronization with external data pools Transora or UCCnet, or other pools.

5. Advanced Integrated capability. A need that help the solution offer additional functionality, like pricing and promotion management by making use of advanced search, workflow, and auditing/reporting integrated with the existing applications.

The hunt for an ideal PIM Solution in Org. A

1. Enterprise data management, or master data management, as the online Laundromat solution, which entails the process of consolidating, harmonizing and cleansing or enriching the data for distributing reference information based on the nature of master data forms the very core of Org.A’s needs as it exists today.

2. Enterprise Content Management or ECM to address the managing of various types of structured data and unstructured content such as Web content, documents, digital and media assets, report data, and collaboration objects.

3. Catalog publishing has always been the need of the hour for Org.A (as discussed above) in terms of web-publishing for its online store. This is the starting point for all its initiatives around SAPMDM.

4. Trading Partner Community Management through Global data synchronization (GDS) as Org. A belongs to this vertical, to leverage the CPG effort around the exchange of standardized electronic data, starting with item data. Product data is to be routed through data pools, which need to be connected to a global registry. Org. A is looking at Transora and UCCNet(1SYSNC) as the data provider to help streamline all PIM projects internally.

5. Product Life-Cycle management or PLM as an iterative, technology-assisted process to continually improve the cost-effectiveness and profitability of a portfolio of products within Org.A leading to vertical specific business content usage in the future. Org. A is not looking at a PIM solution that would just offer a catalog management application that would be limited for the use of a few departments within the organization.

From MDM to GDS and RFID

Master Data Management , as the backbone application for Org. A, now with SP02, has to be viewed from a perspective of a vertical industry driven application, unlike the technological solution it may seem to some today. MDM, as an initiative, needs to be viewed from this perspective only, aligning with vertical specific business content through EP, BI and XI, cannot be a big-bang project. The solution architecture involving the orchestration of other applications is what is going to make all the difference.

The alignment with RFID (radio frequency identification), where there is a cautious approach in the industry today, warrants the need for global data synchronization (GDS). In the retail industry, the move does make sense. MDM is required as a stepping stone here as unless complex product management can be taken care of by a PIM application, there cannot be a move in GDS and RFID. As a result of this, the first step that Org. A will take with MDM is to get it online as a web store-front – nothing different that what was the scene 7-8 years back and wait for customers to come. But that is a start point.

The next step is when Org. A will start extending these capabilities to RFID. This step is what Org.A expects to be the step when it starts seeing the results of the investment, The feedback mechanism along with Analytics for the storefront and POS about which products sell, where they sell, and why the sell – based on the categorization of data and the taxonomy that will need to be built in today to help Org.A reap these benefits. Again, point to be noted is where the technological solution becomes a foolish investment and where the business need is justified. It is this need that justifies the understanding of the data models that will need to be built into the MDM solution to make it useful in distribution and not at a product level at the point of sale. – many attributes, many linkages, cross-selling and up-selling needs, justifying the fact that – MDM cannot be a BIG-BANG implementation.

The PIM Solution needed for Org. A

a. The brass-tacks: The PIM process needs a data audit to diagnose the current state of product data and to cleanse, extract, and enrich existing data before putting into use a unified product information master repository. Org.A is not planning to remove all the supply systems of data in the near future.

b. Prepare for GDS: Implementing data management services within Org. A is to prepare for global data synchronization, manage the challenges associated with conflicting product information, more efficiently manage business processes, and create, cleanse, and publish product content. The GDS solution being on top of the Master Data Management platform, which is designed to support data initiatives for organizations that view data as a corporate asset and for organizations wanting to create data-centric architecture provides a good start.

c. Based on ESA: the PIM solution needs to be designed to support the data model and the associated process model for different types of data, such as meta data, reference, transaction, operational, and analytic data. It needs to addresses the uniqueness of Org.A. Industry-specific business content to help them focus on the vertical industry solutions for retail and CPG so that it aligns with the ESA roadmap being designed for Org.A with SAP NetWeaver.

e. Product packaging hierarchy maintenance Org. A is constantly packaging its products in different configurations based on the trading partners’ demands or for seasonal variations by building up the packaging hierarchy from the bottom up. The PIM solution needs to be designed to enable assortment maintenance as well as to add/delete/correct links among its products to create and maintain packaging hierarchies for products—such that an individual product could belong to one or more assortments at any point in time, either for seasonal variant reasons or for normal packaging reasons. Relationship dependent data maintenance whether ranging a product nationally or for specific locations, Org. A needs a solution such that is designed to model and maintain information such as transportation parameters and/or pricing information at a group level or at an individual location level.

d. No changes to underlying systems: Product life cycle maintenance needs to address the product information issue by creating an enterprise-wide repository of product content, incorporating GTINs into internal data without changing underlying systems. The application has to have the necessary workflows for managing the life cycle of the product such as introducing a new product, changing and/or correcting an existing product, as well as discontinuing a product temporarily or permanently from the supply chain. This would also involve re-categorizing a product to a different merchandising group, associating the same product to more than one product hierarchy and/or internal hierarchy, and cross-referencing these products to internal item numbers coming from one or more business systems.

f. Data Synchronization & data pools : Furthermore, it needs to be designed to model and maintain business workflows around the complexities of trading-partner-specific price management. Data synchronization - when it comes to internal synchronization of item, party, and price data inside of an enterprise or external synchronization to its trading partners through GDS, the solution needs to be designed to work with leading data pools in the respective Geo (US for Org. A) as well as country-specific data pools that are owned by EAN member organizations by conforming to EAN.UCC prescribed standards, and protocols.

g. An Integrated Solution Integration: The PIM solution needs to interact with and the internal enterprise systems and external data pools. The solution needs to be designed to help Org.A orchestrate an integration strategy on internal bi-directional synchronization, assuming distributed master data management among different applications. The design and focus of this synchronization capability needs to work on top of an existing integration infrastructure and to add value in the business domain and process layer to avoid having to rip and replace existing systems. The scenario with which the Solution Architect brings in XI.

i. Supply chain model maintenance: Viewing a product life cycle in the retail industry includes sourcing and business planning such as merchandising and replenishment in buying and marketing group as well as order management, logistics, and distribution in the supply chain planning group. For Org. A, a product would first go through engineering, then through the planning group leading to Operations for streamlining demand management, supply planning, supplier-network planning, inventory optimization, manufacturing, and fulfillment optimization business processes. Org. A needs a solution that helps it manage and maintain master data of the products it carries—not only from a pricing and packaging standpoint but also for other planning business processes point of view. This needs for Org.A will be in the next phase to ensure compliance by associating the same products to other key data elements of a supply chain model and supporting data management workflows such as introducing planning items, introducing new stores, supply or vendor calendar maintenance, store assignment for an item, etc.

h. The Online store-front: Org.A’s business partners need to be a part of the global GDS network to publish/subscribe only national brands through GDS and prefer to have private-label items sent directly as the creation of a small trading network or a c-chain, or to have public information through the network and the remaining information they like to share directly owing to the sensitivity of the information, the PIM solution needs to let Org. A manage and segment business partners based on what they are capable of and based on what they are comfortable with—resulting in the creation of small online hubs or what would be needed to enable master data collaboration across the board.


j. Master Data Organization: A key issue in Org. A today is how to organize the master data. The problem is not new and has been around for a number of years but due to the increased complexity, it has to be planned out better. Unlike MDM 3.0, MDM 5.5, as xCAT, was originally developed to cover master data management within the boundaries of an organization, typically focusing on only the SAP-related master data (A2A/enterprise master data management). With A2i, the outcome of this move can be seen as the release of SAP MDM, SAP MDM Extended and SAP MDM Global Data Synchronization this year as three separate products. Hopefully, all these releases will be merged soon and the result will be a tool that harmonizes the full scope of master data within and across organizations. With the resurrection of B2B interactions, technologies and concepts like the service ESA, RFID, CAF, and industry initiatives like global data synchronization will require not only open standards and integration technology but also harmonized master data across systems.

k. The importance of GDS: It is important to know that Global Data Synchronization (GDS) is the foundation upon which the full benefits of electronic collaboration can be achieved and scaled. GDS is also a pre-requisite for Electronic Product Code (EPC) based on radio frequency identification (RFID). By continuously synchronizing/harmonizing the master data between Org.A’s systems and trading partner’s systems, Org.A can ensure that master data is the same in all systems. This can enable global trade, increase data accuracy among trading partners, and drive costs out from the supply chain. The solution needs to address both product information management problems behind the firewall of an enterprise, and addresses the need to synchronize product information with trading partners through an established Global Data Synchronization Network (GDS).

GDS is the starting point, not the end-goal :

GDS is the existence of an infrastructure that provides and facilitates a seamless flow of product information throughout the supply chain and ensures that all business entities within the supply chain use a common product description and classification to establish a consistent process. This also allows for workflow to ensure systems remain synchronised as products continue to change to:

Strategic and Tactical Goals for Org.A:

a. Establish Common industry standards

b. Single Item registry

c. Item synchronization for industry

d. Collaborative transaction management with MDM

e. Collaborative Supply chain management

f. Collaborative sales and promotion planning

g. Collaborative insight and product development


RFID is the next step with GDS and MDM for Org.A and its business partners (Org. B, Org. C and Org. D) to improve velocity and accuracy in their supply chains, while also looking for value improvements through reduced out of stock conditions. High-value product manufacturers as Org. A, are looking for business solutions to reduce the impact of unwanted losses. The product paradigm is out. (Please refer storyboard for “Creating a Comprehensive Collaborative Platform with SAP NetWeaver”


The road ahead with any solution is not simple and standalone. For laying down an ESA roadmap, it is essential that not only the orchestration between various applications be understood well, the overall landscape be out into perspective. To create a comprehensive collaborative platform, digital supply chain participants need a central data repository to obtain, exchange, and update product information, a roadmap with MDM maybe. Maybe the product may not get you to the end goal today, but it’s a start in the right direction. The new age digital marketplaces and exchanges will rely on data—data that would range from the RFID transmitted product information at the dock doors of receiving to the alerts that notify of a pending raw material shortage on the manufacturing floor with a common registry that would overlay an industry vertical for industry specific master data. The end-to-end infrastructure needs to be created on the SAP NetWeaver platform as a road ahead with ESA. The are many facets to the above, only a part I have blogged. It leads to the world of linked up systems with XI, using BI and finally leveraging Analytics and warrants the need for creating xAPPs to create small c-Chains to increase value chain efficiencies..

And MDM 5.5 is only a step in that direction. And Org.A is going ahead with it.




SAP MDM Training 


SAP Developer Network Latest Updates