Benefit Now From Support Package 5 for SAP NetWeaver MDM 7.1

Following the path that has started with SAP NetWeaver MDM 7.1 SP 04, the brand-new Support Package 5 offers additional benefit from integrating SAP NetWeaver MDM with BPM for collaborative MDM scenarios in heterogeneous, multiple-system contexts. In particular, it provides extended flexibility with regards to the master data management UIs, providing additional WebDynpro components for matching, comparing and merging records. 

From a process perspective in a customer data creation process, a typical flow in determining duplicate data can be as follows:


The 3 steps highlighted in middle blue are now backed by comprehensive WebDynpro UIs:  

Match Records


 Compare Records


Merge Records


In addition, there are enhancements in the MDM workflow (additional step), and enhanced SAP BusinessObjects Data Services blueprints for SAP NetWeaver MDM use cases (e.g. address cleansing) using the MDM Enrichment Architecture. For example, there is specific business content for vendor and customer data to directly interact with SAP BusinessObjects Data Services dataflows, which reduces the implementation time significantly. 

For a complete picture, see the SP05 Overview Presentation and watch the session recording.

MDM Quick Starter: Aggregate Physician Spend

What the Aggregate Physician Spend Quick Starter is all About

The U.S. government introduced The Physician Payment Sunshine Act requiring manufacturers and group purchasing organizations to report on a wide range of payments to physicians and physician-owned entities. According to this, the U.S. requires yearly reporting of all physician payments over a cumulative value of $100 dollars - with the first report being due by March 31, 2011.
Not only due to this legal obligation are health care providers and organizations struggling with a missing single view on their physician data. The data mostly resides in multiple business application systems and is referenced by a number of transactions such as payment or logistical processes.

Situation without aggregated spend data

 

The sample graphic shows a typical scenario, where the physician spend data is dispersed inconsistently across several source systems. Aggregation of the data reveals that the total expenses exceed the allowed range (based on fictituous data).


This quick starter package facilitates a fast implementation of a Physician Spend Aggregation process based on SAP NetWeaver MDM. It covers the data extraction from SAP Business Suite systems as well as the actual import into SAP NetWeaver MDM for consolidation. Once the data has been consolidated it is ready for usage by business intelligence tools such as SAP NetWeaver BW. Life sciences companies can directly leverage the out-of-the-box integration content to reduce the time to value to a minimum.

What it has in it for You

The Aggregate Physician Spend Quick Starter provides pre-defined integration content including:

  • Out-of-the-box data model covering both health care providers and organizations
  • Extractors for SAP Business Suite systems (incl. SAP ERP, CRM, and SRM) for both reference and master data
  • Pre-defined MDM import mappings for SAP Business Suite systems
  • Matching strategies for physician data
  • Pre-defined MDM export mappings fueling SAP Business Suite systems
  • Configurable export mappings fitting your reporting system

 
 

For further details including download information, see the SDN Download Catalog site dedicated to this quick starter.

MDM Quick Starter: Collaborative Master Data Creation Processes

What are MDM Quick Starters? 

SAP NetWeaver MDM provides a generic infrastructure to address manifold business pain points related to misaligned master data. To accelerate the implementation process for specific MDM usages, SAP NetWeaver MDM incorporates preconfigured Business Content. This concept has proven to be successful, and this is why we have decided to further extend this approach by introducing MDM Quick Starters. 

MDM Quick Starters (QS) are preconfigured business packages that are intended to enable SAP NetWeaver Master Data Management (MDM) customers to rapidly deploy and implement a working scenario for master data management processes on top of MDM and in some cases, in conjunction with SAP NetWeaver Business Process Management (BPM). 

What is the Quick Starter for Collaborative Material Master Data Creation all About?

Basic information about how to go about collaborative master data creation processes in heterogeneous landscapes based on SAP NetWeaver MDM and SAP NetWeaver BPM, including the associated benefits has already been highlighted in my recent blog about SAP NetWeaver MDM 7.1 SP4 and in the Error-Free, Consistent Master Data Starts at the Source article.

To provide customers with additional value through an accelerated implementation of a scenario that is streamlined towards the central creation of globally-relevant, identifying master data attributes and their propagation to the relevant application systems in a heterogeneous landscape, SDN now features an MDM quick starter package for collaborative material master data creation. This QS package can be used as a template providing relevant sample content for a collaborative creation process for material master data based on SAP NetWeaver MDM and SAP NetWeaver BPM. It serves as a starting point and can be easily tailored to suit specific business needs.

Sample Process Flow


Think of the following typical business scenario in which the quick starter package can be leveraged:

An LOB owner requires a new material to further process it in the supply chain:

  1. LOB owner checks if the required material record already exists in ERP system. If not, LOB owner requests creation via specific request form. LOB owner can check the completed request form against SAP NetWeaver MDM to find out whether the material's record already exists in the central SAP NetWeaver MDM registry. If not, the LOB owner dispatches the request.
  2. Request appears in the inbox of a designated data steward. The Data Steward enriches the material record, runs specific validations and finally approves the record.
  3. The approved record is now in the inbox of the requestor for review and confirmation.
  4. Once confirmed, the record is automatically created in SAP NetWeaver MDM and propagated to the requestor's system. The LOB owner who requested the record is notified of the available data. 

Process Flow in the Process Composer

Technically, MDM's functionality is exposed as Web services and web-based data governance UIs which are wrapped as workflow steps and combined in a logical process flow using SAP NetWeaver Business Process Management. In the Process Composer, the flow of this master data creation process looks as follows:


What Has the Quick Starter Package in it for You?

The new quick starter package for governed material creation provides the following capabilities:

  • Preconfigured BPMN based process flow
  • Preconfigured Web-based Data Governance UI Components (e.g. Enrich & Approve)
  • Request form
  • Portal roles (e.g. Data Steward)
  • Preconfigured MDM Web services (e.g. Update Record)
  • Leverages standard MDM Material data model
  • Enterprise Service Consumption (Create Material Basic Data)

For further details on the MDM Quick Starter including download information, see the SDN Download Catalog and the associated demo showing the end-to-end process.

Continuously Assuring and Controlling Master Data Quality (Part One)

The Business Case for Continuous Data Quality Assurance

Continuous Data Quality Assurance for enterprise-critical data is indispensable for companies striving for sustained enterprise performance: Flawless business execution and trusted cross-company analyses depend on the quality of the underlying master data, and the challenge of continuously assuring premium master data quality as an ongoing activity is even higher when enterprises are operated on the basis of diversified system landscapes. Tackling this challenge is key to make the competitive edge.


The Situation Today


However today, it is still common that organizations have little, or no transparency at all into what the relevant data entities for a given business process are, nor do they have comprehensive capabilities to define and control required quality KPIs for these data entities. In such a setting, the negative impact of poor master data quality is only revealed when the damage is already done, i.e., through broken business transactions and shaky company analyses that produce wrong decisions. As long as organizations cannot statistically measure the quality of enterprise-critical data and their compliance with company terms, they are in a poor position when it comes to safeguarding and improving the overall performance.

The Way to Continuous Data Quality Assurance


To tackle the situation and accommodate this high-priority need, SAP features a comprehensive and sustained way to manage the data quality. Using this data quality approach, companies can:

  • Define the relevant metrics and set up the required quality rules that their critical enterprise master data need to comply
  • Subsequently monitor the compliance statistically and clearly visualize the prevailing data quality, and
  • Trigger follow-up actions if master data reveals quality issues.

These key activities are integrated into a collaborative end-to-end process that empowers data stewards and data administrators to exercise overall data control.
Using such an approach, companies can establish a closed quality loop around their enterprise master data management strategy.


Fig: SAP BusinessObjects Data Services and Xcelsius dashboards clearly visualize the prevailing data quality and trends. Defined quality dimensions can comprise (to name just a few):

  • Completeness (e.g., all mandatory fields contain data)
  • Conformity (e.g., all formats must match given patterns)
  • Validity (e.g., data must be in a valid range)


This data quality scorecard and remediation solution brings data, systems, and people together into one collaborative and coherent process flow. It combines SAP NetWeaver Master Data Management, SAP BusinessObjects Data Services and Xcelsius dashboards into a cohesive monitoring environment, and flexibly integrates with SAP NetWeaver Business Process Management to seamlessly trigger follow-up action if the data quality revealed forces to do so.  It is a perfect means to bring companies in good shape and keep them there on an ongoing basis.

Sounds interesting? Then stay tuned for part two of this blog series which will focus on architectural and implementation considerations of this scenario.


 

Optimized Supplier Spend Reporting

What is Optimized Supplier Spend Reporting (OSS)? 

With each LOB in a company operating with different information on a single supplier entity it is difficult to get a single view comprising the best information about the specific supplier. This holds true both for operational and for analytical contexts. Is this supplier related to another similarly named supplier? Which address is the best? Which name is correct? What is the most recent phone number? Is my supplier a subsidiary of another company with which my company does business? How much money do I spend on my individual suppliers? 

When this information is not aligned, business gets cumbersome. To ensure that you can analyze your supplier interactions, such as global spend reporting effectively, you need complete, accurate, and synchronized supplier and material data. SAP helps you tackle this challenge with an easy-to-deploy solution for optimized supplier spend (OSS) reporting from which organizations can gain immediate benefit.

Main process steps and software involved

 
 

 
 


 
 

A software package consisting of SAP BusinessObjects Data Services and SAP NetWeaver Master Data Management (MDM) lets you easily extract, cleanse and consolidate supplier and material master data from dispersed IT systems to enable trusted cross-system reporting, such as global supplier spend analysis.

After optimizing data for global spend reporting using SAP BusinessObjects Data Services as the upfront data quality layer and SAP NetWeaver Master Data Management being the overall master data persistency and processing layer, a trusted global supplier spend dashboard can look like this:

 
 


The information displayed in the supplier spend dashboard now reveals the truth and can be used as a credible and trustworthy foundation for strategic sourcing: The output is based on consolidated supplier and material data where redundancies have been detected and eliminated. The screenshot shows an Xcelsius dashboard which runs against SAP NetWeaver BW, where the key mapping information provided by SAP NetWeaver MDM has been processed accordingly.

For more details, 

Optimizing data for global spend reporting is a smart way to start a company-wide MDM strategy. It is easy to deploy and delivers immediate ROI.

Markus Ganser is a product manager in SAP NetWeaver Master Data Management (MDM)

SAP Business Objects and SAP NetWeaver MDM - Bringing Together The Best of Two Worlds

This is really good news for data stewards and other IT people dedicated to information management and data quality management: SAP Business Objects and SAP NetWeaver continue to team up in a common approach to establish sustained enterprise strategies.

Attentive community readers may already have noticed that the SAP NetWeaver Information Management space and the SAP Business Objects Information Management space are semantically closely related, both covering one and the same hot spot in current enterprise strategies applied to gain the competitive edge. This has also been addressed in the Extending the Value of SAP NetWeaver Business Intelligence with the Business Objects Business Intelligence Platform presentation. And - the common ground can also be observed in the master data management space.

As already depicted in the SAP NetWeaver Master Data Management roadmap graphic below, you can see one example of how these areas have started to interact, combining the best of two worlds featuring end-to-end data quality driven master data management.

 
 


[*) Please note that possible future developments are subject to change and may be changed by SAP at any time for any reason without notice]

With the Data Quality Management (DQM) for SAP NetWeaver MDM package at hand, you can leverage the powerful address cleansing capabilities of SAP Business Objects Data Services in SAP NetWeaver MDM*.

For an architectural overview of SAP NetWeaver MDM and SAP Business Objects interacting through the DQM-MDM Enrichment Adapter see the following graphic.

  "

From the package you will get the following benefits:

  1. Ready-to-run sophisticated address cleansing capabilities on an international scale.
  2. Master data excellence by combining master data quality and master data persistency
  3. Streamlined service-based integration via the MDM Enrichment Architecture.
  4. Workflow-embedded data enrichment process.

The related user guide provides a general overview of BusinessObjects™ Data Quality Management software, version for SAP NetWeaver MDM, as well as specific information for installing and integrating this product with SAP NetWeaver MDM.

SAP Business Objects Data Services customers can download the Data Quality Management (DQM) for SAP NetWeaver MDM package from the SAP Download Center (Path: Download > Installations and Upgrades > Entry by Application Group > SAP Business Objects packages and products > BOBJ DQM FOR SAP NW MDM).

In addition, please check the following information material about SAP NetWeaver MDM integrating with SAP Business Objects:

SAP NetWeaver MDM and SAP Business Objects Data Services - a winning combination.

The SAP NetWeaver MDM and SAP BusinessObjects Data Services Integration Path Continues

Initial data quality integration between SAP NetWeaver MDM and Business Objects Data Services is based on the MDM Enrichment Architecture where the Business Objects' Enrichment Adapter allows data cleansing and enrichment of business partner master data using Business Objects' inherent postal address directories. The cleansed data is transferred to the MDM Enrichment Controller for availability in SAP NetWeaver MDM (based on MDM workflow). In addition, ETL connectivity between SAP BusinessObjects Data Services and SAP NetWeaver MDM with data pre-cleansing done by SAP BusinessObjects Data Services has also been explained in an SDN How-to Guide.

Both integration scenarios are explained in my blog on the SAP NetWeaver MDM and SAP BusinessObjects Data Services connectivity.

But what has been somehow missing so far, is a cohesive and comprehensive in-depth picture of current integration scenarios providing best practices for implementation including additional integration mechanisms (e.g. DB views). Therefore, I'd like to draw your attention to a new series of MDM How-to Guides dealing in depth with SAP NetWeaver MDM and Business Objects Data Services integration topics. The first publications in this series are:

Part III of this series is planned to cover the SAP NetWeaver MDM and SAP BusinessObjects Data Services interaction when it comes to central master data creation.

For additional information about integrating SAP NetWeaver MDM with SAP Objects Data Services, see the Integrating SAP NetWeaver MDM with SAP BusinessObjects Data Services Wiki site on SDN.

I hope that this information is beneficial for you!

Integrating SAP NetWeaver MDM with SAP BusinessObjects Data Services

Integrating SAP NetWeaver MDM with SAP BusinessObjects Data Services

* The SAP NetWeaver MDM and SAP BusinessObjects Data Services Integration Path Continues

* SAP Business Objects and SAP NetWeaver MDM - Bringing Together The Best of Two Worlds

* Optimized Supplier Spend Reporting

* Continuously Assuring and Controlling Master Data Quality (Part One)

* Collaborative Material Master Data Creation Quick Starter (featuring SAP NetWeaver BPM and SAP BusinessObjects Data Services integration)

* Aggregate Physician Spend Quick Starter

* Benefit Now From Support Package 5 for SAP NetWeaver MDM 7.1 NEW!

Access Multiple Links in Web Tab

SAP Master Data Management (MDM) provides a completely flexible mechanism for linking MDM to the outside world using Links table. However at a time only one link can be set Active in Web tab of MDM Data Manager. But by using MDM in collaboration with HTML one can simultaneously access N number of links in the Web tab without changing the configuration. 
 
Steps to be followed:
 
1. Create an HTML file with the below code and name it as Links.html. Code shown contains four links however you can add N number of links just by adding lines start with tag <A> using the same syntax shown.

 <HTML>
<BODY bgcolor="cyan">
<FONT face="Bookman Old Style" size=2> 
<A href="http://www.google.com" target="Output">Google Search</A>
<A href="http://www.yahoo.com" target="Output">Yahoo</A>
<A href="http://www.capgemini.com" target="Output">Company URL</A>
<A href="http://Tewall22:50000/irj/webdynpro/dispatcher/MyBookings" target="Output">My Bookings</A> 
</FONT>
</BODY>
</HTML>
 
2. Create another HTML file as shown below

<HTML>
<FRAMESET name="framset1" Rows=10%,*>
<FRAME name="Links" src="Links.html" noresize scrolling="no">
<FRAME name="Output">
</FRAMESET>
</HTML> 
 
The purpose of this file is to divide the page into two halves (known as Frames). Frame one would contain the links to be accessed and Frame 2 will open the link clicked.
 

3. Place both the HTML files in same directory and add the entry in Links table of the repository referencing second HTML file. 


4. Open MDM Data Manager and go to Configuration -> Options, select the link added in above step for the entry Web pane URL for selected records.

5. Check the output in Web tab.


 Note: This Wiki focuses only on the desired functionality rather than look and feel of the HTML pages. You can use the power of HTML to format the pages as per your requirement or liking.

Getting Started with MDM CLIX

CLIX is used for making various console based operations automated and secure. CLIX provides added security by eliminating user interventions and making console activities automated. Till now SAP has introduced CLIX commands only for console based activities.
 Steps necessary before starting with CLIX:

  • Before using CLIX, check the version of the MDM server and find out the compatible version of MDM CLIX and install it.
  • For executing CLIX commands; open command prompt and go to the folder where CLIX is installed.
  • Now type CLIX; just to validate that correct patch is installed. If the CLIX patch is compatible then it should display screen like below.
                          

  • Now to find out all the operations that can be performed on MDM server, type CLIX m  
     
     


  • Similarly, for repository commands type CLIX r, for repository copying commands type CLIX c and for DBMS commands type CLIX d.
  • In all the CLIX commands, the first argument will always be MDMHostSpec, comprised of MDM server host name or IP address, followed by colon and password (password is optional). Eg.: 10.48.64.52:admin
  • Few CLIX commands require one more parameter called DBMSHostSpec, comprised of DBMS instance name, type, user name and password followed by colons or semi-colons. Eg.: dtpxp-vagarwal:s:sa:admin
  • Several commands also require RepositorySpec, comprised of repository name followed by colon and DBMSHostSpec. Eg.: Test_Vendor:dtpxp-vagarwal:s:sa:admin
  • For creating a batch file, type edit CLIX.bat on command prompt and write the desired commands and then save it.
  • Below is a sample CLIX batch file for loading and archiving a repository in a single run.  
     
     
     
     
     
     
     
     
  •  
  •  
  •  
  •  
  •  
  •  
  •  

  • Now for scheduling this batch file, go to schedule task under control panel and schedule it as per the business requirement. 

 For Viewing the abbreviations for all MDM CLIX commands: Just type "CLIX a" in command prompt.

Details of all the flag options available in MDM CLIX are:


So, CLIX is definitely the best solution for automating and scheduling most of the console based activities without any human intervention.

Using SAP MDM Import Manager Batch

About Import Manager Batch

Import Manager Batch is a tool to upload data into MDM in a batch or background mode. This tool has to be used hand-in-hand with the online Import Manager. It is not meant to be used without the support of Import Manager as an important component of the batch import process i.e. the import maps need to be created online.

On the Windows platform, this utility is delivered as an '.exe' file (ImportManagerBatch.exe). This tool can be run in a scheduled background job. Hence it is handy to upload large amounts of data.

A log of import activities is also provided by this tool.


 

Setting up the Import Manager Batch

Once installation has been completed, a few chunks of information are needed to effectively import data using Import Manager Batch. An MS-DOS batch program can then be created to execute the utility online or in a scheduled batch.

Information required

Server and Login

The server information is meant to locate the particular repository to which the data needs to be uploaded into. The login consists of information such as user id, password and language preferences.

Import File

The file where the source data is stored. It is required that the type of file (Excel, Access etc.) and the full location be known.

Import Map

This is created as a part of the online Import Manager activities.

We will see further how we can collect some of the information required from the Data Manager and the Import Manager utility

The Import Manager Batch settings ('.ini') file

Once the information required is collected, it will go into the 'ImportBatch.ini' file as configuration data. This file can be found in the same directory where the 'ImportManagerBatch.exe' file is stored. Create a copy of this file before editing it.


 

Time to '.ini'

Server and Login

Execute MDM Data Manager and login to the target repository. Click Help -> Repository Info. You should see a popup as displayed.



Fig 1.1

Use the 'MDM Server', 'Repository' and 'Port' information for identifying the target repository. The 'User' and password is to be supplied for login via batch mode. The information goes in as displayed in Fig 1.2.



Fig 1.2

Import File

As mentioned earlier, the type and the full path of the file to be uploaded is required. This information can be passed into the 'ini' file as displayed in Fig 1.3.



Fig 1.3

Import Map

This is the most important step with respect to the quality of the import process. As a part of the Import Manager online activities, partitioning, value mapping, matching records etc. are carried out. This information is critical to the background Import Manager Batch process too.

Once all the online configuration has been completed in Import Manager, this information needs to be stored in an import map. The 'Import Status Tab' should show 'Ready to import' before the final version of the map is saved. This ensures consistent import execution.



Fig 1.4


 

Usage of ImportManagerBatch.exe

Once the previously mentioned setup activities have been completed, the batch process is ready to be implemented. Let us now have a look at the utility itself.

Execute 'ImportManagerBatch.exe' file or double click the file from explorer and you will see this dialog.



Fig 1.5

Command Line Options

The following 'arguments' or command line options need to be specified.

/INI: the complete file path storing server login and import file information.

/CLIENTSYTEM: MDM system name.

/MAP: Map name as stored previously.

/LANGUAGE: Language applicable, for example: English [US].



Fig 1.6

This /CLIENTSYSTEM and /LANGUAGE information can be looked up during the online import manager activities.



Fig 1.7

Creating an MS-DOS Batch File

It is necessary that 'ImportMangerBatch.exe' be executed from the path where it is stored. This can be accomplished by writing an MS-DOS batch file (extension '.bat'). This '.bat' file can be executed from any directory.

Here's a sample.

cd C:\Program Files\SAP MDM 5.5\Import Manager

c:

importmanagerbatch /INI "D:\MDM\IMB\ImportBatch.ini" /CLIENTSYSTEM "MDM" /MAP "Equipmentmap" /LANGUAGE "English [US]"

Execution

The execution of the batch file results in the following activities.



Fig 1.8


 

Monitoring

Log of ImportBatchManager activities are found in an XML file. This is stored in the same directory as the ImportManagerBatch.exe itself. The file name starts with 'ImportManagerBat' and ends with the '.xml' extension. Error and success notification are stored in this file.


 

An example of a failed process is illustrated below.



Fig 1.9

In the above log, it is stated that the import process failed because the map couldn't be found.

Successful processes result in loading of data which can be checked by the SAP MDM Data Manager.

Can we use SAP MDM for Central master data management?

SAP MDM for CMDM-

The prerequisite for CMDM is that other two scenarios has already been implemented which are COnsolidation of Master data and Harmonization to target systems. In a central hub MDM implementation any creation, Update, deletion of a master record has to be performed in SAP MDM first and then syndicated to remote systems-ECC. While implementing CMDM we come across following limitations in SAP MDM:

  1. Global Vs Contextual Data- One of the key challeneges in MDM project is to decide which attributes will be part of MDM data model. As suggested by SAP only global attributes should be in MDM. The limitations can be further categorized as following-
  • Nested Qualified/Tuples not supported.
  • Validations/Assignments/Matching strategies not supported for Qualifiers.
  • CHange tracking not supported for Qualified fields.

    2.   User Interface for CRUD operations- MDM GUIs are not very user friendly and has following limitations-

  • Use of Constraints/Masks very limited required for controlling access.
  • COntrolling display of sensitive fields to users.
  • Controlling access for modifications in matching strategy in Data manager.
  • ...........

Above limitations require development of Portal UI which further requires gretaer time/efforts. Portal acts as a thin client for managing the CRUD operations on MDM database for specific repositories.

3. Processes around Master data- Implementing CMDM implies also mapping the processes existing around master data, like governance rules, Approval workflow. This can be achieved via MDM native workflow which have following limitations-

  • Fixed roles and Users for tasks
  • Parallel steps not supported
  • Limited workflow history and performance
  • Limited features in Notification Steps, sending customized information is not possible.

4. Deriving the number ranges

MDM supports numbering for master records either based on Auto ID(calculated fields) or by defining remote systems with Key generation. Neither of the two methods supports external number generation which is pretty much used for some of the Customer and Vendor account groups.

Suggestion- Seeing the limitations above we can opt to use MDM for CMDM only when the Data model, Processes, Governance etc are kept simple and restricted to mostly Global attributes. A very feasible design is use CMDM for the global attributes only and for the local attributes( Sales data, Purchasing Org data etc) develop governance processes in the local systems like ECC.

"socket error" bug fixed in MDM 7.1 SP03 p19

I was working on the MDM business content. When I import reference data regions.xml, get error "Socket error". I has checked  ALL component is the same version. (Note 1345858)

The MDM server is on windows 2003 x64. Later I installed a MDM server on Linux x86_64 system(Build a SAP MDM system on Linux platform). And I get the same error. I had to send a message to SAP.


 They said version 7.1.03.69 would solve this problem. It is available a few days later.

In the Note(Note 1425021 - MDM 7.1 SP03 Patch19 Release Note) you can find that: (build 7.1.03.69 Fixed): Importing into a Hierarchy table using Import Manager, Import Manager crashed and displayed an error saying: "The operation completed successfullyWindows sockets error code: 0 This application will now exit".

Automatic Backup of MDM Repository (Using MDM CLIX in Windows OS)


 MDM CLIX makes most operational MDM Console commands accessible via a command line interface and from within batch files, making it useful for scheduling and automating routine maintenance. In addition, CLIX provides an added measure of security and consistency, since automated batch operation eliminates the potentially unrestricted access and the types of errors that may be introduced through a manual GUI interface. CLIX runs on all platforms for which MDM has been released, including Windows, Linux and Solaris (contrasting with the MDM Console, which is exclusively a Windows-based application). 

CLIX Command Syntax

CLIX command_name [MDM:arguments] [MDM:option flags] 

Advantages of Automatic Backup

No User Intervention required: Since the backup command is executing automatically hence no user action is involved. 

Provide Security & Consistency: MDM CLIX provides an added measure of security and consistency, since automated batch operation eliminates the potentially unrestricted access and the types of errors that may be introduced through a manual GUI interface.

Avoid Manual Backup overhead: Using MDM CLIX commands in a batch file avoids the overhead of taking regular backup of master data manually. 

How-to perform Automatic backup (Archive of MDM Repository) using CLIX Command

 CLIX cpyArchive command is used to create the Archive of the repository specified as a parameter in the command. 

Syntax of cpyArchive command 

CLIX cpyArchive MDMHostSpec RepositorySpec [MDM:-A filename] [MDM:-S segSize] [MDM:-F] [MDM:-N] [MDM:-O] [MDM:-D] 

Where:

MDMHostSpec = MDMHostName : MDMPassword

RepositorySpec = RepositoryName : DBMSHostName : DBMSType : DBMSUser : DBMSPassword     

Steps

1. Create a new file and write the below lines in it: 

     set path=c:\Program Files\SAP MDM 5.5\CLIX

     CLIX cpyArchive Test_MDMServer Test_Repository:Test_DBMSServer:s:sa:pass1234 -F -A TestClixBackup.a2a 

2. Save the file with .bat extension

3. Navigate to Start -> Programs -> Accessories -> System Tools -> Scheduled Tasks.



4. Double click on Add Scheduled Task and click Next to continue.

5. Click on Browse, navigate to the path where batch file has been saved and select the batch file.

6. After opening the batch file, select the option depending on when the backup is to be taken and complete the further steps.



When ever a task is due, commands written in batch file will be executed automatically and overwrite the previously saved Archive (since --F is used in the command) with the new updated one. 







NOTE

Any backup related command like Archive, Duplicate, Synchronize slave, etc can be automatically executed by entering the command in the batch file and scheduling it. 


 

SAP MDM Faqs

    


 

Q)What platforms are supported with SAP Master Data Management?

A) Please find availability and supported platform information on SAP Service Marketplace, alias PAM (http://service.sap.com/pam). And then drill down into NetWeaver -> SAP MDM -> SAP MDM 5.5. Note that appropriate Service Marketplace authorization is required.

Q) How integrated is SAP NetWeaver MDM 5.5 with SAP NetWeaver and applications?

A) SAP NetWeaver MDM 5.5 is an integral part of the NetWeaver stack. In the current feature release, enterprise application integration, both SAP and non-SAP, is accomplished through SAP XI. Interoperability with other systems is possible via SAP NetWeaver MDM 5.5's APIs (including ABAP whose development is currently in process). Tight, native integration is part of the SAP NetWeaver MDM 5.5 roadmap and further pre-built integration points will be rolled out as we progress along the development path. SAP MDM 5.5 SP2 will provide view-only iViews for SAP Enterprise Portal.

Is the Product Catalog Management application part of the SAP NetWeaver Integration and Application Platform? Does print publishing belong to this platform as well?

Yes, these are all part of the SAP NetWeaver platform and print publishing is an extension of the capability to product content management. By definition, this is the case since the former A2i xCat application, now further augmented and known as SAP NetWeaver MDM 5.5, is part of the SAP NetWeaver MDM family of products.

Q)How will MDM fit into Enterprise Services Architecture? Which Web services will be provided and when?

A) MDM is integral to SAP's ESA strategy. The initial list of documented Web services with MDM 3.0 were provided with MDM 3.0 information release. These refer to the ability to access master data information in MDM as a service to create records, etc. New web services will be available as per the roadmap. With SAP MDM 5.5 in conjunction with SAP Exchange Infrastructure, one can create web services by exposing MDM functions using MDM JAVA or .NET APIs.

Q) What tools are available to integrate SAP MDM and other non-SAP applications and platforms?

A) SAP MDM 5.5 exposes its core functions using published JAVA and .NET APIs. Any integration between MDM and other non-SAP software can be handled using APIs. Also, MDM functions can be exposed as web services using APIs in conjunction with SAP Exchange Infrastructure. Broader integration between SAP MDM 5.5 and other SAP NetWeaver components will be available through product roadmap.

Q) Can Mask functionality be used for determining which BP records exist in R/3?

A)There is no need for a mask to be generated, as Syndicator can filter records to be sent according to the Agency and remote key stored within MDM. The "suppress records without key" option needs to be set to "Yes".

Q) Can a mask be recreated automatically from a saved search selection criteria?

A)This is not currently supported. Records can be hidden per role using "constraints" functionality in the console.

Q) Can MDM send only changed fields and data and not the whole record?

A) There are two possible answers to this.

1.
If you are extracting changed data through the API, you can set the timestamp field to change only when your key fields change. This will allow you to select only those records whose changes need to be sent to R/3.

2.
Using the Syndicator you can use the timestamp technology in calculated fields or set up the relevant search criteria in the Syndicator to select only those records that have relevant changes.

Q) What options are available for resending from MDM within XI or R/3 in case an update fails?

A) If the failure lies with XI or R/3, the same XML can be reprocessed (no resending is required). If there is a validation or data problem, the records needs to be identified and modified in MDM Data Manager Client and the Syndicator batch will resend them as they were updated since the last syndication.

Q) How easy is it to maintain the front-end when the data model changes?

A) The effort depends on the number of fields required for the front-end. Fields that are added have no impact. Fields that are deleted (and maintained in the front-end), need to be removed. Fields that are renamed need to be updated.

Q) Is it possible to develop web forms (outside of EP6) that link to standard Java MDM APIs and communicate with the MDM repository?

A) Yes it is possible as you are not limited to the use of iViews that exist. Your own application-specific iViews can be created. You can also access the server with direct calls to the API from the java environment.

Q) Is it possible to assign the saved search criteria to a role or person to restrict what he or she can view in the search?

A) The saved search option is client computer specific. That means that a user's search criteria are available only to the user and not to other users. Therefore the saved search is not an option in this case. Using role constraints you may achieve the required results.

Q) Are adapters/extensions available in MDM for integrating monitoring tools? (ie. does Tivoli register if an exception occurs in MDM?)

A) MDM currently does not trigger external processes on errors. The system uses logging capabilities to register errors and there are specific log files for the various components of the system. If the monitoring system/s can be triggered on changes to the log files then the system can be monitored.

Q) Is it possible to hide certain fields and their values (depending on profile)?

A) The MDM security mechanism allows you to define constraints to be used for display and hide field values in MDM Data Manager Client. Currently the MDM capabilities do not allow you to entirely hide fields upon a constraint setting. However, you can use the APIs for building a User Interface to allow display/hide of fields and attributes as required.

Q) Is it possible to trigger external processes depending on type of errors raised, for example alert management functionality?

A) Currently an extended error handling with follow up processing is not on the roadmap. However, the usage of MDM Expression Language needs to be evaluated for this usage.

Q) MDM stores change history in a separate database which can track the selected fields in any table, the before and after state of a record for that field and the user performing the change. As a result, if you activate too many fields or have frequent updates to the same field, you experience performance problems. How can I better manage this?

A) Limit the number of fields to be tracked to the minimum required. Establish an archive and purge procedure on the track changes log/database on daily basis to keep this database size to minimum, ensuring optimal performance.

User Interface - Client and Web Front End

Q) Are saved searches shared between users or roles?

A) The saved searches (produced from the top menu Search->Save current search) in the client or the syndicator in the current version are saved locally per repository. That means they're shared among different users working on the same workstation. Although it may seem a limitation such an approach makes the saved searches more flexible. The saved searches (as files) can be distributed over different workstations working with the same catalog and/or accessed from a share.

Q) Can a saved search be shared between the client and the syndicator?

A) Searches are saved locally to a file and hence can be shared between the client and the syndicator by copying files (having extension sqf) to the syndicator or the client directories.

Q) Do file-shared searches break security restrictions

A) Searches are merely sets of query's criteria. In other words, every user that opens a saved search will get as results only records she is allowed to see.

Q) There are too many search tabs in the Client's "Search parameters" pane. How can I pick up only ones I want to display?

A) In the Console, every field has parameter "Display Field" which accepts Yes/No values. In the Client's "Search parameters" pane only fields with "Display Field" option "Yes" are shown. Another way to hide a field is in the Client. There make a right click on the search tab and choose "Hide".

SRM MDM Catalog 3.0 - Page not found or not available / Error connecting to

Kindly check the Standard Call structure.

The catalog call structure is attached below.

Seq Parameter Name Parameter Value Type
10 [http://xxxx.xxx.com:53000/SRM-MDM/SRM_MDM 0 URL
20 username Master 2 Fixed Value
30 password newuser123 2 Fixed Value
40 server XXX.XX.X.XXX 2 Fixed Value
50 catalog SRMMDMCAT20 2 Fixed Value
60 UIlanguage EN 2 Fixed Value
70 datalanguage EN 2 Fixed Value
80 HOOK_URL 0 URL
90 returntarget _parent 2 Fixed Value
100 ~caller CTLG 2 Fixed Value
110 ~OkCode ADDi 2 Fixed Value

You can refer my latest published article "SRM-MDM Catalog Setup - Ready Reference".
The SDN link is attached below
http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/806f3d56-0d29-2d10-9abf-c0df6f0fdfe8?QuickLink=index&overridelayout=true
Kindly refer "Web-Service Configuration in SRM" and "Accessing of Catalog through SRM Portal" sections.


 


I had the same error initially. the steps that helped me are :

1 , Create another user in MDM console, with only Catalog USer role.
2. provide a password , as per ur wish.
3. Take the ID/ Passwd and put it into the WEBSERVICE ID config in SRM as per the steps enumerated above.
4. Make sure u know the Parameters are CASE-SENSITIVE, hence maintain the remaining parameters as per COnfig guide on SMP.


1 , Create another user in MDM console, with only Catalog USer role.
2. provide a password , as per ur wish.
3. Take the ID/ Passwd and put it into the WEBSERVICE ID config in SRM as per the steps enumerated above by VIkas
4. Make sure u know the Parameters are CASE-SENSITIVE, hence maintain the remaining parameters as per COnfig guide on SMP

MDM Server Settings

Several parameters used by the MDM Server are configured in the mds.ini file, which is stored in the directory where the MDM Server executable is located. These parameters are located in the top section of the file following the line:

[MDM Server]

Each parameter name is followed by an equal sign (=) and either a value or nothing. There is no space between the name and the equal sign and each name is case sensitive. The parameters are described in the table below.


Under Windows, mds.ini is located in the same directory as mds.exe. Under Linux/Solaris, it is located in the working directory of the MDM Server (the one from which the MDM Server is started); this is typically under the home directory of the user account used to launch the MDM Server.

By default, many of the settings are not entered in the mds.ini file and are added as needed. We assume that the security of this file is controlled via access to the MDM Server host machine.

The following table shows the MDM Server Settings file (mds.ini) parameters. For True/False values that have a default, the default is highlighted in bold.

Name

Description

Accelerator Dir

String. The directory path for the location of the MDM Server index files. On Windows, default is Accelerators under the MDM Server executable directory. On UNIX, default is under the working directory (which is usually the launch directory).

Allow Console to Delete Files

True/False. Allows the MDM Console or CLIX user to delete files from the MDM Server Report Directory.

Allow Console to Retrieve Files

True/False. Allows the MDM Console or CLIX user to retrieve files from the MDM Server Reports directory.

Allow Console to Retrieve SysInfo

True/False. Allows CLIX user to retrieve the contents of the mds.ini, Server*.xml, and Assertion*.xml files.

Allow Modification of mds.ini

True/False. Allows mds.ini parameters to be modified by the "CLIX xs" command.

Archive Dir

String. The directory path for the location where MDM archive files are stored and retrieved. On Windows, default is Archives under the MDM Server executable directory. On UNIX, default is under the working directory (which is usually the launch directory).

Autostart

True/False/Number. Causes the MDM Server to automatically perform a Load Immediate of all mounted repositories that are not outdated or already running elsewhere when the MDM Server starts. MDM treats a positive value as the number of seconds to delay prior to loading so that other processes, especially a DBMS, can fully initialize.

To load some but not all of the mounted repositories, place the line Autostart=False in the specific repository section of the mds.ini for each repository you don't want to load.

BULK_IMPORT_SILO_LEVEL

Hidden, development-only setting. If visible, should be set to 3. Using a lower value turns off optimized SQL access methods and reduces performance. Default is 3.

Client Ping Timeout Minutes

Number. Causes the MDM Data Manager (Client) to send a ping to the MDM Server after the specified number of minutes of inactivity. Use to keep the socket connection active on networks where inactive sockets are killed by the system. Default is 0 (no pings).

See also Inactive Client Timeout.

CPU Count

Number. The actual number of CPUs on the machine where the MDM Server is running. A dual core CPU counts as two CPUs. A hyper-threaded CPU counts as one CPU. Setting this parameter to a value higher than the actual number of CPUs can degrade performance. This setting applies only to building sort indices during the Load Repository operation as of MDM 5.5 SP5 but will be used to tune additional internal operations within MDS in the future.

Database Log Level

0 or 1. Enables (1) or disables (0) database log messages.

DBMS Reconnection Attempts

How many times the MDM Server should attempt to reconnect to a DBMS instance when an existing connection is lost. The wait period between attempts is 3 seconds. Default is 10.

DEBUG\UseAssert

True/False. Causes errors to Assert, which logs them to the Assertion*.xml. Valid when the MDM Server executable is built in DEBUG mode.

Default Interface Country Code

String. The code for the default region to use for the default interface language.

Default Interface Language Code

String. The code for the default language to be used by interfaces to the MDM Server.

Disable Log Retrieval

True/False. Disables retrieval of log and report files by the MDM Console. Applies to all platforms.

Distribution Root Dir

String. The directory path for the root of the fixed directory structure where files associated with the MDM Server's ports are stored and retrieved. Default is Distributions under the MDM Server executable directory.

Duplicate Repositories By Rows Only

True/False. MDM Server does not attempt to duplicate by the whole table method. Should be set to True if you have a SQL Server DBMS and the tempdb cannot grow large enough to accommodate the largest table.

Exclude Originals and Variants

Deprecated as of SP6 Patch 3. Use archive options instead (See Archive Options Dialog)

Extra DBConnection

Validation

True/False. Makes sure that the DBMS connection is live prior to every DBMS request and silently restores it if necessary. Useful for the small minority of MDM installations where the network connection between the MDM Server and the DBMS is unreliable and frequently lost. Improves reliability but slows the MDM Server.

IBMDB2 Best Block Size

Number. Number of records the MDM Server attempts to insert/modify at a time. Default is 256.

Ignore Mismatched

Images

True/False. Allows repositories to load when Thumbnails or Originals tables do not match main data tables. Use with caution. A mismatch usually indicates you are using the wrong Thumbnails or Originals partitions compared to the Main partition.

Import Slice Size

Number. Number of records which MDS will process as a set upon reception from Import Manager, MDIS, or Import Records API. Default is 2048 (1024 on Windows 32-bit). Higher settings risk out-of-memory failures and failed DBMS transactions. Lower settings reduce MDS performance.

Inactive Client Timeout Minutes

Number. Minutes of inactivity to allow from a client (MDM application, API, service, etc.) before the MDM Server starts sending ping packets to the client. Used to clean up dead connections on networks which kill inactive sockets. If the connection is alive, the pings succeed and nothing else happens. If the connection is dead, the ping attempts ultimately fail and the network informs the MDM Server that the connection is dead. Default is 0 (disabled). See also Client Ping Timeout Minutes.

Lexicon Dir

String. The directory path for the location where lexicon information is stored and retrieved. Default is Lexicons under the MDM Server executable directory.

Log Dir

String. The directory path for the location where MDM log files are stored and retrieved. On Windows, default is Logs under the MDM Server executable directory. On UNIX, default is under the working directory (which is usually the launch directory).

Log Protocol Transactions

True/False. Logs info about every client interaction with the MDM Server. Also logs socket timeouts. Slows the MDM Server.

Log SQL Errors

True/False. Adds SQL error info to the rolling log.

Log SQL Modifications

True/False. Logs every SQL modification of the underlying databases. Should be set to True only at SAP request since this generates a huge amount of rolling log info and slows the MDM Server.

LOG_REC_MATCH_PERF

0 or 1. Enables (1) or disables (0) performance log for record matching

Logged-In Role Maintenance

True/False. Allows role maintenance even when one or more users who are assigned to the role are logged on; otherwise, the maintenance of that role is prohibited until all the users assigned to that role log off.

Mail Server

The IP address or domain name of the mail server which the MDM Workflow feature uses to send notification emails.

Mail SMTP Timeout

Number. Number of seconds the SimpleMail client waits for a response from the mail server before aborting a mail task. The repository is locked during this period. Default is 1.

Master Slave Port

Number. Specifies the port number used for communications between master and slave repositories. Default is 20004. If another value is used, the new value must also be set on the mds.ini files of all MDM Servers where the master and slave repositories are mounted.

Max Initial MB to Retain

Number. Megabytes of modification notification packets that MDM Server will hold in memory while an API client is logging in. Necessary since there is a brief period where an API connection is marked active on the MDM side, but the API Client is not yet ready to receive the modification notifications. If this limit is exceeded, subsequent notifications will be dropped. Default is 4.

Max Large Text Length

Number. Maximum bytes read for Text Large fields during the loading of an MDM repository. If you have Text Large fields that contain more than this number of characters, you need to increase the number to prevent data truncation. Data truncation will cause the load to fail. Default is 100000.

See also Number of Rows Per Fetch.

Max Send Failure MB to Retain

Number. Megabytes of packet send data to any connection that will be held if the entire packet cannot be sent. Slow networks and busy clients are sometimes not able to receive the entire packet from the MDM Server. In these cases MDM will hold onto the data and attempt to resend it every minute. If this limit is exceeded, MDM will terminate the client connection. Default is 4.

MaxDB Best Block Size

Number. Number of records the MDM Server attempts to insert/modify at a time. Default is 2048.

MaxDB\Dll

String. Name of the MaxDB Interface Library (DLL, SO, or SL) for MDM Server. Default base name is libSQLDBC76_C. The extension depends upon the operating system (e.g. dll for Windows).

Maximum DBMS Bind Count

Number. The number of bind descriptors to pre-allocate for queries to the DBMS. Default is 512. If a greater value is required, MDS will write a message to the MDM Server log (in which case, increase the value in 50% increments until the error message is resolved, up to the limit of 4096).

MDS Ini Version

Number. The default and only valid value is 1.

MDS Scone

String. The encrypted password for MDM Console access to this MDM Server. Removing it will remove the password for the MDM Server.

Modifications Dir

String. The directory path for the location where master change log files are stored and retrieved. On Windows, default is Modifications under the MDM Server executable directory. On UNIX, default is under the working directory (which is usually the launch directory).

Number of Rows Per Fetch

Number. Number of rows to fetch per iteration during the loading of an MDM repository. Our tests have shown minimal increase in loading performance by increasing this past 100, however Memory limitation may require it to be lower. The total memory required for loading a table's data will be the size of all fields times this number. Note if you have Text Large fields, the "Max Large Text Length" setting is used as the field size. You may need to lower this value if you have Text Large fields with long strings in them. If the required memory is not available, the Load will fail. Default is 100.

Oracle Best Block Size

Number. Number of records the MDM Server attempts to insert/modify at a time. Default is 2048.

Oracle DBA Username

String. DBMS account (database) name having system privileges. Default is SYSTEM.

Oracle Tablespace Files

Number. When an MDM repository is created, determines how many files are used for each repository partition. This allows the repository to grow beyond the default file limit. This limit is a function of block size times 4,194.303 which for the default block size of 8KB is 32GB. Block size is determined when the database is created; if you expect to exceed that limit, set this greater than one so that MDS will handle the growth automatically, saving your DBA the trouble of adding tablespace files later. Default is 1.

Oracle\Dll

String. Name of the primary Oracle Call Interface Library DLL for MDM Server running under Windows. Default is OCI.DLL.

Protect Family Nodes with Locked Data

True/False. When a family node is locked in the Publisher, MDM will reject any record modification which would result in that family node being deleted.

Protocol Log Level

0 or 1. Enables (1) or disables (0) protolog log messages.

RELEASE\UseAssert

True/False. Causes errors to Assert, which logs them to the Assertion*.xml. Valid when the MDM Server executable is built in RELEASE mode.

Remove Unchanged Field Modifications

Deprecated in MDM 5.5 SP6 P3. Use Skip Unchanged Records instead.

Report Dir

String. The directory path for the location where MDM report files are stored and retrieved. On Windows, default is Reports under the MDM Server executable directory. On UNIX, default is under the working directory (which is usually the launch directory).

Report Progress

Percentage 

Number from 0 to 99. Writes a line to the report file indicating progress each time the specified percentage of a table has been processed, in addition to the line indicating that processing the table is complete. Useful for time-consuming table operations. Default is 0 (does not write intermediate progress lines).

See also Report Progress Threshold parameter.

Report Progress

Threshold

Number. Minimum number of rows a table must have before MDM will write lines to the report file based on the setting of the "Report Progress Percentage" parameter. For object tables, MDM uses 1/50th of the specified number as the threshold. Default is 1 (all tables regardless of the number of rows in the table).

See also Report Progress Percentage parameter.

Reserved Memory Megabytes

Number. Used only for Windows NT/2000. Specifies either a percentage or an exact amount of total physical memory that the MDM Server will request from the system at startup. Once the MDM Server allocates memory above this point (which occurs when repositories are loaded and in use), Windows will not reduce MDM's portion of memory below this point by paging. A number from 1 to 100 indicates a percentage; settings above 100 indicate the exact number of megabytes to reserve. Default is 0. When 0 or not specified, MDM accepts whatever minimum memory is allotted by Windows (usually 200 MB).

This setting is provided to overcome Windows' tendency to take back physical memory even when no other process is demanding it, reserving it for unknown programs that might run at a later time and resulting in MDM paging even while a large amount of idle memory is available. Since MDM should be the principle or only significant application running on a Windows server, it is unnecessary to allow Windows to keep such a significant reserve.

Session Timeout Minutes

Number. Causes MDM Console, CLIX, and applications based on the new Java API to expire after the specified number of minutes elapses. Default is 14400 (24 hours). When set to 0, sessions never time out.

Skip Unchanged Records

True/False. Records updated by an import will not receive a change timestamp if their field data values are not changed by the import. Import- and syndication-tracking timestamps will also be skipped. Skipped records are recorded in the Import log.

SLD Registration

True/False. Registers the MDM Server with the SAP System Landscape Directory.

SQL Server Allow Windows Authentication Mode

True/False. Allows the MDM Server to attempt to connect to a DBMS via Windows Authentication Mode when a DBMS username is not provided. SQL Server only.

SQL Server Best Block Size

Number. Number of records the MDM Server attempts to insert/modify at a time. Default is 256.

SQL Server DBA Username

String. DBMS system database name. Default is MASTER.

Stemmer Install Dir

String. The directory path for the InXight installation directory.

String Resource Dir

String. The directory path for the location of the MDM Server string files. On Windows, default is LangStrings under the MDM Server executable directory. On UNIX, default is under the working directory (which is usually the launch directory).

Valid Keyword Chars

String. The valid characters for keyword indexing. Default is abcdefghijklmnopqrstuvwxyz0123456789.

Value Retrieval Threshold

Number. Limits the total number of data field elements that MDS will attempt to send back for operations that request multiple records. The formula is number of records * number of fields * number of languages. If the result exceeds this threshold, MDS returns a failure.

Verify Attribute Linkage

Nocheck/Check/Relink/Delete. Determines how Verify Repair should handle attribute values that correspond to unlinked attributes. Nocheck does nothing. Check identifies orphan attribute values. Relink identifies orphan attribute values and relinks the corresponding attributes. Delete identifies orphan attribute values and deletes them.

Wily Instrumentation

True/False. Whether or not Wily Instrumentation monitoring is enabled.

Wily Instrumentation Level

Level of monitoring to perform. Default is 1.

Workflow Detailed Report

True/False. Writes detailed log information into the Workflows report. Used primarily for debugging.

SAP Developer Network Latest Updates