Showing posts with label SAP MDM Tutorials. Show all posts
Showing posts with label SAP MDM Tutorials. Show all posts

Getting started with SAP NetWeaver MDM

The Beginners Guide for MDM

Master Data Management - some common information

The following chapters provide common information about MDM in general. There is no link to a specific MDM software or solution.

Surely SAP is not the only source of truth for MDM. The following links provide general thoughts and insights about MDM.

MDM Scenarios

There are three major generic scenarios for handling master data: Master Data Consolidation, Master Data Harmonization and Central Master Data Maintenance.

Master Data Consolidation

The main target is to consolidate master data for companywide analysis and reporting. Master Data Consolidation (MDC) defines the required processes for cleansing various types of master data in an interactive manner in particular within a heterogeneous environment. After consolidation, it stores the information from different systems in a centralized repository. Related rich content can be included to augment the data store. Consolidated data is required to leverage the results, e.g. to perform enterprise-wide analytics like global spend analyses. The common process consists of the steps:

  • load master data - including its related reference data - into MDM
  • enrich and cleanse master data in MDM
  • distribute related key and/or grouping information to BI for analytics

MDC does not include the re-distribution of master data records into their original source systems.

Master Data Harmonization

The main target is to send consolidated master data back to its source systems. Master Data Harmonization (MDH) enables consistent data maintenance and distribution to ensure permanent harmonization of master data. Using global attributes, you can ensure that all systems receive the same master data during distribution - and enrich the distributed data objects with additional attribute values in the target systems. Distribution can be controlled, visible, and traceable, with active status management at each distribution step. MDH defines the required processes for cleansing and distribution of various types of master data in an interactive manner in particular within a heterogeneous environment. It includes leveraging the results to perform enterprise-wide analytics like global spend analyses. The common process consists of the steps:

  • load master data - including its related reference data - into MDM
  • enrich and cleanse master data in MDM
  • re-distribute master data back to the original source systems
  • distribute related key and/or grouping information to BI for analytics

MDH means that you can only update already existing master data records in their original source systems. You can neither create new master data records in any source system based on the consolidated data nor create new records centrally in MDM.

Central Master Data Maintenance?

Central Master Data Maintenance (cMDM) supports companywide quality standards by ensuring that central control of master data begins as soon as the data is created. Centrally created master data can subsequently be distributed to client systems as required using interactive distribution. cMDM defines the required processes for the central creation of various types of master data in an interactive manner in MDM and the distribution of the data into a heterogeneous environment. It includes leveraging the results to perform enterprise-wide analytics like  global spend analysis. The common process consists of the steps:

  • create master data centrally in MDM
  • distribute master data to remote systems to create new or update existing records in the systems
  • distribute related key and/or grouping information to BI for analytics

cMDM requires usually the execution of the MDH scenario at least once to initialize the central MDM system with the already existing master data of the current environment.

SAP NetWeaver Master Data Management - what is in it?

SAP NetWeaver MDM helps you to consolidate, store, and enhance master data by improving data governance, data quality, change management, and alignment with business goals.

SAP offers a comprehensive portfolio of services to help you through all phases of your master data management (MDM) life cycle. The portfolio delivers consulting services from a corps of experienced SAP experts, a full curriculum of MDM training courses, and a field-tested roster of tools, methods, and best practices to ensure secure, efficient, and cost-effective deployments. We blend services at a level of engagement that makes sense for your organization. You benefit from targeted services and a foundation of knowledge to support all aspects and phases of your SAP NetWeaver MDM project.

SAP NetWeaver MDM services...

... deliver real value to your organization, including:

  • Effective MDM projects - Our comprehensive portfolio of services helps you through all phases of your MDM life cycle. You have the capabilities you need to plan, build, and run your MDM solution.
  • Efficient MDM deployments - Proven tools, methods, and best practices ensure secure, efficient, and cost-effective deployments.
    Enhanced knowledge and skills - Our full curriculum of training courses enhances your understanding of MDM. Qualified instructors cover all technical and practical issues. You can take courses at training centers around the world or take advantage of onsite training for flexible scheduling and substantial cost and time savings. E-learning options offer remote access to SAP training systems.
  • Optimized MDM performance - SAP provides maintenance offerings and service-level agreements to meet your business needs. Delivered by a global team of experienced specialists, SAP support delivers the advice, action, and knowledge transfer you need to maintain and continuously improve your MDM solution.

SAP NetWeaver MDM Software...

... is an enabling foundation for enterprise services and business process management - providing a single version of the truth for customer, product, employee, supplier, or user-defined data objects.

Working across heterogeneous systems at disparate locations, SAP NetWeaver MDM ensures cross-system data consistency through interactive distribution. It integrates business processes across the extended value chain, delivering features and functions to enable:

  • Support of the scenarios Consolidation, Harmonization and Central Maintenace as defined above.
  • Administration of master data - Manage master data without custom code. A powerful interface supports administrative tasks such as data exception handling and assignment of role-based access to business processes and information. Data managers use the interface to configure data source merging, business rules, and distribution details to downstream applications.
  • Management of internal content - Collect and centralize all your content - including parametric information and rich content such as images, paragraphs of text, PDF documents, and organizational intelligence about content - in an enterprisewide repository.
  • Catalog search - Deploy intuitive interfaces that help you locate items internally, publish Web catalogs on e-commerce storefronts and in supplier enablement programs, and integrate easy-to-search catalogs into e-procurement solutions - all from a centralized repository, and all at speeds surpassing normal SQL-based queries.
  • Print catalog customization - Disseminate product information directly from a centralized catalog repository to popular desktop publishing programs, and automatically generate fully formatted and populated page layouts.
  • Multichannel syndication of product catalog content - Publish restructured and reformatted extracts or incremental updates of your product catalog content - and distribute them to trading partners in several delimited text and XML formats - on an unscheduled or regular basis.
  • Business process support - Enable communication in a heterogeneous environment, and insert master data into other systems. Details are part of the Master Data Integration pages in SDN.
  • Business analytics and reporting - Leverage synchronized data for reliable analysis and accurate reporting.

How-to continue?

Find even more information on the main Getting Started with SAP NetWeaver MDM SDN page. A good place to drill down deeper into the world of MDM are the MDM eLearnings on SDN. Finally you can go on browsing the SAP Master Data Management Wiki, check the FAQ about SAP NetWeaver MDM and Links sections and/or have a look at the different Wiki space categories to improve your MDM knowledge.

SAP Customers Talk MDM: Part Five - Surgutneftegas

The Customer: Surgutneftegas

Surgutneftegas, one of the largest oil and gas producers in Russia, employs innovative technologies and automated processes. To create an IT infrastructure and facilitate its transition to service-oriented architecture, the company deployed sophisticated SAP® software and technology. By helping to raise data quality, the solutions allowed Surgutneftegas to optimize inventory, reduce purchasing costs, and achieve a dramatic return on investment.

“SAP Consulting professionals helped us build a solution that supports our business processes with current, consistent master data. This project is the first step in building our new, service-oriented IT architecture.”

Rinat Gimranov, CIO, Surgutneftegas

Key software components involved:

  • SAP NetWeaver Master Data Management
  • SAP NetWeaver Portal
  • SAP NetWeaver Process Integration
  • SAP NetWeaver Business Warehouse
  • SAP ERP

For complete scenario and implemetation details, click the link above.

Stay tuned for the next customer example.

Regards,
Markus

Markus Ganser Active Contributor Gold: 1,500-2,499 points is a solution manager in SAP Master Data Management (MDM)

SAP Customers Talk MDM: Part Five - Surgutneftegas

URL: http://download.sap.com/SMIGlobal/download.epd?context=DCF9E9612960FA8B7BA9A70B0DD5CE4718181F29E6160F3709AE270A81012D2784E4F4C9632D76375832B546A8B3A6D5CF93E336B8FE68BD

 

The Customer: Surgutneftegas

Surgutneftegas, one of the largest oil and gas producers in Russia, employs innovative technologies and automated processes. To create an IT infrastructure and facilitate its transition to service-oriented architecture, the company deployed sophisticated SAP® software and technology. By helping to raise data quality, the solutions allowed Surgutneftegas to optimize inventory, reduce purchasing costs, and achieve a dramatic return on investment.

“SAP Consulting professionals helped us build a solution that supports our business processes with current, consistent master data. This project is the first step in building our new, service-oriented IT architecture.”

Rinat Gimranov, CIO, Surgutneftegas

Key software components involved:

  • SAP NetWeaver Master Data Management
  • SAP NetWeaver Portal
  • SAP NetWeaver Process Integration
  • SAP NetWeaver Business Warehouse
  • SAP ERP

For complete scenario and implemetation details, click the link above.

Stay tuned for the next customer example.

Regards,
Markus

Markus Ganser Active Contributor Gold: 1,500-2,499 points is a solution manager in SAP Master Data Management (MDM)

What were you doing ten years ago?

Ten years is a pretty long time in the world of modern technology. Today I had reason to think back to what life was like 10 years ago in 2001. I decided to look up what was going on in the tech world and found that in 2001:

  • Apple introduced the iPod
  • Dell become the largest PC maker
  • Hewlett Packard announced plans to buy Compaq
  • Wikipedia was founded
  • Microsoft released the original Xbox game console
  • And most importantly, Sybase Afaria was the market leading mobile device management solution

Back in 2001 enterprise mobility was a bit different too. Even though I worked for Sybase, I didn’t have a cell phone, there were no smartphones or tablets, I didn’t even carry a slim lightweight laptop. For enterprises, mobility was mostly about task workers – and mobility management was about controlling laptops.

A lot has changed over the past ten years. But one thing has remained the same – Sybase Afaria has been the mobile device management (MDM) market leader every year from 2001 through 2011. In fact, today, we announced that leading IT market research advisory firm IDC, in its recent Worldwide Mobile Device Management Enterprise 2011 – 2015 Forecast and 2010 Vendor Shares report, recognized Sybase as the leader in mobile device management (MDM) enterprise software market for the tenth consecutive year.

For well over a decade, Sybase Afaria has transformed enterprises and managed mobility partners around the world by empowering the mobile workforce. Sybase Afaria enables enterprise IT to fully manage and secure a wide range of personal and corporate smartphones and tablets, including Android, iOS, Windows Mobile and more, in both hosted and behind the firewall environments. Additionally, Sybase Afaria's seamless over-the-air (OTA) delivery of in-house and publicly available apps, combined with on-device enterprise portal technology, ensures that only approved mobile users are granted access to internal applications and data.

I encourage you to check out the press release on the IDC report, where IDC evaluated over 20 MDM vendors. You can also find a link to purchase the full IDC report here.

Milja Gillespie Active Contributor Platinum: 2,500+ points is director of mobility at SAP and drives thought leadership programs for the company’s leading edge mobility products.

You’re not the only one wearing a fireman’s hat: tactical to strategic treatment of information is still a struggle

I’ve been to quite a few customer meetings lately, which is awesome. Best part of the job. One consistent theme is that many groups are still failing to get traction for information management initiatives.  A few anecdotes:

  • CompanyA said that they were so decentralized that every individual region just did whatever they wanted in regard to the treatment of information, from each region having different ERP systems (from different vendors) to no global roll-up of key information elements. Now the company says they want to be strategic, but the mountain is *high*. Not only will they have major organizational change management issues to deal with regarding regional independence, but the amount of disparity in the information they have seems never-ending.  They are struggling to even identify which key information elements to start with.
  • CompanyB said they that they have started strategic information management projects multiple times in the last two years, but every project has failed. Why? CompanyB was playing eenie-meenie-minie-mo to pick an executive sponsor. That sponsor would read some analyst reports and talk to a few internal people. From there, grand (but shallow) plans were developed, full of statements like “data is an enterprise asset”. But without a solid, tactical execution plan—and an executive with a deep understanding of how information feeds an enterprise—these grand plans gradually fizzled and are now gathering dust.
  • FireCompanyC said that there has been so much business volatility that they are barely treading water in IT. Projects aren’t planned strategically…there are always fires to put out. IT and Business are constantly rushing to simply put out the largest fire of the week. Consequently, each fire is put out as quickly as possible (it IS a fire, after all). The short-term impact to this approach is obvious, but also consider this: business units accustomed to dealing with fires have more trouble understanding why they need to be strategic in their approach to information management. They don’t understand how information propagates through the organization and has long-term effect, because, after all, the fire from last week is no longer burning. Something must be working.

In all of these cases, some employees can see the problems. They show up to workshops and conferences and you can clearly see the frustration on their faces. Their questions are not technical. They are not asking about Hadoop integration. They want to know how to get their organization to see that the poor information virus has not just made the person in front of them sick, but has become an epidemic.

If only I were that smart. Every organization is different, and the answer to the question depends so much on politics and organization makeup (for more, see the blog entry “Refrigerator smells and information governance”) .  However, there a few approaches to follow:

  1. DQ Tales of WoeKeep a  Data Quality Tales of Woe notebook, as recommended by Maria Villar. In this notebook, capture the stories you hear around your organization. Also note who told you the story and the impact of the bad data. Then, as you have time, track down the real root cause of the data problem. Finally, go back to the story originator and tell them what you’ve discovered. At that point, you’ve converted them into a Friend of Data. (And yes, I’m considering adding FoD to my signature. Why should Hall of Famers be the only ones with HoF? FoD is good enough for me!)
  2. As the notebook fills up, you should be able to notice some common themes. Start rolling up the Tales of Woe into multiple theme areas (i.e. Customer, Vendor or Material; BI, Business Process, or Marketing)

Guess what you just did? You captured a great list of information problems and their real business value. No guessing…straight from the horse’s mouth.

  1. Now look at which area has the most damaging Tales of Woe. Armed with your notebook and business language to describe your problem, start to work your network . Eventually, you should get to an upper-level manager in that area that can help you get executive sponsorship.

Only that executive manager can accomplish these key tasks for you:

  • Establish funding and multi-year commitment to an information program.
  • Spearhead incentive programs that reward data creators and consumers for the right behavior (for example, are call centers rewarded for how fast they key in the information, instead of how accurate the information is?).
  • Drive the organizational change management across the company. Without this, your program will not work. Did I say that out loud? True.

Give it a try. Everyone likes to tell Tales of Woe, so starting should be easy!

Ina Felsheim Active Contributor Platinum: 2,500+ points is a Solution Manager focusing on EIM.

Key EIM and Information Governance sessions for SAP TechEd 2011

Markus Ganser wrote a great blog on some key EIM sessions at SAP TechEd. I’m going to highlight a few from the TechEd Las Vegas session, because that’s where I get to go. J I’m also including Expert Networking Sessions and Pod discussion topics from the show floor. Because Information Governance spans so many technology areas, I thought I’d pull them together for you.

Pre-conference seminar: Newly-added SAP Master Data Governance session on Monday.

Key Hands-On sessions

  • EIM266: Next Generation Archiving: Extend Compliance in Your Corporate Environment
  • EIM161: Using Business Content Extractors in SAP BusinessObjects Data Services
  • EIM164: SAP Data Migration Solution Overview: Best Practices Hands-On
  • EIM163: Profile and Create Data Quality Scorecards to Understand the Health of Your Data
  • EIM267: The Importance of Cleansing and Standardizing Product Data
  • PMC265: Accelerating Business Rules with SAP NetWeaver BRM
  • PMC263: Process Analytics with SAP NetWeaver Business Process Management
  • EIM260: Getting Started with SAP Master Data Governance
  • EIM262: SAP NetWeaver Master Data Management and SAP BusinessObjects Information Steward

Key Lecture sessions

  • SAP BusinessObjects Data Services 4.0 and Beyond
  • EIM102: SAP Data Migration Solution Overview: Best Practices
  • EIM204: Connecting ECM to Business Processes: Evolving Needs and Technologies
  • EIM100: Enterprise Information Management Overview
  • PMC228: IDEXX Runs Marketing Points Program on BRF Rules Engine
  • PMC228: Applied NetWeaver BPM and BRM in Agile and High Volume Scenarios
  • PMC102: Business Rules Management with SAP: BRFplus and SAP NetWeaver BRM
  • EIM203: SAP BusinessObjects Information Steward 4.0 Product Overview
  • EIM112: Strategies and Tools to Ensure the Quality of Your SAP Data
  • EIM114: Information Governance: Reducing Costs and Increasing Customer Satisfaction
  • EIM211: Showcasing MDM Workflow Integration with BPM and Data Services
  • EIM106: SAP NetWeaver MDM for Customer Data Integration: Rapid Delivery in Eight Weeks
  • EIM201: Applying Information Governance in End-to-End MDM Scenarios

Pod topics on the SAP Show Floor

  • EIM-P11: Access and transform data from any source (Tuesday at 10am)
  • EIM-P01: SAP BusinessObjects Data Services Roadmap (Tuesday at 11am)
  • EIM-P19: SAP NetWeaver Information Lifecycle Management (Tuesday at 12pm)
  • PMC-P06: Business Rules Management with SAP (Tuesday at 2pm)
  • EIM-P17: Master Data Management at SAP (Wednesday at 10am)
  • PMC-P02: SAP NetWeaver Business Process and Rule Management (Wednesday at 1pm)
  • EIM-P14: Support your Information Governance program with Information Steward (Thursday at 2pm)

Expert Networking Sessions

  • Tuesday at 2pm: Guidelines on Starting a SAP Data Archiving Project (Karin Tillotson)
  • Wednesday at 11:30am: Build your own network with SAP StreamWork (Sharon Haver)
  • Wednesday at 1:30pm: Getting started with Information Governance (me!)
  • Thursday at 6pm: How to organize and deliver a business rules project (Carsten Ziegler)

PLEASE join me in the Influence Council session for EIM350: Enterprise Information Management on Tuesday afternoon.

To get a complete picture of the lectures and hands-on sessions offered in the Enterprise Information Management track, click the EIM session overview link and select the topics that you are interested in and would like to attend. To display the overall education and session catalogue, click the URL at the top of the blog.

Mark your calendars! Join us at a lecture, hands-on session, Expert Networking Lounge, or Expert Pod. Find us when we're there by sending a tweet to @InaSAP, @SAPMDMGroup, @SAPILM, @SAPECM, or @SAPBOEIM. 

Ina Felsheim Active Contributor Platinum: 2,500+ points is a Solution Manager focusing on EIM.

Buy vs Build MDM (Master Data Management) Solution Goel Ankur

I believe this is an interesting topic to debate. During most of my pre-sales engagements, we touched upon this point as well. I was part of developments based on ABAP and also part of configurations/implementations projects. However I was never had an opportunity to work on any development project which built a complete solution from scratch and hence like to clear my understanding for the same.

In my earlier blogs also I mentioned that the problems with data existed for long term and there was many solutions catering to those challenges and requirements. However for the master data challenges and requirements; the MDM solutions which are currently available in market didn’t exist a decade ago. Definitely to cater and solve master data problems, organizations had to take lead to develop in-house solutions. When the MDM solutions were created, many organizations were already maintaining their own MDM solutions. We will be able to discuss some of the issues organizations had to face in maintaining such solutions. Moreover Master data solutions were more on the business requirement and decisions than the IT requirement as Gartner identifies MDM as a 'technology-enabled business discipline'. Hence many organizations still believe that it is better to build their own MDM solutions since they know their master data and issues related to it better than the solution providers and also they have to invest considerable time and effort in configuration of MDM solutions as per their requirements. I definitely agree that nobody understands the master data and issues related to it better than any their own organizations. However it takes considerable efforts, directions and knowledge to develop their own MDM solutions. Also unfortunately many organizations over run the budget and timelines in developing the MDM solutions. This is largely because of the fact that a MDM solution is not only data quality issue but it’s a quite complex solution relating data quality, governance, management, process workflows and integration of the solution across the landscape catering to all the different systems. Moreover once the solution has been built, it requires resources to maintain such system. This might be more costly and requires more efforts to have exclusive resources than existing MDM solutions where the experienced resources are readily available.

Below are some key considerations are important to be considered while making a decision:

  • Web Services, SOA enabled and EIM
  • Data volumes
  • Hierarchies in master Data
  • Dimensional and Domains of Master Data
  • Duration
  • Budget
  • and Resources (Build and Maintain)

It makes sense that the organizations which already have their own MDM solutions and as long as it is supporting all the requirements and issues then there is no specific need to migrate to other solutions. However if the organization is not happy or looking to enhance their own solution or still don’t have a solution in place then it might be better to explore the various MDM Solutions and implement it as per the requirement.

Moreover it’s important to take note that many MDM solutions are currently existed in market for more than 5 years and quite improved a lot. Definitely they are on right track to provide an integrated solution.

Testing and Monitoring an Interface Between MDM & XI Part 2

clip_image001

· Select a message and press Display.

clip_image002

You may notice that I have selected a message that coantains an error and did not actually reach it's destination. In Call Adapter -> SOAP Header take a look at Error. If you double click that button a screen will appear on the right hand side that shows the details of the error.
clip_image003
This error tells us that something is wrong with the IDoc Adapter. It tells us that transaction IDX1 contains errors, but in this case the error is actually in the configuration of our communication channel, in which we have made reference to the wrong Port. If you select Call Adapter -> Payloads you can see the content of the XML message that came from MDM.
clip_image004
If you go back to SXMB_MONI you may want to also take a look at the Processing Statistics program that will show a good overview which can be helpful when testing your interface with thousands of materials.
clip_image005

3. Testing

Now we're going to go ahead and test out the interface from end to end. I'm assuming that by now you have turned on the MDM Syndication Server and your XI interface is activated in the Integration Directory. Lets log into the MDM Data Manager and create a new material for testing purposes.

· Right click -> Add

clip_image006

· Enter enough data to satisfy your interface requirements (ie: which fields must be populated?)

clip_image007

· Click on another material to save changes

· Close the MDM Data Manager

· Turn on your MDM Syndication Server (if it's not already turned on)

If your Syndication Server settings have been configured correctly then we can assume that because you added a new material to the data manager, it will now syndicate as soon as your interval cycles through (set in the mdss.ini file on your server). Lets go ahead and move over to the Exchange Infrastructure Runtime Workbench to see if it has processed our message. Keep in mind, depending on your interval time it may take a few minutes. Hopefully you should see something like this:
clip_image008
If the runtime workbench shows the message transferred successfully then lets log ino ECC and see if the IDoc was posted.

· Log into ECC system

· Run transaction WE02

clip_image009

· Press F8

· In the left hand pane, select Inbound IDocs -> MATMAS

clip_image010

· In the right hand pane, select the IDoc that just transferred and double click on it

· In the IDoc display, on the left hand side expand E1MARAM and select E1MAKTM

clip_image011

· Verify that the material data is correct

clip_image012

· Expand Status Records -> 53 and double click the only record available

clip_image013

· In the pop up window, copy the message number that was issued to the IDoc

· Press Proceed

· Paste the message number that you copied

clip_image014

· Press F8

clip_image015

You may notice that my image says material 11696 created. This is because a modification was made to an ABAP program to create a material when an IDoc is processed with a certain code. In this blog, the ABAP modification is out of scope, but I'm assuming if you are familiar with ALE then this process should be familiar as well. In any case, this is not a permanent solution, just a temporary solution to finish our prototype. If we take that newly generated material number and run transaction MM02 we should be able to pull up the details on that material.
clip_image016
Press Select Views and select Basic Data and continue.
clip_image017
Hopefully if all went as planned, the material should have transferred smoothly, with no loss in data. This concludes the three part series on MDM and XI. Thanks for reading, hopefully it helps!

Testing and Monitoring an Interface Between MDM & XI Part 1

2. Exchange Infrastructure

Now we'll take a look at the second half of this scenario and test out our XI interface.

2.1 Check Configuration

The only configuration we are going to check is the outbound communication channel. This is what tells Exchange Infrastructure where to pick up what file (location, filename) and do what after it's processed by the inbound communication channel (processing mode, ie: delete).

· Start your Integration Directory (Integration Builder: Configuration).

· Navigate to your outbound communication channel.

· Examine your File Access Parameters.

clip_image001

In my case, because this is a test scenario, I have a bash script picking up the file from the port directory and dropping it onto a drive that all of the SAP systems have access to; this being the /depot filesystem. As you can see I made a temporary folder on that filesystem for the files for this interface to be stored while waiting to be processed. Of course, the simplest way to do this would be to mount the Port directory from your MDM machine to your XI machine. Next take a look at your Processing Parameters and change the settings accordingly. For this particular scenario I have set the poll interval to 5 seconds for testing purposes. Also, notice that I am using delete as the processing parameter. This is so that I can verify that the file was processed, and so the folder doesn't get cluttered up with files.
clip_image002
If everything is the way you want it, lets go ahead and take a look at some important locations that will come in handy for testing and debugging the interface.

2.2 Important Locations

2.2.1 Integration Repository - Map Testing

Start the Integration Repository (Integration Builder: Design) and navigate to the map that we built in Part II. Select the Test tab.
clip_image003
To test our map, we can actually use the XML document that MDM generated via the Syndication Server. Lets go ahead and try this.

· Press the "Load Test Instance" button.

clip_image004

· Select the XML file MDM generated.

clip_image005

· Press the "Start Transformation" button.

clip_image006

If everything went smooth then you should see a pop up screen that says "Executed successfully". Otherwise you will recieve an error to which you can begin your debugging process.
clip_image007

2.2.2 Runtime Workbench - Component Monitoring

The runtime workbench is one of the most powerful and useful features of Exchange Infrastructure. Here we can get very detailed descriptions of errors that may occur with each component of XI. The component that we will want to pay particular attention to is the Adapter Engine.

· Log into your runtime workbench and select Component Monitoring -> Display.

clip_image008

· Click the Adapter Engine link.

clip_image009

Here you can view the status of the adapter. If there is an error in your configuration of a particular adapter it will show up here.
clip_image010

2.2.3 Runtime Workbench - Message Monitoring

Follow a similar procedure to display that Message Monitoring.
clip_image011

· Select your time filter, in this case I will select the last hour.

· Press Start.

clip_image012

You can now see the list of messages that have been processed by the Adapter Engine over the last hour. On my system only one message has been processed in the last hour. You can press either Details or Versions to view more information about any particular message that was processed.
clip_image013

2.2.4 Integration Engine - Monitoring

This is a particularly useful component of Exchange Infrastructure that allows us to view various aspects of the messages that get processed. Lets start by logging into the XI system and taking a look.

· Run transaction SXMB_MONI.

clip_image014

· Double-click Monitor for Processed XML Messages.

· Press F8 or the Execute button.

First look at SAP's Data Migration Solution

Recently my role in SAP has changed a bit, still in solution management but now focused Data Migration.   My first steps have been to learn the scope of the topic, what is going on with the topic, who's involved, all that good stuff.  Being a big fan of SCN I went to check out who is posting what and saying what about data migration.   I expected to find a lot of discussion on the topic since data migration is a common topic and really a practical requirement for implementation, instead I found only a few discussions, and even fewer discussions by customers.   (Please send any good links you have!  I did find a few good posts, but compared to other topics I've covered, it seemed there was very little information on the topic! )  Maybe it's not talked about because it's not seen as much ‘fun' as new development, or maybe no one has taken the time to write about their experiences. Either way, I hope to share what I'm learning and hear about what you already know on the topic! 
In this first blog I'll discuss what I've learned so far about SAP's provided solution for Data Migration.   Then, I'll continue to blog about working with the tools, how hard/easy they are to learn, and effort it takes to up-skill on our offerings for Data Migration.  So, first things first, let's talk about what SAP offers for data migration projects:
Option 1 - New SAP customer who's migrating data into an SAP system.   Also, current customers who are bringing in new plants, new business units, etc., and need to migrate data to an SAP system. 
This is the option I am currently exploring and downloading relevant software on my laptop, so this is the one I'll be blogging on moving forward.  SAP provides software, primarily Data Services, there is some other software too (like the Metadata Manager and some reports, but Data Services is where the bulk of the work takes place).  This is good news for me!   Last year I did some blogs and e-learning on SAP101 for Business Objects experts, and I've been trying to up-skill on the Business Objects offerings.   Before this position I had dabbled in Data Services.  I downloaded it and got a small transform working moving data from SAP.   I was impressed because it only took me about ½ day to go from knowing nothing, to reading something, downloading, installing, and getting a small job running.   So, I was glad I'd be learning more about Data Services!   If you are an expert in SAP and want to learn about Business Objects offerings, maybe you should start to work with Data Migration!
In addition to software, SAP also provides Best Practice Content.  I've always known that we have a Best Practice organization.  I worked with them previously on some Best Practice content for SAP NetWeaver Process Integration, but I didn't really understand the depth of the offering.  If you go to the help portal, there is a tab for "Best Practices".   In the Overview Page you'll notice it says "SAP Business All-in-One", but don't let that scare you away from exploring the content.   The content for Data Migration can be found under the Cross-Industry Packages.  Then select Data Migration.     This is like finding a gold mine for data migration!   Or maybe an encyclopedia would be a better description!  This content has everything you need to get started on migrating non-SAP data to an SAP system.   In a subsequent blog I'll walk through the steps I took to read, learn, download, install, but for now, the important thing to know is that content plays a big role in fast development.  The content includes the following:  guides to install data services and other components required for the migration, actual content to load that includes jobs to load data into SAP via IDOCs, mapping tools to help you map the non-SAP data to the IDOC structure, and some reports.   It includes IDOC mapping and structures for objects like material master, vendor and customer masters, pricing, BOM, cost element, some receivables content.   I haven't explored all of it in detail - but I did notice once I installed the content and documentation I had a whole folder of word documents on each piece of content, for example a document on Material that is a 39 page word document, covering the IDOC structures, what you need to know, and how to map data to the structure.    At this point I have data services installed and the content installed -but I'm still checking out everything that gets delivered. 
The third item delivered by SAP for data migration is services.  Of course, data migration is more than the software and technical tool-set, you also need a methodology for the project, testing, ensuring project success, etc.  SAP provides that service - we have an expert consulting community ready to assist. SAP Data Migration Services consist of a framework, templates, methodology, tools and expertise to analyse, extract, cleanse, validate, upload and reconcile legacy data into a SAP ERP environment.  SAP Data Migration Services provide a mature information management infrastructure and enables data governance best practices that live on after the project.  SAP's Data Migration Services extend  the content delivered and the software provided with additional templates, a methodology, and  the holistic view of how the data migration project fits in with the overall SAP implementation and how to use data management/goverance to ensure data is a strategic asset, enabling successful business process execution.
For a full list of EIM available services look here.
If you'd like to see a good demo of data migration example using data services, you can check out the demo available from the service market place.  It's part of the page http://service.sap.com/bp-datamigration.
In addition to migrating data to SAP, the decision must be made on what to do with the source system.  Does it remain, should it be archived?  Maybe the system should eventually be decommissioned to save costs.  SAP also offers capabilities to decommission systems via Information Lifecycle Management.   I'm not an expert in this area, but it is part of the data migration story, so I am trying to learn more about the offering. 

Option 2 - Migrating from one or many SAP systems to another SAP system
In this option maybe you have multiple SAP systems on different releases, so one on 4.6c, 4.7 and you want consolidate to a single ECC 6.0 system.  It could also be the case you don't take everything, but only parts of the data.   As part of this scenario you could do system decommissioning with Information Lifecycle Management as with the previous option.  For the actual conversions of the SAP systems, SAP offers a system landscape transformation offering by the SLO group (Systems Landscape Optimization).  This group specializes in ‘carving out' parts of SAP, like a company code, to move or to convert.  For example, when the Euro was introduced, a new currency was just not added to systems, but some conversion was required as well.  SLO specializes in these sorts of SAP-centric data conversions.  SLO understands the customizing/configuration, meaning in addition to the data it understands the business context for the data, and can convert the data and the associated business process rules and configuration.   I'm learning more about this area and will be happy to share what I learn in the future!
Option 3 - Migrating to other systems, for example master data management or data warehouse
This option includes migrating to any other system.  It could be master data management, a data warehouse or another sytsem.   This option will still use the Data Services software, and it will also use blueprints which are guides for Data Services that include content for common ETL and data quality scenarios.   I haven't investigated the blueprints yet.  If you have experience in this area please post away!
If you're interested in this topic, please blog your experiences; I'll continue to blog mine.   I'll be speaking from the perspective of someone new to data services and migration projects.  I hope you'll join me along the journey!! 

Why machines can learn from peanut butter sandwiches and why they should forget about it over time

You might remember my last interview with Matthias Kaiser about semantics and apples. In that interview Matthias addressed the aspect of objects and how machines need to be able to give a description of an object the correct meaning. In that interview Matthias used the example of an apple that could either be something to eat or could also be a computer.
I met with him again and this time Matthias explained to me two other aspects of semantics. Machines don't just need to know what kind of object (e.g. an apple) they have, but also what kind of context the user has who e.g. searches for specific information. In the business world this context is normally a role of a user in a company. Combining search results of a user with a specific role and using that information for other, maybe similar searches for users with the same role, makes a search algorythm much more precise and relevant for an enterprise user.
But it's not only about storing this information. It's also about being able to let the machine "forget" certain information that might become irrelevant over time.
Listen to Matthias and find out what peanut butter sandwiches have to do with all of this and how these concepts find their way into SAP products.
I really loved talking to Matthias how he explained to me this complex topics and during the interview he also started talking about a new concept he developed called "Unified Information Access". But that will be part of another interview with Matthias.

BRFPlus and MDG – future ahead:

Recently I was going through documentation of MDG and I was wonder struck with the offerings SAP has given, particularly the integration of BRFPlus and SAP Business Workflow with Master Data Governance (MDG).
All the views provided here are my personal opinions and based out of my experience with Data Management and do not necessarily reflect my employer’s. Please treat the inputs in this blog as just opinions and you should definitely seek professional expertise and opinion before making your business decisions.

BRFPlus is one of the core components in this Master Data Governance.
The way SAP has integrated BRFPlus while loading data into MDG is mind blowing. Unlike the traditional way of writing ABAP code for doing validations and derivations, SAP has closely knit BRFPlus with MDG to carry out these operations. This will give flexibility for business users to do modifications to business rules without much dependency on technical team.
BRFPlus can be used to define validation rules and derive values as per the business rules. We can use Rules, Rule sets, Decision tables and trees and many other features of BRFPlus to carry out these activities.
Of course SAP has still given the option of using BADI's for writing these validations and derivations.
Another important feature SAP has provided is the close integration of BRFPlus with SAP Business Workflow in MDG. SAP has given many pre-configured features and standard workflows for this purpose. These will be handy for creating and modifying complex rules in workflows using decision tables and decision trees which will be very effective for the ever changing business demands.
With BRFPlus going to be the core for Business Rules management, and ever increasing usage in SAP applications like CRM, Transportation Management, Social Sector, Tax and Revenue Management to name a few, Techies should focus more on BRFPlus and understand the full potential of BRFPlus.
There is a huge potential for BRFPlus in coming years particularly using BRFPlus in Workflows, Validations and Derivations in MDG, integration with third part decision management systems and many more. No doubt, this will give clients return on investment (ROI) and true value, and more significantly business users are empowered to modify business rules with ease to the ever changing and dynamic business decisions.

Cross System Master Data Processes with SAP Master Data Governance

Applies to:

ERP 6 EhP5 – Master Data Governance for Supplier

Summary

This article provides implementation details for a simplified cross system Supplier On-boarding scenario leveraging SAP’s Enterprise Master Data Management portfolio consisting of SAP NetWeaver Master Data Management (available since 2004) and SAP Master Data Governance (currently in Ramp-Up). The overarching process is modeled using SAP NetWeaver Business Process Management.

Author(s): 

Lars Rueter  
Company :    SAP AG, Germany   
Created on:    4. March 2011
Author(s) Bio
Mr. Rüter works at SAP in the area of SAP Master Data Governance. In the past 11 years at SAP he has held different positions in Asia Pacific and EMEA. He has extensive experience in SAP's Master Data Management product portfolio, Java Development and SAP NetWeaver Portal. Mr Rüter has been involved in a large number of SAP implementations worldwide.

Cross system create supplier process

In our example we will build a cross-system supplier self-service registration and approval process. A supplier registers via a website and enters some initial data such as company name, street, city and postal code. These global attributes are stored in NetWeaver MDM for further distribution to Non-SAP systems. When the supplier is approved by the master data specialist a change request is automatically generated in SAP Master Data Governance. A workflow in SAP Master Data governance ensures that all ERP specific attributes are maintained. After final approval in SAP Master Data Governance the new supplier is activated and distributed. After activation a notification is send to the original requester.


Figure 1: High-level process overview

In figure 2 below you see which systems are part of the process:

  • (A) SAP Master Data Governance – maintenance and distribution of ERP specific attributes
  • (B) SAP NetWeaver MDM - maintenance and distribution of global attributes
  • (C) NetWeaver CE – process runtime, process specific UIs, process worklist, web service consumption and provisioning
  • (D) NetWeaver Developer Studio – process designtime


Figure 2: System Landscape

As mentioned above the process was implemented using NetWeaver BPM for the design and execution of the cross-system process. But we also leverage the out-the-box governance process in Master Data Governance for maintenance of ERP specific attributes.


Figure 3: Technical process overview

The figure above provides a more technical process overview using the Business Process Modeling Notation (BPMN) notation from SAP NetWeaver BPM:

  • (1) The initial supplier registration web page triggers the start web-service of the BPM process
  • (2) The global attributes from the registration web-page are used to create a new supplier record in SAP NetWeaver MDM
  • (3) In this human interaction step a new supplier is being approved
  • (4) BPM calls a SAP Master Data Governance Web Service to create a change request with the initial supplier data. This also triggers a SAP Business Workflow in SAP Master Data Governance.
  • (5) This step in BPM is called an intermediate message event. The process waits for a message to come in from Master Data Governance before the flow commences. Early in the SAP Business Workflow process we have inserted a task to call BPM. In this call we transmit the ID of the change request.
  • (6) BPM uses the change request ID from SAP Master Data Governance to send an e-mail to the original requestor. The e-mail contains a link to the SAP Business Workflow log. Using this link the original requestor can monitor the status of the change request in MDG.
  • (7) After sending an e-mail the BPM process waits again for a message from SAP Master Data Governance. This time SAP Master Data Governance sends a message at the end of the SAP Business Workflow process and after the Supplier has been finally approved and activated. The message includes the final ID of the Business Partner in the primary persistence.
  • (8) The last step in the BPM process informs the original requestor that the new Business Partner has been created and activated.

Implementation Steps

Integration between NetWeaver MDM and BPM has already been sufficiently documented on SDN. In this section the focus is on the integration between SAP Master Data Governance and NetWeaver BPM. Therefore we look specifically at the three integration points numbered step 4, step 5 and step 7 in figure 3 above. In step 4 we show how the inbound communication to SAP Master Data Governance was realized using a standard Web Service. The steps 5 and 7 are technically very similar in the sense that they both use a web service client proxy to transmit process status information from the SAP Business Workflow back to SAP NetWeaver BPM.

Using the inbound Business Partner Web Service

The ESR Web Service used to create a Business Partner in our scenario is called BusinessPartnerSUITEBulkReplicateRequest_In. In order to leverage this Web Service to automatically create a change request and key-mapping in SAP Master Data Governance , the method INBOUND_PROCESSING of BAdI MDG_SE_BP_BULK_REPLRQ_IN in Enhancement Sport MDG_SE_SPOT_BPBUPA has to be implemented.

IF_MDG_SE_BP_BULK_REPLRQ_IN~INBOUND_PROCESSING

METHOD if_mdg_se_bp_bulk_replrq_in~inbound_processing.

DATA ls_user_setting TYPE mdg_user_proxy_setting.
DATA lt_user_setting TYPE mdg_user_proxy_setting_t.
DATA lv_crtype TYPE mdg_sup_change_req.

if in-message_header-business_scope-id-content = 'BPM'.

ls_user_setting-field_name = 'PROXY_PERSISTANCE'.
ls_user_setting-field_value = '1'.
APPEND ls_user_setting TO lt_user_setting.
ls_user_setting-field_name = 'SUPPLIER_CHANGE'.

SELECT SINGLE usmd_creq_type INTO lv_crtype FROM usmd1601 WHERE usmd_process = 'SUP1'.
*
ls_user_setting-field_value = lv_crtype.
APPEND ls_user_setting TO lt_user_setting.


CALL METHOD cl_mdg_bp_bupa_si_in=>if_mdg_upload_proxy~setup_for_file_upload
EXPORTING
iv_instance = 1
* IO_UPLOAD_DIALOG =
it_user_setting = lt_user_setting.

endif.


ENDMETHOD.



The code first checks the scope-id-element in the message header. The SAP Master Data Governance load will only continue if the scope-id-element is set to BPM. The proxy implementation of the inbound service uses the context of the SAP Master Data Governance-file-upload-framework to determine how the incoming data has to be processed. We use the enhancement spot to set the file upload framework context in such a way that the incoming data is stored in the SAP Master Data Governance staging area and a change request of type SUPPL01 (Create Supplier) is being created. If key-mapping information was send as part of the Web Service call, the key-mapping for the new supplier will automatically be updated.


The ABAP code in the Enhancement Sport looks for the first process type SUP1 in table USMD1601 and takes the change request type from that line. In our example LRDEMO will be selected as change request type when the web service is called (refer to table USMD1601 in figure 4 below). You may have to adapt the ABAP code to ensure your custom change request type (as defined in the following section _ Customizing the governance process_) is correctly assigned in the Enhancement Spot.






Figure 4: Table USMD1601


An example XML document to test the web service is attached to this wiki.






Test your scenario – Inbound Web Service


You should now test if the implementation is working. Using a Web Service test tool such as the SAP Web Service Navigator you can call the Web Service. After successful execution you should find a new change request in the POWER -List (Personal Object Work Entity Repository). You can access the POWER-List via the supplier role in SAP Master Data Governance.



Customizing the governance process


Your Web Service is working? Good! Your inbound connection to SAP Master Data Governance is now ready. Next we need to establish the outbound connection to NetWeaver BPM. In our example we extend the governance process for create supplier by two additional SAP Business Workflow tasks. Each of the two tasks sends a message to NetWeaver BPM.



Since we do not want to modify the SAP delivered workflow template and change request type, we first create a copies.





  • Look up the id of the workflow template for Create Supplier: Open MDG IMG activity Create Change Request Type and find the row with change request type SUPPL01 (Create Supplier). In the same row you find the workflow template id for this change request type.




  • Open the Workflow Builder (transaction swdd) and create a copy of the SAP delivered workflow template for Create Supplier (use the workflow template id from the previous step). Do not forget to save and activate you new workflow template.




  • In MDG IMG activity Create Change Request Type create a copy of the SAP delivered change request type SUPPL01 (Create Supplier). Assign the new workflow template id from the previous step to the new change request. In our example we have created a custom change request type LRDEMO which is linked to workflow template WS99900008 (Firgure 5 below).







Figure 5: IMG activity Create Change Request Type





  • In MDG IMG activity Define workflow step numbers create a copy of the rows from the SAP delivered workflow template for the create supplier workflow template. In your copied rows change the workflow template id to the id of your custom workflow template.







Figure 6: IMG activity Define Workflow Step Numbers





  • To ensure the receiver determination will work for your new change request type enhance the BRF+ table GET_AGENT. Start transaction BRF+ and search for MDG as shown in figure 7 below.







Figure 7: BRF+ Search


Navigate to the GET_AGENT table as shown in figure 8. Create one row in the table for each workflow step. In column CREQUEST_TYPE enter the name of your custom change request type. In columns OTYPE and OBJID enter the object type (eg. user / organization) and corresponding value respectively.






Figure 8: BRF+ table get_agent






Test your scenario – custom change request type


This a good point to test your new change request type and workflow template. From the supplier role menu in SAP Master Data Governance choose the Create Supplier menu item. You should see your custom change request type in the drop down box of the create supplier screen. Select your custom change request type. Enter the new business partner details and approve the individual workflow steps until activation. At the end of the workflow you should have created a new Business Partner in the primary persistence. You can check by doing a search for the business partner id. You may have to change the receiver determination in BRF+ to ensure you can approve all the workflow steps.






Test your scenario – inbound web service with custom change request type


If you have confirmed that your new change request type and workflow template are working, repeat the test using the BusinessPartnerSUITEBulkReplicateRequest_In web service. Verify that after calling the web service a change request is created and the associated custom workflow is started. You can use transaction swud to find the workflow instance.



Exchanging process context information between SAP Business Workflow and NetWeaver BPM


In our example we decided to implement the exchange of process context information between NetWeaver BPM and SAP Business Workflow using asynchronous web services. In particular for exchanging the MDG change request id and business partner id. In NetWeaver BPM two Trigger Eventswere created (wsdl1, wsdl2) for this purpose. The consumption side is technically realized by generating an ABAP proxy for each service and calling this proxy from a task in the SAP Business Workflow. In step 5 (figure 3) we are using the Web Service call to transfer the change request-id to the SAP NetWeaver BPM context. In step 7 (figure 3) we transfer the business partner id after activation in the back end. After transferring the change request id step 5 (figure 3) we can use the change request id in NetWeaver BPM to generate an URL to the SAP Business Workflow log for this change request. We use this URL in our BPM process to send an e-mail notification to the original requestor with the link to the workflow log.






Generating a SAP Business Workflow log URL from a change request id in SAP MDG


https:// <HOSTNAME> : <PORT> /sap/bc/webdynpro/sap/usmd_crequest_protocol2?SAP-CLIENT=<CLIENT> &SAP-LANGUAGE=EN&CREQUEST= <insert the change request number here>



For sending the change request id to NetWeaver BPM a task was added after the Set Initial Status of Change Request task at the beginning of the workflow (figure 9). In the object method section of the task maintenance screen (figure 10) a custom class and method is selected. The called method calls the web service proxy to transmit the change request id. To be able to select your custom class from the workflow task it must implement the interfaces BI_OBJECT, BI_PRESISTENT and IF_WORKFLOW (figure 13). You can find more information regarding ABAP OO for workflow in the references section at the bottom of the article.






Figure 9: Create business partner workflow with custom task






Figure 10: Custom task maintenance



For sending the change request id to NetWeaver BPM a task was added after the Set Initial Status of Change Request task at the beginning of the workflow (figure 9). In the object method section of the task maintenance screen (figure 10) a custom class and method is selected. The called method calls the web service proxy to transmit the change request id. To be able to select your custom class from the workflow task it must implement the interfaces BI_OBJECT, BI_PRESISTENT and IF_WORKFLOW (figure 13). You can find more information regarding ABAP OO for workflow in the references section at the bottom of the article.



After successful activation of the Business Partner we can send the Business Partner Id to NetWeaver BPM. A new task was created after the Set Status of Change Request close to the end of the process (figure 11). In the task maintenance (figure 12) we assign a custom method that in turn uses the web service proxy to call NetWeaver BPM.






Figure 11: Create business partner workflow with custom task






Figure 12: Custom task maintenance






Figure 13: Interface tab of the class that gets called from the custom workflow task






Test your scenario – complete process


You can now call the inbound SAP Master Data Governance web service. In turn a change request will be created. The change request id will be transmitted to NetWeaver BPM. After approval and activation of the Business Partner the Business Partner id is transmitted to NetWeaver BPM. You can use transaction sxi_monitor to view incoming and outgoing messages.



NetWeaver BPM implementation considerations


While the focus of this article is not so much on NetWeaver BPM, but more on the interfaces of SAP Master Data Governance one important aspect with regard to intermediate message events in NetWeaver BPM should be mentioned here. In NetWeaver BPM it is necessary to define a message correlation between the incoming messages from SAP Master Data Governance and the process instance in NetWeaver BPM. Any unique id (eg. process id) can be used for this purpose. The unique id can be send as part of the BusinessPartnerSUITEBulkReplicateRequest_In web service call, for example in the message header. Subsequent calls from SAP Master Data Governance to NetWeaver BPM can then transmit this unique id as part of the message payload. NetWeaver BPM uses the unique id to find the right process instance for the incoming message.



Summary



In this article it was shown how SAP Master Data Governance can be used as part of a NetWeaver BPM cross-system master data process. The article also highlights how the capabilities of the build in governance processes can be easily extended using SAP NetWeaver BPM. The combination of SAP Master Data Governance with SAP NetWeaver BPM and SAP NetWeaver Master Data Management addresses a wide range of scenarios in the area of Enterprise Master Data Management.



Related Content



SAP Developer Network Latest Updates