SAP MDM Jobs Mahindra Satyam

We are Mahindra Satyam (NYSE: SAY), a leading information, communications and technology (ICT) company providing top-class business consulting, information technology and communication services. Leveraging deep industry and functional expertise, leading technology practices and a global delivery model, we enable companies achieve their business goals and transformation objectives.

We are powered by a pool of talented IT and consulting professionals across enterprise solutions, client relationship management, business intelligence, business process quality, operations management, engineering solutions, digital convergence, product lifecycle management, and infrastructure management services, among other capabilities. Our development and delivery centers in the US, Canada, Brazil, the UK, Hungary, Egypt, UAE, India, China, Malaysia, Singapore and Australia serve numerous clients, including several Fortune 500 companies.

We are part of the $7.1 billion Mahindra Group, a global industrial conglomerate and one of the top 10 industrial firms based in India. The Group’s interests span financial services, automotive products, trade, retail and logistics, information technology and infrastructure development.
Designation SAP MDM Consultant
Job Description  Role - SAP MDM Consultant

 Experience -7 - 12 Years

 Work Location – Bangalore/Hyderabad/Chennai

 Type of Position – Permanent

 Lead time to Join – ASAP

Roles & Responsibilities:
 Should have 7 – 12 Years of SAP MDM experience.

 Consultant should have handled minimum 1 – 4 End – End implementation in SAP MDM.

 Candidate should be a quick learner.

 Ideally the candidate should be flexible to travel for short term durations.


Desired Profile
 Role - SAP MDM Consultant

 Experience -7 - 12 Years

 Work Location – Bangalore/Hyderabad/Chennai

 Type of Position – Permanent

 Lead time to Join – ASAP

Roles & Responsibilities:
 Should have 7 – 12 Years of SAP MDM experience.

 Consultant should have handled minimum 1 – 4 End – End implementation in SAP MDM.

 Candidate should be a quick learner.

 Ideally the candidate should be flexible to travel for short term durations.



Experience 7 - 12 Years
Industry Type IT-Software/ Software Services
Role Project Lead
Functional Area ERP, CRM
Education UG - Any Graduate - Any Specialization,Graduation Not Required
PG - Any PG Course - Any Specialization,Post Graduation Not Required
Location Bengaluru/Bangalore, Chennai
Keywords SAP MDM
Contact Vijayanath siddhareddy
Satyam Computer Services Ltd
Telephone 9962090069
Email Vijayanath_Siddhareddy@mahindrasatyam.com
Website http://www.mahindrasatyam.com
Job Posted 30 Mar

SAP MDM-Data Migration consultant, Singapore

Summary
 

 
Experience: 3 - 8 Years
Location:
Singapore
Compensation: U.S Dollars Less Than 5,000 - 5,000
Salaries are in Singapore $
Education: UG - Any Graduate - Any Specialization,Graduation Not Required PG - Any PG Course - Any Specialization,Post Graduation Not Required
Industry Type: IT-Software/ Software Services
Role: EDP Analyst
Functional Area: ERP, CRM
Posted Date: 11 Apr

Desired Candidate Profile


Must have 5 years related experience

With functional experience in data migration/conversion process - master data and transaction data analysis.

With 3-5 full cycles SAP implementation, minimum 2 cycles of which involved global solution roll-out/deployment

Familiar with interfaces of SAP in typical Manufacturing/Logistics/Finance environment

Experience in Pharmaceutical industry

Experience in using SAP migration/conversion tools

Basic ABAP program development experience is necessary

Please email your resume to jobs@wdcasia.sg for a confidential discussions.

Job Description

Job Description:


Lead data conversion/migration process and coordinate end-to-end conversion

This role assists clients in the selection, implementation, and production support of application packaged solutions.

Use in-depth consulting skills, business knowledge, and packaged solution expertise to effectively integrate packaged technology into the clients' business environment in order to achieve client expected business results.


Required Skills: 

Undertook large scale data migration - familiar with multiple loads and tests approach and cutover planning and execution.

Must have experience in managing data conversion from legacy system to SAP and co-ordination with business for data conversion process

Candidate who knows BackOffice Data Migration Tool EzMap will have advantage

Must have 5 years related experience

With functional experience in data migration/conversion process - master data and transaction data analysis.

With 3-5 full cycles SAP implementation, minimum 2 cycles of which involved global solution roll-out/deployment

Familiar with interfaces of SAP in typical Manufacturing/Logistics/Finance environment

Experience in Pharmaceutical industry

Experience in using SAP migration/conversion tools

Basic ABAP program development experience is necessary
Keywords: SAP, MDM, Data Migration

Company Profile

Web Development Company Limited (WDC Ltd.) founded in 1998, an ISO: 9001:2000 company, has over 10 years in software development. WDC today is a well established brand and is amongst the prominent players in the IT consulting and Software development in South East Asia. WDC's software development practice through cross skilling and technology enablement for employees across business divisions. With various growth options, WDC aims to attract best of talent and boasts of lower than industry attrition standards. With significant growth in terms of revenues, profitability, human resources, partnerships, infrastructure and clientele, WDC has handled a number of prestigious projects such as IGA, Nestle, Avaya, Medtronics, Lexmark, Shell, Telstra etc. and earned distinctive awards and acclaim in the domestic and international markets. WDC Ltd. is the first company in ASIA to be awarded the BEACON award for best partner and outstanding performance for Supporting IBM Global Services. WDC Ltd. with a current employee strength of 1000 plus professionals working across various IT verticals like SAP, Mainframe, Peoplesoft, WebTechnologies, MicrosoftTechnologies, Oracle Applications, DBMS, DW/BI, Lotus Notes and other niche skills. Above all the Team at WDC has an undying passion to excel in all areas of business operations. The Executive & Management Team along with all Operation & Functional Managers at WDC accept stiff business challenges as an opportunity to prove its capabilities and go all out to exceed the highest customer expectations. The testimony in the same lies in the fact that WDC has 100% client retention and significant revenue growth has been achieved from existing customer base. -WDC is part of the “Business Partner Tech Council 2008”, which represents a handful of chosen successful Business Partners to work closely and consult with IBM. - WDC was also nominated to be part of IBM “Asia Pacific Partner Advisory Council 2007”. - WDC, is the 2nd company from India to have won a IBM Beacon Award – “Best Regional Consultant and System Integrator, Asia pacific 2005.
Contact Details
Company Name: WDC Consulting Singapore Pte Ltd
Executive Name: Ritesh Srivastava
Address: Not Mentioned
Telephone: +6568279760

SAP FICO with MDM Client of Summit HR

Summary
 

 
Experience: 3 - 5 Years
Location:
Bengaluru/Bangalore
Education: UG - Any Graduate - Any Specialization PG - Any PG Course - Any Specialization
Industry Type: IT-Software/ Software Services
Role: Software Developer
Functional Area: IT-Other
Posted Date: 13 Apr

Desired Candidate Profile

SAP FICO with MDM (master data management)

· SAP MM

· SAP SD

· THEY SHUD BE EXPOSED ON THE FOLLOWING -

· MASTER DATA / Transaction data / LSMW

· SHOULD HAVE DONE ATLEAST 1 IMPLEMENTATION PROJECT

· 3-5 Years - relevant experience

Job Description

SAP FICO with MDM (master data management)

· SAP MM

· SAP SD

· THEY SHUD BE EXPOSED ON THE FOLLOWING -

· MASTER DATA / Transaction data / LSMW

· SHOULD HAVE DONE ATLEAST 1 IMPLEMENTATION PROJECT

· 3-5 Years - relevant experience


Keywords: FICO,MDM

Company Profile

Client of Summit HR
Contact Details
Company Name: Client of Summit HR
Website: Not Mentioned
Executive Name: Usha R
Address: Not Mentioned
Email Address: ushar@summithrindia.com
Telephone: 80-30731084

SAP MDM -- SUPPORT PROJECT -- NIGHT SHIFTS -- CHENNAI

Summary

Experience: 2 - 4 Years
Location:
Chennai
Education: UG - Any Graduate - Any Specialization,Graduation Not Required PG - Any PG Course - Any Specialization,Post Graduation Not Required
Industry Type: IT-Software/ Software Services
Role: Software Developer
Functional Area: Application Programming, Maintenance
Posted Date: 11 Apr

Desired Candidate Profile

Opening is for a support Project - should be open for night shifts

Should be involved in at least one support project.
Years of Exp : 2- 4 Yrs.
Job Responsibilities:

Implementation of Approved Manufacturer's List

Reliability of the system, redundancy, recovery procedures, data integrity and loading, interfaces modifications, performance and tuning of the system, upgrades.

Will be part of global team working closely with business and IT organizations

Job Requirements:
Understanding MDM, architecture, knowledge of MDM tools, experience with data governance.

Project management experience

Unix and database experience (Oracle, MS SQL) is highly desirable

Baan/SAP knowledge is highly desirable.

Business:

Experience with ERP systems, materials management, suppliers onboarding.

Job Description

Opening is for a support Project - should be open for night shifts

Should be involved in at least one support project.
Years of Exp : 2- 4 Yrs.
Job Responsibilities:

Implementation of Approved Manufacturer's List

Reliability of the system, redundancy, recovery procedures, data integrity and loading, interfaces modifications, performance and tuning of the system, upgrades.

Will be part of global team working closely with business and IT organizations

Job Requirements:
Understanding MDM, architecture, knowledge of MDM tools, experience with data governance.

Project management experience

Unix and database experience (Oracle, MS SQL) is highly desirable

Baan/SAP knowledge is highly desirable.

Business:

Experience with ERP systems, materials management, suppliers onboarding.


Keywords: sap mdm, support projects, architecture

Company Profile

Greetings from Varmuk Soft Solutions Chennai.


We are pleased to introduce ourselves as a professionally managed Human Resources Consultancy organization.

We help our clients in finding the right person for the suitable positions on right time through our Human Resource team. We have competencies that are directed towards helping our customers in recruitment. The promise is to meaningfully utilize our acumen to help focus in specific verticals, appreciate client demands and their unique technology and domain requirements.

Ø We believe that candidates must match not only Job, but also Company

Ø We first understand the complexities of our client's culture and JD

Ø Applying a systematic, innovative and collaborative consulting approach that ensures consistent delivery and satisfaction.

Ø while recruiting a candidate for any role, position, level, function, it should always be ensured that there is no compromise in the quality of people

Ø We believe in meeting the client’s requirements within a framework of specific schedule and in a cost-effective manner

Ø We provide candidates with Proper information like KRA and skills to meet employer’s needs.

Why Varmuk

Ø Improvement in quality of employees

Ø Improvement in service and speed

Ø Helps in the present and future requirements

Ø Increase the productivity by reduction of attrition rate

It will be our pleasure to provide our services to your esteemed organization. We would appreciate an early response from your side regarding the same. If you have any further queries or would require any further details, please feel free to get in touch with us.

Warm Regards,



Sree Ranjani R
Lead Recruiter | VARMUK SOFT SOLUTIONS
Mobile +91-95001 28743 | Direct +91-44-4208 2773 | Skype: sreeranjani.r | www.varmuk.com | Follow us on Twitter @ varmuk
Contact Details
Company Name: Varmuk Soft Solutions
Executive Name: Sree Ranjani R
Address: Not Mentioned
Telephone: 42082773

Moving Your Data to a New System?

Author: Duane Failing, Senior Consultant atUtopia, Inc.
moving
It's Just like MovingYour Stuff into a New House!
I recently had the opportunity to discuss the subject of getting data ready to move to a new system with one of my clients.  After the standard presentation, I made the following analogy to attempt to get the idea better appreciated.  Imagine two established adults getting married and moving their stuff into a new house.  Both have complete sets of items, but they need to make decisions on what to take and what not to take.
A few cases to consider:
  • You go through every item that you currently use. Do I really need this item in the new house? What is its function or purpose? If I don't need it, then don't take it. If I need it, then does it need to be cleaned or repaired?

  • You go through the items that are stored in the attic or basement. These are the old items that you currently don't use. Do you want to keep them or are you going to write them off? What is there value? If you were to take them, where would you store them in the new house?

  • You must decide on duplicate items, like washer/dryers. You don't need them both,but how to decide which one to take?  Newer, better fit, more functional?How to manage the negotiations, who wins and who loses. Maybe neither one wins, and you decide to buy a new one and not take the old ones.

  • Maybe the new house has a need for items that neither person currently has? What is the new standard, type, color, size?

This work can be very time consuming, especially if you havea lot it items to consider.  You need to make decisions; sometimes against items that you have a history with, but will not fit will well in the new environment. Clean things up and make the decision before you move everything over and then find out that it doesn't work. Look forward, what do you really want?
If you do this well, it will make your future life a lot better.

Parametric Data - Don't let the "metric" scare you.

Authored by Peter Dahl, Senior Consultant, Utopia, Inc. 

taxonomy 

It's funny how looking at different parts of a compound word gives different connotations. In our taxonomy world, parametric has nothing to do with the metric measurement system ...it's all about parameters.  Parameters are the "old" version of our current terms attributes or characteristics. However, parametric also implies that there is order to the characteristics.

Why would we care about order? Ok, that's a tongue-in-cheek question; order is invariably a good thing. Seriously though, nowadays we seldom manage to run any data system - whether it is a mom-and-pop shop with a simple database of retail products, or a full-fledged ERP - without considering the value of an excellent taxonomy. And, until someone creates and donates a worthy full-fledged taxonomy to the public domain, it's usually developed in-house.

A good taxonomy will include all the elements that allow the advantages of database tools - a noun and modifier (or class), attributes (or characteristics), a short description and a longer, fuller description.  Ideally, the attributes are ordered. Why ideally?  Well, let's think back to my parametric term. The implied order in the list of attributes allows the use of the noun and modifier (class) and the attributes (characteristics) to automatically generate the full description.The order dictates, then, that the description will contain the relevant information about the object from most to least relevant. Why?  Well, again, we can use database tools to make comparisons, and allow the use of fuzzy logic in searches.  If the data in a text block flows from most to least relevant, the most relevant attributes can be matches, while the least relevant can be fuzzy matches ... which allows one to find something "close enough" in an emergency.  Parametric data also allows for the most effective reviews for potential duplications.

The critical thing, of course, is the order ... and it is quite difficult to arrive at a perfect order. Five taxonomy experts will likely have five opinions on the order of any specific item.  To confirm that, check out product catalogs for five different bearing companies and you'll very likely find that none agree on all the attributes, or on the order of relevance.  My rule of thumb for dictating the order of attributes is that each successive attribute should narrow the set of possible matches the most. For example, the noun bearing will contain all possible bearings.  The class roller bearing is an immediate huge reduction in the bearing set. One then needs to look at OD, ID, width, material of construction and other such relevant attributes to determine which will reduce the set of potentials most.

While one can only dictate the ordered attribute set and the resulting parametric description for one's own business, a well defined and controlled taxonomy will allow the use of standardized tools to reduce the variability of data and/or goods as well and helping ensure the data/items can easily be found and harder to inadvertently duplicate.

What is Master Data?

Authored by:  Jim Whyte, Director of Solutions Consulting, Utopia, Inc.

Definition:  Master data is reference data that is shared or exchanged across strategic applications, work processes, lines of business, or enterprises.

My definition has been personally field tested for thebetter part of 20 years and has served a number of Fortune 500 organizations very well.  For me, the focus is on the sharing and leveraging of the reference data across the various organizational boundaries that master data can and must cross.

The definition is based on the goals and objectives my former employer was trying to achieve back in the late 80's - ERP consolidation, moving from a decentralized, LOB/regional multi-national company to a global, top-down, centralized management, standardizing applications,infrastructure, business processes, roles and responsibilities, and reporting.  To achieve that, we needed common reference data across our solution landscape.  Does that sound familiar to you?

We were reducing the number of ERP systems, moving to four regional instances, based on SAP software (common applications). This required a broader view of master data than most are willing to adopt today. We wanted to ensure that our four SAP instances were configured identically. It was the start of our implementation of enterprise data governance, data standards, and standardized master data content. We knew we couldn't achieve one, without the other. 

I often wonder if I would have the same deep belief in my MDM definition and support of centralized data governance if it would have been possible to run our entire enterprise on a single instance back then. I might not have been as concerned with managing configuration table changes across four production instances (e.g. order types, pricing conditions, and unit of measure, currency and exchange rates in addition to customers, vendors and materials) are system development landscape with common data content across development, user acceptance, training and production environments.

We successfully implemented a global data warehouse a few years following our ERP consolidation and we recognized that all fifty, yes 50,master data objects and hierarchies we maintained in our propriety MDM application were either enterprise KPIs or global reporting dimensions in our data warehouse. That wasn't by design up front, but it became a benchmark forus when determining what additional reference data objects we should consider to centrally manage on behalf of our enterprise. 

So what is master data? It is somewhat in the eye of the beholder - but I firmly believe it is those data objects shared and exchanged across your enterprise, that are core to your business operations.  Additionally, they are the data objects that you will pick up and migrate from one strategic application to the other, as they are timeless. They are your enterprise "family jewels" and will be handed down from one generation to the next.

MDM Auto ID: Common Questions Asked

I was always thrilled with the idea of writing blog. When I look into MDM blogs and Articles section then I found that most of topics have already been covered. So I thought of writing about MDM Auto ID. I feel this is useful and informative to everyone especially for MDM Beginners. This is my first blog. So, I would try my best that this information is easily understood by everyone.

What is MDM Auto ID Field: In simple words, it is a field of type INTEGER which automatically gets incremented by one for every new record created into MDM. It is also known as Unique Identifier.
E.g. If you have 5 records in MDM with Auto ID 1 to 5 and you delete record with Auto ID 5, still next time when a new record gets created into MDM then Auto ID 6 will get created not Auto ID 5. That’s why it is a Unique Identifier and you can easily know that somebody has deleted record with Auto ID 5 in MDM and now it is not present.

Commonly asked Questions:
1. Can we load our own value to the MDM Auto ID field, manually or Using Import Manager?
No, as Auto ID field gets populated automatically for every new record created and incremented by one. Also Auto ID will always display as readable only. So you can’t set its value using source file through import manager.

2. Can we Syndicate the MDM Auto ID field to Target System?
Yes, you can easily map your Auto ID field with target field and then can syndicate Auto ID.

3. In case of CMDM scenario, if I send data from MDM to distributing systems. Is this possible that MDM Auto ID field can be used as Global ID?
No, this is not feasible. Because when records get syndicated to target systems and come again to MDM with more field information’s like Remote System’s local ID then you can’t make use of remote key concept as you can not map source field with MDM Auto ID during Import.

What to do in this case: Solution is you need to create one more field of type integer/text in main table say MDM Global ID and populate this field using assignment expression with the help of Auto ID; you maintain the same Auto ID values for this MDM Global ID field. So that when this data comes to MDM with more fields’ information from remote system you can map source Global ID(MDM Auto ID) field with this MDM Global ID and map source Global ID(MDM Auto ID) clone field with Remote Key during Import. In this way, you can maintain remote keys of respective remote systems which you can see into MDM data manager by right clicking on record using Edit Key mappings.

I assume that you are familiar with the concept of Key mapping into MDM. So I don’t discuss it here. I hope this will give everybody a fair idea about MDM Auto ID field

Data Standardization and Enrichment an integral part of Spend Performance Management

In every SPM project I invariably get asked the same questions, what is DSE? is it part of the product?  I will try and answer these questions here.

So what is DSE?
Data Standardization and Enrichment ensures the reliability of your spend analyses.  DSE compiles global market data from external sources including diversity and risk, in order to validate and enrich your suppliers and classify your transactions.

1.Validation & Enrichment
Supplier identities are often vague due to poor data quality and lack of content.  It's tough for an enterprise to maintain consistency across multiple systems involving supplier data.  Let alone maintain parent/child relationships across the entire list of suppliers they work with.  And because of this, procurement organizations are not able to analyze their spend by suppler and they cannot leverage total spend with a supplier parent. 
Supplier validation and enrichment is the process of transforming unstructured and incomplete supplier data into valid, enriched information for reliable analyses and greater leverage.   DSE standardizes the format and content of your suppliers, removes duplicates and then validates them against a global business directory.  Once validated, the supplier records are enriched with information such as legal name, trade names, corporate ownership, risk indicators, and so on.
Before validation & enrichment
Supplier ID
Description
Spend
SAP001
SAP America
$5500.00
BOBJ01
Business Objects
$3500.00
SAP002
SAP
$2200.0

After validation & enrichment
Supplier ID
Description
Cleansed Supplier
Spend
SAP001
SAP America
SAP Inc
$5500.00
BOBJ01
Business Objects
SAP Inc
$3500.00
SAP002
SAP
SAP Inc
$2200.00

By creating linkages between supplier corporate parents and their children, procurement organizations will be able to realize that all spend is against the same supplier. 
Additional enrichment facilitates strategic decision-making such as diversity status (e.g. Minority-owned, Women-owned, Veteran-owned), credit/financial risk, and other attributes supporting corporate initiatives.

2.Data Classification
Classification is the process of assigning purchases to a standard structure such as UNSPSC, eCLASS or Custom Structure.  The issue at most organizations is that a single classification structure is not used for classification across all procurement and financial transaction systems.  Hence procurement organizations are unable to effectively analyze their category spend across the entire enterprise.  The benefits of using a single classification structure are:
  • A standard view of all goods and services purchased across the enterprise
  • The ability to leverage total spend to drive cost savings, monitor compliance, and rationalize suppliers
Below is a good example of data classification
Unclassified Data
Product ID
Description
Supplier
ERP Category
Spend
X08092010
Shoes
Athletic
Clothing
$5000.00
AJRD2009B
Shoes
Balance
Footwear
$2500.00
TP2008CRV
Blackberry
A-Mobile
Fruit
$3500.00
PX2009BD
Blackberry
B-Mobile
Telephone
$2300.00

Classified Data
Product ID
Description
Supplier
ERP Category
Standard Category
Spend
X08092010
Shoes
Athletic
Clothing
Footwear
$5000.00
AJRD2009B
Shoes
Balance
Footwear
Footwear
$2500.00
TP2008CRV
Blackberry
A-Mobile
Fruit
Telephone
$3500.00
PX2009BD
Blackberry
B-Mobile
Telephone
Telephone
$2300.00
Note:  Supplier description is not the only field with which categorization is done.  Depending on the vendor and the engine they use it involves lot many fields to correctly categorize your data.

Looking at unclassified data business is not in a position to find what is the total spend in Footwear or Telephone category, but once the data is classified this question can be answered correctly and accurately. 
Although not mandatory it's very beneficial to use DSE process.  So it's very important to talk to your business users and explain to them how DSE improves leverage.  This is one of the important decisions that will drive project timelines.  The next logical step is to decide which vendor you would like to work with.

As you have realized by now DSE process is not part of the SPM application.  This is an additional service that data goes through.  Which means (in most cases) you will need to send your unclassified data out of the SPM application to your 3rd party vendor for DSE process.  Each vendor has their own way of processing data.  Following are some questions that you might want to ask yourself and your DSE partner to ensure the process is smooth:
  • How frequently do you want to send your data for DSE process
  • How extensive is your DSE vendor's business directory (number of businesses, geographic distribution, type of content)
  • What supplier enrichment attributes are needed by your business users (diversity, risk, sustainability, clinical, parent/child linkages, franchises, sales, geocodes, etc.)
  • What categorization structure (UNSPSC, eCLASS, Custom Categorization) would business like to use
  • How long does it take your DSE vendor to pass your data through their engine for the first iteration and how long does it take for subsequent iterations.
  • Last but not least what do they charge you for this process
Note: Usually DSE partners do take more time during the first iteration as their engine is still trying to learn your data.  Subsequent iterations generally take less time.

In my next blog I will talk about how the SPM application handles all this DSE in its data model.

Lexmark on Using SAP NetWeaver MDM and SAP BusinessObjects Data Services

SAP BusinessObjects Data Services and SAP NetWeaver MDM are a perfect match when it comes to integrating and consolidating customer data spread across the enterprise. At Lexmark, a global provider of business and consumer printing solutions, this combination is applied in their MDM initiative for streamlining enterprise analytics. Listen to what Lexmark's Information Delivery Manager, Joe Young, says in detail in this 4-minute-testimonial.
For more details on combining SAP NetWeaver MDM and SAP BusinessObjects Data Services you may watch this SDN Wiki space.

Import of PDF files using MDM GUI Clients

I have been watching threads on SAP Master Data Management for the last 5 to 6 months. I noticed that SDN members keep asking about import of PDF Files and often confused about this concept. So I think this is useful for all of SDN community if there is requirement to load PDF files.
Pre-requisite: In order to load PDF files Adobe Acrobat® Standard or Professional Version is required and it needs to be installed on the machine that is loading the PDF and not on the Server machine. I mean on machine where you are using MDM Data Manager. It is also required to generate Thumbnails. If you don't install this Adobe Acrobat Standard/Professional you will get error as given below:
"Importing [English US] 'xyz.pdf'...failed.
Unable to open Adobe Acrobat.
See your administrator about installing Adobe Acrobat."

Procedure: PDF files can not be imported through Import Manager. For this you need to upload the PDF files with the help of MDM Data Manager using Data Group. While uploading the file through Data Group you can store that file either using option Link to original file only or store in repository. Once you have uploaded all the files in data Group then after that using Import Manager, you can link main table records with the corresponding PDF files using PDF file names.
Some SDN members often confused about Allow invalid PDF property which can be set in MDM Console. I want to throw some light on it.
Note: MDM considers a PDF invalid only if it is unable to generate a thumbnail for it. If you have installed Adobe Acrobat Professional on the machine that is loading the PDF, you don't need to set this property Allow Invalid PDF in MDM Console. I mean in console right click on repository -->properties-->allow Invalid PDFs = Yes is absolutely not required.
References:
SAP Note: 1258537
MDM Console Reference Guide
MDM Import Manager Reference Guide
MDM Data Manager Reference Guide

MDM Expressions: Part 2

In continuation with previous blog: MDM Expressions: Part 1

So far i have discussed some of functions in Previous blog. So Let's continue with the use of some other Functions with suitable examples:

7. MID( ) : It returns a specified section of a string. Please have a look how it is being used in one of the Project requirement in this thread

8. IS_NULL( ) : Please refer the usage of this function in this thread

9. Qualified table validations : As the functionality of Expression is limited in case of Qualified tables. So I just want here you to aware of validating non-qualifiers inside qualified table by this thread

10. Lookup table Expressions : Refer this thread for details

11. Field population from Tuple table fields : This is possible only in case if your Tuple table is single-valued. For more details please refer this thread

12. Assignments to field of type Lookup : This is not possible as you get below error:
Assignment operation failed: A runtime error occurred while evaluating an assignment, validation or calculated field. For more details please refer this thread

13. Difference between Calculated field and Assignment
: Refer this thread for details

14. HAS_ALL_CHARS( ) : Refer this thread for its usage.

15. Validation that does not contain 1) Alpha numeric, 2) -ve number, 3) All Zeors, 4) String, 5) Special characters and should have only NUMERIC Values : Thread ID


16. Use of OR Operator : This is useful in case of more than one conditions. If any of the condtions is TRUE, It returns TRUE. IT returns FALSE if none of the condition satisfies. Please refer this thread for more details

I hope this would give everyone a fair idea to understand and deal with the usage of Expressions after going through all of the above mentioned threads.

References:
MDM Data Manager Reference Guide

MDM Expressions: Part 1

I am not here to tell you the basic functionality of Functions and Operations as this thing you can read and understand easily from MDM Data Manager Reference Guide. I have a keen interest in writing expressions and I have already answered a lot of threads in SAP Master Data Management. So I have consolidated some of these good threads for your references. I think this blog would be really helpful for the SDN MDM Community in order to understand and get familiar with the use of Operations, Functions and more under Expressions.

What is MDM Expression: MDM expressions are Excel-like formulas that are evaluated by MDM and return a distinct value for each record.

Let's start with the use of some Functions with suitable examples:

1. SYSTIME( ) : If you give in Expression as SYSTIME(0), it will give you the Current Date and Time. So accordingly you can make use of this expression.
e.g. SYSTIME(1) : It will add 1 day to the current date. i.e. Tomorrow
SYSTIME(NEGATE(1)): It will subtract 1 day to the current date. i.e. Yesterday
   Business Requirement where you can use this Function:
   Comparing System Date with a Date field
   Assign constant time to field in DM

2. TRIM ALL, TRIM LEFT and TRIM RIGHT : These are the functions which use to TRIM extreme SPACES only not Special characters. I think in future if SAP can provide such a function for removing special characters that would be really great in writing Expression for Assignments.
Special Characters Removal Using MDM Import Manager

3. Callable Function : You can write an Expression in Validation Tab and if you have set Property Callable = Yes then you can use this expression in any other Validation/Assignment Expression. For this you will see this Validation name in Functions tab of Expression.Please refer this thread for more details

4. Concatenate : Concatenate function will give you always delimiter comma (,) between two field values. For the same you can also use & operator which will give you values without any delimiter. Thread ID

5. Concatenation of Attribute Values : This is possible using Add Branch Asignment. Please refer this thread for more details.

6. Language Field Validation : In fact, Multi lingual validations cannot validate the inherited if the data manager is opened in the primary and vice-versa. For more details Please refer this thread.

MDM Expressions: Part 2

SAP TechEd 2010 is Near - See the EIM Tracklist

Benefit from the broad spectrum of lectures and hands-on sessions in the Enterprise Information Management area that will be given at this year's TechEd in Berlin (October 12 -14, 2010) and Las Vegas (October 18 - 22, 2009). Select the EIM sessions (Berlin/LasVegas) that suit you best regarding the content and level of expertise required.
As a teaser, I'd like to pick just a few sessions per EIM discipline, hoping you'll get hungry for more from the broad offering:
Overview
Enterprise Data Warehousing and SAP NetWeaver Business Warehouse Accelerator:
  • EIM 201 Enterprise Information Management and Metadata Management Around SAP NetWeaver BW 7.3 (Berlin/Las Vegas)
  • EIM 261 Utilize SAP NetWeaver BW Accelerator and SAP BusinessObjects Explorer to Analyze Huge Amounts of Non-SAP Data (Berlin/Las Vegas)
  • EIM 300  SAP NetWeaver Business Warehouse 7.3 Feature List Overview and Roadmap (Berlin/Las Vegas)
Data Integration and Quality Management
  • EIM105  Product Preview: The Next Generation of Data Quality Assessment and Analysis (Berlin/Las Vegas)
  • EIM106  What's New for SAP BusinessOjects Data Services 4.0: Vision and roadmap (Berlin/Las Vegas)
  • EIM160  Getting Started with SAP BusinessObjects Data Services (Berlin/Las Vegas)
Master Data Management and Data Governance
  • EIM162  How to Jumpstart Your SAP NetWeaver MDM and SAP NetWeaver BPM Projects with SAP NetWeaver MDM Quick Starter Packages (Berlin/Las Vegas)
  • EIM200  Master Data Management: SAP's Strategy and Roadmap (Berlin/Las Vegas
  • EIM263  SAP NetWeaver MDM and SAP BusinessObjects Data Services Integration  (Berlin/Las Vegas

I hope you'll find what you need for your expertise in the individual EIM disciplines. Please note that this personal track list is just an excerpt of all EIM-related sessions. For a complete overview, see the links above.
Kind regards,
Markus Ganser

Matching & Merging: Equals Vs Token Equals (Part 1)

I came across a thread where the Author of the thread was concerned about the Scores of matching duplicates. He was getting different Matching Scores other than what he set there during executing Strategies. He just wanted to know how it is happening and so are other SDN MDM members.
I divide this difference between "Equals Vs Token Equals" into two parts. This blog will contain the information of executing Strategies with Funtion = Equals. I will try my best that this difference is easily understood by everyone.

Working with Function "Equals"
For executing Matching, you need to define Transformation, Rule and Strategy.
Open MDM Data Manager, Go to Matching Mode:

1. Define a Transformation :
image

2. Define a Rule : Here in this Rule "Matching Material Description", I have included Transformation just created above and set Function = equals and other parameters as shown in below screen-shot.
image

3. Define a Strategy now : Here in this Strategy, I have included the Rule "Matching Material Description" as created above and set other parameters as shown in below screen-shot.
image

Now coming to Data Manager : Record Mode
Now, I have four records for which there is value for field Material Description as shown below:
image

Since we have set Rule Property Function = equals, it will treat these 3 tokens as whole that is one. Lets see how it shows score when we execute Strategy
image

After executing the strategy: In matching Mode, we get the following scores
image

Since during Function = Equals, it treats all token as one Individual Token
It tells that there are 2 duplicate records with Material Description "Lexan IP 300". So there Score of Success is 100 as defined in Rule above.
"Lexan IP 600" is different from "Lexan IP 300" so showing score of Failure 10
"Lexan ID 900" is different from "Lexan IP 300" so showing score of Failure 10

Now its turn for Token Equals example and here it is :

References:
MDM Data Manager Reference Guide

Matching & Merging: Equals Vs Token Equals (Part 1)

In Continuation with previous blog: Matching & Merging: Equals Vs Token Equals (Part 1)

Through this blog, I will try my best that this difference between Equals Vs Token Equals is easily understood by everyone and how scores get affected and come different than what we have set in case of Function = Token Equals.

Working with function "Token Equals"
At this point of time you know that for executing Matching, you need to define Transformation, Rule and Strategy.
Open MDM Data Manager, Go to Matching Mode:

1. Define a Transformation :
image

2. Define a Rule : Here in this Rule "Matching Material Description", I have included Transformation just created above and set Function = Token Equals and other parameters as shown in below screen-shot.
image

3. Define a Strategy now : Here in this Strategy, I have included the Rule "Matching Material Description" as created above and set other parameters as shown in below screen-shot.
image

Now coming to Data Manager: Record Mode
Now, I have four records for which there is value for field Material Description as shown below:
image

Since we have set Rule Property Function = Token Equals, it will treat these 3 tokens as separate (distinct) token. Lets see how it shows score when we execute Strategy
image

After executing the strategy: In matching Mode, we get the following scores
image

Now you are wondering with Material Description "Lexan IP 300" we have the right Score 100 which we defined during Rule but what about other two records how the score 20 is coming for Material Desciption "Lexan ID 900" and how it is showing score 50 for Material Description "Lexan IP 600" since we have not set these scores during defining Rule.

Logic is pretty simple:
1stly in Function "Token Equals" each Token is considered as separate token (identity) unlike Function "Equal" where these 3 tokens considered as single individual token.
2ndly it gives score basis on the below formula:
Score = Success * Number of Unique Matching Tokens / Total Number of Unique Tokens

Now let's see score for each of these Material Description:

For Material Description : "Lexan IP 300" with "Lexan IP 300"
Number of Unique Matching Tokens: 3 (Lexan, IP and 300)
Total Number of Unique Tokens: 3 (Lexan, IP and 300)
So Score: 100* 3 / 3 = 100.

For Material Description: "Lexan ID 900" with "Lexan IP 300"
Number of Unique Matching Tokens: 1 (Lexan)
Total Number of Unique Tokens: 5 (Lexan, ID, IP, 300 and 900)
So score: 100* 1 / 5 = 20

For Material Description: "Lexan IP 600" with "Lexan IP 300"
Number of Unique Matching Tokens: 2 (Lexan and IP)
Total Number of Unique Tokens: 4 (Lexan, IP, 600 and 300)
So score: 100* 2 / 4 = 50

My primary objective is to show Score calculation through these blogs. So I hope now you would not wonder if you find Matching score different than you defined in Rule if you are using Property, Function = Token Equals.

References:
MDM Data Manager Reference Guide

SAP Netweaver MDM Trends and changing role of MDM consultants Part1.

There have been multiple queries on the newly added features in MDM after the recent activities- SAP acquiring Business Objects, Use of BPM for cMDM and collaborative master data processes, enhanced features like WS generator and Webdynpro configurator added as new emerging trends in MDM 7.1.
In this series of two blogs I will try to list all the new technologies added, scenarios and we will also see how it changes the MDM consultant role in the project, what new skills and knowledges are required etc.
Till 2007-08 the focus was more on the MDM core services like improving performance of MDM servers, import and syndication processes. Year after we saw MDM 7.1 released and within a short span of time we have now 5 patches. The topmost differentiators in MDM 5.5 and 7.1 are-
1. Flexibility in Data modeling- By introduing Lookup(Main) and Tuple datatype we can now model anything under the sun in MDM repository. Deep nested structure, supporting cross linkages between multiple entities, managing relationship data which were all forbidden to dream till 5.5 are now easily achieved in 7.1
2. CTS+ ensures improved administration and life cycle management for MDM projects. It also increases Project Life Cycle auditing compliance and monitoring.
3. Others worth mentioning are Enhanced security measure- User Passwords, Activity logs, Change tracking support for the Qualified tables, Use of DB Views on slave repositories for specific enterprise reporting.
4. MDM 7.1 offers new functionality by implementing the MDM PI Adapter which enables direct communication between MDM and PI. The MDM PI Adapter delivers data to and pulls data from MDM by using the MDM Java API port functions. This enables a tighter MDM PI integration as well as message monitoring. It also supports asynchronous data flow(Exactly once model) and acknowledgment from the target systems.
In part2 of this blog we will focus on the changes in the Process layer and the UI layer which are required for master data management, managing the CRUD operations, Data migration from SAP and NON SAP sources, enhanced Data Quality. We will also see how this impacts the MDM consultants working on any MDM programs.

Data Migration: Installing Data Services and Best Practice Content

In a previous blog I discussed the options for data migration and the technologies used.    We stated that SAP delivers software and content for data migration projects.  In this blog I'll discuss the installation of the Data Services software and loading of the best practice content.  
In order to get started you can use online help or the bp-migration alias in SMP.   In this blog I used the best practices link in online help.    There is a lot of great collateral here to take advantage of, however when I was ready to install The combination of the content down (bp-migration) and the online help giver you everything you need to get started!  It will take you to the ERP quick guide that literally walks through what you need!   It includes a silent installation.  I tried the silent installation, with no luck, so I went to the appendix and did the steps for the manual installation.  I'm sure the silent installation works if everything is setup correctly, but I wasn't too worried about it since I wanted to see the required steps for the manual installation. 
Experience with the manual installation
With the manual installation you install Microsoft SQL Express, Data Services, and do things like create specific repositories, load best practice content into Data Services, and ensure everything is ready to run.    I had to do the manual installation between meetings, so it took me a couple of days, but in man-hours it probably only took ½ of day - and a lot of that was me reading the guide to ensure it was done correctly.  There are a lot of scripts that you need to run, so I'd suggest setting up the passwords to be the same as what would be set in the silent installation.
The installation of Microsoft SQL Express and Data Services is very straight forward.  I did them on my laptop which has other SAP software and didn't run into any problems.   I couldn't find a trial download of Data Services on SDN.  Hopefully your company has already purchased the product.   Someone mentioned you can download and then just get a temporary key from http://service.sap.com/licensekey, but I'm not sure.   I also think there is a version of BI-Edge that includes it, but it's not in the basic BI-Edge, you at least need the "BI Edge with Integrator Kit".   If you have experience with using Data Services as a trial, please post a response and share!
Once the basic installation is done you start to import the content into data services.  The content is grouped into rar files and you download and unzip the content.  The import happens into a couple of Data Services repositories.  The first repository is related to all the lookups that are required during the migration.  The first content I unzipped had spreadsheets for IDOC mapping and files that will be used for doing lookups to validate data. All of this goes into the lookup repository.   One example is the following screenshot.  This is for HCM data.  It has description of the field, if a lookup is required, and if the field is required.  There's other information as well.

Spreadsheet with HR IDOC information
At this point I wanted to know what else is delivered to help understand the IDOCs.   So, I explored a couple things.  The first is the documentation with the best practices in online help.   In online help you'll find sample project plans, and links to ASAP methodology that has data migration content.  I haven't explored the content available in detail for project management, I'll do that and blog about that soon.     You can also download the documentation from SWDC on the service market place.   The documentation is the same in online help and in the download area on SWDC.    The documentation includes a word document for each of the objects.  So, for example, in the HCM example mentioned above, the word document is 32 pages and includes description of the IDOC segments, number ranges, and process steps. 
After the import I had datastores, jobs, project content in Data Services.  Examples are below:
Project and jobs created for creating lookup tables in the staging area for the relevant migration objects. These look up tables will be used to validate the data when the migration jobs execute.  
Delivered project"

Data flows created in data services for the lookup table creation jobs:
image 
Datastores were created. The datastores create lcocal storage for temporary data for the migration, as well as linking to the SAP target system.   I haven't yet created a datastore for the source system.  The next step will be to update the DS_SAP datastore with the actual SAP system to be the target system for the migration. 
Data stores
 The second repository holds all the migration jobs with the target of the IDOCs going to the SAP system.   Once the import for this repository were completed I had a project with jobs related to the IDOC structures:
Project and jobs for migrating data



From this screen shot you can see the jobs for some of the content objects such as accounts payable, bill of material, cost elements, etc. 

That was pretty much it for the manual installation. The last piece was to add the repositories to the Data Services Management Console.  The management console is web-based way to schedule and monitor the jobs.   OK - now I'm ready for the post-installation work!    Look for the next installment to discuss post-installation and start using the content!

Special Characters Removal Using MDM Import Manager

With the latest Market trend MDM-BO integration is emerging as it results in cleanse data and lot of other benefits. On the other hand we don't have any function available under MDM Expressions in MDM Data Manager for removing special characters. So other than MDM-BO do we have any other alternative ? Everyone will have different views on it. But still I feel that MDM Import Manager is very impressive tool for removing special characters effectively and doing other conversions before entering data into MDM.
So I want to tell here some of the capabilities and functionalities of MDM Import Manager for cleansing data before entering data into MDM.
See, when you right click on source field, you can have there property->Set Value Conversion Filter->Multiple
    1. Apply Operator
    2. Normalize
    3. Change Case
    4. Replace
    5. Accept Truncated Value
Points to be noted:
1.One of the Most Important aspects for playing about these Properties depends on the Type of field defined in MDM.
2.You can set these Properties only after mapping your source field with target field.

Now if you map your source field with MDM field of TYPE INTEGER
After right click on source field->Set Value Conversion Filter->Multiple
Here you can play with only Apply Operator
You can set operator for your source values as +, -, x, /, equals, Null Equals, Round, Ceiling, Truncate.
These functions are very useful in case if you want your source field values to be multiplied by some certain digit before coming into MDM, there we can use multiply operator "x". Similarly other operators one can be used as per requirement.

Now if you map your source field with MDM field of TYPE TEXT
After right click on source field->Set Value Conversion Filter->Multiple
Here you can play with Apply Operator (Append and Prepend only) as well as other filters like Normalize, Change Case, Replace, Accept Truncated Value.
Here I will discuss only Normalize, Change Case and Replace functionalities.
Sometimes we have requirement for removing special characters like @, &, #, $ etc. So question is when to go for Normalize and when to go for Replace Conversion filters.
If there is requirement to remove everything including spaces too then one should go for Normalize functionalities.
E.g. if field has value Lexan$@ ID& 300
So outcome after Normalizing would be LEXANID300
You can also make use of CHANGE CASE (UPPER CASE, lower case, Title case etc) along with Normalize as per your requirement. If you also set Change Case = Sentence Case, along with Normalize you will get output as Lexanid300

But if you want to keep spaces between tokens and only want to get rid of special characters, you should go ahead with REPLACE. Other benefit, you can remove the specific special characters if there is requirement. In other words Custom Special characters removal.
E.g. if field has value Lexan$@ ID& 300
Requirement is just to remove special characters @ and &, so here you are maintaining $ as well as SPACES between tokens. Set Replace as shown in below screen-shot.
image

You will get desired result as shown below where symbol $ and SPACES are maintained between different tokens.
image

I am not sidelining the benefits of MDM-BO integration but just wanted to aware everyone that MDM Import Manager is also proficient up to certain extent in dealing with cleansing of data. I hope this blog would be useful and informative for everyone especially MDM Beginners.

Five key factors for Data Migration Success

Authored by Ashvinder Rana, Data Migration Lead, Utopia, Inc.









Many of us have been a part of various data migration projects at some point or the other. We all know data is important. But can we name at least five key factors that will ensure a successful go live? Well, here is my two cents (although I believe it's worth a lot more ). Though, I know there are more than just these five key factors the following are my top five.

  • Understanding Data - it is absolutely imperative for the data team to understand the legacy data and how it exists in the legacy systems. Along with how the data is structured in the legacy systems.

  • Resource Scheduling - now of course, you want the right resources with the right skill sets and knowledge to do your data migration project! Imagine, calling a plumber to fix your AC at home!! I'm sure we would never think to do that. Just the same, we should ensure the data team houses resources with appropriate data skill sets as required for the project.

  • Scoping the Requirements Accurately and On Time - not too early, not too late. You know how the saying goes: "there's a time and place for everything!" You don't want to scope out the data requirements before the business blueprint requirements are signed-off on. If you do, then just be prepared for those "change orders" to start rearing their ugly heads well into the beginning of the development phase. Unfortunately we've all run into these way to frequently!!

  • Data Quality Framework - ensuring that a data quality framework is in place is equally important among other things that will lead to a successful data conversion. This implies not just having means/tools to perform data profiling and analysis, but also a plan to identify data bottlenecks and recommended solutions and/or data cleansing plan of actions as well.

  • Data Validation vs. Data Testing Strategy - projects need to ensure an effective data migration testing strategy is in place and is part of the data migration project. The clear demarcation should be also be made between data validation and data migration testing. Wherein, data validation can comprise random sampling methods to ensure that the data is converted accurately as per the data mapping rules. However, a data migration testing strategy should comprise a series of iterative "mock conversion runs" for all objects in scope where the converted data is utilized by the business process / functional teams to thoroughly test the integration points/transactions as well. In addition, these iterative "mock conversion runs" also allow for validation or conversion programs, conversion error analysis and fixes that will eventually lead to a "zero-error" data migration!

Can Data Quality Influence A Company's Ability to Sustain Environmental Initiatives?

Authored by:  Pawan Gupta, Senior Consultant, Utopia, Inc.

 
 
 
 
 
 
 
 
 
 
 
 
 
In reading through our various blogs, my colleague Tony Stypinski's blog The Greening of Data got me thinking about another aspect of data.   Data qualitycan help corporations become more efficient by supporting their "green" sustainabilityinitiatives.

The "green" initiative is on the rise with many companies scrambling to adhere to the movement of being environmentally friendly and the ability to reduce their carbon footprint.   The push for this initiative can stem from a number of reasons, such as the want to be environmentally friendly, the need to reduce costs, legislation requirements, customer preferences, etc.

One real life example that immediately comes to mind is UPS .  This company has several initiatives to continuously improve their environmental performance and reduce their carbon footprints. Not only are they feeling the benefits, but so are their customers.  They are feeling the relief in their checkbooks.  UPS has environmental initiatives which focuses on improving efficiency, specifically by reducing miles flown and driven and using the most fuel efficient mode of transportation. The company's philosophy has made them acutely aware and focused on their environmental impact as it pertains to how they conduct business and operate.

Now you are probably thinking that is all fine and dandy, but where does data come into play?  In the case of UPS' initiative to improve efficiency, I can only assume that their approach to this initiative was based on...first to provide drivers with optimized routes based on package destination to reduce miles driven to deliver packages. A route planning application and good supporting data would be required for this initiative.  Also, due to the dynamic nature of fuel prices, they will also need the capability to continuously monitor fuel consumption.  This includes the type of fuel consumed to determine fuel cost and emissions to ensure compliance and facilitate decisions regarding what fleet uses the most fuel and what type of fuel should be used.  To come to such a critical decision UPS most likely could not have done so without good quality data.  In order for companyto be able to sustain and improve their initiatives over a long-term, they will need the capability to continuously monitor fuel cost, emissions, and services offered to achieve proper balance. Data governance can play key role in enabling, accelerating adoption of these initiatives, and sustaining the initiatives to realize the benefits offered by them.

No matter what a company's initiative is it will always be influenced by data in some fashion.  Whether your initiative is "green," pink, or purple with yellow polkadots it will always necessitate the need for good quality data in order to make the initiative a reality and be successful.  Don't you agree?

Podcast: How SAP IT Uses BPM and Data Services for Post Merger Data Migration

 
Download the podcast (11 mins)!
One of the things I love about the BPM approach is its amazing ability to synch business and IT implementation.  No where does this show up more clearly than when applying BPM to what are commonly considered IT processes in order to involve the business. 
As I've pointed out in my blogs before, Enterprise Information Management (EIM) is one such area.  In companies with a less mature approach to managing their information and data, data quality issues are considered an "IT problem", and IT is left to figure out how to mitigate these issues - usually with varying degrees of success.  But when BPM is leveraged to orchestrate data quality and governance into a company's business processes, the whole process works much more seamlessly and efficiently.
Recently, I came across such a story here at SAP, where our IT group is facilitating integration of some recently acquired companies.  The team has leveraged our own technologies, SAP NetWeaver BPM and SAP BusinessObjects Data Services, to create a data cleansing, migration, and governance process.  This had the effect of transforming an IT centric process into an evolving best practice that engages IT data analysts and LOB managers in accepting new customer data as they are migrated into SAP's systems.


Figure 1: System Architecture of SAP's BPM-based data migration process
Listen to my latest podcast as I interview Oktavian Wagner of SAP IT's Merger and Acquisition Business Process and Application Migration team as he describes the solution they built.
The technical details of this story are also described in the forthcoming book, "Real World BPM in an SAP Environment" from SAP Press, available this fall.

Postcast Summary

Here is a short summary of the questions that I ask Oktavian in this podcast:
0:00        Introductions
0:40        What role does your team play in the integration of acquired companies?
1:15        What kinds of things does your team have to accomplish when acquiring a company?
1:52        Is your team is using SAP NetWeaver BPM and SAP BusinessObjects Data Services to enable this post merger data migration?
2:35        What is SAP trying to achieve with this project?
3:29        Why is data quality such an issue for SAP when acquiring companies?
4:05        Describe the solution your team has built at a high level.
5:02        Tell us how you are using SAP BusinessObjects Data Services.
6:11        Give us a little more detail about how SAP NetWeaver BPM and SAP NetWeaver Composition Environment fit into this solution.
6:58        In what ways is SAP NetWeaver Business Rules Management used in this solution?
8:30        Tell us about the user experience that was built to go with this new automated process.
9:30        Tell us about how your team worked with SAP Value prototyping to get your project built.

Dealing with Multilingual Data – Is it a Nightmare??

Authored by Prashanth K. Vaida, Data Migration Consultant at Utopia, Inc.


Jim Whyte, director of solutions at Utopia, Inc., wrote an interesting and educational article entitled Data Governance - It's Your Business Processes, Stupid... In the blog he stated "data governance involves addressing the framework for data-related decision making".

That statement triggered a thought...what if international data is involved in a data governance project? How difficult is it to profile, standardize, enrich, de-duplicate, or integrate with multiple languages (e.g. Chinese, Japanese, or French). How difficult is it leverage the 20% technology to maintain this data? How difficult is it to analyze multilingual data? Is it really going to be a nightmare???

In my past professional experience with multilingual data I have successfully leveraged SAP® BusinessObjectsTM Data Services tool to successfully deal with such data. Dealing with multilingual data involves setting a code page and language packs at three levels: data, operating system, and tool. SAP® BusinessObjectsTM Data Services tool has proved to be very compatible and efficient with multiple languages in my experience. Its data cleansing packages allows your company to successfully work with multilingual data.

I am not saying the task at hand is very easy, but it is definitely feasible especially with the right tool and skill sets.

I am curious to hear how other organizations are dealing with multilingual data. What tool set are you leveraging?

SAP MDM Project- Key stakeholders

Master data for any organisation is very crucial for sustaining continuous growth. We all know that there are business events, for example- New products introduction, M & A, Outsourcing, which impact the Overall data quality of Master data. SAP MDM  provides one solution for ALL master data for industry specific business processes and can be levergaed for continuously improving Master data quality. There are different scenarios supported by MDM-
1. Core MDM Scenarios- Consolidation, Harmonization and Central Master data Management.
2. Horizontal Scenarios-Customer data Integration, Product Info Management and Improved Business Intelligence.
3. Industry Scenarios- GDS, Collaborative MDM for tailored needs.
4. Embedded Scenarios like SRM/MDM catalog
For a client's business needs(read Master data) the applicable MDM scenarios define the scope for MDM program which is delivered using their internal resources or using ISVs/ Implementation Partners. For any MDM implementation program one of the initial tasks is preparing the project charter and identifying/earmarking the different players who will be part of the MDM program which becomes crucial for the program success. Any MDM program should broadly have 3 layers to it -
  •  Vision/Strategic Planning
  • Decision and monitoring the Program(Go, no-go?) 
  • Execution for the program. 
The Hierarchy can be like- Joint Management Council(Strategic Planning)-> Service Delivery Management(Steering Committee for Decision and monitoring)->Project Manager(Execution of project)-> Project team(different tracks)-> Team leads and Developers.
 Below we will see the roles and responsibilities for differnt stakeholders.
  1. Master data Council- They represent the Top management/executives who own the MDM program, its fitment with overall IT plan, defining the MDM roadmap etc.
  2. MDM Steering Committee- They consists of different Data stewards, specialists from different divisions of the business reposnsible for defining the processes and expectations. They also monitor the program execution more on the business needs front, not from the delivery side.
  3. Solution Architect- responsible for designing the overall solution landscape, technologies used for interfaces, design principles to bring the best practices etc.
  4. MDM program Manager- He/She is responsible for the MDM program from delivery point of view, managing the overall team which further consists of-
  • Track leads for related technlogies- EP, XI, ECC, Data Migration
  • MDM functional Consultants for different tracks/master data entities- Customer, Vendor, Product etc.
  • Development team- MDM, EP, XI, ECC
  • Basis and Governance team
The team structre and size can vary depending on complexities and the volume of business/masters, but the above hierarchy can be used as first template for any MDM implementation program.

SAP Developer Network Latest Updates