ASUG Annual Conference Session: Managing Transports Centrally in an SAP Landscape Using Change and Transport System (CTS+)

ASUG Annual Conference Session: Managing Transports Centrally in an SAP Landscape Using Change and Transport System (CTS+)

Please join Storm Archer at the 2009 Sapphire/ASUG Annual in Orlando, Fl. May 11-14th. Storm will be presenting a session on the SAP Enhanced Change and Transport System (CTS+). Below is the session information:
In an SAP landscape, there are different development, content, and configuration objects that require a transport from a development system to a testing environment and finally into productive systems. Learn how to leverage the ABAP-based enhanced Change and Transport system (CTS+) for transporting changes in the entire SAP landscape, both for the ABAP programming language and Java objects. In addition, attendees will receive answers to questions such as, "What is new in the area of CTS+?" and "What is planned for the future?"
The Session will be offered:
May 14, 2009
11:30am-12:30pm
Room 220E
Hope to see you there!
Warmest Regards,
Storm Archer
Senior Product Manager
SAP Labs, LLC.

 

"Driven to Perform" Webcast: Managing Performance and Risk in a Down Economy – A Practical Approach to Optimizing Business Performance for Everyone

"Driven to Perform" Webcast: Managing Performance and Risk in a Down Economy – A Practical Approach to Optimizing Business Performance for Everyone

 

 

Managing Performance and Risk in a Down Economy – A Practical Approach to Optimizing Business Performance for Everyone
What differentiates you from your competitors? How can you improve performance in difficult times by finding and executing the most effective and efficient strategy, and then tuning it to achieve even better results? How can you achieve alignment across your organization and partner network? Don’t miss this important webcast, where you can get answers to all these questions. Come hear directly from the authors of Driven To Perform – Risk Aware Performance Management from Strategy Through Execution as they describe how to strategize, plan, execute, monitor, analyze, and optimize performance, all within the context of the myriad risks and compliance requirements that companies face today.
Driven To Perform provides a unified approach to performance, risk, and compliance management that can help all you effectively manage performance across a global business network. Driven to Perform shows how to apply the principles of risk-aware performance management to your area of expertise, whether it is Finance, HR, Sales, Marketing, Service, Supply Chain, Procurement or Product Development. You will receive valuable context for all of these business areas as it relates to managing performance and risk, as well as learn the collaboration points between them enabling you to optimize performance beyond your four walls.
The concepts of Driven To Perform are essential for those who are serious about driving transformation and who are looking to create a high performance and data driven culture beyond just concepts. Driven to Perform will take you from strategy through execution, with a focus on end-to-end business processes. A detailed case study shows how a corporate strategy is executed across an organization, fleshing out the meaning of risk-aware performance management so you see a practical use case for how to apply this to your business.

 

 

SAP NetWeaver MDM 7.1 Entered Unrestricted Shipment

SAP NetWeaver MDM 7.1 Entered Unrestricted Shipment

Dear Community,
SAP NetWeaver MDM 7.1 has successfully completed the ramp-up phase and entered unrestricted shipment now.

 The highlights of this release are: 

  • High flexibility in data modeling and support for complex data structures
  • Optimized inbound and outbound processing
  • Integration into SAP NetWeaver’s standard administration and life-cycle components
  • Pristine data quality (for example, through connectivity to SAP BusinessObjects Data Services)
  • Acceleration of service-based implementations
Our customers already have SAP NetWeaver MDM 7.1 in productive operation today and are enjoying the benefits: They have confirmed the optimized performance of the platform and reinforced that new features such as the data modeling enhancements, are exactly what they were looking for.

If you want to learn more about SAP NetWeaver MDM 7.1, you can access our collection of informative material stored on SDN:

SAP NetWeaver MDM Capability Page 
SAP NetWeaver MDM 7.1 Overview Page
Opinions about SAP NetWeaver Master Data Management 7.1 (Blog)
SAP NetWeaver MDM 7.1 Features Overview (eBook)
Thread on SAP NetWeaver MDM 7.1 Documentation Updates

Kind regards,

Thomas Ruhl

We are Definitely faster...

We are Definitely faster...

In continuation to Oliver’s blog If it were a hundred times faster...
 
Just a small thought that if lifts working in our building work at speed of 10 years ago, will they make difference for us for reaching office. All of us hate to wait for lift to arrive and reach our floor. Hence I am with Users on their requirements.
However as Vijay pointed in Oliver's blog; everybody loves that work finish as soon as possible but what is the limit !
  

example; I remember of one of my project where the MDM repository used to take 3-4 hours to reloaded which was required as every night activity (there was a reason why we needed it and later it was solved). Because of this reason, the MDM server was unavailable for long time but in night. If we required this system in multi-geographies then we had to plan something else as current system taking too long to finish the job.

Another example, that if a organization system/process is waiting the customer master creation process in its system while customer is waiting for the product then instead of solving problem, its not solving the problem.
  

The work we used to do 10-20 years before in 8 hours, now we are able to do much more than that.
May be we became faster or systems made us perform faster !


Ankur Ramkumar Goel has 8+ years of SAP experience in ABAP, BI and MDM with implementations, Roll outs, Maintenance, due diligence and strategy projects across Europe, USA and APJ, and has been instrumental in MDM for 3 years. The Blog contents don’t necessarily represent my Organization positions, strategies or opinions.

Perspective on Improving Matching Performance of Large SAP MDM Repositories

Perspective on Improving Matching Performance of Large SAP MDM Repositories

Some immovable requirements result in replicating the data model of SAP ECC in SAP MDM. Result is a large repository with over 250 fields. Once this repository is loaded with over a million records, and you start performing matching or searching using SAP Enterprise Portal you would encounter several performance challenges.
A slow MDM system will never be accepted by business users. To address this eventuality it is essential to proactively identify and mitigate these risks. Clients have to be aware that replicating the data model in SAP MDM might be as easy amongst the options rather than a building a solution on eSOA.  But this approach runs the risk of real time performance.  For accurate matching results we use Token Equals Feature, but this results in huge lead times specially while processing million records with over 250 fields.
Following SAP Document is a good start to help in improving the performance:"How to Optimize an MDM Matching Process'
Perspective:
To enable faster performance and searching results, one way is to look at 2 separate repositories. One dedicated for matching and searching tasks and the other holding the parent repository with all fields. The dedicated repository for matching must have only crucial fields such as Matching Fields, Primary & Foreign keys. The Portal could then connect to this smaller repository for matching. Once the results are displayed on SAP Enterprise Portal, the user could then choose to add, delete or update. The resultant action would then connect to main Repository.
Keeping a smaller dedicated repository for matching also reduces the loading time. You cannot use the Slave Feature in Console as a slave repository must have the same fields such as the Parent. As per the SAP Document another good practice is to improve the speed by using Calculated Fields in the smaller repository. These calculated fields are trimmed values of matching criteria for Example First Name can be trimmed to first three characters stored in one calculated fields, first two characters stored in another calculated field etc. Using Equals feature in matching the performance can be extremely fast. But the results of this option might not be as accurate as using token equals as per the analysis done.
Handling the dilemma of choosing precision accuracy with long delays versus faster results with average accuracy was a good learning for us. A lot of "What If Scenarios" needs to be done with multiple options and time taken for each. These options could be trying out choosing different calculated fields, Different matching strategy and scores, and choice of equals or token equals for each matching field.  Analysis of this Performance Improvements would give ideal insight to the approach step with quantified data supporting.  With this study of matching behavior, one should be able to identify the approach for accurate results in shortest time.
A good practice during Blueprint workshops is to present all the results along with choice of matching strategy, scores and threshold limit. If there is disagreement amongst various client stakeholders while identifying the best matching criteria, statistical techniques such as Average Ranking, Spearman's rank etc can be used.  Since each project is unique, generalization of approach is difficult. For Business Partners to increase accuracy, using calculated fields approach for both First Name and Last Name would be more efficient then just using calculated fields of First Name for example.
Insight into the behavior of business users would give information to decide whether to choose Token Equals or Equals using calculated fields.  You could choose matching criteria with Equals using calculated fields on every day use for business users purely due to the high speed results.  Matching with Token Equals can be used on periodic basis such as weekly or fortnightly to identify possible duplicates by Data Administrator. This dual approach might involve redundant activities but would ensure healthy data.
Data analysis using random sampling would give insight to spread of master data in different categories such as Country, Organization etc.  Depending on the pattern of classification, you could filter records based on Country, Region, Categories-Retail Example Apparel, Food etc. Filtering would enable faster matching performance.
The best practice is to stick to the line of thought that Master Data (Global) should be stored in SAP MDM and rest of transactional fields in respective systems such as SAP ECC etc. This would enable standardized data model and attributes for global use and not replicate legacy or SAP ECC data model in SAP MDM.
Navendu Shirali is a Consultant in SAP MDM Center of Excellence at Infosys . His areas of include building new solutions, converting opportunities and Master Data Consulting.

 

Using MDM Java APIs to retrieve and execute Matching strategies in MDM

Using MDM Java APIs to retrieve and execute Matching strategies in MDM

Taking forward, the build  for my Customised Data Manger using MDM Java APIs, please find this blog as the new addition the series:
 Using MDM Java APIs to retrieve Taxonomy attribute Values                                 
 Using MDM Java APIs to add Text(type) taxonomy attribute allowable values
Before I demonstrate using MDM Java APIs to retrieve matching strategies, let us understand what are matching strategies and how do we create or add them in MDM.
A matching strategy is comprised of one or more matching rules and a pair of numeric thresholds.Each strategy can be executed against a set of one or more records against the set, the search results, or all of the records in the repository.
Matching strategies identify potential matches for each record based on the matching scores of the individual rules that comprise the strategy and the thresholds that determine which records are potential matches for each record.
1. As a first step we will create a matching strategy from with in Matching Mode of Data Manager, to add a new strategy to the list of strategies:

  • If necessary, click on the Strategies tab to make it the active tab.
  • Right-click in the Strategies pane and choose Add Strategy from the context menu, or choose Records > Matching > Strategies > Add Item from the main menu.
  • MDM adds a new matching strategy named "New Strategy" to the list of strategies, and highlights it for editing.
  • Include the columns against which target records are to matched, say Material_Number in current scenario.
  • Type the name 'Material Strategy' for the matching strategy and press Enter.
2. Now since we have created the matching strategy by the name 'Material Strategy' let us look at the MDM Java API code snippets, which will be required to execute the matching strategy. Using the Java API, at first we reteieve matching strategies available in MDM repository:

RetrieveMatchingStrategiesCommand retMatStr =  new RetrieveMatchingStrategiesCommand(connection);
retMatStr.setSession(authUserSession);
try {
retMatStr.execute(); }
catch (CommandException e) {
e.printStackTrace(); }
// we have only one matching strategy, retriving the matching strategy id
matchStID = retMatStr.getMatchingStrategies()[1].getId();
....where
connection & authUserSession vaiables hold MDM Connection and authenticated user session respectively
After fetching the matching strategy id, we are left with two following steps:
3. For any new records being created,using the matching strategy id we execute the matching strategy to find mathcing records:

ExecuteMatchingStrategyForNewRecordValuesCommand exeMatstr = new ExecuteMatchingStrategyForNewRecordValuesCommand(connection);
exeMatstr.setSession(authUserSession);
exeMatstr.setStrategyId(matchStID);
exeMatstr.setSource(recs);
exeMatstr.setTarget(rids);
exeMatstr.setTarget(new Search(new TableId(1)));
try
{
exeMatstr.execute();
}
catch (CommandException e)
{
e.printStackTrace();
}
....where
recs is the array of source records, these are records to find matching for, currently only one record is supported
rids is the array of target recod ids, these are the recods to match to
setTarget(new Search(new TableId(1))) sets the target records in form of a search object, these are the records to match to, i.e. all records in main table
4. After executing the matching strategy we execute the RetrievedMatchedRecordsCommand, to retrieve the records that matched the source record for the matching strategy, which was executed:

RetrieveMatchedRecordsCommand retMR = new RetrieveMatchedRecordsCommand(connection);
retMR.setSession(authUserSession);
retMR.setMatchingTaskId(exeMatstr.getMatchingTaskId());
retMR.setRecordId(new RecordId(-1));
retMR.setResultDefinition(new ResultDefinition(new TableId(1)));
try {
retMR.execute(); }
catch (CommandException e) {
e.printStackTrace(); }
.....where
setMatchingTaskId(exeMatstr.getMatchingTaskId()) sets the matching task Id  fetched using executeMatchingStrategy command object from the last step, this identifier is a handle to the matching stratgey that was executed in the last step.
setRecordId(new RecordId(-1)) sets the source record Id, the matched records will be based on this source record.For external record matching, the record Id is -1.

As discussed in the blog, one can use the steps to execute matching strategies already existing in MDM in a different way. Though one might argue for running parallel searches in MDM using the API to accomplish certain kind of matching between records, instead of executing matching strategies this way. I would say this is a feature provided by the java API, it empowers developer to replicate data manager features over the web, and I personally feel using it this way would save a lot of development time required to explicitly code such a matching strategy using the APIs.

 

"Driven to Perform" Podcast from SAPPHIRE Orlando 2009 on SearchDataManagement.com: Are you ready for corporate performance management?

"Driven to Perform" Podcast from SAPPHIRE Orlando 2009 on SearchDataManagement.com: Are you ready for corporate performance management?

Stephanie Buscemi, co-author of Driven to Perform – Risk-Aware Performance Management From Strategy Through Execution, did a podcast at SAPPHIRE 2009 discussing the key concepts of the book. This is a great way to get a quick overview!

Are you ready for corporate performance management?

Corporate performance management (CPM) is one of those hazy terms that means different things to different people, but the basic concept almost everyone can agree on is that CPM is the discipline of tracking progress against preset goals to make sure those goals are being met in as efficient a way as possible.

How to actually achieve effective CPM, which involves aspects of business intelligence (BI) and risk management, is where things get tricky. There are any number of performance management methods that companies can follow, but choosing one over another is often an exercise in guesswork.

To help you better understand the concept of CPM and develop specific steps to successfully implement and maintain a performance management framework, we're speaking with Stephanie Buscemi, vice president of marketing for enterprise performance management and governance, risk and compliance at SAP and the coauthor of the new book Driven to Perform.

In this 30-minute podcast, appropriate for both business and IT professionals, listeners will:


  • Learn just what elements make up CPM and how the discipline has evolved over the last several years (1:45).
  • Get an understanding of the current state of CPM adoption and the major stumbling blocks most companies encounter (5:30).
  • Find out how CPM demands a balance among people and process issues and technical demands (10:15).
  • Get advice on the sometimes overwhelming task of starting a CPM initiative and managing the CPM lifecycle (14:20).
  • Get tips for evaluating CPM technologies and how to make buying decisions that match your CPM needs (21:50).
  • Find out what you risk if you fail to develop a comprehensive CPM framework (25:00).
You can also download the podcast directly.
About the speaker: Stephanie Buscemi is vice president of marketing for enterprise performance management and governance, risk and compliance at SAP, where she is responsible for go-to-market plans, cross-product solutions, and product strategy. She has served in leadership roles within performance management and business intelligence over the past 14 years. Stephanie joined SAP from Hyperion/Oracle, where she was most recently senior director of global marketing. Prior to Hyperion/Oracle, Stephanie was at Business Objects, where she led in building the company's U.S. presence. She holds a B.A. from UCLA.

Changes in life...of all !

Changes in life...of all !

 



For most of us, there was someone who made our paths changed a lot…
 
It was my cousin because of whom I am in SAP world. He gave me idea and support for entering the SAP world. Without him I would be here where currently I am. Once in SAP world, It was my dream to join the parent organization SAP itself.
 
Then there was one of my manager in one of previous organization, who over the lunch asked me to take the new challenge of MDM and that’s how I entered MDM world. Definitely there was history behind that why he asked me for that however the point is that this decision paved the way for me to join SAP organization.
 
I still remember when I first touched the computer in my engineering course and soon I realized that my logical reasoning for programming is far better than theoretical understanding of engineering subject. It was clear that I chose wrong steam for my engineering degree. However someone had some different plans and finally I landed in computing world only.
 

One thing I learned for sure is that finally we all land up where we have to be; where we deserve to be…

 

Producing Printed Catalogs with SAP NetWeaver MDM

Producing Printed Catalogs with SAP NetWeaver MDM

SAP NetWeaver Master Data Management is a multi-domain solution for master data consolidation and harmonization scenarios applicable for customer, vendor, product, employee or custom-built data objects. When it comes to product data, there is a native SAP NetWeaver MDM feature on which I'd like to draw your attention: It is the definition and publication of printed product catalogs. For this scenario, SAP NetWeaver MDM provides a specific UI client application enabling you to expose MDM repository data to a catalog publication and make the required layout settings in a DTP-like environment. A new how-to guide provides you with insight into this client application (MDM Publisher), drawing a basic publication scenario with the key process steps involved.
I hope this guide gives you an overall understanding of the print publishing capabilities of SAP NetWeaver MDM.

 

Standards and role of Data Quality with Masterdata

Standards and role of Data Quality with Masterdata

REFERENCE:
http://www.iso.org/iso/iso_catalogue/catalogue_ics/catalogue_detail_ics.htm?csnumber=51653
Let me quote what Wikipedia states about the ISO 8000-110: ISO 8000-110, Data quality - Part 110: Master data: Exchange of characteristic data: Syntax, semantic encoding, and conformance to data specification (revision of ISO/TS 8000-110:2008); basically states the importance of bringing standardization and advantages of the data quality in a MD space with the DQ.
What have we from this certification ...
Most importantly the MDM implementation is assured of seeking advantage of the implementation of a data quality program already available as part of the system , hence leading to a highest intangible of customer/vendor confidence of data requirements to be accurate.

I think we may all agree to three main MD benefits viz:

1. Application interpretable data requirement statement
2. Unambiguously labeled data
3. Automated gap analysis
While the program we post and undertake as part of a diligence with most client may be a separate initiative that organization will need to seek should not be linked to the MDM program. Newer cutting edge technologies and processes in today's industries have emphasized the important role in the complexity of information sharing. Business Suite 7 launched by SAP in another view of looking at the same. The additions of the Internet and revelations in interface design have resulted in companies being closely intertwined with each other and the consumer (GDS for GS1 could be an apt analogy in the mdm world) which we as consultant should be able to leverage. With the simultaneous transfer of information constantly taking place, it is more important than ever for clients to share and display quality data - a place where MDM can ideally be a hub for the master data. Duplicates and obsoletes have crept in swelling the data size above the expected level - Data quality being the key message here.
image
QAS Survey indicated only 46% have their own documented data quality strategy which lays an important precedence for the MDM program. More so showing why we as mdm consultant show also focus on the DQ functionality and due diligence study helping the MDM implementation.
image
I welcome thoughts on the same.

Rajesh Iyer Rajesh involved in the strategic management of complex Proof Of Concepts, Pilots and Custom Demonstrations to clients, COIL with SAP and various prospects in the SAP MDM space. He has carved several accolades from clients and customers. He has also been a key contributor in a technical pre-sales role where he often had to not only prove out the capabilities of the SAP product in extremely demanding and challenging environments but also had to craft the correct messaging for the relevant groups evaluating the software with competitive products such as IBM MDM , Hyperion and Kalideo.

 

SAP MDM Business Content and What Is In It For You

SAP MDM Business Content and What Is In It For You 

SAP NetWeaver MDM provides a generic infrastructure for multi-domain master data management. This is all perfectly described on the MDM home and the MDM 7.1 pages at SDN. However, bundled information about the business content delivered with the software and its benefits in implementation projects was somehow missing there, or at least less obvious to find.
Therefore, a new SDN site in the MDM space is explicitely dedicated at providing you with the required information at a glance. Check the new MDM Business Content page.
You may be aware that organizational intricacies are often an issue when it comes to designing the perfect data model to manage a company’s master data. Reiterated discussions and redesign cycles cost worthy time and effort. And once you are through with it, you’ll probably spend even more time integrating it into your IT landscape! In such a situation, predefined business content for SAP NetWeaver Master Data Management can help to reduce the implementation effort, and save you time and money.
For a quick overview, watch this SDN presentation to get informed on the exisiting MDM business content and planned content enhancements. 

 

Why Solaris for SAP?

Why Solaris for SAP?

I’m happy to announce the release of a new whitepaper:  "Platform Design, TCO, and Cost-Effective Flexibility: The Role of Solaris in Support of SAP Enterprise Applications."   What excites me about this new paper is that it really outlines the imperatives facing the CIO today and then  provides answers to those challenges:

  • CIOs are being asked to cut costs for computing platforms, but must also increase flexibility to better meet business needs.
  • New versions of enterprise applications, such as SAP Business Suite 7, are driving companies to upgrade and consolidate.
  • CIOs are also under pressure to implement Green IT: that means less power demand and less heat generation in the data center.
  • Server consolidation and virtualization are increasing trends that require deft management of increasingly complex and diverse machine arrays across the enterprise.
  • Platform as a Service and Infrastructure as a Service (Cloud Computing for short) are driving decisions such as when it makes sense to move applications and infrastructure to the cloud to lower costs and provide faster provisioning.

Under these conditions, scalability is necessary.  You need that scalability across  heterogeneous existing hardware and you need a truly top-tier operating system to orchestrate the pieces effectively.  In Hasso's keynote speech - The Power of Speed - at Sapphire last month, he used a few vivid examples of how the data retrieval paradigms used by most enterprise's operating systems vastly underutilize the capacity of the hardware you already have. He compares the cycle time of one processor to the number of cycles needed to retrieve data for business applications to “going to Mars to get a sip of water.” It’s quite astounding when you think of it that way, and I encourage you to watch the whole thing.

This paper tells the story of how the Solaris and SAP combination is ideal for meeting these challenges I have described above. It will make the SAP community more aware of how Solaris solves a lot of problems that people are seeking to solve by using lots of separate technologies, which can run you into hazardous and expensive territory when support and integration issues arise. In addition to racking up some of the most impressive SAP performance benchmarks,  Sun has facilities based in Walldorf – The Solaris Lab for SAP that allows other HW vendors to certify their platform for SAP on Solaris and also provides partners and customers questions for any related Sun and SAP question for Solaris (saponsolaris@sun.com).  Of course you can always visit our wiki too!

Many people know Solaris is the leading enterprise software operating system. What many people don’t realize is that Solaris is supported on over 1000 x86 and SPARC platforms—delivers the performance, stability and security that users and customers demand.  Solaris has a platform for virtualization, for security, and for a related file system all in one package that is engineered to meet the needs of the enterprise.   Solaris is a one-stop shop for meeting all of these challenges. The recent blog by Andre Bogelsack gives a great overview of how the University of Munich uses Solaris and it's built in virtualization capabilities to serve 82,000+ students.  Pretty Impressive!

You can get the paper by going to https://www.sun.com/offers/details/Solaris_for_SAP_Enterprise.xml, but I’d like to summarize some of the highlights.

This paper demonstrates the big trends I’ve outlined here: cloud computing, virtualization, server sprawl, and the need for Green IT. It then demonstrates how Solaris delivers a stable, reliable, scalable and innovative solution for all of these demands.

Customers benefit from the singular vision and engineering muscle that created Solaris, while, through OpenSolaris, they can also capitalize on the flexibility of open source.

The paper describes how easy it is to migrate your SAP installation to Solaris (which has binary compatibility, by the way, which many other operating systems don’t – so if you need to run a Solaris 8 application on Solaris 10, you can).

Then you can see how Solaris is optimized for virtualization and high-availability computing, even on ordinary commodity hardware. Customers who switch to Solaris have recorded an average 30 to 40% decrease in downtime.

Solaris is free. You can get started today.  Solaris can run on the full range of hardware available.   Additional details about Hardware Certification for SAP on Solaris x64 can be found at saponsolaris.com.

Again, I encourage you to download the paper and see for yourself. It’s a succinct document that illustrates clearly why the Solaris platform is ideal for running your SAP installation, and indeed your entire enterprise.

 

Information compliments Intelligence...

Information compliments Intelligence... 

Information brings/felicitate intelligence...however Correct Information is the key to Intelligence....

Checked an article that Coke is using RFID technology to collect information of Customer perspective (Coke's RFID-Based Dispensers Redefine Business Intelligence). It is wonderful utilization of technology.  The collected information will be used to formalize correct and effective strategy and processes to meet the specific demands.

Intelligence is always required. The Data mining, analytics offers complete new environs of Information. This experiment again emphasizes and proves that right information is very vital. I am not sure if it’s already capturing the climate conditions, type of outlet at the time of drink order in dispenser. This kind of information, trends are very important in establishing or redefining the strategy, rectify mistakes (if any) to achieve more success.

The type of outlet is important for any company since only right and complete information leads to right decisions. For example, in a burger joint or pizza joint cold drinks will sell more. However at the eating joint, beer will sell more. Moreover time is important since in evening, tea might sell more and in morning juices will win.

The right information puts the right perspective to the reports clients regularly see or want to see. This simplifies the job of management or may be rectifies will be right word.

Ankur Ramkumar Goel has 8+ years of SAP experience in ABAP, BI and MDM with implementations, Roll outs, Maintenance, due diligence and strategy projects across Europe, USA and APJ, and has been instrumental in MDM for 3 years. The Blog contents don’t necessarily represent my Organization positions, strategies or opinions.

 

What's new around the world of MDM - SAP guys should know

What's new around the world of MDM - SAP guys should know

 

I see large SAP shops also having competitive products and more so a partner has the cruel and cross bearing job to give the best of breed solution - more so when he or she is more comfortable with the SAP Ecospace.
I find the constant updates from other product vendors in the MDM space to be a good point to actually look at SAP MDM from a different hat. While the information of the updates is seeked from different sources, the inferences and observations are with my experience with the products and interactions with key users and experts in that field.
Let us take for example couple of updates which have been published in the new suite of MDM :
A) Kalido - an old horse with a niche fan following has brought in a matching capability within the matching engine. This uses the bitpartite graphs which is also a good feature. How do we compare this with SAP MDM - Personally SAP MDM wins hands down however bitpartite is best run by Netrics matching has DQ - global ID functionality built in. Food for thought for SAP Folks.
Ref:www.cnbc.com/id/30961009
B) Oracle - probably the least threat concieved by the MDM players ; It has decided to sell product data cleansing solution in conjunction with its product data hub with one of its partner in the market. SAP MDM - should not be a problem for has a clear leader board adavantage with oracle on the MDM front.
Ref:www.tcdii.com/oraclemdm.html
C) Last but surely not the least - SAP's biggest threat IBM MDM; Exeros to be aquired by IBM - impact : the IBM WPC is now tougher and stronger having capabilities of profiling during the data doscovery phase during a mdm program. My view from the SAP family and fan club- the BO suite integration in MDM7.0 and further roadmaps would be really the key to break free stringent competition in this space. Siperian  and dataflux who are players in this segment would also see this as oppurtunity to share the pie. MDM implementation hence would be further elaborate to make sure data profiling and diligence time required for the same being extended. Consultants , partners need to be aware of more integration techniques and pre-req's required during an implemention.
Ref:www.gartner.com/DisplayDocument?ref=g_search&id=976312
While I feel this is early days in the slow economy aquisitions to restate the pie under a new banner - it is surely challenging to see more products in this domain. If you felt implementation was tough - now you have the changes and buy-outs to keep an eye out for.
I would welcome thoughts and more insights on the products my fellow sdn bloggers have come across. I am sure these are quite argueable and hence idea of a blog and have opinions and substantiate the same with demos.
Rajesh Iyer Rajesh involved in the strategic management of complex Proof Of Concepts, Pilots and Custom Demonstrations to clients, COIL with SAP and various prospects in the SAP MDM space. He has carved several accolades from clients and customers. He has also been a key contributor in a technical pre-sales role where he often had to not only prove out the capabilities of the SAP product in extremely demanding and challenging environments but also had to craft the correct messaging for the relevant groups evaluating the software with competitive products such as IBM MDM , Hyperion and Kalideo.

 

Managing Product Hierarchies using SAP MDM

Managing global product hierarchies for an organization can be a huge challenge, especially when there are multiple systems & no ownership of managing the hierarchies. This blog is not to provide more information on importing or syndicating hierarchies, but I have tried to explain how management of complex hierarchies can be a strong business case while implementing MDM.
The need to manage Product Hierarchy
Any organization offering products or services for sell, need to categorize there offerings systematically. A simplistic way to group different products is by grouping them in distinct Product Lines. But many times this high level grouping does not suffice especially for organizations have a large exclusive product offerings. Hence, it is necessary to drill to a logical level of hierarchy where a finished good can be linked. This can be achieved by creating a logical hierarchy which can not only help in categorizing the Products but also in analytics & forecasting.
Building a Product Hierarchy
Few points to keep in mind while building a product hierarchy:
  1. Applicable globally for the organization
  2. Covers all Products & are categorized uniquely
  3. It should be at a logical level & recommended not more than 5 levels
A product hierarchy can be built by grouping products by Line, Type, Family, Group, Series etc. The order & the levels of these categories must be identified carefully so as to avoid expansion or repetition of nodes. Strong capabilities of SAP MDM
SAP MDM comes with a strong mechanism to build, manage hierarchies & is one of the strong business cases for implementing MDM in organization. MDM allows you to easily manage complex product hierarchies, import it from excel files & automatically create a tree structure.
imageimage
SAP MDM provides features to add sibling, child, sort the tree etc. This makes it a strong tool. Additional functionalities like Taxonomy, Images etc add to the strength of the tool. Capability of SAP MDM to trigger Workflows on hierarchy table help in better management of hierarchies. 
Though Product hierarchy is a lookup reference data for master data objects like Material & Products, it still can be managed centrally using SAP MDM. Syndicating the changes / additions in hierarchy can be easily managed using SAP MDM & can help in having a synchronized, unified hierarchical product classification. Further, it can also be leveraged during Catalog Management, SRM & with BI to perform analytics.

 

SAP UK and Ireland World Tour 2009

SAP UK and Ireland World Tour 2009  :  Your complete guide to Business Intelligence , all under one roof. 

Join us on the UK leg of the SAP World Tour on the 15th July in Birmingham  and discover how the latest tools and technologies from SAP and our partner companies worldwide deliver clarity, accountability and insight to businesses of all sizes.
What can I expect to gain from attending?
This event offers a unique showcase for the latest business strategies, industry best practices and sap solutions, with particular emphasis on business intelligence and enterprise performance management: 



Find out how you can manage and mitigate risks and optimise business performance while closing the gap between strategy and execution



Tame information chaos by turning structured and unstructured data from disparate sources into high quality, actionable insight that can drive business process improvement



Get the inside track on our range of affordable, quick-win solutions that can deliver rapid return on investment in a changeable climate and help you identify new opportunities by putting your existing data to work.
 

What if my business doesn’t run SAP?

SAP BusinessObjects’ tools are heterogeneous – you don’t need to run SAP platforms as our applications are designed to run across any and all data sources – making this is the ideal event to get up to speed with our solutions to your business challenges and talk to partners and customers who can share their first-hand experience.
 

Affordable SAP e-Learning

 
 
Virtual SAP TechEd (VSTE) provides access to valuable education with over 500 sessions from SAP TechEd events from 2006-2008 and used by SAP practitioners worldwide to learn latest SAP products, technologies, and develop new skills.
For SAP practioners, staying current with the latest and greatest in SAP techologies and solutions is key to professional success. In the current economic climate, keeping SAP skills of your team has become more affordable. ASUG members and memebrs of other SAP professional groups are now eligible for a 30% disocunt.
To buy, preview, or learn more, please visit Virtual SAP TechEd page. If you are an ASUG member (or beolong to another SAP User group), please send us an email at sdn_subscriptions@sap.com with your full contact information and the association you belong to for instructions to purchase Virtual SAP TechEd at the special price.

 

Producing Printed Catalogs with SAP NetWeaver MDM - Part 2

Producing Printed Catalogs with SAP NetWeaver MDM - Part 2 

The recent How to Create Publications with SAP NetWeaver MDM Using MDM Publisher guide introduced the MDM Publisher as a UI client application enabling you to expose MDM repository data to a catalog publication and define the desired layout in a DTP-like environment. While the focus of that beginner's guide was to draw a basic publication scenario with the key process steps involved, a follow-up guide provides you with information on how to optimize table layouts for more advanced publications.
I hope you'll find the new How To Create Publications with SAP NetWeaver MDM Using MDM Publisher - Advanced Topics guide helpful for setting up more sophisticated print publications.

 

Removing Complexity from SAP EDI and B2B Processes

Removing Complexity from SAP EDI and B2B Processes 

I had a conversation with a large Oil and Gas company today about complexity in the areas of EDI and B2B processes. They have, like many large companies, acquired different subsidiaries which run different ERPs and have different EDI translators. A quick review of their IT infrastructure easily uncovered huge amounts of complexity and costs associated with operating, supporting and maintaining these multiple disparate systems.

Who can fix this complexity? Is it the EDI department? No, the EDI department usually is task oriented and rarely can say NO to the business. So where is the complexity, who owns it and who can fix it? The complexity is in the integrations of data with the multitude of database applications and ERPs, and supporting the various file formats that the business units agree to support. The challenge is in developing, managing and documenting all of these integration scripts. This is often outside of the responsibility of the EDI department, but never really embraced by the software development teams in IT either. It is a kind of NO-MANs land. No one wants to own it. Software developers will grudgingly develop an integration script, but only if they are promised it is a short term project. The second it mostly works, the developer moves on to sexier development projects - leaving an undocumented, and poorly supported integration script to be found and deciphered next year by the next unlucky soul. These undocumented integration scripts grow like weeds in July. After a few years, there are hundreds and thousands of these scripts just waiting to be broken by an upgrade or process improvement somewhere which will bring IT to its knees fast.

The complexity involved in supporting all the various business processes and data requirements of dozens of different database applications and ERPs can be enormous. Companies need to either develop specialized database applications to help manage all of the integration scripts or buy some specialized application (I have never seen an application that does exactly this, except one I wrote myself many years ago).

Added to the complexity of managing hundreds and even thousands of internal integration scripts is the data and file format requirements that the business units agree to support with customers and suppliers. One simple electronic purchase order which can easily be mapped and supported by an EDI expert, suddenly becomes a nightmare when the business unit agrees to support a different file format for every customers' purchase order process. These kinds of complexities have a name - "combinatorial explosion!!!!" Multiply all of the different database systems and ERPs that all require different data and processes, with an understaffed EDI department and business units agreeing to support hundreds of different file formats for simple processes. HELP! No wonder EDI departments are often considered slow and unresponsive. They are shell shocked and buried alive in complexity!

SAP's Business Network Transformation strategy and recent investment in an automated business exchange is designed, for the first time, to solve these problems - at least for SAP customers. It is a new and much simplified and standardized paradigm for SAP EDI and B2B.

SAP's Netweaver PI can facilitate the internal enterprise integration of all of the various components of SAP and aggregate the integration of data into one set of standardized interfaces that can be pre-developed and stored in the ESR (enterprise services repository). These standardized sets of business processes and integration points, connected to the automated business exchange, eliminates the need for internal integration scripts, their development, maintenance and support. This greatly simplifies EDI and B2B projects and significantly reduces the costs of implementing EDI.

The automated business exchange runs an SAP-centric hub for SAP users. The automated business exchange utilizes SAP's Netweaver and other solutions co-developed by SAP on its network which enables it to integrate with all other SAP customers through a Netweaver PI-to-PI connection. Once an SAP customer is connected to the automated business exchange, they gain access to the 40,000 plus companies that are already connected and supporting a huge library of B2B business processes for various SAP applications. The 40,000 are growing expoentially now as the network effect kicks-in.

SAP's new paradigm for EDI and B2B removes the complexity of integrations, standardizes processes, reduces costs, enables rapid on-boarding of large numbers of trading partners and addresses the remaining issue that causes uncontrolled complexity which is supporting large varieties of different EDI standards and trading partner required custom file formats. In SAP's new strategy, once an SAP customer connects to the automated business exchange through Netweaver PI, IDocs or tRFC, the exchange as a managed EDI/B2B service provider takes over the management of the infinite number of different file formats and communication protocols required by trading partner communities.

Let's review - Integration complexity is now resolved as there is a simple and standardized way of integrating with an EDI/B2B system (operated by the automated business exchange co-owned by SAP) via Netweaver PI, IDocs or tRFC. Netweaver PI can also integrate all the back-office applications into an enterprise service bus architecture (eSOA) so data can be shared as well. The automated business exchange (operating in a cloud computing environment) already has over 40,000 connected companies that can be accessed by connecting once to the exchange.

Every new company that connects to the automated business exchange can be available, with permission, to exchange data with all other connected members. This SAP-centric network effect means that for the first time SAP EDI and B2B data exchanges are guaranteed to get easier, faster and simplier over time as the network expands and the library of pre-developed and pre-integrated business processes and data exchanges grows.

Investing in SDN Blogging

Investing in SDN Blogging

Many of my colleagues and friends that work in and around SAP solutions enjoy reading other people's blog postings, but have never contributed a blog article themselves.  Often they say they either don't have the time, or are uncomfortable writing for public consumption.  Several of my friends that have published blog articles on SDN took literally 2-4 weeks to write one article as they wanted it perfect before publication.  They misunderstood the purpose of blogs.
Blogging on SDN takes time, but it does not have to be perfect.  Blogs are the sharing of ideas and experiences in real time.  These are your ideas and experiences. I don't think I have ever re-read a blog article I wrote in the past without wanting to edit it again (what was I thinking?).  This is a dynamic media where the free flowing stream of ideas and experiences can be shared.  Sometimes you may even change your mind about a subject, perfect!  Write another blog article explaining why your ideas have changed!
Blogging on SDN is about sharing.  It is about community learning and the distribution of ideas and best practices across a community with a shared interest.  I often have emails and comments sent to me after posting a blog.  These are from readers offering personal experiences, asking questions or even, can you believe it, disagreeing.  Wonderful!  That is how learning and progress happens.  If along the path you can build your personal reputation as a subject matter expert, then even better.
I write on SAP Netweaver, EDI and B2B business processes and strategies.  I have posted over 50 articles on these subjects in the last 12 months that can be found by searching on my profile or the words EDI and B2B.  These articles are posted in real-time, but also have longevity.  My experiences are archived for others that want to learn in the future.  Blogging on SDN is helping create a knowledge base of experiences, an intellectual asset that the entire SAP community can share.
I encourage you to share your experiences, ideas and strategies with the SDN community!
Kevin Benedict Kevin Benedict is an independent consultant on Mobile and EDI/B2B Strategies, http://www.linkedin.com/in/kevinbenedict

 

Exchange ECC customer master standard field with the CRM Z- fields (ECC -> CRM)

Exchange ECC customer master standard field with the CRM Z- fields (ECC -> CRM) 

This blog is a continuation of my previous blog CRM 7.0 How to --4  .

In this blog I will cover the Scenario 1 : Exchange ECC customer master standard field with the CRM Z- fields (ECC->CRM).

Since you have used AET/EEWB to enhance the BP master, You don't have to perform any task in CRM side as the tool (AET/EEWB) has already taken care of all the necessary task for you.

What you need to do is map the ECC field to the CRM custom fields. For that perform few simple steps mentioned below and your are done.

Step 1: In ECC Tr. SE11, look for the structure BSS_CENTI, double click on CI_CUST and create structure CI_CUST, Add all the fields added to BUT000 table in structure CI_EEW_BUT000. Make sure that you add the fields in the same sequence (this is very important)

image
image

Step 2: Follow the same for the structure BSS_CENTIX double click on the CI_CUST_X and create structure CI_CUST_X. In the CI_CUST_X follow the same sequence of fields but use the component type GB_BAPIUPD (A flag to indicate the change in the field).

image

Step 3: Copy the FM SAMPLE_FCTMODULE_DE_EIOUT to Z function module and all the code mentioned below to map the standard fields to the custom fields.
DATA:  LV_PARTNER_GUID        TYPE  BU_PARTNER_GUID_BAPI,
           LS_XKNA1               TYPE  KNA1,
           LS_BUSEI_EXTERN        TYPE  BUSEI_COM_EXTERN,
           LS_CRMKUNNR            TYPE  CRMKUNNR.

  FIELD-SYMBOLS:  TYPE BUSEI_COM_EXTERN.

  READ TABLE IT_XKNA1 INTO LS_XKNA1 WITH KEY KUNNR = IV_CUSTOMER.

  READ TABLE IT_CRMKUNNR INTO LS_CRMKUNNR WITH KEY
  CUSTOME_NO = LS_XKNA1-KUNNR.
  CHECK SY-SUBRC = 0.
  LV_PARTNER_GUID = LS_CRMKUNNR-PARTN_GUID.

  READ TABLE CT_MAIN_EXTERN ASSIGNING  WITH KEY
  HEADER-OBJECT_INSTANCE-BPARTNERGUID = LV_PARTNER_GUID.

  IF SY-SUBRC = 0.
    -CENTRAL_DATA-COMMON-DATA-CI_INCLUDE-ZZAFLD000003  = LS_XKNA1-XXXX. --> Here use the std field
    -CENTRAL_DATA-COMMON-DATAX-CI_INCLUDE-ZZAFLD000003 = 'X'. 

   -CENTRAL_DATA-COMMON-DATA-CI_INCLUDE-ZZAFLD000004  = LS_XKNA1-XXXX. --> Here use the std field
    -CENTRAL_DATA-COMMON-DATAX-CI_INCLUDE-ZZAFLD000004 = 'X'.


    -CENTRAL_DATA-COMMON-DATA-CI_INCLUDE-ZZAFLD000005  = LS_XKNA1-XXXX. --> Here use the std field
    -CENTRAL_DATA-COMMON-DATAX-CI_INCLUDE-ZZAFLD000005 = 'X'.


    -CENTRAL_DATA-COMMON-DATA-CI_INCLUDE-ZZAFLD000006  = LS_XKNA1-XXXX. --> Here use the std field
    -CENTRAL_DATA-COMMON-DATAX-CI_INCLUDE-ZZAFLD000006 = 'X'.

  ENDIF.


The next two steps are required to call the above created FM while sending the data from ECC to CRM.
Step 4: Go to tr. SM30 and maintain table TBE24. Create a product and mark it active.

image

Step 5: Go to tr. SM30 and maintain table TBE34 for the event DE_EIOUT. This event will get trigger when data will flow from ECC to CRM.

image

That is it. Your standard field is mapped to the CRM custom fields and ready for exchange from ECC to CRM.

Reference note: Note 736595 - Exchange of EEW fields with R/3 customer master

 

Master Data Management Represented in BPX Community Space

Master Data Management Represented in BPX Community Space

In the SDN Community space there have been repeated discussions about which information is exactly needed. Should it be merely IT-related content, exclusively treating the technical aspects that are relevant for software developers and technical consultants, or should this community channel also provide a broader picture that also integrates information about the underlying business dimension of the software.  There is no once and for all answer to this since SDN covers both areas with a predominant IT spin and others where the IT and business aspects are very close.
When it comes to master data management, it becomes very clear that it is about IT and business matters alike: The impact of applied MDM on business execution in diversified IT landscapes is dramatic (which, after all is the main reason why there is such as a thing as MDM), and therefore I'd consider the business context always as an intrinsic part of MDM. To get an overview of the MDM business impact, you may watch this animated 3-minute-demo.
Finally, this means nothing more than that comprehensive MDM information for the community needs to cover the What, How and the Why adjusted to the specific target group.


This figure perfectly illustrates the bottom-up impact of the quality of master data on related business transactions and analytical processes. If, in a specific business situation [e. g., during a Merger & Acquisition phase] the quality of master data decreases, it is very likely that the quality of related business and analytical processes deteriorates as well.

To pay tribute to this natural ambivalence of MDM being relevant for the IT and the business world, it is clear that MDM needs its due space on BPX where the MDM business dimension is focused.
As a consequence, Master Data Management is going to be featured in the BPX Community space under Business Themes starting today.
I hope you'll find this "new" BPX topic interesting and the information therein beneficial for your business.
Kind regards,

 

Master Data Quality, Governance and Management

Master Data Quality, Governance and Management 

Master Data Quality, Governance and Management


During one of my client’s implementation, I met very good sales people to whom I had to sell the idea of Data Standards, Quality and Governance. 

This was due to the fact that in their current landscape, data was entered in any format wished for. Their used to be new customers and materials creation all the time without checking duplicates (whether such customer or material) was already created in system.  The first reason I got that search functionality of SAP ECC is not very good. The second reason was that it’s very hard to find the right customer or material as of now since there are already so many customers and materials with same parameter. Over the period of time there are so many duplicates in system that it’s hard to choose the right customer or material. After looking into their system I was shocked to see the that either or all of Name 2, Name 3, and Name 4 has address details and fields like street was empty. For some Street data was filled. Street has the city details. City field has District, region or even country. For some customers though rare cases, data was properly maintained. One good thing was there that one of their systems was following standard procedure and had governance mechanism. The data was 90% correct in that system. However it also had scope of duplicates and some of wrong details.

One of the challenge was definitely was the Data Standards. While creating the Customer Master data, as far as possible, they should enter the data in respective fields only and in correct format. One of the beautiful questions was that why this MDM is required since day to day work activities are being carried out as usually. He was very right since the organization was able to carry the business. However there was some challenges and gaps which I had to highlight in that meeting.

The other challenge was management was not able to get the correct information for the customers and not able to utilize the customer base to expand the product profiles by cross selling products.

One of their pain point was that if someone entered wrong pin-code and right city (since they don’t know the correct pin-code), then the shipment reaches wrong location. This is because the courier delivery is based on pin-codes. After this shipment comes back, the correct details are tried to be found either by calling the customer directly or getting this information from sales person. The correct pin-code is then again passed to Courier Company and the shipment gets finally delivered. This whole exercise leads to increased shipment cost, increased delivery time and most importantly customer dissatisfaction.

For the Duplicates, I took a very simple approach. I asked for our sales person mobile phone and went to his contacts folder. As expected, he was using company name as prefix or suffix. I asked simple question to him that if you have many contacts with common name, how you identify that you always call the right contact. He answered that I use company name along with contact name to distinguish between contacts. My another question was then that without adding company name, how he will recognize which is right contact to call until he remember his contact number (which is not possible). By this, he already understood that by not adding company name, he might have to call all contacts by same name to talk to right contact and dissipate lot of time and money in this.

For the Data Quality, I was simply confused and also amazed by looking at scattered data. I explained if the right data would have been entered in right fields with the help of standards, validation checks, processes and little governance then system would be in better shape and maybe we not a reason to meet.

I was sincerely thankful to be part of such engagement and learning experience and finally able to solve their problems. It’s such an amazing feeling to be part of such challenging assignments and help customers with the right solutions. This is what provides the fire to be part of such requirements.

 

Share your best practices, tips, and ideas for implementing IFRS on SAP for a chance to present it during an SAP IFRS Webinar LIVE!

Share your best practices, tips, and ideas for implementing IFRS on SAP for a chance to present it during an SAP IFRS Webinar LIVE! 

SAP is presenting a series of four one-hour Webinars designed to help you understand the SAP strategy and road map, planning, implementing, and reporting for IFRS compliance using SAP solutions.

We are seeking best practices and ideas from SAP community with expertise to contribute to the topics of these webinars. If you or your organization have a unique idea, best practices, tips and tricks that will benefit those seeking to implement IFRS reporting using SAP solutions, submit your idea at SAP IFRS Best Practices Wiki.  One best idea will be selected by the presenters of each webinar and the winner will be invited to present his/her idea for 5 min during each webinar.
This is a great opportunity to showcase your knowledge and thought leadership. Submit your idea now!

 

SAP Webinar Series: Successful Transition to IFRS on SAP


SAP Webinar Series: Successful Transition to IFRS on SAP

The U.S. Securities and Exchange Commission has mandated that by 2014, publicly listed companies in the U.S. will need to transition their financial reports to the International Financial Reporting Standard (IFRS) from the current U.S. Generally Accepted Accounting Principles (U.S. GAAP) standard. For many, this impending transition is an “accounting” issue. However, this transition will have a significant impact on all operational areas outside of the finance department for all publicly listed enterprises.

SAP is presenting a series of four one-hour Webinars designed to help you understand the SAP strategy and road map, planning, implementing, and reporting for IFRS compliance using SAP solutions. Gain strategic insights from experts you need to prepare for transition from U.S. GAAP to IFRS. From this series attendees will:
  • Gain knowledge and insights you need to prepare for IFRS transition within your own organization or at your customers
  • Understand the IT and business issues and implications when transitioning to IFRS
  • Learn the SAP product road map and solution functionality to support IFRS-to-U.S. GAAP transition implementations
  • Learn from expertise and best practices based on how SAP implemented the solution
  • Learn about key resources, tools, and community support available from SAP to help successful transition to IFRS reporting and get ahead to properly transition to IFRS
To learn more and register for this webinar series, please visit SAP Ecohub site. All webinars in this series are scheduled from 10 am PT or 1 pm ET and they are:

SAP Road Map for IFRS Compliance – August 13, 2009
Learn about the SAP product road map and how SAP solutions provide the functionality required for businesses to implement IFRS
Pete Graham, VP, Solution Management, ERP Financials, SAP
James Fisher, Senior Director, Solution Marketing, Enterprise Performance Management, SAP 

SAP Migration from U.S. GAAP to IFRS – August 20, 2009
Learn about the transition of SAP financial reporting from U.S. GAAP to IFRS
Dr. Christoph Hütten, Chief Accounting Officer, SAP AG

Implementing IFRS at SAP – September 9, 2009
Learn how SAP planned the needed business and IT functionality and managed its own transition to IFRS reporting, including managing parallel ledgers (IFRS, U.S. GAAP, and local GAAP) from the project leader
Christiane Ohlgart, Head, Subsidiary Accounting, Corporate Financial Reporting, SAP AG 

Optimizing the Financial Closing Process for IFRS – September 16, 2009
Understand best practices in IFRS adoption, learn how the SAP ERP application, including SAP Financial Closing cockpit and the SAP Central Process Scheduling application by Redwood will improve your financial close performance while adhering to IFRS dual-reporting standards  
Pete Graham, VP, Solution Management, ERP Financials, SAP
Peter Minck, VP, Redwood Software

 

 

 

On The Road with the SAP TechTour 2009: Atlanta, Georgia

On The Road with the SAP TechTour 2009: Atlanta, Georgia 

SAP TechTour made a stop in Atlanta during the ASUG Georgia chapter meeting on Friday, April 24, 2009. The meeting was hosted by IBM, who presented a great overview of the history and accomplishments of SAP at IBM and the how IBM went about implementing SAP to transform its supply chain yielding many tangible and intangible benefits.

SAP TechTour and Communities topics presented at this event were the following:
  •  Business Process Management by Nick Holshauser
  • Joined Roadmap - SAP NetWeaver BI and Business Objects by Katie Beaver
  • Update on SAP NetWeaver Master Data Management by Klaus David
  • Enterprise Information Management (EIM) for SAP Environments by Klaus David
  • SAP Communities by Siva Darivemula
  • Virtual SAP TechEd by Siva Darivemula
With many participants from major Atlanta area employers such as Home Depot, Coca-Cola, Delta Air Lines, Newell Rubbermaid, UPS and many others, the event presented many educational and business networking opportunities for the participants including several compelling demonstrations from vendors.
At SAP, we are pleased with our experience and partnership with ASUG and the Georgia chapter. We look forward similar successful collaboration in the future.

 

Master Data Assessment

Master Data Assessment

 

Now a days, all the SAP implemented organizations are looking forward for assessment of their Master Data as a part of Data Management Initiative, which would help to identify the areas of unclean data as well as the need to switch on to SAP-Master Data Management Module for better data quality.
This blog gives an overview of the procedure for assessment of Master Data such as Material Master, Vendor Master, Customer master & so on. Vendor master data is consisered here for the discussion.
Basically, every master data is created abiding to the specific business rules of the Industry. Out of which, if there is any deviation, it’ll be considered as an unclean data. The assessment for the master data can be carried out against various criterions such as completeness, conformance, duplicates and consistency.
Initially, we need to download all the Master data key fields from the system into an excel or note pad (can be exported to MS access in case of large data).  For example, the key fields such as NAME1, NAME2, Street, City, PO Box No, Address fields, Deletion indicator, Tax fields etc. from General Data view, PO Order currency, Terms of Payment, Reconciliation account, Double invoice validation, Deletion indicators etc. from Purchasing and Company code data. LFA1, LFB1, LFM1, ADRC tables are used for dowloading the Vendor Master Data.
For the data downloaded, the following assessment can be done by checking against the below criterions.
            1.Completeness: All the required fields filled
Case study: The business rule of the client mentions Name1, Name2, Tax1 and Tax2 fields are to be maintained for the Vendor Master. Although this can be maintained as required in the configuration, however in some cases, the settings are exceptional depending upon requirements related to region wise, account group wise etc.,
In this case, all the entries for the fields to be maintained are to be checked whether there is an input in the field or not (empty). The percentage of the required fields not maintained out of the total valid entries indicate the incompleteness.

 2.Conformance: Standards that need to be adopted
Case study:  The business rule of the client mentions all the Name fields are to be entered in capital letters (Ex: Should be CHARLES instead of Charles) or the contact numbers are to be maintained in a specific format (Ex: Should be 123-456-7890 instead of 1234567890) & so on.
In this case, the conformance is to be checked for all the entries whether if they are maintained as per the standards or not.

            3.Duplicates: Multiple records referring to the same vendor
In most cases, the same vendor is created with multiple vendor master records in SAP due to various reasons.
The duplicates can be checked for various combination of fields such as Name1-Adress fields combinations, NAME1-Tax id Field combination etc. The address fields which can be considered for checking duplicates are Street, Postal code, City, PO Box no, Tel no, Fax no etc.
           
            4.Consistency: Sanctity of data
Case study: The business rule of the client mentions that some fields shouldn’t contain specific words or a special character is not allowed etc.
For example, NAME1 field should not contain any special characters except ‘&’ or spaces are not allowed in between letters of the abbreviations (Ex: Should be AT&T instead of A T & T). In these cases, inconsistency represents the total entries having special characters except ‘&’ sign for NAME1 or space in abbreviations out of all the entries checked.

The following conclusions can be derived with the assessment:
-Cleanliness of the data can be assessed, which will be useful to redesign the master data creation or upload process meeting business requirements.
-Standard template usage for assessment of the input master data can be done before it is created in SAP
-Identifying the need for implementing SAP MDM (Master data management) as an alternative for maintaining clean master data.

 

Additional Flexibility in SAP MDM Publishing

Additional Flexibility in SAP MDM Publishing 

In addition to the flexible catalog publication features provided by the MDM Publisher UI client, something which has been thoroughly explained in a recent starter's guide and an advanced user's guide, SAP NetWeaver MDM 7.1 comes with another versatile option for creating product publications - and this is based on specific Java Publishing APIs.
 
These MDM Java Publishing APIs provide an interface for managing MDM publishing activities enabling 3rd party applications to access and modify publishing/layout data to automate their publishing process.
Currently, there are two specific use case scenarios for the MDM Java Publishing APIs:
  1. Retrieve and modify layout data for various reports and Web layouts:
    Used to simply change already existing publications that for example have been created with the MDM Publisher (or via the Publisher APIs), receive the publication information in XML format, and provide it for example as XSLT to 3rd party consumers that expose the data as a Web representation.
  2. Automatic massive generation of data layouts:
    Used to determine the products to be collected into a publication, assign layout properties, and expose it to the appropriate output device (print or Web), all without manual intervention or preceding layout definitions in the MDM Publisher.
Both scenarios are perfectly suited to quickly set up and launch, for example, ad hoc Web marketing campaigns for specific products. The MDM Java Publishing APIs provide an easy-to-manage environment that is geared towards "light" publishing purposes where users can quickly and effectively obtain the desired output. 
More details about the Java Publishing APIs can be found in the latest edition of the MDM Java API Guide for SAP NetWeaver MDM 7.1 at the SAP Documentation Center on SAP Service Marketplace.
For a quick overview of the key features of SAP NetWeaver MDM in the publishing area, you may watch the eBook on publishing features with SAP NetWeaver MDM 7.1. I hope you benefit from the flexible publishing options offered.
Regards,
Markus

 

MDM & BPM: A perfect combination for User Centric Processes

 Hi,
Implementing user centric business processes with SAP NetWeaver MDM and Guided Procedures was and is still very popular since it has many advantages in comparison to the standard MDM Workflow tool for certain use cases (like integrating additional services, custom UI's, cross system workflows etc). With the combination SAP NW MDM and SAP BPM/BRM you can do the same but you will add a lot of new features to your solutions as well like:
  • Code free based Modeling by a Visual Process Modeler with NetWeaver Developer Studio
  • Out of the box substitution, delegation and revoke functionality
  • Easily adaptable MDM Web User Interface consumed by BPM directly (Preview of SAP NetWeaver MDM WD Components, planned to be released at a later point in time)
  • Integrated Business Rules (code free) functionality (no / less logic in Web Dynpro code anymore)
  • Exception handling of broken instances
  • Deadlines and automated escalations
  • Use of standard MDM Web Services in automated BPM activities
  • Out of the box Notes and Attachments handling
  • Enhanced dynamic texts for process titles in Portal UWL
  • Reusable Sub-Processes and embedded Sub-Processes (with upcoming BPM release)
  • Enhanced monitoring capabilities
  • .. a lot more
Since some months I already implemented some “Proof of Designs” and demo scenarios to show these advantages. If you are interested you can view this ScreenCam:
Expanding the Reach of your MDM Solution with SAP BPM/BRM

The business benefits are more or less obviously:
  • Higher Master Data Accuracy
  • Lower TCO and implementation effort by out of the box functionalities and Reusable subprocesses
  • Increased flexibility & design collaboration
  • Enable your business users since the tools are mostly code free usable
  • Better Data Governance throughput
  • Better Tracking & Tracing capabilities
  • Faster Rollouts of new versions (No Thick clients, No Excel)

The following screenshot shows the designtime in theNetWeaver Developer Studio for BPM Modeling:
image"
This screenshot is a example of a decision table used in the scenario to determine the correct process flow for the corresponding input parameters:
image"
Last but not least this screenshot shows you the configuration option of the new MDM Web Dynpro  Components used in the BPM process:
image"
Please feel free to place comments or suggestions or contact me if you have questions.
 
Related Posts

Webinar: Maximizing Business Intelligence in a Data-Driven Environment – with customer Lexmark & Processing Magazine

 
 

Webinar: Maximizing Business Intelligence in a Data-Driven Environment – with customer Lexmark & Processing Magazine

I'd like to draw your attention to an upcoming Business Intelligence webinar, featuring customer Lexmark and partner Wipro Technologies.
  
The webinar will address how companies can get relief in these challenging economic times by getting the tools to make the right decisions.

How do you, as a company, best deal with information overload without missing what's important? Lexmark will share how they have been able to cut through the overwhelming amount of data that confronts us today with the help of BI, resulting in Lexmark being able to make fast, accurate and confident decisions. Bottom line: better information management has helped them increase business value.  

THE SPEAKERS
Joe Young, Senior Manager, Enterprise Information Management Systems at Lexmark International, will share his experiences and ideas with Mike Wasson, Publisher of Processing Magazine. 

Lexmark is a manufacturing company with many of the above challenges, and will share with you how they worked with Wipro Technologies and SAP to understand these problems better and develop solutions to them. Joining the discussion will be Rajesh Shewale, Program Manager at Wipro; and Siddharth Taparia, Principal at SAP. 

These three leaders from three different companies will provide their unique perspectives on the key issues and trends facing the manufacturing industry, and how companies are implementing better use of analytics and business intelligence to respond to these business challenges.

Register here to join this webinar on Wednesday, September 16 at 8:00 a.m. PT /11:00 a.m. ET  

 

Related Posts

SAP Developer Network Latest Updates