Showing posts with label SAP MDM. Show all posts
Showing posts with label SAP MDM. Show all posts

FAQ about SAP NetWeaver MDM

Can I install and run several versions of the MDM Clients in parallel on my PC?

Yes, both the installation as well as the parallel usage of all MDM Clients (e.g. Console, Data Manager, and so on) is possible. To achieve this, you need to change the default path suggested by the MDM installer during the MDM Client installation. You can use any destination folder on your local PC.

Example

In SAP MDM Development we use a central folder SAP MDM Clients as root folder for our installations. Below this folder there is a specific one for each MDM build version we need to work with. The screenshot shows that it is even possible to install MDM 5.5 and MDM 7.1 Clients in parallel.

Where can I find Repositories with Data?

The predefined SAP Business Content contains a small set of sample data that can be used to demonstrate the MDM capabilities. The sample data is directly shipped with the content deliverables. Additionally there is a NetWeaver MDM Demo Package available for MDM 5.5

How can I connect MDM with BI?

The predefined SAP Business Content used a file based approach to send data to BI. Each repository contains a Syndication Map that sends relevant data to BI. Alternatively you can use the MDM ABAP API to set up a direct connection between the systems.

 

Where can I find more information?

The SAP MDM WIKI space contains several articles of various topics that contain additional information - but are to large for the FAQs. Use the links below to browse through the existing information.

SAP Business Content — FAQ, tipps and tricks related to the SAP predefined Business Content

Links — Links to additional information related to MDM outside the Wiki pages.

Multiple Main Tables and Import Manager — Some things to know about the import and multiple main tables.

Efficient Repository Designing in MDM — Useful tipps, tricks and recommendations for data modeling in MDM.

Getting started with SAP NetWeaver MDM — The Beginners Guide for MDM

Information Governance Tips + Tricks from a Practitioner

SAP is leading a series of Information Governance workshops, co-sponsored by Deloitte. The first of these workshops kicked off last week in Palo Alto. It was awesome. I’m providing a quick highlight in this blog, but feel free to email Philip On for details on upcoming workshops.

The headliner for the event is the incomparable Maria Villar. Not only has Maria implemented strategic information management organizations and policies at a number of other companies, she is now the GVP for SAP’s own Customer Data Management organization. She teaches courses on information governance at E-Learning curve. And did I mention she wrote a book? She wrote a book: Managing Your Business Data: From Chaos to Confidence.

Really, an information governance program is a series of decision points, all to help you scope your efforts for the most meaningful business value. Maria level-set the room: information governance can be applied to a single project. You can start information governance with a single data type. However, keep in mind that information governance should ultimately be extended to the enterprise.

Information governance is also not one-size-fits-all. For any hope of success, it needs to be tailored to your organization. Evaluate your organization across these dimensions and adjust your strategy appropriately:

  • Culture
  • Information management maturity
  • Management sponsorship
  • Data domains and processes affected
  • Priorities

Getting started with Information GovernanceSo, how do you know where to start? First, we polled the room. About 25% had started an information governance program in the last year. 25% had been doing information governance for more than a year, and all of those were multi-domain. As you might expect then, about 25% has Business Data Steward roles for information governance.

In general, follow this process:

  1. Pick the information
  2. Pick the projects
  3. Pick the team
  4. Pick the management system

Maria also walked the group through the 10 questions to answer to help shape your company’s information governance program:

  • What business priority is enabled most by quality information?
  • What data fields and aspects of data are the most important to govern?
  • What are the potential benefits of governing this information?
  • Who are the stakeholders for the business priority?
  • What is the perceived and actual level of data quality?
  • What processes and systems are used to create and update information?
  • Are there standards in place today?
  • What technology is used to govern information today?
  • How can I ensure success?
  • What does great information management look like?

Problem with information governance executive sponsorship: no one wants to OWN it, but they don't want anyone else to own it either. There are only a handful of executives who can go across Lines of Business (LOBs) and silos, which makes it hard to build information governance success. However, look for an executive sponsor who is strong and has the most to gain with information governance success. Oh, and do not use “metadata” when talking to an executive about information governance. Use KPIs and metrics instead. Once the executive sponsor is buying what you are selling, know that it is critical to get multi-year funding for your information governance program. Get that commitment up front. And how do you speak in the metrics and KPIs mentioned? You MUST establish operational goals and metrics and strongly tie them to benefits and value.

If you can, manage the Create process of information governance very robustly. In fact, this is a great place for a data quality firewall. Data quality should be a major component of any information governance program. And a good place to start with data quality is by collecting "tales of woe". Some of these tales exist because the fit-for-purpose definition is different across different Line of Businesses (LOBs) and  business process. In one case, a customer had a CRM implementation that, without information governance, resulted in a 2-year delay! Sounds like woe to me. Other great places to look for these tales are financial restatements, regulatory issues, customer issues, and business process gaps. And keep in mind that data quality work requires business process re-engineering. In one case a customer was creating master data in 1300 different places!

One workshop customer talked about information governance and business process: Business process improvement projects are a great place to start information governance because metrics can turn RED and are then very visible. Information governance can, in fact, have great business process benefits: cross-sell/upsell, lead management, opportunity creation, days sales outstanding (DSO), and 360-degree view, among others. One good place to start is to select fields for information governance by reverse engineering reports, regulatory reports, and business process data.

Three final tips from Maria:

  • Human intervention happens to make business analytics and Business Intelligence (BI) *look* like what people expect regardless of what the data actually says.
  • Information Architecture plays a key role in enabling information governance.
  • Regulators and Auditors are also big friends of information governance. Use them!

After Maria’s session, Deloitte stepped in to talk about their experiences implementing information governance. Deloitte provided some good tips:

  • The introduction of new applications is driven by business process or executives. These large projects *start* with goals, but end up just trying to finish by a certain date.
  • You need a way to talk about information governance as a critical business enabler, including information lifecycle. Be a fact-driven organization. It turns out that this switch to being a fact-driven company requires a great plenty of organizational change management. Do not underestimate the culture change coming your way, or the effort you’ll have to apply to understand the change management forces.
  • Start with a list of all important information and then define how the system of record should behave instead of starting with technology and working your way backward.

Information Governance technologyInformation governance tools (all offered by SAP) include data profiling, assessment, and metadata management with Information Steward, ETL and data quality with Data Services, Master Data Management, Extended Enterprise Content Management (ECM), Business Process Management (BPM), business rule engine with BRM, BI, and archiving and retention with ILM. One main pain point is that consumers of the information don’t understand the context of the information and use, so it was hard to enforce the information governance policies. This is a good place to automate to reduce the pain/time required of those consumers.  One customer created an internal class called Data Appreciation for Developers with great success.

Hopefully these highlights helped you shape your information governance program. If you are interested in attending a session like this, please email Philip On for details on upcoming workshops.

Ina Felsheim Active Contributor Platinum: 2,500+ points is a Solution Manager focusing on EIM.

See what Enterprise Information Management has in the bag at SAP TechEd 2011

 

Attending the SAP TechEd 2011 lectures and hands-on sessions about SAP Enterprise Information Management you can witness how this solution portfolio helps you to create, cleanse, integrate, manage, govern, and archive structured and unstructured data along the information lifecycle.

The session topics cover enterprise data warehouse management, master data management and governance, data integration and data quality management, information lifecycle management, and enterprise content management. You'll also find out the latest and greatest about in-memory computing including SAP HANA.

Fig: SAP EIM solution portfolio managing information along the lifecycle

What's in it for you?

To get a complete picture of the lectures and hands-on sessions offered in the Enterprise Information Manangement track, click the EIM session overview link and select the topics that you are interested in and would like to attend.

To display the overall education and session catalogue, click the URL at the top of the blog.

You see, at SAP TechEd 2011 there is plenty of information about Enterprise Information Management. Get the most of it, mark your calendars!

Additional info on social media

To stay tuned with what's going on in SAP's key EIM domains, feel free to follow the SAP EIM Twitter accounts:

@SAPMDMgroup

Group, sharing information about SAP solutions for master data management and governance (MDM)
@SAPILM

Group, sharing information about SAP solutions for information lifecycle management (ILM), e.g. archiving, system decommissioning, retention management
@SAPECM

Group, sharing information about SAP solutions for enterprise content management (ECM)
@SAPBOEIM

Group, sharing information about SAP BusinessObjects Enterprise Information Management (EIM) solutions for Data Services, Data Quality, Data Migration and Information Governance

Regards,

Markus

Benefit from Pre-TechEd Workshop about SAP Master Data Governance

Seminar Topics

Now that you've hopefully caught a glimpse of SAP Master Data Governance (MDG), for example by watching the quick demos featured on SCN, take the chance and attend at a dedicated MDG pre-SAP TechEd workshop, scheduled Sept. 12, 2011 in Las Vegas. There are still seats available.

In this seminar, you can learn how master data governance approaches the data management problem within the SAP software landscape with utmost ease and efficiency. Whiteboard-led sessions provide insight into all aspects of the solution including data model, security, workflow, data replication, extensibility, and customization.

For seminar details and registration information, see the ASUG Pre-Conference Seminars site.

Enjoy the session.

Regards,

Markus

Markus Ganser Active Contributor Gold: 1,500-2,499 points is a solution manager in SAP Master Data Management (MDM)

Buy vs Build MDM (Master Data Management) Solution Goel Ankur

I believe this is an interesting topic to debate. During most of my pre-sales engagements, we touched upon this point as well. I was part of developments based on ABAP and also part of configurations/implementations projects. However I was never had an opportunity to work on any development project which built a complete solution from scratch and hence like to clear my understanding for the same.

In my earlier blogs also I mentioned that the problems with data existed for long term and there was many solutions catering to those challenges and requirements. However for the master data challenges and requirements; the MDM solutions which are currently available in market didn’t exist a decade ago. Definitely to cater and solve master data problems, organizations had to take lead to develop in-house solutions. When the MDM solutions were created, many organizations were already maintaining their own MDM solutions. We will be able to discuss some of the issues organizations had to face in maintaining such solutions. Moreover Master data solutions were more on the business requirement and decisions than the IT requirement as Gartner identifies MDM as a 'technology-enabled business discipline'. Hence many organizations still believe that it is better to build their own MDM solutions since they know their master data and issues related to it better than the solution providers and also they have to invest considerable time and effort in configuration of MDM solutions as per their requirements. I definitely agree that nobody understands the master data and issues related to it better than any their own organizations. However it takes considerable efforts, directions and knowledge to develop their own MDM solutions. Also unfortunately many organizations over run the budget and timelines in developing the MDM solutions. This is largely because of the fact that a MDM solution is not only data quality issue but it’s a quite complex solution relating data quality, governance, management, process workflows and integration of the solution across the landscape catering to all the different systems. Moreover once the solution has been built, it requires resources to maintain such system. This might be more costly and requires more efforts to have exclusive resources than existing MDM solutions where the experienced resources are readily available.

Below are some key considerations are important to be considered while making a decision:

  • Web Services, SOA enabled and EIM
  • Data volumes
  • Hierarchies in master Data
  • Dimensional and Domains of Master Data
  • Duration
  • Budget
  • and Resources (Build and Maintain)

It makes sense that the organizations which already have their own MDM solutions and as long as it is supporting all the requirements and issues then there is no specific need to migrate to other solutions. However if the organization is not happy or looking to enhance their own solution or still don’t have a solution in place then it might be better to explore the various MDM Solutions and implement it as per the requirement.

Moreover it’s important to take note that many MDM solutions are currently existed in market for more than 5 years and quite improved a lot. Definitely they are on right track to provide an integrated solution.

Support for SAP NetWeaver 7.30

Support for SAP NetWeaver 7.30 has been qualified for the following MDM components:

● MDM Web Dynpro Components

● MDM Portal iViews

● MDM Portal Content (Product and Business Partners)

● MDM Web Services (design time and runtime)

● MDM Enrichment Controller

● MDM PI Adapter

● MDM Java API

● MDM ABAP API

● Collaborative Processes for Material Master Data Creation

For more information, see the SAP NetWeaver MDM 7.1 Master Guide on SAP Service Marketplace at http://service.sap.com/installmdm71.

MDM Repository Structure

An MDM repository consists of the following tables:

Main table

Every MDM repository has exactly one main table. The main table consists of the primary information about each main table record. For example, an MDM repository of product information would include an individual record for each product and an individual field for each piece of information that applies to all products, such as SKU, product name, product description, manufacturer, and price. Most of the time you will be looking at information in the main table.

Subtables

An MDM repository can have any number of subtables. A subtable is usually used as a lookup table to define the set of legal values to which a corresponding lookup field in the main table can be assigned; these tables hold the lookup information. For example, the main table of an MDM repository of product information may include a field called Manufacturer; the actual list of allowed manufacturer names would be contained in a subtable. Only values that exist in records of the subtable can be assigned to the value of the corresponding lookup field in the main table.

 

Lookup subtables are just one of the powerful ways that MDM enforces data integrity in an MDM repository. The set of legal values associated with lookup fields also makes the MDM repository much more searchable, since a consistent set of values is used across the entire repository.

Object tables

Object tables include the Images, Text Blocks, Copy Blocks, Text HTMLs, and PDFs tables. An object table is a special type of lookup subtable, where each object table is used to store a single type of object, such as images, text blocks, copy blocks, HTML text blocks, or PDF files. You cannot store an object directly in a main or subtable field in an MDM repository. Instead, each object is defined or imported into the repository once and then linked to a main or subtable field as a lookup into the object table of that type.

 

Object tables eliminate redundant information, since each object appears only once in the MDM repository even if it is linked to multiple records.

Special tables

Special tables include the Masks, Families, Image Variants, Relationships, Workflows, Data Groups, and Validation Groups tables.

System tables

System tables appear under the Admin node in the Console Hierarchy and include the Roles, Users, Logins, Change Tracking, Remote Systems, XML Schemas, Reports, and Logs tables.

SAP MDM Tutorials | SAP MDM Training | SAP MDM Interview Questions |SAP MDM Books

Testing and Monitoring an Interface Between MDM & XI Part 2

clip_image001

· Select a message and press Display.

clip_image002

You may notice that I have selected a message that coantains an error and did not actually reach it's destination. In Call Adapter -> SOAP Header take a look at Error. If you double click that button a screen will appear on the right hand side that shows the details of the error.
clip_image003
This error tells us that something is wrong with the IDoc Adapter. It tells us that transaction IDX1 contains errors, but in this case the error is actually in the configuration of our communication channel, in which we have made reference to the wrong Port. If you select Call Adapter -> Payloads you can see the content of the XML message that came from MDM.
clip_image004
If you go back to SXMB_MONI you may want to also take a look at the Processing Statistics program that will show a good overview which can be helpful when testing your interface with thousands of materials.
clip_image005

3. Testing

Now we're going to go ahead and test out the interface from end to end. I'm assuming that by now you have turned on the MDM Syndication Server and your XI interface is activated in the Integration Directory. Lets log into the MDM Data Manager and create a new material for testing purposes.

· Right click -> Add

clip_image006

· Enter enough data to satisfy your interface requirements (ie: which fields must be populated?)

clip_image007

· Click on another material to save changes

· Close the MDM Data Manager

· Turn on your MDM Syndication Server (if it's not already turned on)

If your Syndication Server settings have been configured correctly then we can assume that because you added a new material to the data manager, it will now syndicate as soon as your interval cycles through (set in the mdss.ini file on your server). Lets go ahead and move over to the Exchange Infrastructure Runtime Workbench to see if it has processed our message. Keep in mind, depending on your interval time it may take a few minutes. Hopefully you should see something like this:
clip_image008
If the runtime workbench shows the message transferred successfully then lets log ino ECC and see if the IDoc was posted.

· Log into ECC system

· Run transaction WE02

clip_image009

· Press F8

· In the left hand pane, select Inbound IDocs -> MATMAS

clip_image010

· In the right hand pane, select the IDoc that just transferred and double click on it

· In the IDoc display, on the left hand side expand E1MARAM and select E1MAKTM

clip_image011

· Verify that the material data is correct

clip_image012

· Expand Status Records -> 53 and double click the only record available

clip_image013

· In the pop up window, copy the message number that was issued to the IDoc

· Press Proceed

· Paste the message number that you copied

clip_image014

· Press F8

clip_image015

You may notice that my image says material 11696 created. This is because a modification was made to an ABAP program to create a material when an IDoc is processed with a certain code. In this blog, the ABAP modification is out of scope, but I'm assuming if you are familiar with ALE then this process should be familiar as well. In any case, this is not a permanent solution, just a temporary solution to finish our prototype. If we take that newly generated material number and run transaction MM02 we should be able to pull up the details on that material.
clip_image016
Press Select Views and select Basic Data and continue.
clip_image017
Hopefully if all went as planned, the material should have transferred smoothly, with no loss in data. This concludes the three part series on MDM and XI. Thanks for reading, hopefully it helps!

Testing and Monitoring an Interface Between MDM & XI

Hello and welcome back to the last of a three part series on integrating MDM with ECC with XI (sorry for the onslaught of acronyms). If you missed out on the first two you can find them here: Part I, Part II. In this one we will focus on testing out your scenario, and how to troubleshoot (where to look) in both MDM and XI. You may have already noticed that in the previous two parts of this series I used a sample scenario dealing with material master data and the MATMAS05 IDoc structure. Ultimately we want to generate a new material in ECC based on the creation of a material in MDM.So lets go ahead and get started. First we'll start with the syndication process in MDM, and making sure our settings are correct.

1. MDM

First we'll start with the syndication process in MDM, and making sure our settings are correct.

1.1 Check Configuration

1.1.1 Client-Side Settings

· Open the MDM Console

· Navigate to Ports in the repository to which your map is located.

clip_image001

· Verify that you have selected the correct map (built in Part I)

· Select your processing mode as Automatic

clip_image002

· Open the MDM Syndicator (Select the repository and press OK)

· Select File->Open

· Select the remote system representing ECC

· Select your map and press OK

· Select the Map Properties tab

clip_image003

· Check Suppress Unchanged Records so we automatically update only changed records.

· Close and save your map

clip_image004

1.1.2 Server-Side Settings

· Open your mdss.ini file on the MDM server

· Verify that Auto Syndication Task Enabled=True

· For testing purposes, change the Auto Syndication Task Delay (seconds) to something rather small, such as 30 or less. This way you don't have to wait a long time for syndication when testing.

clip_image005

· Verify that the service is started.

· UNIX systems: ps -ef | grep mdss

· WINDOWS systems: open services, and look for entry regarding syndication server

· If service is not running, run command ./mdss (UNIX) or rightclick->start service (WINDOWS)

clip_image006

1.2 Important Locations

I'd like to go over some of the important locations (directories) on your server that will come in handy when troubleshooting and testing. One of the trickiest parts of working with MDM is figuring out where things go and where to look. Because it's so different from the SAP software that we are all used to, navigating the system is not as easy as running a transaction code. Also, MDM reacts to certain situations differently that you may expect, so it's important to know where to look when things aren't working properly. I'm working with MDM installed on HP-UX, however I will try to address each topic as it would appear in Windows to the best of my knowledge.

1.2.1 Home

Log onto your MDM server and navigate to the home directory for the MDM application server. On the server I am working with (sandbox) it happens to be located on the opt filesystem, and the path looks like /opt/MDM. In this directory take note of several important directories:

/opt/MDM/Distributions
/opt/MDM/Logs
/opt/MDM/bin

The Distributions folder is very important because this is where the port directories get created. When you create a port in the MDM Console for a particular repository, it creates a subset of folders in the Distributions directory based on which repository the port was created in, and whether the port is inbound or outbound. For example, in our particular scenario we may navigate to the path/opt/MDM/Distributions/install_specific_directory/Material/Outbound/. Here we will notice a folder entitled ECC which (if you followed the fist part of this series) corresponds to the port that we created earlier. This directory was created as soon as the port was created in the MDM Console. We will focus more on the contents of our port directory shortly.

The Logs folder contains several important log files, however most of them will not apply to our particular scenario, because the logs that we will want to look at are going to be specific to the syndication process, and are located within the port directory. Neverless, I thought it was important to mention that in certain troubleshooting scenarios, don't forget that these log files also exist.

The Bin directory is critical because that is where the files that start the app servers are located. The programs mds, mdss, and mdis are critical files.

1.2.2 Port

Your port directory is going to have the following format:
/MDM_HOME_DIRECTORY/Distributions/MDM_NAME/REPOSITORY/Outbound/REMOTE_SYSTEM/CODE/
For example the we created looks like this:
/opt/MDM/SID.WORLD_ORCL/Material/Outbound/ECC/Out_ECC/
Here you should see the following directories:

/Archive
/Exception
/Log
/Ready
/Status

The Archive directory is not as important during the process of syndication as it is with the process of importing data into MDM. This directory contains the processed data. For example, if you were to import an XML document containing material master data, a message would get placed in the archive directory for reference later if you ever needed to check.
The Exception directory is very important because often times when an error occurs you can find a file has been generated in the Exceptions folder that should look similar to that file that either the import server or the syndication server are attempting to import or syndicate. In other words, lets say you were attempting to import an XML document that contained material master data, but the map that was built in MDM has a logic error, the document will instead get passed to the Exceptions folder and the status of the port will be changed in the MDM Console program to "blocked".
The Log directory is important for the obvious reason. Logs are created each time the syndication server runs. So if your interval is 30 seconds, then a log will be generated in this folder every 30 seconds. It will give you the details of the syndication process which ultimately can be critical in the troubleshooting process.
The Ready folder is the most important folder in our scenario. When the Syndication Server polls during it's interval and performs the syndication, the generated XML message will appear in the Ready folder. So in the case of our scenario, we are going to have material master data exported to this directory and ultimately Exchange Infraustructure is going to pick up the data and process it to ECC.
The Status directory contains XML files that hold certain information pertaining to the import / export of data. This information includes processing codes and timestamps.

1.3 Testing

Now are going to test out our scenario and make sure that the export of materials works correctly. First things first, we need to create a new material in the MDM Data Manager. Make sure that your MDM syndication server is turned on! Remember on UNIX we can start it by running ./mdss in the /bindirectory, and on Windows by simply starting the service.

1.3.1 MDM Data Manager

· Start MDM Data Manager

· Connect to Material repository.

clip_image007

· Add a new material by "right-click".

clip_image008

· Fill in required fields to satisfy the map built in Part I.

clip_image009

· Verify the new product is saved by clicking elsewhere in the Records screen, and then back to the new Material.

clip_image010

1.3.2 Check Syndication

We are now going go verify that the syndication process is taking place as it should based on the settings in your mdss.ini file. If you have set the MDM Syndication Server to perform the syndication process every 30 seconds, as I set it for testing purposes, then by the time you log into your server the syndication should have already occured. Lets check by logging onto the server and navigating to the Ready folder in our Port directory.
/opt/MDMSID.WORLD_ORCL/Material/Outbound/ECC/Out_ECC/
If all went as planned your Ready folder may look something like this:
clip_image011
Those files are XML files that contain the data for each material in your repository that has changed. In this case the only materials in my repository are the two that I just added, so the MDM Syndication Server updated the Ready folder with both new materials. Now they are waiting for XI to pick them up and process them. Before we move over to the XI part lets take a look at one of these files and verify that the data in them is correct. Keep in mind that if you have already configured XI to pick up the files from this directory and process them, it's possible you won't see them here because they have already been deleted by XI (based on the settings in your communication channel).

1.3.3 Verify Data

Lets go ahead and open one of these files. I copied the file from the server to my local Windows running computer to examine the file, but of course you can read the file straight from the server if you prefer. If your mapping was done similar to mine, your file should look like a MATMAS05 IDoc in XML structure. This is to make it easier for XI to process since we can export in this format from MDM without much difficulty.
clip_image012

SAP MDM Books : Enterprise Master Data Management using SAP Netweaver MDM

Enterprise Master Data Management using SAP Netweaver MDM

CLIX ERRORS

Common CLIX errors are listed in Table 112.

Table 112. Common CLIX Errors

388
MDM Console clip_image002clip_image004clip_image004[1]clip_image006clip_image008clip_image008[1]clip_image010clip_image010[1]clip_image010[2]clip_image010[3]clip_image004[2]clip_image004[3]clip_image006[1]clip_image008[2]clip_image008[3]clip_image010[4]clip_image010[5]clip_image010[6]clip_image010[7]

Number

Message

Explanation / Probable Cause

0x83000002

WinSock error on connect

MDM Server is not running. The CLIX

Client could not connect.

0x80020002

Error opening file

File specified does not exist, or argument

not in quotes.

0x80010004

Already exists

Target file or repository already exists

(use –FORCE option to override).

0x8001000d

Directory already exists

Target file same as an existing directory.

0x80020002

Error opening file

Source Archive File not found; mistyped

or lacking full path or needing quotes.

0xFFAB4010

The repository is invalid

Source Repository was mistyped, not in

quotes, or not mounted on MDM Server.

0x8402000B

Repository already exists

Target repository cannot be overwritten

unless –FORCE option is added.

DBMS Settings

DBMS Settings

The applicable brands for each DBMS setting are listed in Table 113.

Table 113. DBMS Settings and Brands

NOTE ►► See “DBMS Settings” on page 234 for more information on

the DBMS settings.

SAP Developer Network Latest Updates