Central user administration (CUA)

Central user administration (CUA) is a great tool. Despite the fact that SAP has tried to replace it with IDM tools (IDentity Management). CUA remains efficient and reliable.

Questions that will be answered in this blog are:

  • What are use cases for CUA?
  • How to setup CUA?
  • How to monitor CUA?
  • Is CUA working is S4HANA?

Use cases for central user administration

Use cases for central user administration:

  • Management of users in the entire landscape (including production servers)
  • Management of users in non-production (sandbox, development, acceptance)
  • Management of users in client 000

Suppose you have a larger landscape consisting of 100 SAP systems and a new basis person will join. Good luck creating 100 user accounts… With CUA connected this is done in one shot.

And every now and then you need to go to client 000. You have forgotten the password, or due to security settings you users is automatically locked there after xx amount of days. With CUA you can simply reset your password there and log on.

Check if you are using to use SAP-GRC access control. This might conflict with CUA.

Set up of central user administration

In the central CUA system (also called CUA master) you need to set up a logical system for each CUA child system. Use transaction BD54 to create them.

Also setup 1 RFC in SM59 to each child system with this naming convention:

<SID>CLNT<MANDT>

Use a non-expiring background user, with the appropriate rights, in this RFC. Make sure you update the whitelist for CUA in the RFC, otherwise you might get RFC callback error. See this blog.

Now start transaction SCUA:

Create a new model view and add the child system:

Do check that the RFC status is fine.

Save and activate the CUA model view:

Check in the master CUA system that the distribution model is created correctly. Start transaction BD64 and look for the CUA model:

Check in WE20 in the master CUA system that the partner profiles are correctly generated towards the child system:

Check that the outbound settings are set to collect the idocs.

If you have a user base up to 1000 users, you could set the idocs to immediately. With larger user bases: set to collect. Reason is that CUA will daily compare the child and master. It will generate 1 idoc per user. This will clog the child system if you do not set to collect. 

Check on the CUA child system that the WE20 partner profiles are also created correctly:

Also here, set the processing to collect in stead of process immediately.

In transaction SCUM you can make a very detailed configuration per field on which fields are globally maintained in the CUA master, and which local:

First synchronisation

After the first setup you need to do an initial synchronisation.

Start transaction SCUG:

First synchronise the Company address. Then synchronise the users. During user synchronisation you will get errors due to user groups. Each user group in the CUA child system needs to be defined in the CUA master system as well.

Transaction SCUL can be used to check the logging:

For text comparison a traffic light shows whether the child system supports it or not. See SAP note 1642106 – CUA|PFCG: Automatic text comparison of roles for central system. This note explains to update table USR_CUST:

For issues remaining with first setup, read OSS note 333441 – CUA: Tips for problem analysis.

Regular batch jobs

In the CUA master system plan the following batch jobs:

  • RSCCUSND (Send user master data to child systems), daily
  • SUSR_ZBV_GET_RECEIVER_PROFILES (text comparison between child and central), daily
  • RSEOUT00 (Send idocs to child systems), every 5 minutes

In the CUA child system plan the following batch jobs:

  • SUSR_ZBV_GET_RECEIVER_PROFILES (text comparison between central and child), daily
  • RBDAPP01 (Process idocs from the master system), every 5 minutes
Due to the jobs, a change in CUA master can take up to 10 minutes to be effective in the child system.

In the central system the next standard jobs are scheduled:

  • BAT_CUA_USER_MASTER_DATA
  • BAT_CUA_SEND_IDOCS
  • BAT_CUA_COMPARISON_PROFILES
  • BAT_CUA_SEND_IDOC_ERRORS

In the child systems the next standard jobs are scheduled:

  • BAT_CUA_PROCESS_IDOCS     
  • BAT_CUA_COMPARISON_PROFILES

More background information can be found in OSS note 399271 – CUA: Tips for optimizing ALE distribution performance.

CUA in action

If you goto SU01 in the master system, you see there is an extra tab called systems. And you have to specify the system for each role you assign to a user:

Copying a user can be done for multiple systems.

Also password resets can now be done for multiple systems in one shot.

Emergency cases

There might be emergency cases when CUA master is down or is having maintenance or issues, you might need to temporarily disconnect CUA.

Read OSS note 320449 – Deactivating the CUA temporarily. Run program RSDELCUA in the child system.

CUA and S4HANA

Despite several rumors, CUA is fully supported with S4HANA. See help.sap.com on CUA in S4HANA 2021.

Background information

More background information:

SAIS_MONI: Generic audit report about system changes

SAIS_MONI is a central audit report about system changes. It collect changes from client opening, audit log, change log, transport log and more, in one central place.

Questions that will be answered in this blog are:

  • How to install the SAIS_MONI tool?
  • How to run the SAIS_MONI tool?

How to install the SAIS_MONI tool?

The SAIS_MONI tool is installed via OSS note 2423576 – SAIS | Generic audit report about system changes. Or it is standard as of the basis support package stated in this note.

Bug fix notes:

How to run the SAIS_MONI tool?

You start the tool with transaction SAIS_MONI:

Depending on your input the output will be shown as ALV output.

For a full description of each option, read OSS note 2915635 – SAIS | Generic audit report about system changes.

Logging for users is described in OSS note 139418 – Logging of user actions (ABAP server).

Bug fix OSS notes

Bug fix notes:

Content server migration of documents

If you have configured attachments of document info records to be stored in the content server, you still might have a lot of old document stored into the SAP database.

This blog will explain how to migrate these documents from the database to the content server.

Questions that will be answered are:

  • How to migrate documents from database to content server?
  • What are relevant background OSS notes?

Running the migration

The main OSS note is 389366 – Relocation of documents. This basically tells you to run program RSIRPIRL. The exact use is explained in OSS note 2459712 – How to use report RSIRPIRL.

To run start transaction SE38 and start program RSIRPIRL and fill out the required data:

Select a time frame that has little documents in a test environment first. Check how long it takes and that it ends correctly. After the relocation is done you get a list of technical ID’s migrated. When confident in a test environment, run in production environment, and monitor the storage of the content server (so it does not fill up to 100%).

New modification note for delayed deletion: 2991944 – Introducing the Delay mode in report RSIRPIRL.

Migration of GOS objects

The program RSIRPIRL does not have many selection criteria. You might also find out that the time to migrate takes too long. If you need to migrate GOS document (global object services attachments), you can use program RSGOS_RELOCATE_ATTA:

This program migrates the GOS documents specified per type and page. GOS documents are normally the bulk of the documents. This way you can migrate most of the documents before running the full run with RSIRPIRL. Full background of program RSGOS_RELOCATE_ATTA can be found in OSS note 2293171 – RSGOS_RELOCATE_ATTA: Relocating attachments from generic object services.

Copying content repository

If you want to copy content from a content repository to another (not re-locate), install the program Z_DOC_COPY from OSS note 2774469 – Program to copy SAP content repositories.

Relevant OSS notes

OSS notes:

APC: Abap push channel

The ABAP push channel (APC) is the ABAP implementation of websockets. It’s goal is to enable the ABAP stack to send push messages to registered web clients.

This blog will answer the following questions:

  • How to setup an ABAP push channel?
  • How to implement the ABAP push channel?
  • How to test the ABAP push channel?
  • Where to find more background and examples on ABAP push channel?

Setting up an ABAP push channel

To setup an ABAP push channel go to transaction SE80 and right click, select create / connectivity / ABAP push channel notification.

Now press the Generate Class and Service button. The classes and services will now be generated as placeholders. Save your work.

If you try to activate the service at this point in time you get this error message:

The reason is that we didn’t implement two methods of the new class yet: the ON_START and ON_MESSAGE.

Implementing the actual APC class

To do this, we go to SE24 and lookup the generated class and we select the ON_START method:

Press the redefine button to redefine the method.

Use this code in the method:

TRY.
* send the message on WebSocket connection
DATA(lo_message) = i_message_manager->create_message( ).        lo_message->set_text( |ON_START has been successfully executed !| ).
i_message_manager->send( lo_message ).
CATCH cx_apc_error INTO DATA(lx_apc_error).
MESSAGE lx_apc_error->get_text( ) TYPE 'E'.
ENDTRY.

This basically confirms the push channel registration.

Now redefine the ON_MESSAGE method:

TRY.
* create the message object
DATA(lo_message) = i_message_manager->create_message( ).
* send message
lo_message->set_text( |Hello World !| ).
i_message_manager->send( lo_message ).
CATCH cx_apc_error INTO DATA(lx_apc_error).
MESSAGE lx_apc_error->get_text( ) TYPE 'E'.
ENDTRY.

It simply pushes the message: ‘Hello World’.

Save and generate the class in SE24.

Now we can go back to the SE80 ABAP push channel we have created and activate it as well. You can run the consistency check to see all is fine:

Testing the ABAP push channel

Now you can test the ABAP push channel by hitting the test button in the SE80 screen of the ABAP push channel. The test service will launch an ABAP webdynpro screen.

If the ABAP webdynpro screen does not launch, activate in SCIF transaction the following 2 nodes: WDR_TEST_APC and WDR_TEST_APC_WSP.

Test result:

As an alternative to SE80 you can also use transaction SAPC:

Background information

Excellent blogs on ABAP push channels are:

Chat bots via SAP conversational AI

This blog will explain how simple you can set up a chat bot via SAP conversational AI.

Questions that will be answered in this blog are:

  • How can I set up a test chat bot with SAP conversational AI?
  • How can I test the chat bot?

Setting up the chat bot

SAP conversational AI is the technology behind the chat bots of SAP. It only runs in the cloud. It does not run on premise.

You can register for a free test account at https://cai.tools.sap/.

After registration you can create your first bot:

Here we choose Perform Actions and press the CREATE A BOT button:

Continue with the wizard:

Now create the bot.

First thing to add is an intent:

And enter the detailed words (expressions) for the intent:

When the user keys in one of these words, it will trigger the bot.

But this is only the trigger. There is no action yet. Goto the build tab and create a skill first:

Now open the skill. In the skill you can goto the details to add a trigger:

Here we use our test intent we created before as trigger.

Now goto the Actions tab and create the wanted action:

In our simple case, we respond back with a text saying “Me too!”.

Testing the chat bot

Now we will test our chat bot. Test result:

First thing we say is “tell a joke”. Then the bot will tell a joke.

Then we ask: I need a holiday. Since we have set the word “holiday” as an intent, the bot responds with the action and says “Me too!”. The same for I need beer. Any response question that is not in the chat bot script it goes to fallback.

OpenSAP course

SAP has a nice OpenSAP training on chatbot building. Follow this link.

Data archiving: reducing amount of parallel batch jobs

When executing data archiving you have to be acting careful. The data archiving write and delete processes can be consuming a lot of CPU power from the database. Also, if you are not careful you might, by accident, claim all background processes. This blog will explain how to limit the amount of batch jobs used for data archiving. The data archiving run process itself is described in this blog.

Questions that will be answered in this blog are:

  • How can I limit the amount of deletion jobs?
  • How can I restrict the archiving jobs to run on a specific application server only?

Limit amount of deletion jobs

When the write run of data archiving is finished, this can have delivered many files. If you are not careful with the deletion, you select all files and each file will start a deletion run. This will consume a lot of CPU power on database level, since the deletion run will fire many DELETE statements to the database in rapid sequence. Also you might consume all batch jobs, leaving no room for any business batch job.

In stead of running the deletion from SARA, you can also run the deletion via program RSARCHD:

With this example, MM_EKKO files will be deleted. Maximum of 50 files from 1 archiving run will be processed, with a maximum of 2 deletion batch jobs running at the same time.

The general OSS note for this program is 133707 – Data archiving outside transaction SARA.

Relevant OSS notes bug fix notes:

General application server restrictions via batch job server group

In SM61 you can setup a special batch job server group. Here can assign a single application server for you data archiving batch job processing. We assume here you created a group called DATA_ARCH.

In SARA you can now goto the general data archiving settings:

Now you can link the batch job server group:

With the button JobClasses you can specify the job priorities per data archiving function:

A = high priority, C = low priority. The above screen shot is an example.

The second part of OSS note 2269004 – How to reduce parallel archiving jobs on Integration Engine describes the procedure as well. The first part of the note is only relevant for SAP PI.

SAP interfacing: REST

The SAP ABAP stack can also interface using REST protocol. To support this interface protocol SAP has developed special classes in the ABAP stack.

Questions that will be answered in this blog are:

  • How do I create a REST interface in ABAP stack?
  • How do I test a REST interface in ABAP stack?
  • Which tools to use to developer REST interface?

REST in ABAP

SAP delivers in the ABAP netweaver stack the ABAP REST library. The full specification can be found on the SAP help portal. The help portal also contains a small tutorial. Next to the pre-delivered REST library classes there are no tools for you available to faster develop REST in ABAP. It is coming down to SE24 and SE80.

Good reference blog is the SAP blog on usage of REST in Netweaver 7.4. This also explains the generic REST architecture implementation in ABAP.

Creating REST service in ABAP

We will create a simple Hello World REST service in ABAP. There are 2 main classes in REST ABAP: the application class handling the URL and the resource class where the logic is.

Start transaction SE24 and create a new class inheriting from the SAP delivered class CL_REST_HTTP_HANDLER:

REST create class

important here to press the inheritance button! Fill out CL_REST_HTTP_HANDLER as superclass:

REST create class as inheritance

It is mandatory to redefine the GET_ROOT_HANDLER method:

For now just leave the method empty. Save and generate.

Now create the REST resource class based on inheritance of CL_REST_RESOURCE:

REST define resource class

Now redefine the GET method:

REST resource class redefine GET method

No we add a simple implementation by simply adding the text ‘Hello World’:

REST resource class GET method implementation

Save and activate this class.

Now we go back to the previous class: the application class. In here we now edit the GET_ROOT_HANDLER implementation we left empty earlier:

REST implementation of root handler

If the URL is getting the input /hello then the handler class (our resource class) ZCL_HELLO_WORLD_RES_REST is called. This class will return the string.

Save and activate again. The coding work is done.

Runtime implementation

Now we need to make a runtime implementation. Goto transaction SICF and select the main node default_host first. Then select from the menu Service/Host the option Create Service:

REST SICF create service

Fill out the name of the service and click ok. In the next screen give a description and in the Handler List section refer to the application class ZCL_HELLO_WORLD_REST:

REST SICF handler

Save the service. The service is created but not active. To activate right click on the service and select Activate:

REST SICF activate service

Testing the service

From the previous SICF screen right click the service again and select the option Test Service. A screen will come that says “No suitable resource found”. Now modify the URL by adding /hello after the test in the URL, and press enter again:

REST SICF test service

The URL build up: the test is the name defined in SICF. The /hello was defined in the application class.

Authorizations and security

The REST library has no specifics about authorization and security. So you have to take care your self.

Business authorization security: has to be built in via AUTHORITY-CHECK statements at the correct spots.

Technical security is provided in the Logon Data tab on the SICF node. Here you can set requirements for the technical logon method and if you only allow https.

REST versus ODATA

ODATA is based on REST and has more features. If you have a choice, you best use ODATA. ODATA exposing is described in this blog.

In SAP REST is supported, but you have to code a lot, and limited tools are available. For ODATA much more development and monitoring tools are available.

SAP interfacing: ODATA

In the previous blog we have setup RFC enabled function module. If you want to expose this function module as ODATA service you can use the wizard in transaction SEGW. This blog assumes the basis ODATA basis activation has been performed (see this blog).

Questions that will be answered in this blog are:

  • How do I generate an ODATA service based on a RFC function module?
  • How do I test if the ODATA service is properly working?

Set up of the ODATA service

Start transaction SEGW and create a new project:

Now start the RFC import wizard by right clicking on Data Model and selecting the option Import and then RFC/BOR interface:

Now select the data parameters:

And enter which field is key field:

After pressing finish the wizard will generate the needed classes.

Save your work and press the check button to validate if everything is ok:

Now we need to map the implementation to the RFC module. Right click on the GetEntitySet below ZODATADEMOENTITYSet and select Map to Data Source:

Now map the fields (you can use drag and drop):

Now you need to map the data fields correctly and press check.

Save your work.

Generation of objects

You can see that the Runtime Artifacts section is still empty.

Now press the button Generate Runtime Artifacts:

Wait for the generation to finish:

Now the runtime artifacts are generated, but the service maintenance is not done yet. Open the section Service maintenance and double click on the system:

Now press the Register Service button:

Accept settings and assign package for transport:

Now the registration status is green.

Testing the ODATA service

Press the button SAP Gateway Client (or start transaction /IWFND/GW_CLIENT directly, and then enter the correct service):

The test client starts:

Enter the correct inputdata: /sap/opu/odata/SAP/ZODATADEMO_SRV/ZODATADEMOENTITYSet(‘1’)

And check the output:

Attention points

The example above seems simple, but you will face more issues in real live implementation when you need to add tables and more complex structures. In those cases additional configuration and many times extra coding in the methods of the generated classes is required.

Nice blogs to start with:

ODATA security

The user calling the ODATA service needs a special right in SAP to be allowed to call the ODATA service.

Start transaction PFCG and create a new role. On the menu tab select the option Authorization Default. Then select type Tadir and object type IWSV gateway business suite enablement. Now you can finally search for our own developed and activated ODATA service:

Now save the role and assign it to the user(s) needing to call this ODATA service.

The application security relies on the function security authorization check inside the RFC function module.

ODATA V2 and V4

SAP is now moving from ODATA V2 towards ODATA V4. Read more on ODATA V4 activation in this blog.

SAP interfacing: consuming web services

In the previous blog we have exposed a web service. Now we will show how to consume a web service in ABAP. As example we will consume the web service we exposed in the previous blog. This blog assumes you have configured the basic web service SOAP runtime (if not, read this blog).

Questions that will be answered in the blog are:

  • How to generate a web service consumption proxy?
  • How to setup SOAMANAGER for web service consumption?
  • How to test the web service consumption setup in SE80?
  • How to use the generated web service consumption proxy in ABAP code?
  • What are the authorisation and security aspects for web service consumption?

Generating web service consumption proxy

Start in SE80 by exporting the WSDL file from your previously generated webservice. Goto the WSDL tab and press export to save the WSDL file locally:

In SE80 in your package select Enterprise Services and right click on it to create a new service:

In the object type screen select Service Consumer:

Now select External WSDL/schema:

Select local file:

Select the local file:

Select the package, transport and use Z as prefix:

Then select Finish to complete the roadmap.

Wait for the system to compile the software:

Save and Activate. Now the design time proxy is ready.

SOAMANAGER settings

In the previous steps we have setup the design time proxy. Now we add the runtime artefacts as well.

Now goto transaction SOAMANAGER:

Select Web Service Configuration, and search for the newly created design time object:

Click on the blue internal name to reach the configuration screen:

On the screen press Create and then manual configuration:

Give the logical port a name and description and mark the logical port is Default tickbox to true. Then continue with the roadmap.

Now fill out user ID and password. Continue and fill out user ID and password:

You can lookup the access URL from the service defined in the previous blog and check on the transport settings tab:

Do not use the WSDL URL address, but the binding URL!

Now fill out the URL details in the next screen.

Now finish the roadmap. And on this screen hit the ping web service test button to check if all is ok:

The design time artefacts can be transported. The SOAMANAGER settings need to be repeated in each system. This is wanted as well, since on a test system you might want to call a test web service URL and on production the same web service from the production URL.

Testing the web service consumption setup

Now go back to SE80 and test the web service consumption:

Select the port you created above in SOAMANAGER:

Edit the data:

And press test to get the results:

Using the web service consumption proxy in ABAP code

Now we are ready to use the web service consumption proxy in our ABAP code. ABAP code example:

*&---------------------------------------------------------------------*
*& Report ZCONSUMEWS
*&---------------------------------------------------------------------*
*&
*&---------------------------------------------------------------------*
REPORT zconsumews.

* Data Declarations
DATA: zcl_proxy TYPE REF TO zco_zbapidemowebservice, " Proxy Class
      zdata_in  TYPE zzbapidemo, " Proxy Input
      zdata_out TYPE zzbapidemoresponse, " Proxy Output
      zfault    TYPE REF TO cx_root. " Generic Fault

* Instantiate the proxy class providing the Logical port name
CREATE OBJECT zcl_proxy EXPORTING logical_port_name = 'ZDEMOWS'.

* Set Fixed Values
zdata_in-zimport = '1'.

TRY .
    zcl_proxy->zbapidemo( EXPORTING input = zdata_in
                          IMPORTING output = zdata_out ).
    WRITE: / zdata_out-zexport.
  CATCH cx_root INTO zfault.
* here is the place for error handling

ENDTRY.

Run the ABAP and see the result:

How to get the right parameters? All the required structures can be found on the SE80 ABAP web service consumption proxy internal view:

Authorizations

The end users using the ABAP that is consuming the web service must be given the rights for the correct S_SERVICE object. Otherwise they will get an error that they are not authorized to call the proxy service object.

Monitoring the availability of the web service

It was explained you can test the connection. Unfortunately there is no out of the box way to test this connection in a batch job on a frequent basis. If you want to frequently test and be alerted on issues with connection to the web service, you can read this blog to deploy a simple custom program that executes this function and can be planned in the background.

Background notes and blogs

More information and details can be found in these 2 SAP wiki’s: wiki1 and wiki2.

Relevant OSS notes:

SAP interfacing: exposing web services

In the previous blog we have created a test RFC module. We now will expose this test RFC module as web service. This blog assumes the basic SOAP web service runtime has been done according to the manual in this blog.

If you are looking for information on how to consume a web service in the ABAP stack: read this blog.

Questions that will be answered are:

  • How can I generate a web service design time based on an RFC module?
  • How do I activate the web service runtime via SOAMANAGER?
  • How do I test my web service?

Creating the web service based on RFC module

Goto transaction SE80 and search for the test BAPI:

Now right click on the name ZBAPIDEMO function module and select the option Create / Enterprise Service:

Fill out the name for the service definition and the description. Press Cont. to continue to the next screen:

Press Cont to go to the next step:

Press Cont. to go to the next screen:

Fill out your package and transport request.

Important here: on a sandbox you might want to use a local object ($TMP). In a development system, NEVER use the local option. A lot of data structures and coding will be generated. If you later try to move the objects from $TMP to a real package, you will be faced with a lot of issues. See note  886682 - Proxy inconsistencies on the use of repair programs SXIVERI_PROXY_HASHID_CHECK and SXIVERI_PROXY_HASHID_CHECK_70. After the cumbersome and painfull repair you will not make the mistake again... 

Press Cont. to goto the last screen:

On the screen you can already see the next action after completion: SOAMANAGER. But first press Complete to start the generation of the objects.

After the generation, do not forget to Activate the objects!

Activation success message:

Setting up the runtime with SOAMANAGER

To setup the runtime, start transaction SOAMANAGER. It is assumed that the basis team has performed the initial SOAP runtime setup. If not done, ask the basis team to follow the steps in this blog.

On the SOAMANAGER start screen choose the option Web Service Configuration:

In the next screen search for the design time object we created and activated in the previous section (if you forgot the activate, you will not find it now…):

Select the service and on the next screen press the button Create Service:

Fill out the definition details:

Press Next and define the security settings:

Remark: in the newer versions, the default security is set to high. If you need lower security, go back to SE80 definition in the tab configuration to change the security profile (save and regenerate!):

Press next and define the SOAP protocol settings:

On the last screen of the wizard press finish:

Wait for the runtime generation to finish.

The screen returns to the generated runtime artifacts:

The most important artifact is WSDL file which you can open from here.

Testing the service

Go to transaction SE80 and select the Enterprise Services Browser (if not visible go to menu path Utilities/Settings and add the tool):

Now open your service by clicking the Open Object button and search for the service in the second tab:

Check that the WSDL file is properly showing:

If ok, press the test button (F8) to start the test tool:

On the next screen first press the XML editor button to allow the content to be changed:

Now press execute to test. The result:

Web service security

The functionality security of the web service is the same as for the generic RFC handling (see blog on this).

The technical security of web services is mainly driven from the security settings in SOAMANAGER. There you can set the transport protocol security and you can indicate if you want simple user ID / password security or work with additional certificates for server to server authentication.

The user calling the SAP web service must have the authorization object S_SERVICE. In S_SERVICE you can define the specific web service it needs to be able to call.

Troubleshooting web services security issues

For troubleshooting web services note 2321968 – SOAP Web Service Security Troubleshooting refers to a very extensive SAP site for web service security issues troubleshooting.

Monitoring web services

For monitoring web services, read this dedicated blog.