caity5r4jd

Complete XI Backup / Restore in an hour

Systems fail due to many reasons. Some try to recover from the error, learning new things. But most of the time we are in need of a proper backup - restore procedure which is not time consuming. This blog is about taking a complete backup of XI and restoring it to the same state, even after a reinstallation/crash of the operating system. This backup is very specific for oracle database.

Steps to take Backup

Backup files

There are many ways one can take a backup. The most traditional way is to zip it! Lets zip the necessary files.

  • /usr
  • /oracle (including ctrl & data files)

The total size of compressed files comes around 8 GB (the data is for SP15).

Backup registry

image

Take a backup of only for those marked inside red as a registry (.reg) file.

Steps to Restore XI

Restoring XI is a very tricky way. You cant just extract and make the server up instantly. But there is no need for the machine to have oracle installed or any general prerequisite apart from jdk.

File Extraction

Extract the compressed backup to the same location as previously installed.

Registry Updation

Remove the existing registry entries for oracle and sap and run the registry file. Restart your system after updating registry.

Make the oracle database up

The next challenge is to make the oracle database up. Login to <SID>adm, goto console and type

image

C> sqlplus /nolog

SQL> startup mount;

You will get some error like this.

ORA-00214: controlfile C:\oracle\<SID>\origlogA\cntrl\CNTRLCXI.DBF
version 762 inconsistent with file
C:\oracle\<SID>\sapdata1\system_1\cntrl\CNTRLCXI.DBF version 758

Dont panic.

Type

SQL> shutdown;

Now, copy the file from C:\oracle\<SID>\origlogA\cntrl\CNTRLCXI.DBF to C:\oracle\<SID>\sapdata1\system_1\cntrl\CNTRLCXI.DBF.

After copying, now retype,

SQL> startup mount;

This time you will get a different error like the one below for C:\oracle\CXI\sapdata1\system_1\SYSTEM.DATA1

ORA-01113: file 1 needs media recovery

Now, type

SQL> shutdown;

SQL> recover database;

SQL> startup;

Oracle database must be up and running for next step to proceed. These are the only two problems which must be faced.

Register the services

Even though, we updated the registry with the backup services, the services will not be registered. To register the services, the best suggested way is to re-run the central instance step of installation (which takes less than 10 minutes) which has the registering the services step. Make sure you give the same SID and other details in the CI installation.

image

Start the XI Server

Go to MMC and start the instances. Bingo! The server is up and running as on the date of backup.

Final note:

The server is up and the restore is over. But make sure you clear the java web-start cache if your restore is made after applying some patches. If you don’t, you will not be able to see Adapter Engine in any adapter. This is mainly due the different SP versions in web-start and server since the cache is not cleared.

Managing bulky flat messages with SAP XI (tunneling once again) - UPDATED

I've recently run into the challenge of managing a really bulky message (over 25 Mb!) in SAP XI, where both source and target are flat structures. So I wondered: why going through the overhead of memory consuming XML conversion? Can I get rid of it in this case? Sure I can!

Benchmark

At the beginning of May 2006 (after writing this blog) I had chance to benchmark this approach, compared to the classic one (IDoc XML - graphical mapping - JMS conversion from XML to flat). Here are the astonishing results. I won't unveil XI box's sizing here, as I don't consider it very interesting. The only thing I'll say is that both tests were made on the same machine, with the same load conditions.
Test data: 60 messages of 4 Mb each (flat size), "dropped in" XI almost simultaneously.
Processing: pure XI (adapters, routing, mapping)
Results:
- classical method: between 45 and 50 minutes
- this method: 3 minutes

Introduction

The process I am about to describe here could seem a kind of XI nature negation, but I tend to consider it rather another possibility to achieve a goal. Well, you may be asking yourself, what is this goal? It' easier said than done.
I have an IDoc going out of SAP R/3 that must be mapped against a flat target structure (yeah, the good old classic one with a 4 bytes record type at the beginning), which has to be sent to an MQ queue through JMS Adapter.  Unfortunately this IDoc is bulky: its size can easily reach 25 Mb, if weighed flat (imagine when converted into XML!). We all know that managing this kind of message in XI with a mapping that takes place in the J2EE stack (be that graphical, Java or XSLT) can be dangerous, and make our XI box sit down and have coffee&cigarette... ;-)

Flow

So I first investigated the IDoc tunneling technique (also described in this Michal's weblog), sadly realizing that it's only possible when both sender and receiver are SAP systems exchanging the same IDoc with no mapping at all.
Then the weird solution came up to my mind:

  1. R/3 system dumps the IDoc with a file port (in place of the canonical tRfc port) in flat format
  2. XI reads the flat IDoc with a sender file adapter with no content conversion
  3. data are mapped with an ABAP mapping, thus avoiding any unnecessary JCo data flow between the two stacks
  4. a receiver JMS adapter writes data to the queue with no content conversion
Assumptions

I assume you are familiar enough with WE20 (Partner Profiles) and WE21 (Port definition) in SAP R/3, so I won't go into the details of it. I also assume you're able to configure both communication channels (file sender and JMS receiver), as it should also be easier than ever, considering my technique allows to cut short with them (remember: no content conversion). The only thing that needs to be mentioned is probably how to enable ABAP mapping (which comes disabled in a brand new XI installation): you need to change something in the Exhange Profile and reboot your J2EE stack. For this purpose refer to SAP documentation.

Realization

Before drawning into ABAP code, another general concept. You may be wondering where and how I have designed source and target structures... Well, for what concerns IDoc, I create them dinamically by reading XI definition of it, while for legacy target structures it's up to you, but I decided to define ABAP dictionary structures with SE11 trx (you can also have them in an ABAP include as I did for IDoc, see below).
First step is to create an ABAP mapping class, which must implement the standard interface IF_MAPPING (see this documentation chapter for details). For details about ho to create ABAP package, change request and so on, please refer to SAP documentation or have a look at this weblog o' mine, where this steps are given full details.
Here below is a simplified version of my own ABAP mapping, hopefully commented enough to be self-explanatory.

THE ABAP MAPPING CODE HERE

 

------------------------------------------------------------------------------------------------------------------------------------

*******************************************************************
*******************************************************************
* Very important prerequisite: for ZGUA_IDOCSTR_GENERATE_INCL to
* work correctly, if IDoc was changed in the backend,
* it must be manually deleted (and optionally regenerated) in IDX2.
* No automatic refresh is performed by XI in this case because no
* IDoc adapter is actually involved.
*******************************************************************
*******************************************************************

METHOD if_mapping~execute.

  TYPES: idocline(1065) TYPE c.

  DATA: tmp      TYPE string,
        idocline TYPE idocline,
        theidoc  TYPE TABLE OF string,
        edidc    TYPE edi_dc40,
        edidd    TYPE myedidd,
        linenr   TYPE i,
        lv_str   TYPE string,
        lv_tab   TYPE string,
        dummy    TYPE TABLE OF string.

  DATA: lv_trg    TYPE string,                      " target line
        lt_trg    TYPE TABLE OF string,             " target table
        lv_trgstr TYPE string.                      " target table stringfied

* Target structures (DDIC structures)
  DATA: ls_cl TYPE zgua_st_cl,
        lt_cl TYPE TABLE OF zgua_st_cl,
        ls_rm TYPE zgua_st_rm,
        lt_rm TYPE TABLE OF zgua_st_rm,
        ls_tc TYPE zgua_st_tc,
        lt_tc TYPE TABLE OF zgua_st_tc.

  FIELD-SYMBOLS: <idocline> TYPE string.

* Infinite loop for real runtime debugging
* DATA: debug. WHILE debug IS INITIAL. ENDWHILE.

* Process start --------------------------------------------------------------

* Convert source from xstring to string
  CALL FUNCTION 'ECATT_CONV_XSTRING_TO_STRING'
    EXPORTING
      im_xstring = source
    IMPORTING
      ex_string  = tmp.

* Create string table at crlf
  SPLIT tmp AT cl_abap_char_utilities=>newline INTO TABLE theidoc.

* Last row is dirty??? (bad, but done by xstr2str function), so get rid of it
*  DESCRIBE TABLE theidoc LINES linenr.
*  DELETE theidoc INDEX linenr.

* Get the control record
  READ TABLE theidoc INTO edidc INDEX 1.
  DELETE theidoc INDEX 1.
* Check if include with IDoc segments structure has to be regenerated before including it
  SUBMIT ZGUA_idocstr_generate_incl
    WITH idoctyp = edidc-idoctyp
    WITH cimtyp  = edidc-cimtyp
    WITH port    = edidc-sndpor
    AND RETURN.

  INCLUDE zi_my_std_or_cust_idoc/.

* Collect data of data segments ----------------------------------------------------
  LOOP AT theidoc ASSIGNING <idocline>.
    MOVE <idocline> TO edidd.
    seg_e2_e1 edidd-segnam.
    CASE edidd-segnam.
      WHEN 'Z1SEGMENT01'.
        MOVE edidd-sdata TO ls_z1segment01.
        APPEND ls_z1segment01 TO lt_z1segment01.

      WHEN 'Z1SEGMENT02'.
        MOVE edidd-sdata TO ls_z1segment02.
        APPEND ls_z1segment02 TO lt_z1segment02.

      WHEN 'Z1SEGMENT03'.
        MOVE edidd-sdata TO ls_z1segment03.
        APPEND ls_z1segment03 TO lt_z1segment03.

    ENDCASE.

  ENDLOOP.

** Mapping ------------------------------------------------------------
*
* Notice that in this simple case I'm doing just a "trivial" move-corresp.
* but the mapping logic could be as complicated as needed

  LOOP AT lt_z1segment01 INTO ls_z1segment01.
    ls_cl-rectype = 'CLIE'.
    MOVE-CORRESPONDING ls_z1segment01 TO ls_cl.
    ls_cl-crlf = cl_abap_char_utilities=>cr_lf.
    APPEND ls_cl TO lt_cl.
  ENDLOOP.

  LOOP AT lt_z1segment02 INTO ls_z1segment02.
    ls_rm-rectype = 'REME'.
    MOVE-CORRESPONDING ls_z1segment02 TO ls_rm.
    ls_rm-crlf = cl_abap_char_utilities=>cr_lf.
    APPEND ls_rm TO lt_rm.
  ENDLOOP.

  LOOP AT lt_z1segment03 INTO ls_z1segment03.
    ls_tc-rectype = 'TECO'.
    MOVE-CORRESPONDING ls_z1segment03 TO ls_tc.
    ls_tc-crlf = cl_abap_char_utilities=>cr_lf.
    APPEND ls_tc TO lt_tc.
  ENDLOOP.

* ...
* ...      [ MORE MAPPING LOGIC TO COME HERE ]
* ...

* Final steps ----------------------------------------------------------

* Collect target data and put them as fixed length records
  trgstr2flatstring lt_trg ls_cl lt_cl.
  trgstr2flatstring lt_trg ls_rm lt_rm.
  trgstr2flatstring lt_trg ls_tc lt_tc.

* Convert flat string table to string
  LOOP AT lt_trg INTO lv_trg.
    CONCATENATE lv_trgstr lv_trg INTO lv_trgstr.
  ENDLOOP.

* Convert string to xstring
  CALL FUNCTION 'ECATT_CONV_STRING_TO_XSTRING'
    EXPORTING
      im_string  = lv_trgstr
    IMPORTING
      ex_xstring = result.

ENDMETHOD.

------------------------------------------------------------------------------------------------------------------------------------

MACROS CODE HERE

------------------------------------------------------------------------------------------------------------------------------------

* Get segment type from segment definition
DEFINE seg_e2_e1.
  segnam = &1.
  seglen = strlen( segnam ).
*  describe field segnam length seglen in character mode.
  seglen = seglen - 3.                                " get rid of trailing three chars
  &1 = segnam(seglen).
  replace '2' with '1' into &1
    length 3.
END-OF-DEFINITION.

* Put the given target structure or table to the flat strings table
DEFINE trgstr2flatstring.
* We have a table (0..n element)
  if not &3 is initial.
    loop at &3 into &2.
      oneline = &2.
      append oneline to &1.
    endloop.
  else.
    oneline = &2.
    append oneline to &1.
  endif.
END-OF-DEFINITION.

 

------------------------------------------------------------------------------------------------------------------------------------

The report invoked is the real magic one: it exploits a standard function module behind trx IDX2 to get the IDoc definition and dinamically create an ABAP include with all segments definition.

THE REPORT CODE HERE

 

------------------------------------------------------------------------------------------------------------------------------------

*&---------------------------------------------------------------------*
*& Report  ZGUA_IDOCSTR_GENERATE_INCL
*&
*&---------------------------------------------------------------------*
*&
*&
*&---------------------------------------------------------------------*

REPORT  ZGUA_idocstr_generate_incl.

DATA: lv_incname TYPE programm.

DATA: lv_idocdate TYPE d,
      lv_idoctime TYPE t,
      lv_idocts(14) TYPE n,
      lv_incdate TYPE d,
      lv_inctime(6),
      lv_incts(14) TYPE n,
      ls_segdef TYPE idxedsappl,
      lt_segdef TYPE TABLE OF idxedsappl,
      lv_strname TYPE string,
      lv_tabname TYPE string.

*       The dynamic internal table stucture
DATA: BEGIN OF seg,
      name(30) TYPE c,
      BEGIN OF struct,
        fildname(8) TYPE c,
        abptype TYPE c,
        length TYPE i,
      END OF struct,
      END OF seg.

* The dynamic program source table
TYPES: BEGIN OF incstr,
         line(72),
       END OF incstr.
DATA: incstr TYPE incstr,
      inctabl TYPE STANDARD TABLE OF incstr,
      generr TYPE string.

PARAMETERS: idoctyp TYPE edipidoctp,
            cimtyp TYPE edipidoctp,
            port TYPE idx_port DEFAULT 'SAPSID'.

START-OF-SELECTION.

* -------------------------------------------------------------------------------- *
* Init include name.
  CONCATENATE 'ZI_' idoctyp '/' cimtyp INTO lv_incname.

* Refresh IDoc structure, if needed (the function is smart enough :-)
  CALL FUNCTION 'IDX_STRUCTURE_GET'
    EXPORTING
      port                  = port
      doctyp                = idoctyp
      cimtyp                = cimtyp
      release               = ''
      direction             = ''
      saprel                = '46C'
    TABLES
      edsappl               = lt_segdef
*    EXCEPTIONS
*      no_doctyp             = 1
*      wrong_rfc_destination = 2
*      communication_error   = 3
      .

*  IF sy-subrc <> 0.
*    EXIT.
*  ENDIF.

* Caching mechanism: is IDoc metadata younger than generated include?
  SELECT SINGLE upddate updtime
    FROM idxsload
    INTO (lv_idocdate, lv_idoctime)
    WHERE port = port
    AND   idoctyp = idoctyp.
  CONCATENATE lv_idocdate lv_idoctime INTO lv_idocts.

  SELECT SINGLE sdate stime
    FROM trdir
    INTO (lv_incdate, lv_inctime)
    WHERE name = lv_incname.
  CONCATENATE lv_incdate lv_inctime INTO lv_incts.

* Not in synch with IDoc: incldue must be regenerated
  IF lv_idocts > lv_incts.
    SORT lt_segdef BY segtyp pos.

* Create the dynamic internal table definition in the dyn. program
    LOOP AT lt_segdef INTO ls_segdef.
      AT NEW segtyp.
*       Type for segment
        CONCATENATE 'types: begin of' ls_segdef-segtyp ','
          INTO incstr-line SEPARATED BY space.
        APPEND incstr TO inctabl.
      ENDAT.

      CONCATENATE ls_segdef-fieldname '(' ls_segdef-expleng '),'
        INTO incstr-line.
      APPEND incstr TO inctabl.

      AT END OF segtyp.
        CONCATENATE 'end of' ls_segdef-segtyp '.'
          INTO incstr-line SEPARATED BY space.
        APPEND incstr TO inctabl.
*       Internal table
        CONCATENATE 'lt_' ls_segdef-segtyp INTO lv_tabname.
        CONCATENATE 'data:' lv_tabname 'type table of' ls_segdef-segtyp '.'
          INTO incstr-line SEPARATED BY space.
        APPEND incstr TO inctabl.
*       Working area
        CONCATENATE 'ls_' ls_segdef-segtyp INTO lv_strname.
        CONCATENATE 'data:' lv_strname 'type' ls_segdef-segtyp '.'
          INTO incstr-line SEPARATED BY space.
        APPEND incstr TO inctabl.

      ENDAT.
    ENDLOOP.

* Create and generate the dynamic include
    INSERT REPORT lv_incname FROM inctabl PROGRAM TYPE 'I'.
* GENERATE REPORT lv_incname MESSAGE generr.
    COMMIT WORK AND WAIT.
  ENDIF.

END-OF-SELECTION.

 

------------------------------------------------------------------------------------------------------------------------------------

Conclusion

You may find the whole stuff a bit crazy maybe, but I can guarantee that performance is terrific... 1,5 second mapping time for a 20 Mb file in a DEV box!
Last but not least: in my case the source IDoc is custom, so I developed it so that on each segment I always have key field(s) of the parent, just for convenience. If this is not your case, segments numbering and hierarchy is anyway present in the IDoc flat representation and you may need to improve my code in order to handle them in your ABAP mapping.
In the next weblog I will show you how I realized a split of the resulting message into several smaller and packaged messages in order to overcome an MQ limitation of JMS message size.

Process Flow of Generic Synchronization in MI ABAP Server Component

INTRODUCTION:

This blog is about the records in the Web Application Server and the R/3 backend when a generic synchronization is performed. This is an asynchronous application. These records will help you to understand the process and monitor the data flow in the WAS and R/3 backend during generic synchronization. Now let me give a general overview of the process. The end user has the mobile client installed on his mobile device which might be a laptop or PDA etc and wants to work offline with his application. When he’s finished with his work in the offline scenario he needs to synchronize his data with the backend. The process of data flow would be from your client to your WAS (middleware), WAS to backend and when another synchronization is performed data is transferred from backend to WAS and then to client.

       This blog will help you to understand the process that takes place between the WAS and the backend.

ASSUMPTIONS:

  • Mobile Client & Server are up and running.
  • Mobile Application is deployed in the MI client
  • The end user has synchronized his data and the data container has reached the WAS. This is a whole another process which is out of the scope of this blog.

PROCESS:

Step-1

Let us start with the processes in the Backend.

A “ZFM” – a function module with all the business logic is to be created and a wrapper for this “ZFM” is created using the transaction ME_WIZARD. The name of the method is the alias name for the wrapper function we just created. We refer this method name to access this function module from the application. There is also an optional name field which will be the name of the wrapper. When this field is left blank it will automatically generate a name in the format “ME_ZFM”. This alias method gets an entry in the table BWAFMAPP in the backend against the generated function module.

image

Step-2

On the Web AS – MI Server, we enter the alias method and the name of the “generated function module” in the table BWAFMAPP and a corresponding entry in the table MEMAPPDEST with the RFC destination pointing to the backend. With above mentioned entries, the function module WAF_MW_SYNC in the server maps to the backend and executes the business logic – “ZFM”.

image

image

Step-3

When the client synchronizes the application the data container is generated and comes to the inbound container of the MI Server. The details of the container are in the table MESYBODY. The type I is highlighted in the picture, which means INBOUND CONTAINER. The device ID that is seen here in the picture is generated automatically and is different from that of the device ID on the client. It always remains unique for that client device. The data container is transferred to the backend and the “ZFM” is executed in the process described in step 2.

image

Step-4

The data container from the backend resides in the outbound container of  MI Server with the same device ID but different container ID. Since this process is asynchronous, a second synchronization is needed from the client to get the updated data. When this happens, the outbound container will be delivered to the client and MESYHEAD is cleared. The type I seen in the above picture will be converted to O which means outbound container.

image

ISSUES:

If Step 2 is overlooked you will get this following error.

image

PCK for Enabling & Testing SAP NetWeaver XI Adapter Scenarios

As many of you already know, the main purpose of the PCK (Partner Connectivity Kit) is to enable smaller partners or subsidiaries to integrate their landscape with the primary business partner’s SAP NetWeaver XI landscape. However, did you also know that it is a great tool and, essentially a development environment, for developing and testing modules (and adapters) and testing adapter features and configuration without the need of an XI system at all? Yes, this is the case and when leveraged properly, it can be an essential asset to enabling your adapter scenarios.

Background

Much like the central and de-central Adapter Engine (AE), the PCK is a J2EE based application with the Adapter Framework as its core. Thus it also includes the adapters that come delivered with the AE (e.g. File, SOAP, JDBC, etc.) and, of course, communicates with XI using the native XI messaging protocol. However, because its intended use is outside the immediate XI landscape, it has its own Integration Builder-like user interface for configuration and, more useful, it can be set up in a “loop” configuration such that the PCK is both the sender and receiver of a message. This allows for the testing of adapters without having an Integration Server available.

image

As you can see, the UI has the same look and feel as the Integration Builder Tools.

Loop Configuration

As mentioned, the loop configuration is used when XI is not in the landscape or you simply want to do some local testing without XI. The most essential changes take place in the SAP XI Adapter XI service in the J2EE Visual Administrator tool. The following properties should be changed:

image

image

For completing the loop configuration and setting up a test, please refer to the following link: PCK Loop Configuration

Benefits

The primary benefit of using the PCK in such a fashion is the ability to run and test adapter specific scenarios without an Integration Server. Right now, it’s really the next best thing to having your own portable XI. Not having a dependency to an XI system can provide many advantages in certain circumstances:

  • Greater flexibility to experiment without putting main XI installation at risk
  • Local environment (e.g. laptop can test XI scenarios) - can configure and test anywhere, anytime.
  • Fast way to test and experiment with new adapter features and scenarios
  • Since this is your own local environment, you have control and can slap on the latest SP without waiting for the latest SP to be applied to the XI system.
  • Develop, deploy, and test adapter modules, and also the new SAP Conversion Agent.
    Limitations

    Obviously without an Integration Server in the mix, there are limitations in many respects, only a few of which I'll list:

  • Not for end-to-end integration testing (e.g. no real receiver determination, routing rules, mapping, etc.)
  • JPR (Java Proxy Runtime) is not supported. Although the functions of the Adapter Framework are also used by the Partner Connectivity Kit (PCK), the PCK does not support the JPR.
  • Rather "heavy" environment - full SAP NetWeaver AS Java is a prerequisite for installing the PCK. Memory intensive - recommend having at least 1GB of RAM.

    So that there is no misunderstanding, the PCK’s main business use is to exchange messages with an Integration Server. However, that is outside the scope of this blog and thus, was not discussed in that regard.

    Installation

    The SAP Partner Connectivity Kit SR1 installation guide is available on SAP Service Marketplace: PCK Installation Guide - SR1

  • Think objects when creating Java mappings

     

    Most of the XI consultants come from SAP (ABAP) background and face many problems when they need to implement Java mapping for the source to target transformation. This weblog attempts to give a guideline on approach to be taken when developing mapping programs.

    Assuming we have a IDoc to flat file scenario, let us first have a look at the source IDoc structure.

    Source IDoc XML structure

    Graphical reprensentation for quick understanding

    graphical

    Starting from the lowermost level of T08 as a composite structure i.e. made up of other simple properties, think of T08 as a class having all string properties with names same as that of the tag names of IDoc XML. So a typical T08 class structure would be,

    ------------------------------------------------------------------------------------------------------------------------------------

    public class T08{
        private String TRANSACTION_TYPE;
        private String METER_POINT_REFERENCE;
        private String DELIVERY_POINT_ALIAS;
        private String PO_BOX;
        private String SUB_BUILDING_NAME;
        private String BUILDING_NAME;
        private String BUILDING_NUMBER;
        private String DEPENDENT_STREET;
        private String PRINCIPAL_STREET;
        private String DOUBLE_DEPENDENT_LOCALITY;
        private String DEPENDANT_LOCALITY;
        private String POST_TOWN;
        private String COUNTY;
        private String POSTCODE_OUTCODE;
        private String POSTCODE_INCODE;
        private String PAF_INDICATOR;
    }

    ------------------------------------------------------------------------------------------------------------------------------------

    The fastest way to create this class could be going to the segment editor in SAPGUI, copy all the property names and create String variables with all these names. Next, assuming that you are using NWDS as your development tool, create the getter / setter methods for all these properties. Matter of a right click only!!! Right click anywhere inside the class body, select Source --> Generate getter / setter --> Select All.Click Ok.

    Now that we are done with T08 class, lets move on to its parent i.e. A00. Create the A00 class on the similar lines of T08, key point to note is, A00 should have a class member of type T08 .

     

    ------------------------------------------------------------------------------------------------------------------------------------

    public class A00{
        private String TRANSACTION_TYPE;
        private String ORGANISATION_ID;
        private String FILE_TYPE;
        private String CREATION_DATE;
        private String CREATION_TIME;
        private String GENERATION_NUMBER;
        private String INT_FILE_TYPE;
        private String RECEIVER_ID;
        private T08 t08; //reference to the T08 structure
    }

     

    ------------------------------------------------------------------------------------------------------------------------------------

    Generate the getter / setter methods for this class too.

    We are now ready with the basic data structure in which the data would be stored at runtime and following steps with more or less modification could be taken as the basis for the actual program logic

    1.Based on the occurrence of a particular sub-structure the data type of the class member should be decided. E.g. in our case, T08 was having only one occurrence so we declared A00 with a class member T08. If T08 had multiple occurrences, then declare a Vector / Arraylist of T08 as a class member for A00.
    e.g. private T08 t08; //reference to the T08 structure i.e. for single T08 occurrence.
    private Vector t08List ; //reference to a dynamically growing Vector of T08 structures.

    2.If multiple source structures appear in the input to the mapping program , simply have a class level member ArrayList or Vector which would contain various A00 structures.
    private Vector idocList; // list of idocs

    3.Build the child object completely before adding / setting it to the parent object. So , in our case, we have to make sure that we set all the possible properties of T08 before adding it to A00 structure.

    4.The sequence of building all the hierarchical objects should be from the lowermost composite structure to the top level composite structure. So a typical code based on this approach would look like ,

     

    ------------------------------------------------------------------------------------------------------------------------------------

    a00.setTRANSACTION_TYPE(value);
    ....
    ....
    t08.setPAF_INDICATOR(value);
    ....
    ....
    t08.setBUILDING_NUMBER(value);
    ....
    ....
    a00.setT08(t08);
    ....
    ....
    idocList.add(a00);

     

    ------------------------------------------------------------------------------------------------------------------------------------

    5.So once you are done with populating the complete data structure , imagine how handy it is to refer any value within the data structure using the getXXX methods.

    Advantages of this approach:

    1.You can write more readable, modular, and maintainable code.

    2.Use of objects along with Java’s predefined data structures, collections helps you reduce programming effort, increases program speed and quality as these predefined structures are very well designed and well tested. Performance & efficiency overheads as well as chances of runtime errors associated with usage of Arrays can be safely avoided.

    3.toString method of the composite structure classes can be overridden to provide any specific output requirements. E.g. if FILE_TYPE attribute of A00 should appear with double quotes in the final output, the logic could be written in toString of A00 class. toString is the method which if overridden, gets called automatically say if we use a statement like
    System.out.println(a00);

     

    SAP XI Interview Questions

    BPM:Single Sender and Multiple Receivers based on synchronous exchange(switch) part-1

    Overview

    In this exercise we can focus on the Business Process Management capabilities. The goal is to send a Purchase order Request from Mail interface to a particular receiver out of couple of receivers and get back sales order. Sending a Request Selectively to a Receiver out of Receivers pool and Getting Response back from Receiver. (Using BPM includes steps Switch, Correlation, and containers).

    Description

    Send purchase order from mail then mail adapter takes this purchase order from mail server inbox and sends to corresponding receiver for generating sales order. The Receiver selection is done based on vendor number. Sender is Mail server and Receivers are SAP and Peoplesoft and FileSystem.

    Note: This is not a productive example, I hope it helps you to understand BPM scenarios based on switch and synchronous mechanism.

    BPM Diagram

    image

    Exercise steps

    Step 1-System Landscape Directory

    1. Create SLD objects for File system or reuse. (Receiver)

    2.Create SLD objects for People soft system or reuse. (Receiver)

    3.Create SLD objects for SAP R/3 or reuse. (Receiver)

    4.Create SLD objects for Mail System (Sender)

    Step 2-Integration Repository

    2.1.Import RFC for SAP R/3 ZRFC_SALESORDER_CREATEFROMDAT1

    2.2.Prepare Interface objects for Mail system (Req Data type, Req Message type, Res Data type, Res Message Type, Outbound Synchronous Interface) or reuse.

    2.3 Purchase Order Req Data type

    image

    2.4 Purchase Order Message type

    image

    2.5 Res Data Type (Sales Order DT_SO )

    image

    2.6 Res Message Type (Sales Order MT_SO)

    image

    2.7 Message interface (Out bound Synchronous Out :MT_PO, Inp :MT_SO)

    image

    2.8 Inbound Synchronous Message Interface (Out: MT_SO, Inp: MT_SO) for File and PeopleSoft.

    image

    Note: Req Data type MT_PO, Req Message Type: MT_PO Res Data type: MT_SO, Res Message Type: MT_SO are common for all File System and Mail System and People Soft System.

    2.9 Abstract Asynchronous Interface for Purchase Order (MT_PO).

    image

    2.10 Abstract Asynchronous Interface for Sales Order (MT_SO)

    image

    2.11 Abstract synchronous Interface ( MT_PO,MT_SO)

    image

    Mapping Objects: Message mapping

    2.12 .Message Mapping between MT_PO and RFC Req.

    image

    2.13 Message Mapping between RFC response to MT_SO

    image

    2.14 Message Mapping between MT_PO and MT_PO.

    image

    2.15 Message Mapping between MT_SO and MT_SO

    image

    Mapping Objects : Interface Mapping

    2.16 Interface mapping between synchronous abstract interface and RFC called IM_PO_SO.

    image

    2.17 Interface mapping between synchronous abstract interface(MI_PO_SO_ABS) and inbound synchronous interface (MI_PSFT_FILE)

    image

    Step 3 BPM

    3.1.Create Integration Process.

    3.2.Create Following Containers.

    1.PO (Category: Abstract Interface, Type: MI_PO_ABS)

    2.SO (Category: Abstract Interface, Type: MI_SO_ABS)

    3.3 Create Correlation Object.

    1.Correlation Name: CO

    2.Correlation Container: CR, type: string, Involved messages: MI_PO_ABS,MI_SO_ABS

    3.4.Receiver Step 1.

    Step Name: PO_Rec

    Message: PO

    StartProcess: yes

    Active Correlation: CO

    3.5 Switch 1.

    Step Name: Sel_Rec

    Branch1 Condition: (PO./p1:MT_PO/DT_ORDER_PARTNERSNEW/PARTN_NUMB = PSFT )

    Branch2 Conditions: (PO./p1:MT_PO/DT_ORDER_PARTNERSNEW/PARTN_ROLE = FILE)

    Branch3 Condition : (PO./p1:MT_PO/DT_ORDER_PARTNERSNEW/PARTN_NUMB = SAP)

    3.6 Sender 1.(Branch1)

    Step Name: PSFT

    Mode: Synchronous

    Synchronous Interface: MI_PO_SO_ABS

    Request Message: PO

    Response Message: SO

    Activate Correlation: CO

    3.7 Sender 2(Branch 2)

    Step Name: FILE

    Mode: Synchronous

    Synchronous Interface: MI_PO_SO_ABS

    Request Message: PO

    Response Message: SO

    Activate Correlation: CO

    3.8 Sender 3 (Branch 3)

    Step Name: SAP

    Mode: Synchronous

    Synchronous Interface: MI_PO_SO_ABS

    Request Message: PO

    Response Message: SO

    Activate Correlation: CO

    3.9 Sender 4

    Step Name: PSFT

    Mode: Asynchronous

    Message: SO

    Activate Correlation: CO

    Integrate SAP Conversion Agent by Itemfield with SAP XI

    Introduction

    In a joint partnership with SAP, Itemfield provides the so called SAP Conversion Agent by Itemfield. It enables data conversions from unstructured and semi-structured data formats into XML, e.g. from ASCII, EBCDIC, MS Word, MS Excel, PDF, HL7, EDIFACT etc.

    The Conversion Agent consists of the Conversion Agent Studio, and the Conversion Agent Engine. Former one is for designing and configuring transformations that can consist of parsers, serializers, mappers, or transformers. Once your tranformation is done, you deploy it as a Conversion Agent service. The services are run by the Conversion Agent Engine what is the runtime environment of the Conversion Agent system.

    In general, there are two possibilities to integrate the SAP Conversion Agent with SAP XI, either by a module within the SAP XI Adapter Framework, or via the Conversion Agent Java API. Both approaches are shown here.

    Prerequisites

    The SAP Conversion Agent by Itemfield has been shipped since SAP XI3.0 SP14. For a complete list of platforms that are supported, please refer to SAP Note 894815.

    However, the Java API should be supported prior to SAP XI3.0 SP14.

    How to integrate using SAP XI Adapter Framework Module

    After you deployed your transformation as service, you can call it from a module within any adapter that runs on top of the SAP XI Adapter Framework. You can call the service either at sender or receiver side.

    In the SAP XI Integration Directory, create a communication channel, choose an adapter type, and maintain the appropriate parameters. Change to tab Module to define a local Enterprise Bean. Maintain localejbs/sap.com/com.sap.nw.cm.xi/CMTransformBean as module name, and set the parameter TransformationName to the service that you deployed.

    image

    How to integrate using Conversion Agent Java API

    Let's assume you need to call the transformation service while sending a message to XI. Furthermore, you intend to use a transport protocol that does not require a sender agreement, or strictly speaking does not allow to maintain a module within a sender communication channel, e.g. when using plain http or XI protocol. In that case, the module approach above can't be applied.

    Alternatively, you can call the transformation service within a mapping via the Conversion Agent Java API. The Java API is exposed by the CM_JavaApi.sda J2EE library as part of the SAP Conversion Agent software package, see below.

    Define a Java class that implements the Java interface com.sap.aii.mapping.api.StreamTransformation. It contains two methods: public void execute to run the mapping, and public void setParameter to access message header data during runtime. Import packages com.sap.aii.mapping.api, and com.itemfield.contentmaster. Call the transformation service by establishing a Conversion Agent parser engine session, as shown in the code below.

    package com.sap.rig.apa; import java.io.IOException; import java.io.InputStream; import java.io.OutputStream; import java.util.Map; import com.itemfield.contentmaster.*; import com.sap.aii.mapping.api.*; public class callConversionAgent implements StreamTransformation { private Map myParam; public void setParameter(Map param) { myParam = param; } public void execute(InputStream in, OutputStream out) throws StreamTransformationException { // get header data String senderInterface = (String)myParam.get(StreamTransformationConstants.INTERFACE); // get CA service name String caService = new String(); if (senderInterface.equalsIgnoreCase("account_detail_req_ob")) { caService = "EBCDIC2XML"; else if (senderInterface.equalsIgnoreCase("account_detail_resp_ob")) { caService = "XML2EBCDIC"; } byte[] inbyte = null; try { // convert InputStream into byte stream int bufsize = in.available(); inbyte = new byte[bufsize]; in.read(inbyte); // convert byte stream into InputBuffer InputBuffer inbuf = new InputBuffer(inbyte); OutputBuffer outbuf = new OutputBuffer(); // build the CA parser engine session ParserEngineSession session = new ParserEngineSession(caService, inbuf, outbuf); try { session.exec(); } // convert OutputBuffer into byte stream byte[] outbyte = outbuf.toByteArray(); // convert byte stream into OutputStream out.write(outbyte); } } }

    ------------------------------------------------------------------------------------------------------------------------------------

    package com.sap.rig.apa;

    import java.io.IOException;
    import java.io.InputStream;
    import java.io.OutputStream;
    import java.util.Map;

    import com.itemfield.contentmaster.*;
    import com.sap.aii.mapping.api.*;

    public class callConversionAgent implements StreamTransformation {

      private Map myParam;   
      public void setParameter(Map param) {
        myParam = param;
      }

      public void execute(InputStream in, OutputStream out)
        throws StreamTransformationException {

    // get header data
        String senderInterface = (String)myParam.get(StreamTransformationConstants.INTERFACE);

    // get CA service name
        String caService = new String();
        if (senderInterface.equalsIgnoreCase("account_detail_req_ob")) {
          caService = "EBCDIC2XML";
        else if (senderInterface.equalsIgnoreCase("account_detail_resp_ob")) {
          caService = "XML2EBCDIC";
        }

        byte[] inbyte = null;
        try {

    // convert InputStream into byte stream
          int bufsize = in.available();
          inbyte = new byte[bufsize];
          in.read(inbyte);

    // convert byte stream into InputBuffer
          InputBuffer inbuf = new InputBuffer(inbyte);
          OutputBuffer outbuf = new OutputBuffer();

    // build the CA parser engine session
          ParserEngineSession session = new ParserEngineSession(caService, inbuf, outbuf);
          try {
            session.exec();
          }

    // convert OutputBuffer into byte stream
          byte[] outbyte = outbuf.toByteArray();

    // convert byte stream into OutputStream        
          out.write(outbyte);

        }
      }
    }

     

     

    ------------------------------------------------------------------------------------------------------------------------------------

     

    Create a .jar, and import the same to the Integration Repository.

    image

    Create an Interface Mapping, and choose the appropriate Java Class as mapping program.

    image

    For more details about the SAP Conversion Agent including installation guidelines, please refer to SAP Service Marketplace, and navigate to → Media Library → Documentation → Conversion Agent Documentation. The SAP Conversion Agent software can be downloaded under SAP Software Distribution Center → Download → Support Packages and Patches → Entry by Application Group → SAP NetWeaver → SAP NETWEAVER → SAP NETWEAVER 04 → Entry by Component → Process Integration (PI/XI) → NW 04 CONVERSION AGENT 1.0.

    XI Mapping Module for AFW

    Introduction

    I have recently been searching for a module that could execute a generic mapping program, but I had no satisfaction. If you are wondering why I needed it, keep up reading and you'll probably find out yourself many scenarios in which this technique can be applied.
    I searched standard doc first, only finding the MessageTransformBean which has serious limitations: first of all it was born for XI 2.0, and it needs the mapping program to be packaged and deployed along with the module in a .sda file.
    Sure, a module that can execute a Java or XSLT mapping program packaged in the module .jar itself could have been enough, but I just never stop, I'm not satisfied by half-solutions...

    The Goal

    So the goal was to have my adapter module perform these basic tasks:

    • read its own parameters in the Communication Channel configuration to get mapping coordinates and other settings
    • dynamically retrieve the mapping (be that graphical, Java or XSLT)
    • execute the mapping by invoking the mapping runtime
    • put the resulting document as message payload
    Graphically Speaking

    In a common and simplified XI scenario, data flow can be represented by the picture below. Notice that steps 3 and 4, in which data are sent via RFC by the Integration Engine to the Mapping Runtime and the other way around (synchronous call to a Jco Server instance), is exactly what I aimed to avoid. Performance is seriously affected by this step. More, suppose you have an ABAP mapping as the main transformation program in your scenario, but you need a kind of pre-mapping Java or XSLT program to be executed before the ABAP class can process data... Don't you think it's really an overhead to go back and forth between the two stacks?
    image
    So comes the modified scenario, in which steps 3 and 4 (which is actually a single logical step, that is one mapping) are removed. In the picture below the Mapping Runtime is now invoked by the Adapter Framework in step 2 before the message is sent to the ABAP stack, and in step 5 before the message is sent the Messaging System for final delivery. In this kind of scenario, the two steps can be both present, or only one of them: one step represents one logical step, that is one transformation.
    image
    The module is flexible enough to be used both in sender and receiver communication channels: if you use it in step 2, say it acts like a kind of pre-processor, while if you use it in step 5, it acts like a post-processor. You can even have multiple instances of the same module in one communication channel, so that you can emulate a multi-mapping, where the result document of previous mapping will be the source of the following one...

    NWDS project

    So let's get it built.
    For general information about how to build the right project in NetWeaver Developer Studio, please refer to this SAP howto. Here I will just underline and focus on needed additional settings.

    image

    In addition to common .jar files required by any module project, you need to download a couple more from your XI box, and put them in your project build path (not as external jars, but using the “Add Variable...” and “Extend” functions, otherwise the .jar would be packaged in your final .ear file, which is not dangerous but really useless).

    JarName
    Can be found at...

    guidgenerator.jar
    /usr/sap/<SID>/DVEBMGS00/j2ee/cluster/server0/bin/ext/com.sap.guid

    aii_utilxi_misc.jar
    /usr/sap/<SID>/DVEBMGS00/j2ee/cluster/server0/apps/sap.com/com.sap.xi.services

    aii_mt_rt.jar
    /usr/sap/<SID>/DVEBMGS00/j2ee/cluster/server0/apps/sap.com/com.sap.xi.services

    aii_ibrun_server.jar
    /usr/sap/<SID>/DVEBMGS00/j2ee/cluster/server0/apps/sap.com/com.sap.xi.services/ EJBContainer/applicationjars

    Very important now are J2EE references, that you can find in the application-j2ee-engine.xml of your Enterprise Application addition project. In addition to common references needed by a module tu run smoothly in the J2EE, you need a couple more. You can find here below the xml source needed.
    <reference reference-type="weak"> <reference-target provider-name="sap.com" target-type="application">com.sap.xi.services</reference-target> </reference> <reference reference-type="weak"> <reference-target provider-name="sap.com" target-type="library">com.sap.guid</reference-target> </reference>

    ------------------------------------------------------------------------------------------------------------------------------------

    <reference reference-type="weak">
        <reference-target provider-name="sap.com"
            target-type="application">com.sap.xi.services</reference-target>
    </reference>

    <reference reference-type="weak">
        <reference-target provider-name="sap.com"
            target-type="library">com.sap.guid</reference-target>
    </reference>

     

    ------------------------------------------------------------------------------------------------------------------------------------

    Hint. Remember to set up also the JNDI name in your ejb-j2ee-engine.xml, which must match the name you'll use to invoke the module from a communication channel.
    Now you're almost there... Just missing one thing more: the bean code.
    I won't comment the code here in this weblog: it's really full of comments in itself.

     

    ------------------------------------------------------------------------------------------------------------------------------------

    package com.guarneri.xi.afw.modules;

    import javax.ejb.SessionBean;
    import javax.ejb.SessionContext;
    import javax.ejb.CreateException;

    // XI specific imports
    import com.sap.aii.af.mp.module.ModuleContext;
    import com.sap.aii.af.mp.module.ModuleData;
    import com.sap.aii.af.mp.module.ModuleException;
    import com.sap.aii.af.ra.ms.api.*;
    import com.sap.aii.af.service.auditlog.*;

    // XML manipulation imports
    import com.sap.aii.ibrun.sbeans.mapping.*;
    import com.sap.aii.ibrun.server.mapping.MappingHandler;
    import com.sap.aii.ibrun.server.mapping.api.TraceList;
    import com.sap.guid.*;

    // Other imports
    import java.lang.reflect.Field;
    import java.util.Date;
    import java.util.HashMap;
    import java.util.Hashtable;
    import java.util.Map;

    /**
    * @ejbHome <{com.guarneri.xi.afw.modules.AFWJavaMapHome}>
    * @ejbLocal <{com.guarneri.xi.afw.modules.AFWJavaMapLocal}>
    * @ejbLocalHome <{com.guarneri.xi.afw.modules.AFWJavaMapLocalHome}>
    * @ejbRemote <{com.guarneri.xi.afw.modules.AFWJavaMap}>
    * @stateless
    * @transactionType Container
    */
    public class AFWJavaMapBean implements SessionBean {

        final String auditStr = "*** AFWJavaMap - ";
        ModuleContext mc;
        boolean hangOnError;
        Object obj = null; // Handler to get Principle data
        Message msg = null; // Handler to get Message object
        Hashtable mp = null; // Module parameters
        AuditMessageKey amk = null;
        // Needed in order to write out on the message audit log

        public ModuleData process(
            ModuleContext moduleContext,
            ModuleData inputModuleData)
            throws ModuleException {

            Date dstart = new Date();
            byte[] out = null; // Mapping result

            // Creation of basic instances
            try {
                mc = moduleContext;
                obj = inputModuleData.getPrincipalData();
                msg = (Message) obj;
                mp =
                    (Hashtable) inputModuleData.getSupplementalData(
                        "module.parameters");

                if (msg.getMessageDirection() == MessageDirection.INBOUND)
                    amk =
                        new AuditMessageKey(
                            msg.getMessageId(),
                            AuditDirection.INBOUND);
                else
                    amk =
                        new AuditMessageKey(
                            msg.getMessageId(),
                            AuditDirection.OUTBOUND);

            } catch (Exception e) {
                Audit.addAuditLogEntry(
                    amk,
                    AuditLogStatus.ERROR,
                    auditStr
                        + "Error while creating basic instances (obj,msg,amk,mp)");
                throw new ModuleException(
                    auditStr
                        + "Error while creating basic instances (obj,msg,amk,mp)");
            }

            Audit.addAuditLogEntry(
                amk,
                AuditLogStatus.SUCCESS,
                auditStr + "Process started");

            // See what we want to do in case of mapping failure (default is HANG)
            hangOnError = true;
            String hoe = mpget("hang.on.error");
            if (hoe != null)
                hangOnError =
                    !hoe.equalsIgnoreCase("no") || !hoe.equalsIgnoreCase("false");

            // Audit source document if requested
            String auditSource = mpget("audit.source.message");
            if (auditSource != null)
                Audit.addAuditLogEntry(
                    amk,
                    AuditLogStatus.SUCCESS,
                    auditStr
                        + "Source document: "
                        + new String(msg.getDocument().getContent()));

            // SWCV guid generation
            String swvcGuidStr = mpget("swcv.guid");
            if ((swvcGuidStr == null)
                || (swvcGuidStr.length() != 32 && swvcGuidStr.length() != 36))
                return this.handleReturn(
                    inputModuleData,
                    auditStr
                        + "No SWVC Guid was provided or Guid has wrong length. Exiting!",
                    true);

            if (swvcGuidStr.length() == 32)
                // Perform some required (homemade) formatting
                swvcGuidStr =
                    swvcGuidStr.substring(0, 8)
                        + "-"
                        + swvcGuidStr.substring(8, 12)
                        + "-"
                        + swvcGuidStr.substring(12, 16)
                        + "-"
                        + swvcGuidStr.substring(16, 20)
                        + "-"
                        + swvcGuidStr.substring(20, 32);

            IGUID swcv = null;
            try {
                IGUIDGeneratorFactory guidGenFac =
                    GUIDGeneratorFactory.getInstance();
                IGUIDGenerator guidGen = guidGenFac.createGUIDGenerator();
                swcv = guidGen.parseGUID(swvcGuidStr);
            } catch (GUIDFormatException e2) {
                return this.handleReturn(
                    inputModuleData,
                    auditStr
                        + "Error while creting SWVC Guid (GUIDFormatException). Exiting!",
                    true);
            }

            // Get the mapping type (if none provided assume it's graphical mapping)
            String mappingType = mpget("mapping.type").toUpperCase();
            if (!mappingType.equalsIgnoreCase("GRAPHICAL")
                && !mappingType.equalsIgnoreCase("JAVA")
                && !mappingType.equalsIgnoreCase("XSLT")) {
                return this.handleReturn(
                    inputModuleData,
                    auditStr
                        + "Wrong mapping type supplied "
                        + "(only GRAPHICAL | JAVA | XSLT are allowed). Exiting!",true);
            } else if (mappingType == null)
                mappingType = "GRAPHICAL";

            // Get mapping namespace (this is not vital, as the mapping search
            // is tolerant enough to look into the whole SWVC)    
            String ns = mpget("namespace");
            if (ns == null)
                ns = new String();

            // Get mapping name. Here I wanna be a good boy and allow people
            // to put it the natural way ;-)
            String mappingName = mpget("mapping.name");
            if (mappingName == null)
                return this.handleReturn(
                    inputModuleData,
                    auditStr + "No mapping name was provided. Exiting!",
                    true);

            if (mappingType.equalsIgnoreCase("GRAPHICAL")) {
                // Let's build the real class name
                mappingName = "com/sap/xi/tf/" + "_" + mappingName + "_";
                mappingType = "JAVA";
            }

            // Check which trace level was requested (default to warning)
            char trLevCh = '1';
            String trLevel = mpget("trace.level");
            if (trLevel != null) {
                if (trLevel.equalsIgnoreCase("INFO"))
                    trLevCh = '2';
                else if (trLevel.equalsIgnoreCase("DEBUG"))
                    trLevCh = '3';
                else if (trLevel.equalsIgnoreCase("WARNING"))
                    trLevCh = '1';
                else if (trLevel.equalsIgnoreCase("OFF"))
                    trLevCh = '0';
            }

            // Create the messenger object
            Messenger mes = MappingDataAccess.createMessenger(trLevCh);

            // Instantiate MappingData object
            MappingData md = null;
            try {
                md =
                    MappingDataAccess.createMappingData(
                        mappingType,
                        mappingName,
                        ns,
                        swcv,
                        -1,
                        "AFWJavaMap");
            } catch (Exception e) {
                return this.handleReturn(
                    inputModuleData,
                    auditStr + "Error instantiating MappingData: " + e);
            }

            // Put this MappingData object in an array (this emulates an
            // interface mapping with several mapping steps)   
            MappingData[] mds = new MappingData[1];
            mds[0] = md;

            // Build the map object
            Map map = null;
            try {
                // Here I just feed the messageId but with same tecnique
                // other useful stuff can be put
                // (it strongly depends on which Header Fields you access
                // in your mapping!)
                map = new HashMap();
                map.put("MessageId", msg.getMessageId());

                // Here I need to use reflection to get a valid instance
                // of the trace object
                Object tr = null;
                Class c = mes.getClass();
                Field trf = c.getDeclaredField("trace");
                trf.setAccessible(true);
                tr = trf.get(mes);
                map.put("MappingTrace", tr);
            } catch (Exception e) {
                return this.handleReturn(
                    inputModuleData,
                    auditStr
                        + "Error during HashMap and MappingTrace creation - "
                        + e
                        + " - Exiting!");
            }

            // Create the MappingHandler object
            MappingHandler mh = null;
            try {
                mh =
                    new MappingHandler(
                        msg.getDocument().getContent(),
                        mds,
                        map,
                        mes);
            } catch (Exception e) {
                return this.handleReturn(
                    inputModuleData,
                    auditStr
                        + "Error during MappingHandler creation - "
                        + e
                        + " - Exiting!");
            }

            // Rock n' Roll: execute the mapping
            boolean mappingFailed = false;
            try {
                out = mh.run();
            } catch (Exception e1) {
                mappingFailed = true;
                // handleReturn is postponed to allow mapping trace
                // to be in the msg audit
            }

            // Trace management
            TraceList tl = mes.getTraceList();
            for (int i = 0; i < tl.size(); i++)
                Audit.addAuditLogEntry(
                    amk,
                    AuditLogStatus.SUCCESS,
                    "Mapping TRACE: "
                        + "("
                        + tl.getItem(i).getLevel().toString()
                        + ") "
                        + tl.getItem(i).getMessage());

            if (mappingFailed)
                return this.handleReturn(
                    inputModuleData,
                    auditStr + "Error during mapping execution - Exiting!");

            // Audit resulting document if requested
            String auditResult = mpget("audit.result.message");
            if (auditResult != null)
                Audit.addAuditLogEntry(
                    amk,
                    AuditLogStatus.SUCCESS,
                    auditStr + "Output document: " + new String(out));

            // New payload insertion
            try {
                XMLPayload newPayload = msg.getDocument();
                newPayload.setContent(out);
                msg.setDocument(newPayload);
                inputModuleData.setPrincipalData(msg);
            } catch (Exception e) {
                return this.handleReturn(
                    inputModuleData,
                    auditStr + "Error while inserting new payload.");
            }

            // Stats
            Date dend = new Date();
            Audit.addAuditLogEntry(
                amk,
                AuditLogStatus.SUCCESS,
                auditStr
                    + "Process completed - "
                    + "(execution "
                    + (dend.getTime() - dstart.getTime())
                    + " ms)");

            // Return
            return inputModuleData;

        }

        private ModuleData handleReturn(ModuleData md, String strmsg)
            throws ModuleException {
            Audit.addAuditLogEntry(amk, AuditLogStatus.ERROR, auditStr + strmsg);
            if (hangOnError) {
                // Audit anyway source document to let developers understand
                // what went wrong by doing some test/debug
                Audit.addAuditLogEntry(
                    amk,
                    AuditLogStatus.SUCCESS,
                    auditStr
                        + "Source document: "
                        + new String(msg.getDocument().getContent()));
                // TODO: This is a temporary workaround!
                // Put a dummy error XML message as output so that the message
                // won't have any application meaning
                Message lmsg = (Message) md.getPrincipalData();
                XMLPayload dummypl = lmsg.createXMLPayload();
                try {
                    dummypl.setContent(
                        new String(
                            "<?xml version=\"1.0\"?>"
                                + "<Error>Mapping failed in module</Error>")
                            .getBytes());
                    lmsg.setDocument(dummypl);
                } catch (PayloadFormatException e) {
                } catch (InvalidParamException e) {
                }
            }

            return md;
        }

        private ModuleData handleReturn(
            ModuleData md,
            String strmsg,
            boolean throwEx)
            throws ModuleException {
            throw new ModuleException(strmsg);
        }

        private String mpget(String pname) {
            return (String) mc.getContextData(pname);
        }

        public void ejbRemove() {
        }

        public void ejbActivate() {
        }

        public void ejbPassivate() {
        }

        public void setSessionContext(SessionContext context) {
            myContext = context;
        }

        private SessionContext myContext;
        /**
         * Create Method.
         */
        public void ejbCreate() throws CreateException {

        }

    }

     

    ------------------------------------------------------------------------------------------------------------------------------------

    Usage

    The module supports these 8 parameters:

    Parameter Name

    Mandatory

    Values

    Description

    swcv.guid
    Yes
    Any valid SWCV Guid
    The guid of the Software Component Version in which the mapping resides

    mapping.name
    Yes
    Any valid mapping name
    The name of the mapping program. In case of graphical mapping it's just the name of the mapping you can see in the repository. In case of Java mapping, the name must be fully qualified with package (e.g. com.yourcompany.mapping), visible in the relevant imported archive. In case of XSLT mapping, the name of the XSL file, visible in the relevant imported archive.

    mapping.type
    No
    GRAPHICAL (default) | JAVA | XSLT
    Determine the type of mapping

    namespace
    No
    Any valid namespace
    The namespace in which the mapping resides. I know it sounds strange, but this is not mandatory as the mapping lookup function is just so tolerant that searches in the whole SWCV. Of course, giving the right namespace increases performance.

    trace.level
    No
    OFF | WARNING (default) | INFO | DEBUG
    Determine the mapping trace level that you want to output in the message audit log.

    hang.on.error
    No
    Yes (default) | No
    Determine whether the message will stop if something fails. This needs additional work: currently, upon error, a dummy XML document is put in the message in place of the real one.

    audit.source.message
    No
    Any value
    Deactivated by default. If set, the source document will be written to the audit log. Useful for sender channels.

    audit.result.message
    No
    Any value
    Deactivated by default. If set, the result document will be written to the audit log. Useful for receiver channels.

    I think the above table is eloquent enough to let you guess how powerful this guy is...

    image

    Just one hint, maybe useless: to find out the Software Component Version Guid, in IR just double click on it in the left tree, let it open in the right pane and then go to the "Software Component Version" menu and choose "Properties..." It's the "Object ID".

    Conclusion

    I consider this development o' mine a beta version, though perfectly workin', that I will refine with the help of all you SDN'ers out there.
    I also expect someone to write a weblog on the implications and use cases that this module can have on XI projects.
    Finally, I must be honest: it was a painful one ;-) ... But in the end greatly satisfactory!
    Special thanks go to my wife Nunzia for puttin' up with my frustration when I was hardly finalizin' and to the great Pietro Marchesani for his support and for our chat about Java Reflection one drunky night in Athens.

    SAP Developer Network SAP Weblogs: SAP Process Integration (PI)