Monday, March 13, 2017

Day light saving : Prod outage : server shut down / FAILED



If anyone of  you faced the issue of server getting stopped automatically yesterday (Sunday, March 12, 3:00 am), this could be due to Day light saving. Error log as below:

soa_server1.log:####<Mar 12, 2017 3:00:02 AM CDT> <Critical> <Health> <xxxx> 
<soa_server1> <[ACTIVE] ExecuteThread: '35' for queue: 'weblogic.kernel.Default 
(self-tuning)'> <<WLS Kernel>> <> <xxxxssdd:15a91b0a340:-7ffd-0000000018e09e53> 
<zzzzzz> <BEA-310006> <Critical Subsystem JTAMigratableGroup has failed. 
Setting server state to FAILED.

The leasing table won’t be updated with proper time and any gap of more than 1 hour would mark the corresponding server dead. This caused SOA_SERVER1 to shutdown itself.

This symptom is mentioned in  Doc ID 1590774.1.

Solution: Patch 17033308 is available for this issue.

Thursday, March 9, 2017

SOA Performance Tip : Reduce the MDS size !!!!



Generally, when there is a case where the heap usage is more and you are getting back-to-back full GC in Prod, this is really difficult to handle in production.
When getting into details, MDS is one of the components that is very heavy and already loaded in memory. As a best practice, all XSDs will be stored in MDS which is generally heavy in size, unless they are non-standard xsds. These xsds are having a lot of documentation around it which makes it more than the actual definition. Obviously these documentation help for the reason of better understanding :) :) :)

Tip : See if these documentation can be removed to reduce the size of xsd. This can be done easily with a very powerful tool/utility in Linux. It is the Sed  command. Sed is the ultimate stream editor command that can do wonders.
Lets take an example of such xsd and see how can sed command be used to remove the documentations/enumerations of a xsd file

sed -i '/xsd:documentation/d' sample.xsd
sed -i '/xsd:enumeration/d' sample.xsd

Here you go !!!!… check the size of xsd to see the size getting reduced after removing the documentations/enumerations
Save this modified file and import them back to MDS

Oracle B2B : Performance tips – 2


This tip is again related to XEngine !!!
As Oracle B2B is a multi protocol engine catering to messages exchange with trading partner. The widely used standards are X12, EDIFACT, HL7. HL7 can also be used by another product – SSHI (SOA Suite for Healthcare). All these (X12, EDIFACT, HL7) are using the XEngine to translate the message from Native-to-XML and viceversa.
Along with the document definition (ecs file), XEngine also uses the parser schema to translate the message from Native-to-XML and viceversa. These parser schema are present in one of the configuration file in the XEngine installation location (Oracle_Home\soa\soa\thirdparty\edifecs\XEngine\config\XERegistry.xml). As there are many parser schema of different standards are present in this file, the parser will go in a sequential manner to find the correct parser schema for every message !!!
Tip : is to keep the required parser schema in the top of the list, so that the engine finds in the first iteration and proceeds further, thus improves the processing time !!!. This means, if we are using the HL7 standard, place the HL7 specific parser schema on the top of the list as mentioned in below snippet
<Registry Name="XERegistry">
 <Product Name="Parser">
  <Category Name="ParserSchema">
   <Item Name="SchemaFile">${XERoot}/config/schema/HL7_parser_schema_FHS-NV-2.6.ecs</Item>
   <Item Name="SchemaFile">${XERoot}/config/schema/HL7_parser_schema_MSH-NV-2.6.ecs</Item>
   <Item Name="SchemaFile">${XERoot}/config/schema/HL7_parser_schema_BHS-NV-2.6.ecs</Item>
   ..........


Couple of important Notes:
• This needs the server to be restarted after the change
• This is not a documented information by Oracle. Hence, please get the acceptance by raising the SR, inorder for a continued support from Oracle

Saturday, February 18, 2017

Oracle B2B : Performance tips - 1


Oracle B2B is a multi protocol engine catering to messages exchange with trading partner. The widely used standards are X12, EDIFACT, HL7. HL7 can also be used by another product - SSHI (SOA Suite for Healthcare). All these (X12, EDIFACT, HL7) are using the XEngine to translate the message from Native-to-XML and viceversa.

Consider a usecase where we have any of the documents involved (X12, EDIFACT, HL7) with moer than 30 definitions (ecs / XSD) used in the enterprise. it slowly impacts the performance of the message processing. This is due to the reading of ecs definitions to be read from MDS for each message exchange. However, XEngine has a property to cache the ecs definition where we can specify the number of definitions that can be cached. This is present in one of the configuration file in the XEngine installation location (Oracle_Home\soa\soa\thirdparty\edifecs\XEngine\config). Change this to the required number of unique count ecs that is used.

<Category Name="Settings">
        <Item Name="CacheSize">0</Item>
        .............   
</Category>


Couple of important Notes:

  • This needs the server to be restarted/
  • This is not a documented information by Oracle. Hence, please get the acceptance by raising the SR, inorder for a continued support from Oracle

Thursday, January 26, 2017

Performance Improvement Tip for a BPEL project – Continued [IMPORTANT]


Having said on few of the tips on the previous post, there are some the tables still continue to persist the data which can still cause concerns. The Instance data occupies space in Oracle BPEL Process Manager schema tables. Data growth from auditing and dehydration can have a significant impact on database performance and throughput.

This occurs mainly when we have Async  BPEL process. The very reason for BPEL to persist few of the information like references, state of the instance, metadata, etc., are to cater to handle the faults and to recover the same.

Few of the tables are :
document_dlv_msg_ref : Stores references to dlv_message documents stored in the xml_document table.
xml_document : Stores all large objects in the system (for example, dlv_message documents). This table stores the data as binary large objects (BLOBs). Separating the document storage from the metadata enables the metadata to change frequently without being impacted by the size of the documents.
dlv_message : Stores incoming (invocation) and callback messages upon receipt. This table only stores the metadata for a message (for example, current state, process identifier, and receive date).
With this, there are high chances of the data growth in the tables and resulting in table space issue or the disk space issues.

One of the ways to handle this would be :
  • incorporate the proper purge strategy ( Preferred )
  • consider, revisiting the retention policy
  • Disk space increase (Last option)

Thursday, December 29, 2016

Performance Improvement Tip for a BPEL project



Dehydration of runtime instances is a major bottleneck when it comes to the discussion of performance improvement. Depending on the product that is in use, various runtime tables will be involved in persisting the runtime data. This results in having huge volume of data in these tables causing
  • I/O issues
  • Latency in retrieving the records
  • Hard disk space issues and many more
One of the ways is to enable “inMemoryOptimization” property. This is a property in BPEL that can be set to TRUE.
With this setting to True, another property “completionPersistPolicy” is used to determine persistence behaviour.
completionPersistPolicy is having On (Default) / Deferred / Faulted / off. Among this, setting to “faulted” will persist only faulted instances which can be used for recovery and other related tasks.

<property name="bpel.config.inMemoryOptimization">true</property>
<property name="bpel.config.completionPersistPolicy">faulted</property>

Saturday, December 3, 2016

CSV to DVM conversion



There are situations where we get the documents in the CSV (comma separated value) format data to be used in the SOA integration, especially these will be used in XSLT transformations. In these case it required to be converted to  a DVM (Domain Value Map) to get the various values fora  given Key.

Managed to develop a Java program and you can use this for your requirements. In this example input file is "C:\\Work\\sample.csv" output will be "C:\\Work\\output.dvm".

Please use this appropriately

import java.io.BufferedReader;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStreamReader;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.logging.Level;
import java.util.logging.Logger;


public class DVMCreator {
    private final static Logger logger =
        Logger.getLogger(DVMCreator.class.getName());

    public DVMCreator() {
        super();
    }


    private String inputFile;

    /**
     * @param inputFile
     */
    public void setInputFile(String inputFile) {
        this.inputFile = inputFile;
    }


    /**
     * This method listens for incoming records and handles them as required.
     *
     * @return sheetData
     * @throws Exception
     */
    @SuppressWarnings("finally")
 public List getSheetData() throws Exception {
        List sheetData = new ArrayList();

        FileInputStream fis = null;
        try {
         System.out.println(inputFile);
            fis = new FileInputStream(inputFile);
            ArrayList<String> list = null;
            
         //Construct BufferedReader from InputStreamReader
         BufferedReader br = new BufferedReader(new InputStreamReader(fis));
         
         String line = null;
         while ((line = br.readLine()) != null) {
           list = new ArrayList(Arrays.asList(line.split("\\s*,\\s*")));
           sheetData.add(list);
         }
         br.close();
        } catch (IOException e) {
            e.printStackTrace();
        } finally {
            if (fis != null) {
                fis.close();
            }
            return sheetData;
        }
    }


     private static String convertExcelToDVMData(List sheetData,String dvmName ) {
        StringBuffer xmlData = new StringBuffer();
      
      dvmName = dvmName.replaceAll(".dvm", "");
        
        Map map = new HashMap();
        xmlData.append("<?xml version=\"1.0\" encoding=\"UTF-8\" ?>");
        xmlData.append("<dvm name=\"");
        xmlData.append(dvmName);
        xmlData.append("\"");
        xmlData.append(" xmlns=\"http://xmlns.oracle.com/dvm\">");
        xmlData.append("\n");

        xmlData.append("<description></description>");
        xmlData.append("\n");

        xmlData.append("<columns>");
        xmlData.append("\n");


        List columnNamelist = (List)sheetData.get(0);
        for (int j = 0; j < columnNamelist.size(); j++) {
            xmlData.append("<column name=\"");
            xmlData.append(columnNamelist.get(j));
            xmlData.append("\"/>");
            xmlData.append("\n");

        }
//        logger.log(Level.INFO, "");
        xmlData.append("</columns>");

        xmlData.append("\n");

        xmlData.append("<rows>");
        xmlData.append("\n");

        for (int i = 1; i < sheetData.size(); i++) {
            xmlData.append("<row>");
            xmlData.append("\n");

            List list = (List)sheetData.get(i);
            for (int j = 0; j < list.size(); j++) {
                xmlData.append("\t");

                xmlData.append("<cell>");
                xmlData.append(list.get(j));
                xmlData.append("</cell>");
                xmlData.append("\n");

                if (j < list.size() - 1) {
                    logger.log(Level.INFO, ", ");
                }
            }
//            logger.log(Level.INFO, "");
            xmlData.append("</row>");
            xmlData.append("\n");
        }
        xmlData.append("</rows>");
        xmlData.append("</dvm>");

        return xmlData.toString();
    }
  

  public static boolean writeDVM(String xmlData, File file) throws Exception {
      try {
          FileOutputStream fOut = new FileOutputStream(file);
          fOut.write(xmlData.getBytes());
          fOut.flush();
          fOut.close();
          logger.log(Level.INFO, "File Created .::" + file.getName());

      } catch (Exception e) {
          logger.log(Level.INFO,
                     "Create() failed : " + e.getMessage());
          throw e;
      }
      return true;
  }
  
  
    /**
     * main method
     *
     * @param args      Expect one argument that is the file to read.
     * @throws IOException  When there is an error processing the file.
     */
    public static void main(String[] args) throws IOException {
        DVMCreator reader = new DVMCreator ();
        String spliter[] = null;
        String inputXL = null;
        String outputDVM = null;
         
                reader.setInputFile("C:\\Work\\sample.csv");
                File file = null;
              try {
                  file = new File("C:\\Work\\output.dvm");
//                  logger.log(Level.INFO, "File Created .::" + outputDVM);

              } catch (Exception e) {
                  logger.log(Level.INFO,
                             "Create() failed : " + e.getMessage());
              }

                try {
                    List data = reader.getSheetData();
                    
                    String xmlData = convertExcelToDVMData(data,file.getName());
                    boolean flag = writeDVM(xmlData, file);
//                    logger.log(Level.INFO, "XML data ::" + xmlData);


                } catch (Exception e) {
                    logger.log(Level.INFO, "Error occured" + e.getMessage());
                    e.printStackTrace();
                }
                logger.log(Level.INFO, "DVM  write done.");
        
    }
}