STROBing a transaction

In the last few weeks, we have been challenged with a system that was supposed to process orders was not processing them in a timely fashion. TMON indicated that the process was sitting in an IC wait suggesting that the delay was intentional. However, the concern was that the process was waiting on some unknown resource such as the database. The process consisted of a CICS transaction that called a number of COBOL programs and accessed an ADABAS database.

In an attempt to identify the actual line of code that was causing the delay, we turned on STROBE. Since this is a CICS transaction, you have to run an “ADD ACTIVE” measurement as the CICS region is an executing job. Set the job name to the CICS region, the session duration to 10 minutes, and the target sample size to 20,000 (Change these numbers as required for your context). In addition, select the “Data Collectors” option with a “Y”. On the Data Collectors panel, select the CICS+ option with a “Y”.

On the CICS Data Collector Options, select “Collect Region Data” with a “Y” and provide the name of the transaction in the “Detail by Transaction” section of the panel.

At this point in time, the measurement should be active. Monitor the measurement of completion and then generate a profile in the foreground. You can do this in batch but it is a relatively trivial operation so it can be performed quickly in the foreground.

After the profile is generated, the transaction summary (#TUS) report indicated that 599 seconds of the measurement were attributable to one transaction. Further, all of this time was attribute to a single program. STROBE was able to provide us with the actual CICS command “EXEC CICS DELAY” and the offset of the compile a listing. We were able to take this offset error look into the compile listing and identify that the line of command that was generating the delay was an

EXEC CICS
     DELAY FOR SECONDS (number_of_seconds>
END-EXEC

This proved that the delay in the process was intentional and not the result of a database or other resource contention.

Pro*C – PCC-F-02104, Unable to connect to Oracle

If you are attempting to compile Pro*C code and get the error “PCC-F-02104, Unable to connect to Oracle”, perform the following steps:

Instead of connecting to the Oracle database as

sqlplus userid/password@somelistener.domain.com:1521/MYPRODDB

and then attempting to compile, edit your tnsnames.ora to add the below entry (change appropriately to reflect your database SID or service, listener, port etc.)

MY_DB =
  (DESCRIPTION =
    (ADDRESS = (PROTOCOL = TCP)(HOST = SOMELISTENER.DOMAIN.COM)(PORT = 1521))
    (CONNECT_DATA =
      (SERVER = DEDICATED)
      (SID = MYPRODDB)
    )
  )

Export this entry as follows

export TWO_TASK=MY_DB

And now connect as

sqlplus userid/password

i.e. without specifying the usual connect string.

In case you are wondering about the name “TWO_TASK”, Tom Kyte explains it best:
“because in days gone by – the 1980’s – two tasks (having two tasks working together) just made ‘sense’”.

Copying a file from a Linux on-prem server to an AWS RDS Oracle instance

Today we had a requirement to copy a file from a Linux on-prem server to an AWS RDS Oracle instance. We did this using a DB link and utl_file utility. For the rest of this post, “ONP” will refer to the on-prem instance, and “ADS” will refer to the AWS RDS Oracle instance.

On the AWS RDS Oracle instance, create a DB link to the on-prem instance

create database link AWS_TO_ON_PREM CONNECT TO ONPUSER
IDENTIFIED BY SECRET-PASSWORD
USING '(DESCRIPTION =
    (ADDRESS_LIST =
      (ADDRESS = (PROTOCOL = TCP)(HOST = ONPREM.HOST.COM)(PORT = 1776))
    )
    (CONNECT_DATA =
      (SERVICE_NAME = ONPDB)
    )
  )'
   ;

On the AWS RDS Oracle instance, create a directory to store the file

exec rdsadmin.rdsadmin_util.create_directory('ADS_DIRECTORY');

Check that the directory got created

Select * from dba_directories;

As AWS manages the path internally, the above SQL will return the actual directory path.

On the on-prem instance, create a directory pointing to the location of the file that you want to copy to AWS

CREATE OR REPLACE DIRECTORY
FOR_AWS AS '/onprem/host/file/location/file_for_aws.bin';

GRANT READ, WRITE ON DIRECTORY FOR_AWS TO ONPUSER;

Now to copy the file from the on-prem instance to the AWS instance. This is accomplished via the below PL/SQL which is to be executed while connected to the AWS RDS Oracle instance. In our case, the data was binary (raw) data

declare
  ONP    utl_file.file_type@AWS_TO_ON_PREM;
  ADS    utl_file.file_type;
  ldata  raw(32767);
begin
  ONP := utl_file.fopen@AWS_TO_ON_PREM(location=>'FOR_ADS', filename=>'file_for_aws.bin', open_mode=>'rb');
  ADS := utl_file.fopen(location=>'ADS_DIRECTORY', filename=>'file_for_aws.bin', open_mode=>'wb');
  begin
    loop
      begin
        utl_file.get_raw@AWS_TO_ON_PREM(ONP, ldata, 32767);
        utl_file.put_raw(ADS, ldata, true);
      exception
        when no_data_found then
          exit;
      end;
    end loop;
  end;
  utl_file.fclose@AWS_TO_ON_PREM (ONP);
  utl_file.fclose(ADS);  
exception
  when others then
    utl_file.fclose@AWS_TO_ON_PREM(ONP);
    utl_file.fclose(ADS);
    raise;
end;
/

After the above is complete, check that the file size in AWS matches the file size from on-prem with this sql

select    filesize, mtime 
from      table(rdsadmin.rds_file_util.listdir('ADS_DIRECTORY')) 
where     filename='file_for_aws.bin'
;

If you need to delete a file in AWS, you can use this command

exec utl_file.fremove('directory-name','file-name');