Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 2 Next »

The Configuration File

The hashedFiles section of the S2PX configuration file allows you to set defaults for the Parallel DRS stage used to replace Server Hashed Files. Here’s an example:

hashedFiles:                               # Special settings for handling the translation of Server Hashed Files
  type: ORACLE                             # The type settings to be used for the generated DRS Connector stages.  Valied values are DB2,ORACLE or	ODBC
  variant: 11                              # the DRS variant setting
  connection: MCIDEMO                      # the DRS connection name
  username: MCITESTDEV                     # the DRS database username (this can also be a job parameter name if required)
  password: '{iisenc}CeGmkL8fluDU7OVw=='   # the DRS database password (this can also be a job parameter name if required)
  schema: myuser                           # (OPTIONAL) Prefix all hashed File tablename with the schema (e.g. myuser.tablename)

Note that only a single set Parameter Set called SPXHashedFiles is required as all Hashed Files are assumed to be managed in the same underlying DRS-compatible database.

S2PX takes these properties and generate a Parameter Set called SPXHashedFiles which contains each of these properties as a parameter with the default value for each of those parameter taken from the values specified in the configuration file. S2PX also creates a single values file called default which also has values taken from the values specified in the configuration file.

Generated Jobs expect parameters set passed

Top-level sequence also passes Parameter Set SPXHashedFiles to each Job task with the value file default.

Reason: If we don’t specify Value File we’ll use the Parameter Set default values which means if you want to change those parameter values later on you’ll need to re-compile all the jobs which use that poaramete Set. A use of a single value file centralises the parameters in a single location which can be altered without requiring a chance to your compiled Job.

Job Decomposition and Sequences

Where a Server Job has been decomposed into multiple Parallel Jobs S2PX will generate a co-ordinating Job Sequences with the same name and parameters as the original Server Job. Where these Jobs use Hashed Files the Sequence will not feature an instance of the SPXHashedFiles Parameter Set as one of its parameters as this would change the signature of the Sequence, thereby rendering it incompatible with any other other Job Sequences or external scheduling mechanisms which call it. In this case each Job activity is hard-coded to use the default Value File.

Schema Parameter

An additional optional parameter called schema can be specified to add a schema parameter to the SPXHashedFiles Parameter Set. When supplied this value is used as a prefix for all Hashed File table names. The . delimiter will be added automatically and does not need to be specified in the configuration file. If this value is omitted no schema parameter is added to the SPXHashedFiles Parameter Set.

  • No labels