Hashed File Parameterisation
The Configuration File
The hashedFiles
section of the S2PX configuration file allows you to set defaults for the Parallel DRS stage used to replace Server Hashed Files. Here’s an example:
hashedFiles: # Special settings for handling the translation of Server Hashed Files
type: ORACLE # The type settings to be used for the generated DRS Connector stages. Valied values are DB2,ORACLE or ODBC
variant: 11 # the DRS variant setting
connection: MCIDEMO # the DRS connection name
username: MCITESTDEV # the DRS database username (this can also be a job parameter name if required)
password: '{iisenc}CeGmkL8fluDU7OVw==' # the DRS database password (this can also be a job parameter name if required)
schema: myuser # (OPTIONAL) Prefix all hashed File tablename with the schema (e.g. myuser.tablename)
Note that only a single set Parameter Set called SPXHashedFiles
is required as all Hashed Files are assumed to be managed in the same underlying DRS-compatible database.
S2PX takes these properties and generate a Parameter Set called SPXHashedFiles
which contains each of these properties as a parameter with the default value for each of those parameter taken from the values specified in the configuration file. S2PX also creates a single values file called default
which also has values taken from the values specified in the configuration file.
Generated Jobs using Hashed Files will have the SPXHashedFiles
Parameter Set added as a Job parameter.
Where a Server Job has been decomposed into multiple Parallel Jobs S2PX will generate a co-ordinating Job Sequence with the same name and parameters as the original Server Job. These Sequences will also pass the SPXHashedFiles
Parameter Set to each of its Job Activities with the Value File default
. The reason for this is that if DS2PX didn’t specify the Value File and used the Parameter Set default values then if you wanted to change those parameter values you’d need to re-compile all the Jobs which use that Parameter Set. The use of a single Value File centralises the parameters in one location which can be easily altered without requiring the recompilation of all Jobs using Hashed Files.
Job Decomposition and Sequences
As mentioned, Server Jobs decomposed into multiple Parallel Jobs are co-ordinated by a Job Sequence with the same name and parameters as the original Server Job. This Sequence will not have the SPXHashedFiles
Parameter Set as one of its parameters as this would change the signature of the Sequence (which is intended as a drop-in replacement for the original Server Job) thereby rendering it incompatible with any other Job Sequences or external scheduling mechanisms which invoke it. Within this sequence each Job Activity is hard-coded to use the default
Value File.
Schema Parameter
An additional optional parameter called schema
can be specified to add a schema
parameter to the SPXHashedFiles
Parameter Set. When supplied this value is used as a prefix for all Hashed File table names. The .
delimiter will be added automatically and does not need to be specified in the configuration file. If this value is omitted no schema
parameter is added to the SPXHashedFiles
Parameter Set.
© 2015-2024 Data Migrators Pty Ltd.