Executing S2PX Conversion
S2PX is Analysis and Conversion functions are only intended for execution against ISX (for Analysis) and DSX (for Conversion) files which contain only DataStage Jobs (and associated assets) which successfully compile.
Once the S2PX conversion utility has been set up, it can be executed using the MettleCI Command Line Interface:
$> mettleci s2px convert help
MettleCI Command Line (build 161)
(C) 2018-2022 Data Migrators Pty Ltd
Was passed main parameter 'help' but no main parameter was defined
Usage: s2px convert [options]
Options:
* -config
Path to server to parallel configuration file
* -source-dsx
Source DSX export to be converted from server to parallel
* -target-dsx
Target DSX that will contain all converted jobs
Command failed.
$> mettleci s2px convert \
-source-dsx "MyServerJobs.dsx" \
-target-dsx "MyParallelJobs.dsx" \
-config s2px_config.yaml
MettleCI Command Line (build 161)
(C) 2018-2022 Data Migrators Pty Ltd
Loading configuration
Preprocessing 'MyServerJobs.dsx'
Converting jobs
Decomposing MyParallelJobs
Translating MyParallelJobs
Done.
As specified in the command line, a file MyParallelJobs.dsx
was created containing the S2PX-generated Parallel jobs. The re-naming scheme used by the s2px convert
command is specified in the transfer
section of the S2PX configuration file. If you select the RENAME_PARALLEL_JOB
transfer option then the output file can be imported into the original DataStage project from which your Server Jobs were exported without overwriting your original server Jobs.
As an example of the various outputs generated by running the conversion command, consider the following scenario, which uses an S2PX configuration file with these settings:
transfer: # General S2PX settings
strategy: RENAME_PARALLEL_JOB # The job naming strategy: Valid values are RENAME_PARALLEL_JOB or BACKUP_SERVER_JOB
suffix: Px # The suffix appended to the name of the job to be converted prior to decomposition
hashedFiles: # Special settings for handling the translation of Server Hashed Files
type: ORACLE # The type settings to be used for the generated DRS Connector stages. Valied values are DB2,ORACLE or ODBC
variant: 11 # the DRS variant setting
connection: MCIDEMO # the DRS connection name
username: MCITESTDEV # the DRS database username (this can also be a job parameter name if required)
password: '{iisenc}CeGm9U7OVw==' # the DRS database password (this can also be a job parameter name if required)
decomposition:
bufferDirectory: /data/project/s2px/transient # A directory for the storage of temporary information during conversion
mode: OPTIMIZE_RUNTIME # (OPTIONAL) Optimise performance or minimize jobs. Valid values are OPTIMIZE_RUNTIME and MINIMIZE_JOBS
Running the conversion command …
$> mettleci s2px convert \
-config plugins/s2px_config.yaml \
-source-dsx assets/S2P/DSX/Project.dsx \
-target-dsx assets/S2P/DSX/Project_Converted.dsx
MettleCI Command Line (build 161)
(C) 2018-2022 Data Migrators Pty Ltd
Loading configuration
Preprocessing 'assets/S2P/DSX/Project.dsx'
Converting shared containers
Decomposing ErmChildOberParentPx
Translating ErmChildOberParentPx
Decomposing ErmittelnGSPx
Translating ErmittelnGSPxP01
Translating ErmittelnGSPxP02
Translating ErmittelnGSPxP03
Decomposing ErmittelnNamePx
Translating ErmittelnNamePxP01
Translating ErmittelnNamePxP02
...
Decomposing ReadDSJobPx
Translating ReadDSJobPx
Converting jobs
Decomposing ACT_ST_9109_SET_INITIAL_XML_STRUCTUREPx
Translating ACT_ST_9109_SET_INITIAL_XML_STRUCTUREPx
Decomposing ALT_OkavoKUBA_DB_4KB_PRPx
Translating ALT_OkavoKUBA_DB_4KB_PRPx
...
Decomposing ZHUK24_HCK_SAS_Tabellen_aufbereitenPx
Translating ZHUK24_HCK_SAS_Tabellen_aufbereitenPx
Decomposing ZVertissPKVTeil2Ab201909_wegwerfenPx
Translating ZVertissPKVTeil2Ab201909_wegwerfenPxP01
Translating ZVertissPKVTeil2Ab201909_wegwerfenPxP02
Translating ZVertissPKVTeil2Ab201909_wegwerfenPxP03
Translating ZVertissPKVTeil2Ab201909_wegwerfenPxP05
Translating ZVertissPKVTeil2Ab201909_wegwerfenPxP04
Translating ZVertissPKVTeil2Ab201909_wegwerfenPxP06
Done.
… we see that the following actions have taken place (in the order shown):
Server Shared Containers are decomposed into a set of equivalent Parallel Shared Containers
Server Jobs are then decomposed into a set of equivalent Parallel Jobs
Note that generated Parallel Jobs have adopted the specified suffix, followed by “P” and then a 2 digit number if the server job required decomposition
Where no decomposition is required, the job ending in
Px
is a Parallel Job.Where decomposition has occurred, the job ending in
Px
is a Sequence Job, and the similarly-named jobs ending inP01
,P02
, etc. are the decomposed Parallel Jobs invoked by the top-level Sequence Job.
To better understand the behaviour of the conversion process you can consult the two log files which are described here.
© 2015-2024 Data Migrators Pty Ltd.