Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Table of Contents

...

This sheet provides a high-level overview of the level of support S2PX will provide for converting your DataStage solution’s particular set of stage types.

Image RemovedImage Added

Supported Server Jobs - Summary

The percentage of jobs in the supplied ISX file that contain…

  • Status
    colourGreen
    titleSUPPORTED
    Stage types supported by S2PX conversion. For Jobs to be classified in this segment all of its stage types must be supported by S2PX. In the example shown here we see that approximately 95% of Jobs in this ISX are supported by S2PX conversion.

  • Status
    colourYellow
    titleTBD
    Stage types that may be supported by S2PX conversion in the near future. If you encounter a large volume of Jobs in this category then speak to your IBM representative about the timing for introducing S2PX conversion support for relevant stage(s). For Jobs to be classified in this segment all of its stage types must be supported and at least one must be classified as TBD, but none of the stage types can be classified as unsupported. Any instance of an unsupported stage type means the entire Job is classified as…

  • Status
    colourRed
    titleUNSUPPORTED
    Stage types not supported by S2PX conversion. If a Job contains a at least one unsupported stage then it is classified as unsupported.

A description of which stages are currently supported and which are still in the planning stage is available here.

Number of unsupported Stages by Status

A colour-coded breakdown of the number of

Status
colourRed
titleUNSUPPORTED
or
Status
colourYellow
titleTBD
stages by stage type, ordered (left-to-right) by descending number of instances discovered in your ISX file. In the example shown here we see that of all the non-supported stages in this ISX the most commonly used is the UniVerse Data Access stage of which 16 instances were encountered.

Unsupported Jobs - Details

Lists the individual Jobs containing

Status
colourRed
titleUNSUPPORTED
or
Status
colourYellow
titleTBD
stages and the number of instances of stages in each classification discovered. In the example shown here we see that Job Find_Untraced_Jobs_DB2 is the Jobs which will require the greatest attention due to it containing 6 instances of unsupported stages. The exact names and types of these stages are described in the associated Support Data spreadsheet.

...

This sheet details the level of manual intervention that will be required by humans to the Parallel Jobs generated by S2PX.

...

Jobs Remediation - Summary

The summary graph describes how many Server Jobs will convert to Parallel Jobs which are expected to replicate Server behaviour without problems.

In this example we can see that S2PX expects that…

  • Status
    colourGreen
    titleNot Required
    Approximately 45% of Jobs will convert to Parallel Jobs which are function identically to their Server equivalents.

  • Status
    colourYellow
    titleRequired
    The remaining approximately 55%, however, will require some level of manual intervention from a human before they can replicate the behaviour their Server ancestors.

There are a number of reasons why manual intervention may be required, but is often because a Server Job uses one or more capabilities for which a direct equivalent in the DataStage Parallel engine is simply not available. In some cases the user may be required to re-interpret the original Job requirements to identify whether the compromise solution presented by S2PX will suffice, or whether further enhancement of the generated Jobs(s), using native Parallel functionality, is required.

Jobs Remediation - Details

This section describes the details of the required interventions identified in the summary graph above. The Issue requiring remediation column lists, in order of descending number of instances encountered, the S2PX Asset Query which identified the issue. The documentation for each of these Asset Queries (linked in the associated Remediation Data sheet) suggests a required course of remediation action, where one is require.

In this example we can see that the converted ISX has 183 instances (across 106 distinct Server Jobs) of the Disable Schema Reconciliation issue. In this case a remediation is likely not required but particular attention must be paid to these Jobs during Parallel Job testing.

We can also see 130 instances of a Sequential File with Backup issue across 102 distinct Server Jobs. The actual functionality of the Job itself isn’t affected but for some customers the loss of Sequential File backup functionality maybe unacceptable, and so an alternative solution (of which many are available) should be sought.

The Reserved Words in Transformer Stages issue, as a further example, is one where every instance discovered will require human intervention to resolve as these Jobs will not compile in their current form, and S2PX will not attempt to generate new variable names for your Job as the existing name a likely to encode meaning which is useful for Job developers.

...

(This summary sheet derives is data from the Function Data sheet which you can inspect for further details.)

Image RemovedImage Added

Job Function Calls - Summary

The Summary graph at the top identifies the proportion of Server Jobs which use function calls which are expected to work without further intervention (because an identically-named Parallel equivalent function already exists) and those which are expected to require some level of human involvement.

In the example here we see…

  • Status
    colourGreen
    titlebuildbuilT-in
    Just over 50% of Jobs feature one or more function calls of which all are natively supported in target DataStage environment (v11.7.1.3 SP4 or greater).

  • Status
    colourYellow
    titleRequies RequiRes mapping
    Slightly less than 50% percent, however, have no Parallel equivalent - either because they are IBM-supplied DataStage Server functions for which no Parallel version exists, or because they are custom functions you have coded yourself in DataStage Basic. In this case you will need to configure S2PX with one or more custom routine mappings.

Function or Routine Name - Details

The table below lists all function calls which S2PX doesn’t understand, and for which some level of manual conversion will be required. This table of functions names, ordered by descending number of distinct calls, also details the number of distinct Jobs from which calls originate.

...

(This summary sheet derives is data from the Connector Migration Data sheet which you can inspect for further details.)

Image RemovedImage Added

Jobs requiring Connector Migration

Percentage of jobs involving Connectors (a component that provides data connectivity and integration for external data sources, such as relational databases or messaging software) that are classified as legacy components and need migration using IBM’s Connector Migration Tool ('CCMT') solution.

Stage Types requiring Connector Migration

A breakdown of the number of stages requiring conversion by stage type, ordered (left-to-right) by descending number of instances discovered in your ISX file.

Jobs requiring Connector Migration

A table of Jobs along with the number of stages on each job requiring migration, order by descending number of stages requiring migration.

...

Prevalence of Hashed Files in Jobs - Summary

The Summary graph at the top simply identifies the proportion of Jobs which make use of one or more Hashed Files stages. Jobs are categories as simply

  • Jobs referencing Hashed Files (

    Status
    colourYellow
    titleyellow
    ), or

  • Jobs not referencing Hashed Files (

    Status
    colourGreen
    titlegreen
    )

Hashed File Usage Patterns - Summary

The next summary graph identifies, for each Hashed File reference, whether that reference is…

  • RED Shared between Jobs or Job invocations. Identifies Hashed File instances in a Job which are…

    • Reading but not writing, meaning the Hashed File has been generated by an upstream Job, or

    • Writing but not reading, meaning the Hashed File is being produced for consumption by a downstream Job, or by a different invocation of the same Job.

  • ORANGE Hashed File Appending Data. Identifies Hashed File instances which append data, meaning the Hashed File already existed at the point of Job invocation, and is being appended to produce data for consumption by a downstream Job, or by a different invocation of the same Job.

  • YELLOW Hashed File Synchronisation. Identifies Hashed File instances accessing the same Hashed File from multiple stages on the same canvas, potentially introducing a synchronisation issue that could require manual effort to implement in a Parallel environment.

  • GREEN Hashed File Transient, Exclusive. Identifies ephemeral Hashed Files, seemingly created, written, and read exclusively within the same job, and which appear to used in a transient manner to act simply as a synchronisation point.

Hashed File Instances per Job - Details

A list which breaks out the Hashed File Usage patterns summary above by Job, ordered by descending frequency of usage.

...