Document toolboxDocument toolbox

How does S2PX support duplicate keys in Hashed Files?

How does S2PX support the conversion of Server Jobs which write data rows with duplicate keys to Hashed Files?

DataStage Server Jobs with Hashed File as targets don't, by default, log errors due to the presence of duplicate keys in rows written to Hashed Files. These Jobs just overwrite the rows in the target Hashed File meaning that any input rows with duplicate keys will simply overwrite identically-keyed rows in the hashed File.

The DRS Connector used to replace the hashed File is configured to use an insert/update operation to replicate this behaviour of the Hashed File.

 

© 2015-2024 Data Migrators Pty Ltd.