43
AutoImporter Installation, Configuration & Operation Manual Author: Mark Watts Date: Wednesday, 22 October 2008 Version: 0.1 (Draft)

hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

  • Upload
    buiphuc

  • View
    213

  • Download
    0

Embed Size (px)

Citation preview

Page 1: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

AutoImporter

Installation, Configuration&

Operation Manual

Author: Mark WattsDate: Wednesday, 22 October 2008Version: 0.1 (Draft)

Page 2: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

1 Document Information

1.1 Document informationDocument ID:

Extended Lookup Set Control

Project:Version: 0.1 (Draft)Title: Extended Lookup Set ManualAuthor: Mark WattsOwner:Client:Contact:

1.2 Document historyRevision date Previous revision Summary of Changes Change

s marked

22 Oct. 08 - Initial version -

1.3 ApprovalsThe following individuals must approve this document before distribution.

Name Signature Title Date of Issue

Version

1.4 DistributionWhen authorised modifications are made to this document, it must be distributed to the following people.

Name Title Date of Issue

Version

© TOWER Software EMEA Page 2 of 37 Commercial In Confidence

Page 3: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

2 Table of Contents

1 Document Information 22 Table of Contents 33 Introduction 44 Overview 55 Installing Auto Importer 86 Configuring the Service 97 Configuring Auto Importer 10

© TOWER Software EMEA Page 3 of 37 Commercial In Confidence

Page 4: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

3 Introduction

This document describes the installation, configuration and operation of the Auto Importer product.

© TOWER Software EMEA Page 4 of 37 Commercial In Confidence

Page 5: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

4 Overview

Auto Importer is a .NET 2 multi threaded application designed to facilitate the bulk importing of documents from multiple sources into TRIM.

Auto Importer runs as a Windows Service.

The product is configured through a single XML file which defines one or more Jobs for the Auto Importer to run.

The product logs all activity into a log file.

4.1 Terminology

JobA Job is a single step in an import process allowing either the capture of information, the reworking of captured data or the export of data to TRIM itself.

Work ItemA work item is an individual set of metadata (and optionally a document) captured by the Auto Importer and being processed through a sequence of Jobs.

4.2 Generic Job

A generic job, the job from which all other job types are descended, has a source, an operation, some transformations and an output.

A generic Job polls a folder looking for IDX files to process, processes them and places the resultant IDX file (and any associated electronic document) into an output folder.

Import and Export jobs differ from this in that import jobs do not necessarily poll a folder looking for IDX files and output jobs do not necessarily drop their output IDX files in a folder.

IDX files are the internal intermediary format for metadata and have the field=value format: -

………………

Title=The tile of the document

Author=A Person

DOS_FILE=c:\test\1234.doc

© TOWER Software EMEA Page 5 of 37 Commercial In Confidence

Page 6: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

Etc……

……………….

Beneath the polling directory of a generic job will be a “Done” and a “Failed” folder. The job if it succeeds will place the processed source file into the done folder or the failed folder if it fails.

4.3 Reserved Field NamesWithin an IDX file there are a number of reserved field names: -DOS_FILE – this field tells the auto importer that an electronic document (word file, tiff, pdf etc) is attached to the metadataRECORD_TYPE – This field tells TRIM what type of record to create during the export to TRIM process.Field Names are case sensitive!

4.4 Capture Jobs

Capture Jobs have their input defined as a folder to poll, a directory structure to traverse or a SQL statement to execute.

Their process defines how to dissemble captured items.

Their transformations describe any initial data manipulation which needs to be performed.

Their output is a directory where captured metadata and documents will be put.

4.5 Transformation Jobs

A transformation Job is designed to allow data manipulation. It is capable of reworking filed names, field values, doing lookups, splitting & concatenating values etc

A transform Job performs as a generic job taking input and delivering output in IDX format.

4.6 Folder Finder Job

A special job can be defined to allow the identification or creation of a container in TRIM into which to place the current work item. This operates in the same mode as a generic job having both input and output in IDX format.

© TOWER Software EMEA Page 6 of 37 Commercial In Confidence

Page 7: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

It is basically a specialised lookup allowing a search to be done in TRIM to find a folder and if one cannot be found to create one. In either case the folder (Container) is returned to the work item as a simple metadata field.

4.7 Export Jobs

Export jobs take their input in the generic IDX format but the output is destined for another system i.e. TRIM.

© TOWER Software EMEA Page 7 of 37 Commercial In Confidence

Page 8: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

5 Installing Auto Importer

The installation package contains 2 files: -

Installer.msi Setup.exe

Both files are required to install the add-in on the target machine.

Execute either file to begin the installation.

Select “Next” to proceed with the installation.

Press “Next and allow the install to complete.

© TOWER Software EMEA Page 8 of 37 Commercial In Confidence

Page 9: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

6 Configuring the Service

Before the service can be used it needs to be configured to run as the identity of a user who has administrative TRIM access.

From the Windows services configuration utility select the AutoImporter service and modify its properties.

Change the log on identity to one with TRIM admin rights and apply the changes.

© TOWER Software EMEA Page 9 of 37 Commercial In Confidence

Page 10: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

7 Configuring Auto Importer

Auto Importer is configured in a single XML file named “AutoImporter.xml” located in the product’s installation folder.

7.1 Outer XML Structure

The outer XML of the Auto Importer’s configuration file defines basic operating parameters and one or more jobs.

<Config>

<Marshaller>

<StartFile>path</StartFile>

<StoptFile>path</StopFile>

<HoldFile>path</HoldFile>

<PollPeriod>integer</PollPeriod>

</Marshaller>

<Logger>

<LogFileDuration>1</LogFileDuration>

<LoggingLevel>Verbose</LoggingLevel>

</Logger>

<Jobs><Job>

………. Job detail ………

</Job>

<Job> ………. Job detail ………

</Job>

</Jobs>

<Config>

Each individual Job is then defined in an XML structure depending on the type of Job being defined.

7.2 Marshaller Definition

© TOWER Software EMEA Page 10 of 37 Commercial In Confidence

Page 11: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

The marshaller is responsible for starting and stopping processing of jobs due to external triggers.

These triggers are the existence of designated files in fixed locations.

The marshaller will look for these files on a configured time period.

The marshaller will look for these files in the order RUN, HOLD, STOP & START.

The definition of each of these files obviates the subsequent ones in the following order: -

1. RunFile - if this is defined then it becomes the single active flag. All other flags are ignored.

2. HoldFile - If this is defined and the RunFile is not set then this becomes the single functional flag.

3. StartFile & StopFile - if both the RunFile & HoldFile flags are not set then both of these flags are active.

All of the triggers are optional but must be defined in the XML structure.

7.2.1 Poll Period This decimal value dictates how many seconds the marshaller will sleep between searches for the trigger files. Values of 1 to 3600 are valid.

7.2.2 Start File This is the fully qualified path and file name of the file the marshaller will look for to signify it should move from a paused state to a running one.

If the file is found the marshaller will log the event and then attempt to delete the file. Failure to delete the file will simply cause the marshaller to rediscover the file at the next poll cycle.

7.2.3 Stop File This is the fully qualified path and file name of the file the marshaller will look for to signify it should move from a running state to a paused one.

If the file is found the marshaller will log the event and then attempt to delete the file. Failure to delete the file will simply cause the marshaller to rediscover the file at the next poll cycle.

7.2.4 Hold File This is the fully qualified path and file name of the file the marshaller will look for to signify it should move into and stay in a pause state.

This file overrides both start and stop files but is secondary to the Run File. If a hold file is detected then the autoimporter will not resume processing until the hold file is removed.

If the file is found the marshaller will log the event but will not attempt to delete the file.

© TOWER Software EMEA Page 11 of 37 Commercial In Confidence

Page 12: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

7.2.5 Run File This is the fully qualified path and file name of the file the marshaller will look for to signify it should move into and stay in a running state.

This file overrides both start and stop files and the Hold File. If a Run file is detected then the autoimporter will not stop processing until the Run file is removed.

If the file is found the marshaller will log the event but will not attempt to delete the file.

7.1 Logger DefinitionThe logger is responsible for writing out information to the Auto Importer’s log file.

At startup (every time the service starts) the logger will calculate its current log file name by appending the current date in the format yyyymmdd to the fixed characters AI_ and then appending .log as a file extension. It the date was 30th June 2003 then the log file would be named AI_20030630.log.

The logger then opens the file, creating it if it does not exist, and appends log items to it.

The LogFileDuration parameter stipulates that if the AutoImporter service is not restarted then what is the maximum number of days that a log file should be used before rolling over into a new log file named appropriately for the current date. Setting this value to 1 will cause the log file to be rolled over daily. The rollover point is 00:00 hours local time.

The LoggingLevel parameter indicates what detail of logging is required. There are 3 options: -

Errors Only error messages will be written to the log

Warnings Both Warnings and errors will be written.

Verbose All messages generated during processing will be written.

7.2 Generic Job Definition XML

Every job defined will at minimum have the following structure: -

<Job><JobType>DirectoryWatcher</JobType> <ExcludeJob>FALSE</ExcludeJob> <Name>DirectoryWatcherTest</Name> <Description>Directory watcher test job</Description> <Interval>3</Interval>

</Job>

© TOWER Software EMEA Page 12 of 37 Commercial In Confidence

Page 13: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

7.2.1 JobType Job type defines the type of process the job will perform. This is a limited list of fixed names each of which then dictates what additional XML elements are required to declare it.

Valid Values are: -

DirectoryWatcher - A job that polls a file system directory

TreeWatcher - A job that traverses a directory structure

SQLWatcher - A job that executes an SQL query

MetadataTransformer - A job that reworks metadata

TRIMContainerFinder - A job that queries TRIM for a container

TRIMExporter - A job that exports work items to TRIM

7.2.2 ExcludeJob This tag allows the declaration that when it starts the Auto Importer should ignore this job and not load or run it. This allows for testing and development.

Valid Values are TRUE or FALSE

7.2.3 Name This is the short name of the job and must be unique within the list of jobs declared.

Valid Values are any string of characters.

7.2.4 Description This is a verbose string which is designed to simply describe the job. It does not need to be any particular value or unique.

Valid Values are any string.

7.2.5 Interval The period in seconds between the end of one action to look for work for the job and the beginning of the next one.

Valid Values are any integer value from 1 to 3600

© TOWER Software EMEA Page 13 of 37 Commercial In Confidence

Page 14: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

7.3 Directory Watcher Job Definition

A DirectoryWatcher Job will poll the folder defined in the InputFolder at an interval defined in the Interval for a maximum of MaxItems files which match the PollFileMask format.

Each item found will then be opened and depending on the IndexFileFormat value processed into one or more IDX files which will be placed (along with any associated documents) into the Output Folder.

The resulting output IDX files will contain all fields defined in the source files with any values which were found.

If a job is defined as a DirectoryWatcher then the following XML tags are required: -

<Job><JobType>DirectoryWatcher</JobType> <ExcludeJob>FALSE</ExcludeJob> <Name>DirectoryWatcherTest</Name><Description>Directory watcher test job</Description><Interval>3</Interval><InputFolder>C:\AutoImporterStages\DIR1</InputFolder><PollFileMask>*.txt</PollFileMask><IndexFileFormat>MultiRowDelimited</IndexFileFormat><SeparatorCharacter>|</SeparatorCharacter><MaxItems>10</MaxItems><OutputFolder>C:\AutoImporterStages\DIR2</OutputFolder> <DeleteOnSuccess>FALSE</DeleteOnSuccess> <DeleteDOS>FALSE</DeleteDOS>

</Job>

In adition to the tags defined in the Generic Job description above the following additional tags are required: -

7.3.1 InputFolder

This declares where the directory watcher job should poll for new items to process

This can be any local or UNC path.

7.3.2 PollFileMask

This is the search mask, in DOS format, to apply when looking for new items to process

7.3.3 IndexFileFormat

This tells the directory watcher the format of the files it should find. The processor can deal with either indivitual files with field=value metadata on separate lines much like the IDX format described above or multirow delimited files where the field names are defined in the first row and each item is defined on one following row with the metadata values placed in the corresponding column.

© TOWER Software EMEA Page 14 of 37 Commercial In Confidence

Page 15: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

Valid values are FieldEqualsValue or MultiRowDelimited

7.3.4 SeparatorCharacter

This tells the processor the character which in the case of FieldEqualsValue files separates the field name from the value; or in the case of MultiRowDelimited files the character which separates the columns

Valid values are TAB (this is reserved and signifies a tab character) or any single ASCII character.

7.3.5 MaxItems

This tells the process what the maximul nimber of items it should process before forcably discontinuing and entering the sleep delay declared in the PollPeriod paramenter.

It allows for throttling of the whole process.

Any valid integer value from 1 to 65000

7.3.6 OutputFolder

This declares where the directory watcher job should deliver resulting IDX files after successful processing.

This can be any local or UNC path.

7.3.7 DeleteOnSccess

This tells the process whether the source files which the job processed should be deleted after a successful work item is completed and delivered to the OutputFolder.

If false then the source file will be moved to the Done subdirectory.

Valid values are TRUE or FALSE

7.3.8 DeleteDOS

This instructs the capture process to attempt to delete any electronic attachment (DOS_FILE) after the successful processing of a work item.

This is needed where both the metadata and the documents associated have been intentionally deliver to the importer solely for its use. If the documents are located in another system and are not for the importer to clean away post processing this should be switched off.

Valid values are TRUE or FALSE.

© TOWER Software EMEA Page 15 of 37 Commercial In Confidence

Page 16: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

7.4 Tree Watcher Job Definition XML

A TreeWatcher Job will begin at the folder defined in the InputFolder and traverse it and all sub folders at an interval defined in the Interval for a maximum of MaxItems files which match the PollFileMask format.

Each item found will then be profiled and processed into an IDX file which will be placed (along with a copy of the original file) into the Output Folder.

The resulting IDX file will contain the following details: -DOS_FILE - The file path of the original file after it is copied to the output folderFileDate - The DOS file date of the original fileContainerPath - The folder name in which the file was foundFileName - The file name of the original file less path and extensionFileExtension - The DOS file extension of the original fileDrive - The drive letter of the disk on which it was found.

If a job is defined as a TreeWatcher then the following XML tags are required: -

<Job><JobType>TreeWatcher</JobType> <ExcludeJob>FALSE</ExcludeJob> <Name>DirectoryWatcherTest</Name><Description>Directory watcher test job</Description><Interval>3</Interval><InputFolder>C:\AutoImporterStages\DIR1</InputFolder><PollFileMask>*.txt</PollFileMask><OutputFolder>C:\AutoImporterStages\DIR2</OutputFolder> <DeleteOnSuccess>FALSE</DeleteOnSuccess>

</Job>

In adition to the tags defined in the Generic Job description above the following additional tags are required: -

7.4.1 InputFolder

This declares where the directory watcher job should poll for new items to process

This can be any local or UNC path.

7.4.2 PollFileMask

This is the search mask, in DOS format, to apply when looking for new items to process

7.4.3 DeleteOnSccess

This tells the process whether the source files which the job processed should be deleted after a successful work item is completed and delivered to the OutputFolder.

© TOWER Software EMEA Page 16 of 37 Commercial In Confidence

Page 17: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

If false then the source file will be moved to the Done subdirectory.

Valid values are TRUE or FALSE

7.4.4 OutputFolder

This declares where the directory watcher job should deliver resulting IDX files after successful processing.

This can be any local or UNC path.

© TOWER Software EMEA Page 17 of 37 Commercial In Confidence

Page 18: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

7.5 SQL Watcher Job Definition XML

An SQLWatcher Job will, at an interval defined in Interval, make an ODBC connection to a database as defined in the ConnectionString parameter and a execute the query defined in SelectSQL.

Each item found will then be processed into an IDX file with all of the fields found in the SQL result set included.

If the process is successful then the SQL declared in the UpdateSQLSuccess will be executed, alternatively on a failure the UpdateSQLFailure SQL will be executed. In both cases any items in the SQL bounded with # characters will be substituted for the values from the current item being processed i.e.

If the current item contained a field called ID_NO and the update SQL looked like this…UPDATE MYDB SET FIELDX=’Y’ WHERE ID_NO=’#ID_NO#’ the update sql would have the #ID_NO# replaced with the value of the field ID_NO from the current work item.

7.5.1 SQL Note The update SQL strings can contain multiple sequential calls if separated by a semi-colon “;” e.g. UPDATE TABLE SET FIELD=VALUE WHERE FIELD=#FIELDNAME#;UPDATE...

If a job is defined as a SQLWatcher then the following XML tags are required: -

<Job><JobType>TreeWatcher</JobType> <ExcludeJob>FALSE</ExcludeJob> <Name>DirectoryWatcherTest</Name><Description>Directory watcher test job</Description><Interval>3</Interval><ConnectionString>Provider=OLEDB;Database=AB</ConnectionString><SelectSQL>SELECT * FROM MYDB</SelectSQL><UpdateSQLSuccess>UPDATE MYDB SET status=’Y’ …</UpdateSQLSuccess><UpdateSQLFailure>UPDATE MYDB SET status=’X’ …</UpdateSQLFailure><OutputFolder>C:\AutoImporterStages\DIR2</OutputFolder>

</Job>

In adition to the tags defined in the Generic Job description above the following additional tags are required: -

7.5.2 ConnectionString

The ADO compliant connection string to attach to the desired database.

7.5.3 SelectSQL

The SQL to execute to get a list of items to process.

© TOWER Software EMEA Page 18 of 37 Commercial In Confidence

Page 19: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

7.5.4 UpdateSQLSuccess

The sql to execute to indicate to the source DB that the item was processed successfully and to change the item so that it is not reprocessed from subsequent SelectSQL calls.

7.5.5 UpdateSQLFailure

The sql to execute to indicate to the source DB that the item was NOT processed successfully and to change the item so that it is either flagged for attention or to ensure it is reprocessed from subsequent SelectSQL calls.

7.5.6 OutputFolder

This declares where the directory watcher job should deliver resulting IDX files after successful processing.

This can be any local or UNC path.

© TOWER Software EMEA Page 19 of 37 Commercial In Confidence

Page 20: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

7.6 Sharepoint Watcher Job Definition XML

A SharePoint Watcher Job will attempt to archive the document contents of sharepoint sites.

The process is orchestrated by use of an administration database.

Each site nominated in the administration database is accessed in turn and: -

The site’s security details are read. Each permission/person is translated into TRIM locations & permissions. If matching locations are not found in TRIM then the process will fail but know it should retry after a given period.

The list of document stores from the site are read and compared to an optional list of stores to be handled. The default is to harvest all stores.

Each store is then interrogated and a list of documents compiled with their metadata.

Each document is then retrieved and the metadata (site, store & document) is put into an IDX file for subsequent stages to process.

A SharepointWatcher Job will, at an interval defined in Interval, make an ODBC connection to an administration database as defined in the ConnectionString parameter and execute the query defined in SelectSQL.

Each returned row is then examined to see if a previous attempt has been made to harvest the site. This LastTriedTime is a database field optionally selected during the initial SQL Select. If a last tried time is found it will then be compared to the RetryDelayHours value configured in the job to see if enough time has elapsed to retry.

If a site processes correctly then the SQL defined in the UpdateSQLSuccess job parameter will be executed, a failure due to missing location details or other non critical circumstances will result in the SQL being executed and an abject failure will result in the SQL being executed.

7.6.1 SQL Note The update SQL strings can contain multiple sequential calls if separated by a semi-colon “;” e.g. UPDATE TABLE SET FIELD=VALUE WHERE FIELD=#FIELDNAME#;UPDATE...

7.6.2 Minimal SQL parameters To process a site the SQL expects to return at least these minimal fields: -

URL - a string containing the full URL of the SharePoint site to archive.

Any field can be coersed into the URL name using SQL.

7.6.3 SQL string substitution During processing the SQL strings defined in the three (3) update parameters of the job will have values substituted.

Any item bounded by # symbols is assumed a substitution candidate.

© TOWER Software EMEA Page 20 of 37 Commercial In Confidence

Page 21: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

#FieldName# will be replaced with the corresponding value from the current item’s field values collection.

#NOW# will be replaced with the current date/time in the format YYYY-MM-DD HH:MM:SS.

If a job is defined as a SharepointWatcher then the following XML tags are required: -

<Job><JobType>SharepointWatcher</JobType> <ExcludeJob>FALSE</ExcludeJob> <Name>SharepointHarvester</Name><Description>Job to harvest sharepoint sites</Description><Interval>3</Interval><ConnectionString>Provider=OLEDB;Database=AB</ConnectionString><SelectSQL>SELECT FIELDA, ... FROM ADMINDB</SelectSQL><UpdateSQLSuccess>UPDATE ADMINDB SET FIELDA=’Y’</UpdateSQLSuccess><UpdateSQLRetry>UPDATE MYDB SET status=’X’</UpdateSQLRetry><UpdateSQLFailure>UPDATE MYDB SET status=’X’</UpdateSQLFailure><OutputFolder>C:\AutoImporterStages\DIR2</OutputFolder> <TemporaryFolder>C:\AIStages\TEMP</TemporaryFolder><StoreNames>Archive</StoreNames><TRIMWorkgroupServer>mwatts1</TRIMWorkgroupServer><TRIMWorkgroupServerPort>1137</TRIMWorkgroupServerPort><TRIMDataset>AP</TRIMDataset><DelayRetryHours>0</DelayRetryHours><DelayRetryFieldName>DTLASTTRY</DelayRetryFieldName>

</Job>

In adition to the tags defined in the Generic Job description above the following additional tags are required: -

7.6.4 ConnectionString

The ADO compliant connection string to attach to the desired database.

7.6.5 SelectSQL

The SQL to execute to get a list of items to process.

7.6.6 UpdateSQLSuccess

The sql to execute to indicate to the source DB that the item was processed successfully and to change the item so that it is not reprocessed from subsequent SelectSQL calls.

7.6.7 UpdateSQLRetry

The sql to execute to indicate to the source DB that the item was not processed successfully but to set appropriate flags to cause a retry.

© TOWER Software EMEA Page 21 of 37 Commercial In Confidence

Page 22: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

7.6.8 UpdateSQLFailure

The sql to execute to indicate to the source DB that the item was NOT processed successfully and to change the item so that it is either flagged for attention or to ensure it is reprocessed from subsequent SelectSQL calls.

7.6.9 OutputFolder

This declares where the job should deliver resulting IDX files & documents after successful processing.

7.6.10 TemporaryFolder This declares where the job should temporarily download documents before creating output files in the output directory.

7.6.11 StoreNames A pipe “|” delimited list of Sharepoint stores which should be included in processing. A blank value indicates all stores should be handled.

7.6.12 TRIMWorkgroupServer This is either the IP Address or DNS resolvable name of the machine to which TRIM connections will be made to validate locations.

7.6.13 TRIMWorkgroupServerPort This is either port number on which to try to connect to TRIM on the specified Workgroup Server.

7.6.14 TRIMDataset This is the 2 character identifier of the TRIM dataset which is to be connected to.

7.6.15 DelayRetryHours This is a decimal number indication the number of hours after a failed attempt that a site should be left before attempting to harvest it again. A blank value defaults to 0.

7.6.16 DelayRetryFieldName This name of the field from the database which is used to hold the last harvest attempt date/time after a failure. A blank value will disable the feature.

© TOWER Software EMEA Page 22 of 37 Commercial In Confidence

Page 23: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

© TOWER Software EMEA Page 23 of 37 Commercial In Confidence

Page 24: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

7.7 Metadata Transformer Job Definition XML

A Metadata Transformation Job will, at an interval defined in Interval, poll the folder defined as InputFolder looking for IDX files. Each file found will be reworked based on the defined transformations and delivered to the folder defined in OutputFolder.

A successful transformation will result in the original file being deleted/moved to done or a failure will see the original file moved to failed.

If a job is defined as a MetadataTransformer then the following XML tags are required: -

<Job>

<JobType>MetadataTransformer</JobType>

<ExcludeJob>FALSE</ExcludeJob>

<Name>Prep</Name>

<Description>Data Prep</Description>

<Interval>3</Interval>

<MaxItems>100</MaxItems>

<DeleteOnSuccess>TRUE</DeleteOnSuccess>

<InputFolder>C:\AutoImporterStages\DIR2</InputFolder>

<OutputFolder>C:\AutoImporterStages\DIR3</OutputFolder>

<Transformations/>

</Job>

In adition to the tags defined in the Generic Job description above the following additional tags are required: -

7.7.1 InputFolder This is the path to a folder the module should poll for IDX files.

7.7.2 OutputFolder This is the path to a folder the module should deliver completed IDX files.

© TOWER Software EMEA Page 24 of 37 Commercial In Confidence

Page 25: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

7.7.3 Transformations This is a section comprising 0 to many individual transformation actions. Transformations are executed in the order listed.

Each individual transformation section takes the following XML format: -

<Transformation>

<SourceField> </SourceField>

<TargetField> </TargetField>

<Action#</Action>

<Data></Data>

</Transformation>

SourceField defines the field name in the existing IDX file to use as the source value.

TargetField defines the name of the field in the IDX file which will be the result of the process.

Action defines the transformation action to conduct.

Data contains aditional information needed by the transformation.

Different transformation actions require different subsets of these 4 fields to have values.

The possible actions are listed here: -

7.7.3.1 #FIXED_VALUE#This instructs the transformer that it should either update the field named in both SourceField & TargetField to the value contained in the Data element. If the field does not already exist then it is added.

e.g.

<Transformation>

<SourceField>FOLDERRT</SourceField>

<TargetField>FOLDERRT</TargetField>

<Action>#FIXED_VALUE#</Action>

<Data> CLIENT FOLDER</Data>

</Transformation>

© TOWER Software EMEA Page 25 of 37 Commercial In Confidence

Page 26: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

7.7.3.2 #COPY_VALUE#This instructs the transformer that it should copy the value from the field named in SourceField into the field named in TargetField. If the TargetField does not already exist then it is added.

The Data element is not needed

e.g.

<Transformation>

<SourceField>Field1</SourceField>

<TargetField>Field2</TargetField>

<Action>#COPY_VALUE#</Action>

<Data></Data>

</Transformation>

7.7.3.3 #DROP_FIELD#This instructs the transformer that it should (if it exists) remove the field and its value form the IDX file who’s name is supplied in the SourceField.

The Data & Target elements are not needed

e.g.

<Transformation>

<SourceField>Field 1</SourceField>

<TargetField> </TargetField>

<Action>#DROP_FIELD#</Action>

<Data> </Data>

</Transformation>

© TOWER Software EMEA Page 26 of 37 Commercial In Confidence

Page 27: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

7.7.3.4 #CHANGE_FIELD_NAME#This instructs the transformer that it should (if it exists) rename the field named in SourceField to the name listed in the TargetField.

The Data element is not needed

e.g.

<Transformation>

<SourceField>Field 1</SourceField>

<TargetField>Field 2</TargetField>

<Action># CHANGE_FIELD_NAME#</Action>

<Data> </Data>

</Transformation>

7.7.3.5 #SIMPLE_LOOKUP#This instructs the transformer that it should (if it exists) use the value of the field named in SourceField lookup a second value and put that value into a field named in TargetField. If TargetField does not exist it will be created.

The data element contains the fully qualified path to a simple text file containing lookup valuse in the format: -

SourceValue1=ResultValue1

SourceValue2=ResultValue2

SourceValue3=ResultValue3

e.g.

<Transformation>

<SourceField>Field 1</SourceField>

<TargetField>Field 2</TargetField>

<Action># SIMPLE_LOOKUP#</Action>

<Data> c:\temp\\values.txt</Data>

</Transformation>

© TOWER Software EMEA Page 27 of 37 Commercial In Confidence

Page 28: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

7.7.3.6 #EXTERNAL_LOOKUP#This instructs the transformer that it should (if it exists) use the value of the field named in SourceField lookup a second value and put that value into a field named in TargetField. If TargetField does not exist it will be created.

The data element contains the class name of a COM class which implements the required interface: -

The interface exposed in the autoimporter executable expects a single function called Lookup which accepts a single string parameter and returns a collection of field/value pair strings.

The results if any will either be updated or added to the current fields and their values in the IDX file.

The TargetField parameter in not needed.

e.g.

<Transformation>

<SourceField>Field 1</SourceField>

<TargetField> </TargetField>

<Action># EXTERNAL_LOOKUP#</Action>

<Data>my.lookup.class</Data>

</Transformation>

7.7.3.7 #CONCATENATE#This instructs the transformer that it should parse the formatting string passed in the Data tag and then attempt to substitute the values of the fields named. The resulting string is then put into the field named in the TargetField parameter.

The formatting string can be looked at as a multi element string separated by pipe “|” symbols. Items which should be substituted with values from the existing fields collection are designated by being enclosed in exclamation marks “!”. The characters (if any) between a pipe and exclamation are conditionally inserted as are if the substitution filed contained a value.

|12!field1!34! for example means that if field1 exists and contains a value that it will be put in the output string wrapped in 12 & 34. This allows conditional inclusion of commas etc only when a value exists.

Take an example of an address which is held in the fields as multiple elements: -

Street=24 High Street

District=

Town=Lington

© TOWER Software EMEA Page 28 of 37 Commercial In Confidence

Page 29: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

County=Someshire

Postcode=LS1 1LS

To have this address rendered into a single string with commas separating it could be done as follows: -

!Street!|, |!District!|, |!Town!|, |!County!|, |!Postcode!

This when processed would result in: -

24 High Street, , Lington, Someshire, LS1 1LS

Using the conditional method the formatting string would look like: -

!Street!|, !District!|, !Town!|, !County!|, !Postcode!

This will then result in an output of: -

24 High Street, Lington, Someshire, LS1 1LS

e.g.

<Transformation>

<SourceField>Field 1</SourceField>

<TargetField>Filed 2</TargetField>

<Action># CONCATENATE #</Action>

<Data>!Street!|, !District!|, !Town!|, !County!|, !Postcode!</Data>

</Transformation>

© TOWER Software EMEA Page 29 of 37 Commercial In Confidence

Page 30: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

7.8 TRIM Container Finder Job Definition XML

A TRIM Container Finder Job will, at an interval defined in Interval, search the defined input folder looking for items to process.

It will then make a connection to a TRIM database as defined by the parameters and attempt to find a folder which matches the criteria.

A search will be carried out in TRIM which will be parameterised by: -

The record type to look for

The classification in which to look (if specified)

The ID to look for

The client associated to the folder (if any)

If a folder is found its record number will be returned in a new field called CONTAINER. If not a new folder will be made, titled using the search criteria and for each field in the metadata it will examine the record to see if a user defined field exists on that record and if so then the value is set.

If the process is successful then the work item is moved to the done folder or deleted if that is the configuration. If the process fails then the item is moved to the failed folder.

If a job is defined as a TRIMExporter then the following XML tags are required: -

<Job>

<JobType>TRIMContainerFinder</JobType>

<ExcludeJob>FALSE</ExcludeJob>

<Name>TRIMFolder</Name>

<Description>Carefirst TRIM Folder Finder</Description>

<Interval>3</Interval>

<MaxItems>100</MaxItems>

<DeleteOnSuccess>TRUE</DeleteOnSuccess>

<TRIMWorkgroupServer>SSSRV0006A</TRIMWorkgroupServer>

<TRIMWorkgroupServerPort>1137</TRIMWorkgroupServerPort>

<TRIMDataset>CF</TRIMDataset>

<ContainerRecordTypeFieldName>FOLDERRT</ContainerRecordTypeFieldName>

<ContainerTitlingMethod>TEXT</ContainerTitlingMethod>

<ParentClassificationFieldName>CLASSIFICATION</ParentClassificationFieldName>

<ContainerTitleFieldName>FOLDERTITLE</ContainerTitleFieldName>

<ContainerClientFieldName>SUBJECT</ContainerClientFieldName>

<ContainerExternalIDFieldName>SUBJECT</ContainerExternalIDFieldName>

<InputFolder>C:\AutoImporterStages\DIR3</InputFolder>

<OutputFolder>C:\AutoImporterStages\DIR4</OutputFolder>

</Job>

© TOWER Software EMEA Page 30 of 37 Commercial In Confidence

Page 31: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

In adition to the tags defined in the Generic Job description above the following additional tags are required: -

7.8.1 TRIMWorkgroupServer This is either the IP Address or DNS resolvable name of the machine to which TRIM connections will be made.

7.8.2 TRIMWorkgroupServerPort This is either port number on which to try to connect to TRIM on the specified Workgroup Server.

7.8.3 TRIMDataset This is the 2 character identifier of the TRIM dataset which is to be connected to.

7.8.4 ContainerRecordTypeFieldName This contains the name of the IDX metadata field which will contain the name of a TRIM record type to be used for searching and creating folders.

7.8.5 ContainerTitlingMethod Declares the titling methid to be used by containers. May contain TEXT or CLIENT. If client the client specified in the value of the field named by ContainerClientFieldName will be used to search and title the folder.

7.8.6 ParentClassificationFieldName Declares the metadata field which will contain the classification path to search for and to create containers in.

7.8.7 ContainerTitleFieldName Declares which metadat field will contain the title to be used in folder creation.

7.8.8 ContainerExternalIDFieldName Declares which metadat field will contain the ID value to be used in searching and folder creation .

7.8.9 InputFolder This is the path to a folder the module should poll for IDX files.

© TOWER Software EMEA Page 31 of 37 Commercial In Confidence

Page 32: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

7.8.10 OutputFolder This is the path to a folder the module will deliver completed IDX files.

© TOWER Software EMEA Page 32 of 37 Commercial In Confidence

Page 33: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

7.9 TRIM Export Job Definition XML

A TRIM Export Job will, at an interval defined in Interval, search the defined input folder looking for items to process.

It will then make a connection to a TRIM database as defined by the parameters and attempt to create records in TRIM for each work item.

If the process is successful then the work item is moved to the done folder or deleted if that is the configuration. If the process fails then the item is moved to the failed folder.

As the new TRIM record is created each of the fields in the work item is examined and if the field name is one of the TRIM reserved field names the value is set on the new record. If the field is not one of the reserved field names then the importer will examine the record to see if a user defined field exists on that record and if so then the value is set.

If a job is defined as a TRIMExporter then the following XML tags are required: -

<Job>

<JobType>TRIMExporter</JobType>

<ExcludeJob>FALSE</ExcludeJob>

<Name>ExportToTRIM</Name>

<Description>This job creates records in TRIM</Description>

<MaxItems>100</MaxItems>

<DeleteOnSuccess>TRUE</DeleteOnSuccess>

<TRIMWorkgroupServer>SSSRV0006A</TRIMWorkgroupServer>

<TRIMWorkgroupServerPort>1137</TRIMWorkgroupServerPort>

<TRIMDataset>CF</TRIMDataset>

<DefaultRecordType>Document</DefaultRecordType>

<InputFolder>C:\AutoImporterStages\DIR5</InputFolder>

</Job>

In adition to the tags defined in the Generic Job description above the following additional tags are required: -

7.9.1 TRIMWorkgroupServer This is either the IP Address or DNS resolvable name of the machine to which TRIM connections will be made.

7.9.2 TRIMWorkgroupServerPort This is either port number on which to try to connect to TRIM on the specified Workgroup Server.

© TOWER Software EMEA Page 33 of 37 Commercial In Confidence

Page 34: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

7.9.3 TRIMDataset This is the 2 character identifier of the TRIM dataset which is to be connected to.

7.9.4 DefaultRecordType When a record is created, if no explicit record type is found in the metadata of the IDX file then this record type will be used. The explicit type can be passed in a metadata field named RECORD_TYPE. This must be the exact record type name used in TRIM. It IS case sensitive.

7.9.5 InputFolder This is the path to a folder the TRIMExporter module should poll for IDX files.

7.9.6 Reserved Field Names The following fields names (NOT CASE SENSITIVE) are reserved and map directly to the TRIM fields corresponding: -

Field Name TRIM Field NameTITLE Record Title (Free Text Part)

CLASSIFICATION Used to lookup a classification from its name e.g. Level1 - Level 2 - Level 3

FOLDEROr

CONTAINER

Used to try to find a container for the record based on its exact record number

RECORD_TYPE The exact name of a TRIM record type. This will be the type of record created.

ASSIGNEE The unique name of a TRIM location to set as the new record’s assignee.

AUTHOR The unique name of a TRIM location to set as the new record’s author.

HOMEOr

HOME_LOCATION

The unique name of a TRIM location to set as the new record’s home.

OWNER The unique name of a TRIM location to set as the new record’s owner.

CLIENT The unique name of a TRIM location to set as

© TOWER Software EMEA Page 34 of 37 Commercial In Confidence

Page 35: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

the new record’s client.DATE_CREATED The record’s created date in a recognisable

format e.g. yyyy-mm-dd, dd/mm/yyyy etcDATE_REGISTERED The record’s registered date in a recognisable

format e.g. yyyy-mm-dd, dd/mm/yyyy etcDATE_FINALISED Sets the finalised date on the record. Uses the

the value passed in the value. If the value is empty then it uses the current date & time

EXTERNALIDOr

EXTERNAL_ID

The record’s externalID field

NOTES The value is appended to the record’s notes field.

RETENTIONor

SCHEDULE

Used to look up a retention schedule from trim to assign to the record. Needs to be the TRIM schedules exact name.

DOS_FILE This field must contain the fully qualified path to a file to be used as the record’s primary electronic attachment.

PERMISSION_VIEW_RECORD Contains a pipe (|) delimited list of location sort names to add to the record permission.

PERMISSION_VIEW_DOCUMENT Contains a pipe (|) delimited list of location sort names to add to the record permission.

PERMISSION_UPDATE_RECORD Contains a pipe (|) delimited list of location sort names to add to the record permission.

PERMISSION_UPDATE_DOCUMENT

Contains a pipe (|) delimited list of location sort names to add to the record permission.

PERMISSION_MODIFY_ACCESS Contains a pipe( |) delimited list of location sort names to add to the record permission.

PERMISSION_DELETE_RECORD Contains a pipe (|) delimited list of location sort names to add to the record permission.

PERMISSION_ADD_CONTENTS Contains a pipe (|) delimited list of location sort names to add to the record permission.

<Job>

<JobType>MetadataTransformer</JobType>

© TOWER Software EMEA Page 35 of 37 Commercial In Confidence

Page 36: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

<ExcludeJob>FALSE</ExcludeJob>

<Name>Prep2</Name>

<Description>Carefirst Data Prep 2</Description>

<Interval>3</Interval>

<MaxItems>100</MaxItems>

<DeleteOnSuccess>TRUE</DeleteOnSuccess>

<InputFolder>C:\AutoImporterStages\DIR4</InputFolder>

<OutputFolder>C:\AutoImporterStages\DIR5</OutputFolder>

<Transformations>

<Transformation>

<SourceField>CLASSIFICATION</SourceField>

<TargetField>CLASSIFICATION</TargetField>

<Action>#DROP_FIELD#</Action>

<Data></Data>

</Transformation>

</Transformations>

</Job>

<Job>

<JobType>TRIMExporter</JobType>

<ExcludeJob>FALSE</ExcludeJob>

<Name>TRIMExport</Name>

<Description>Carefirst TRIM Exporter</Description>

<Interval>3</Interval>

<MaxItems>100</MaxItems>

<DeleteOnSuccess>TRUE</DeleteOnSuccess>

<TRIMWorkgroupServer>SSSRV0006A</TRIMWorkgroupServer>

<TRIMWorkgroupServerPort>1137</TRIMWorkgroupServerPort>

<TRIMDataset>CF</TRIMDataset>

<DefaultRecordType>Document</DefaultRecordType>

<InputFolder>C:\AutoImporterStages\DIR5</InputFolder>

© TOWER Software EMEA Page 36 of 37 Commercial In Confidence

Page 37: hpsws.lithium.comhpsws.lithium.com/hpsws/attachments/hpsws/RM_St/429/1/AI... · Web viewhpsws.lithium.com

Auto Importer

<OutputFolder>C:\AutoImporterStages\DIR6</OutputFolder>

</Job>

© TOWER Software EMEA Page 37 of 37 Commercial In Confidence