36
Release Notes for Tivoli Workload Scheduler, version 8.5 Release notes Abstract This document supplies the release notes for IBM® Tivoli® Workload Scheduler, version 8.5. Content The Release Notes for Tivoli Workload Scheduler, version 8.5 contain the following topics: What is new in this release Interoperability tables Installation limitations and problems, and their workarounds Upgrade notes Software limitations and workarounds Globalization notes APARS Fixed in this release Documentation updates To download the appropriate package for your operating system, see the Tivoli Workload Scheduler download page For detailed system requirements for all operating systems, see the Detailed System Requirements page . To access the Tivoli Workload Scheduler documentation, see the online Information Center What is new in this release

TWS 8.5 New

Embed Size (px)

Citation preview

Page 1: TWS 8.5 New

Release Notes for Tivoli Workload Scheduler, version 8.5

 Release notes AbstractThis document supplies the release notes for IBM® Tivoli® Workload Scheduler, version 8.5.   ContentThe Release Notes for Tivoli Workload Scheduler, version 8.5 contain the following topics:

What is new in this release Interoperability tables Installation limitations and problems, and their workarounds Upgrade notes Software limitations and workarounds Globalization notes APARS Fixed in this release Documentation updates

To download the appropriate package for your operating system, see the Scheduler download page.

For detailed system requirements for all operating systems, see the Detailed System Requirements page.

To access the Tivoli Workload Scheduler documentation, see the online Information Center

What is new in this releaseThis section summarizes the product enhancements included in this version of IBM Tivoli Workload Scheduler:

Installation improvements Variable tables Workload Service Assurance Problem determination and troubleshooting enhancements Enhanced integration features New Tivoli Dynamic Workload Console features What has changed in the Tivoli Workload Automation library

Page 2: TWS 8.5 New

Installation improvements

The installation process is significantly improved by the following enhancements:

Common shared architecture for Tivoli Workload Scheduler and Tivoli Dynamic Workload Console

Tivoli Workload Scheduler and Tivoli Dynamic Workload Console are from this version capable of being integrally installed. This gives the following benefits:

Reciprocal discovery during installation Optimized installation with shared middleware and a common installation

layout Shared embedded WebSphere® Application Server Optimized disk space and memory consumption Simplified access to the administration tools Installation interfaces with same look and feel Single Planning and Installation Guide now includes all the installation

documentation for both Tivoli Workload Scheduler and Tivoli Dynamic Workload Console, including installation troubleshooting and messages.

Enhanced installation launchpadUse the installation launchpad to install and upgrade the majority of the Tivoli Workload Automation products. This gives the following benefits:

Guided installation with typical installation scenarios Simplified choice-making during the deployment and configuration phases Graphical representation of typical network topology for a small and medium

business environment More intuitive component selection Access point for information:

o Download documentationo IBM Tivoli Information Centero IBM Publications Centero IBM Software Support Centero Product pages from ibm.com®

Automatic response file generation during the Tivoli Workload Scheduler and Tivoli Dynamic Workload Console installation and upgrade processes

You have the option, during the installation process, to require that a response file is generated with the selections made during the process. The generated response file can then be used to clone the installation on a number of other systems, greatly simplifying the product deployment process.

Addition of diagnosis and resume of a failed Tivoli Dynamic Workload Console installation

In the same way as Tivoli Workload Scheduler, Tivoli Dynamic Workload Console can automatically detect a number of specific failing points during installation, so that you can correct the problem and rerun the steps in error.

Fix pack Rollback Function for Tivoli Dynamic Workload ConsoleAutomatic upgrading from previous versions is possible. In addition, a rollback

Page 3: TWS 8.5 New

function is available for when a fix pack is applied.

Variable tables

Variable tables are new scheduling objects that can help you reuse job and job stream definitions.

In Tivoli Workload Scheduler version 8.5, what were previously called now called variables. You can place your defined variables into variable tables which are simply collections of variables. You can define variables with the same name and different values and save them in different tables. You can then use such tables in job definitions to provide different input values according to which table is in use.

Workload Service Assurance

Workload Service Assurance helps Job scheduling personnel to meet their service level agreements. Using Workload Service Assurance, Job schedulers can flag mission-critical jobs for privileged processing throughout the scheduling process. At plan run time the Workload Service Assurance components, Time Planner and Plan Monitor, dynamically monitor that the conditions for the timely completion of mission critical jobs are met and automatically promote these jobs or their predecessors if they are late.

Using Tivoli Dynamic Workload Console views, operators can monitor the progress of the critical network, find out about current and potential problems, release dependencies, and rerun jobs.

Enhanced integration features

Tivoli Workload Scheduler version 8.5 supports new integration with:

Tivoli Dynamic Workload Broker version 1.2 Tivoli Storage Manager version 5.4 and later

See IBM Tivoli Workload Scheduler: Integrating with Other Productshow Tivoli Workload Scheduler integrates with these and other products.

New Tivoli Dynamic Workload Console features

Tivoli Dynamic Workload Console version 8.5 includes modeling and topology functions. You can now use a single graphical user interface to model and monitor your workload environment.

A Workload Designer was added for modeling, covering all the use cases related to creating and maintaining the definition of the automation workload, such as jobs, job streams, resources, variable tables, workstation classes, and their dependencies.

You can now use the Tivoli Dynamic Workload Console to define Tivoli Workload Scheduler topology objects, covering all the use cases related to the definition and maintenance of the scheduling environment, such as workstations and domains.

Page 4: TWS 8.5 New

What has changed in the IBM Tivoli Workload Automation version 8.5 publications

These are the changes made to the Tivoli Workload Automation publications, which are divided into a series of libraries:

Tivoli Workload Automation cross-product publications

The following summarizes the changes made to the Tivoli Workload Automation cross-product publications:

PublicationsNew publication. It lists, with a brief description, all publications in the Tivoli Workload Automation library. GlossaryNew publication. The product glossary that was previously included at the back of all the publications is now in a separate publication. Download documents, System Requirements documents, and Release Notes documentsImprovements have been made to the layout and more content has been added.

Tivoli Workload Scheduler distributed library

The following summarizes the changes made to the Tivoli Workload Scheduler distributed library:

Planning and Installation GuideThis publication has been reorganized into a complete single source of reference for everything related to installing, upgrading, or uninstalling. All material that can be used for ongoing configuring and customizing has been moved to other publications. The details are as follows:

All information about supported platforms and hardware and software prerequisites has been moved to the Tivoli Workload Scheduler and Tivoli Dynamic Workload Console System Requirements documents

All information about installing, updating, and uninstalling the Tivoli Dynamic Workload Console has been moved into this publication

All information about troubleshooting the installations of Tivoli Workload Scheduler and Tivoli Dynamic Workload Console has been moved to this publication

All information about configuring and customizing either Tivoli Workload Scheduler or Tivoli Dynamic Workload Console that you might also need to use during the life of the product has been moved to the This includes:

o General configuring and customizingo Global optionso Localoptso Useroptso Configuring SSLo Security file and authorizations, including makesec

commands

Page 5: TWS 8.5 New

o Configuring connection parameters (command-line client connection) All information about configuring and customizing either Tivoli Workload

Scheduler or Tivoli Dynamic Workload Console that you only need to use once, just after the installation, and never subsequently change, has remained in the Planning and Installation Guide. This includes:

o Setting environment variableso Configuring the master domain managero Configuring the backup master domain manager

All information about integrating Tivoli Workload Scheduler with other products has been moved to Integrating with Other Products

All messages relating to installing, upgrading, or uninstalling Tivoli Workload Scheduler and Tivoli Dynamic Workload Console have been moved to this publication

User's Guide and ReferenceAll information relating to administering the product has been moved to the Administration Guide. This includes:

Global options and optman Tivoli Workload Scheduler console messages and prompts Security file and authorizations, including makesec and Enabling time zones Miscellaneous administrative commands: evtsize, StartUp

The manual is now in two parts, a user's guide, which includes conceptual information and describes user scenarios, followed by a reference to the commands and utilities.

Administration and TroubleshootingThis publication has been split into three new publications: Administration GuideIncludes all the administration information previously in the Planning and Installation Guide, the User's Guide and Reference, and the Tivoli Dynamic Workload Console: Installation and Troubleshooting. Troubleshooting GuideIncludes all the troubleshooting information previously in the Installation Guide, the User's Guide and Reference, and the Tivoli Dynamic Workload Console: Installation and Troubleshooting, except that relating to the installation, upgrade, and uninstallation, which has been moved to the Planning and Installation Guide. MessagesIncludes all the messages for Tivoli Workload Scheduler and Tivoli Dynamic Workload Console, except those relating to the installation, upgrade, or uninstallation, which have been moved to the Planning and Installation GuideDesigning Your WorkloadThis new publication describes how to design your workload with IBM Tivoli Workload Scheduler by providing an overview of product capabilities. It is designed to help the first-time user of Tivoli Workload Scheduler to understand how the product can satisfy the business needs of the user's organization. Integrating with Other ProductsThis new publication describes how Tivoli Workload Scheduler integrates with other

Page 6: TWS 8.5 New

products. It includes some integration information previously included in Appendix A of the Planning and Installation Guide, and some new integration scenarios and information. Variable Table ScenariosThis new publication describes various scenarios for using variable tables, introduced in this release. It is designed for a user who knows Tivoli Workload Scheduler to understand how to use the new variable tables feature. Application Programming InterfacesThis new publication provides information about using the following:

The Tivoli Workload Scheduler Integration Workbench, to create custom interfaces for the management of scheduling objects in the database and in the plan

The Web Services Interface, to use the Web Services access mechanism for the creation of your own client application to perform a subset of Tivoli Workload Scheduler tasks that manage jobs and job streams in the plan.

Tivoli Dynamic Workload Console: Installation and TroubleshootingDiscontinued. The Tivoli Dynamic Workload Console has now been fully integrated as a component into Tivoli Workload Scheduler, so the information contained in this publication has been similarly integrated into the following Tivoli Workload Scheduler publications:

All information relating to the installation, upgrade, and uninstallation, and the troubleshooting of these activities, has been moved to the Installation Guide

All information relating to the configuration or administration of the Tivoli Dynamic Workload Console and the embedded WebSphere Application Server has been moved to the Administration Guide

All information relating to the troubleshooting of the Tivoli Dynamic Workload Console and the embedded WebSphere Application Server has been moved to the Troubleshooting Guide

Tivoli Workload Scheduler LoadLeveler® libraryInformation about the Tivoli Workload Automation LoadLeveler library has been included in this publication for the first time. Access to the LoadLeveler library is now possible from within the Tivoli Information Center.

For full details of what is new, see the IBM Tivoli Workload Automation Overview

Interoperability tablesSupport at level of older product or component: For all products and components described in this section, the level of support is at that of the older product or component.

In the tables in this section the following acronyms are used:

TWS Tivoli Workload Scheduler

Page 7: TWS 8.5 New

MDM Tivoli Workload Scheduler master domain manager

BKM Tivoli Workload Scheduler backup master domain manager

DM Tivoli Workload Scheduler domain manager

FTA Tivoli Workload Scheduler fault-tolerant agent

z/OS Conn Tivoli Workload Scheduler z/OS® connector feature

z/OS Controller Tivoli Workload Scheduler for z/OS

TDWB Tivoli Dynamic Workload Broker

JSC Job Scheduling Console

TDWC Tivoli Dynamic Workload Console

TWS for Apps Tivoli Workload Scheduler for Applications

FP Fix pack

Tivoli Workload Scheduler: compatibility

  MDM

8.5, 8.4, 8.3, 8.2.1, 8.2.0

DM8.5, 8.4, 8.3, 8.2.1, 8.2.0

FTA8.5, 8.4, 8.3, 8.2.1, 8.2.0

z/OS Conn

8.3, 8.2

z/OS Controller

8.3, 8.2

MDM 8.5

Y Y Y  

DM 8.5 Y Y Y Y Y

Agent 8.5

Y Y Y Y Y

Tivoli Dynamic Workload Console: compatibility

  MDM/DM/FTA8.5

MDM/DM/FTA8.4

TDWC 8.5 Y Y With Fix Pack 2 (or later fix packs) installed on the Tivoli

Workload Scheduler V8.4 component.

With Fix Pack 5 (or later fix packs) installed on the Tivoli

Workload Scheduler V8.3

TDWC 8.4,with or without any fix packs

  YWith or without any fix packs

installed on the Tivoli Workload Scheduler V8.4

component.

Page 8: TWS 8.5 New

TDWC 8.3    With Fix Pack 2 (or later fix packs) installed on the Tivoli

Workload Scheduler V8.3

Note:The indicated fix pack levels are the minimum required. You are strongly recommended to maintain all Tivoli Workload Scheduler components at the latest fix pack level.

Tivoli Workload Scheduler Job Scheduling Console: compatibility

  MDM/DM/

FTA8.5

MDM/DM/FTA8.4

MDM/DM/FTA8.3

z/OS Con

n8.3

z/OS Controlle

r8.3

JSC 8.4

Y (1) Y Y (3) Y (4) Y (5)

JSC 8.3

Y (1) Y (2) Y Y Y (5)

Notes

1. The list below summarizes the behavior and limitations in using the Job Scheduling Console V8.4 or version V8.3 with a master domain manager V8.5:

o Limitations in using variable tables: You cannot manage variable tables. You can manage only the parameters defined in the default variable

table. You cannot create parameters when the default variable table is

locked. You can perform only the browse action on objects that address

variable tables. For example, to modify the properties of a job stream that references a variable table you must use the command line or the Tivoli Dynamic Workload Console.

During the submission, you cannot change the properties of a job stream that uses a variable table.

You can perform only the browse action on jobs that are flagged as critical.

o Limitations in using the integration with Tivoli Dynamic Workload Broker: Both in the database and in the plan, you cannot perform any

operations on workstations defined as broker workstation; you can only browse them. They are shown as standard agent workstations. In particular, the Create Another function, although not disabled, must not be used, because it does not produce a copy of the original broker workstation definition but it produces a standard agent workstation definition.

Page 9: TWS 8.5 New

Both in the database and in the plan, you cannot perform any operations on jobs that manage Tivoli Dynamic Workload Scheduler jobs; you can only browse them. They are shown as particular, the Create Another function, although not disabled, must not be used because it does not produce a copy of the original job definition but it produces a job definition without the workload broker specifications.

2. For firewall support, Java™ Virtual Machine 1.4.2 SR9 must be installed on the computer where the Job Scheduling Console is installed

3. For firewall support, Fix PK47309 must be installed on the Tivoli Workload Scheduler component

4. For firewall support, Fix Pack 3 of the z/OS Connector 8.3 must be installed5. With fix PK41611 installed on the z/OS controller6. If you have applied the standard agent integration PTFs (UK17981, UK18052,

UK18053) you must also install fix PK33565 on the z/OS Controller

Tivoli Workload Scheduler Command-line client: compatibility

  MDM/BKM

8.5MDM/BKM

8.4MDM/BKM

8.3

Command-line client, version 8.5 Y    

Command-line client, version 8.4   Y  

Command-line client, version 8.3     YTivoli Workload Scheduler for Applications: compatibility

  MDM/DM/FTA

8.5, 8.4, 8.3, 8.2.1, 8.2.0JSC

8.4, 8.3, 8.2.1

TWS for Apps8.4, 8.3, 8.2.1, 8.2.0

Y Y

Note:It is recommended that you upgrade Tivoli Workload Scheduler for Applications to the latest fix pack version before upgrading the Tivoli Workload Scheduler engine to version 8.5. This applies to all Tivoli Workload Scheduler for Applications versions up to version 8.4. If you have already upgraded the Tivoli Workload Scheduler engine to version 8.5 and now need to upgrade any version of Tivoli Workload Scheduler for Applications up to version 8.4, refer to Tech Note 1384969.

Tivoli Workload Scheduler database, version 8.5: compatibility

  Engine V8.5 Engine V8.4

Database version V8.5 freshly installed Y

Database version migrated from version V8.4 to V8.5

Y

Database version migrated from version V8.3 to V8.5

Y

Note:

Page 10: TWS 8.5 New

A database at any given level can only be used with a freshly installed engine, if that engine is at the same level as the database. This means, for example, that a version 8.5 database, that you migrated yesterday from version 8.4, cannot be used with a version 8.4 engine that you installed today, but only with a version 8.4 engine that had already been installed prior to the migration. It can also be used with any version 8.5 engine, whenever installed.

Installation limitations and problems, and their workaroundsThe following are software limitations that affect the installation of Tivoli Workload Scheduler on all platforms:

The Tivoli Workload Scheduler installation fails if the vcredist_x64.exe package is missing

In a Windows 64-bit environment, when performing a fault-tolerant agent, a master domain manager, or a backup master domain manager installation on a pristine workstation or on a workstation where the dll contained in the vcredist_x64.exe package is missing or is not at the required level, the installation fails because the Tivoli Workload Scheduler services cannot start. (48555)

Workaround:

Install the vcredist_x64.exe package before installing Tivoli Workload Scheduler. You can find the vcredist_x64.exe package in the Microsoft Download Center. If you have already performed the installation and it failed, install the vcredist_x64.exe package and restart the installation steps that did not completed successfully.

On the installation window some buttons seem to disappear if you press next and then back

In the installation window, if you choose Install a new instanceBack, in the previous window you can no longer see the buttons instance or Use an existing instance.

Workaround:

When you click Back, the two buttons mentioned do not disappear; they are just moved a little further down in the window. You must scroll down to see the buttons again.

Parentheses () are not permitted in the Tivoli Workload Scheduler installation path

When installing Tivoli Workload Scheduler, you cannot specify parentheses in the installation path field.

In the interactive InstallShield wizard on UNIX® and Linux® platforms, any passwords you enter are not validated at data input

No password validation is performed in the interactive InstallShield wizard on UNIX

Page 11: TWS 8.5 New

and Linux platforms at data input. If you make an error in the password, it is only discovered when the installation wizard attempts to use the password during the performance of an installation step.

Workaround:

Rerun the installation, using the correct value for the password.

On Red Hat Enterprise Linux V5.0 the automount feature does not work

On Red Hat Enterprise Linux V5.0 after inserting the DVD and double-clicking on the desktop icon, the following message is displayed:

./setup.sh: /bin/sh: bad interpreter: Permission denied

This is because the automount feature that mounts the DVD does so with the option noexec, which is not compatible with the way Tivoli Workload Scheduler needs to use the DVD.

Workaround:

To solve the issue, umount the DVD, and manually remount it, using the following command:

mount /dev/scd0 /media

Multiple installations on the same workstation with the same TWSuser

Two installations cannot coexist on the same workstation if they have the same TWSuser name, where one is a local user account and the other is a domain user account.

Workaround:

Install two instances of a Tivoli Workload Scheduler with different

Upgrade notesUpgrade procedure for platforms out of support

To upgrade master domain managers or backup masters running on platforms that are no longer supported by version 8.5 (such as AIX® 32 bit kernel), run the parallel upgrade procedure on a supported system. Upgrade procedures are documented in the IBM Tivoli Workload Scheduler: Planning and Installation Guide

When performing a direct upgrade from version 8.2 to version 8.5 , the tokensrv process must not be stopped

Page 12: TWS 8.5 New

When performing a direct upgrade from Tivoli Workload Scheduler version 8.2 to version 8.5, in a Windows® environment, the token server process must be active for the automatic data migration steps to complete successfully.

Software limitations and workaroundsThe following are software limitations that affect Tivoli Workload Scheduler, and their workarounds (when available):

Tivoli Workload Scheduler erroneously installs licenses to be counted by the License Metrics Tool and Tivoli Asset Discovery for Distributed

For a detailed description of the problem and the workaround to apply, refer to the technote TWS incorrectly installs licenses of some other products

Interoperability problem on Linux platforms

On Linux platforms there are connectivity problems between Tivoli Dynamic Workload Console version 8.5 and older Tivoli Workload Scheduler versions configured with the Custom user registry.

The connection does not work due to a missing variable in the configuration of the embedded WebSphere Application Server on Tivoli Workload Scheduler.

If you run Tivoli Workload scheduler version 8.4 (where Custom user registry is the default configuration value on Linux), or if you use PAM authentication, you must expect to experience such connectivity problems with Tivoli Dynamic Workload Console version 8.5.

To check if the Tivoli Workload Scheduler version 8.4 engine on the Linux system is configured based on the Custom user registry, do the following on the master domain manager of the Tivoli Workload Scheduler version 8.4 engine:

1. Run the showSecurityProperties.sh script from <TWS_homeand check if the value of the activeUserRegistry property is take note of the value of LocalOSServerREALM.

2. If the value of activeUserRegistry is not Custom, there are no connectivity problems and you need do nothing. If the value is Customapply the following workaround on Tivoli Workload Scheduler:

a. Edit the <TWS_home>/appserver/profiles/twsprofile/config/cells/Defa

ultNode/security.xml file as follows: i. Find the activeUserRegistry key. The value is similar to:

ii. CustomUserRegistry_<series of numbers

iii. Search the userRegistries section of the file, where the value of xmi:id is equal to the value of the activeUserRegistrykey.

Page 13: TWS 8.5 New

iv. In the same section, after the LocalOSSserverPasswordvariable declaration, add the following string:

realm=LocalOSServerREALM value

where LocalOSServerREALM value is the value you took note of in step 1.

b. Save the file.c. Run stopWas.sh and then startWas.sh from

to restart the embedded WebSphere Application Server (44551).

File eif.templ is recreated when migrating from version 8.4 GA to version 8.5

When Tivoli Workload Scheduler version 8.4.0 (the General Availability version without the addition of any fix pack) is migrated to version 8.5, before the embedded WebSphere Application Server is restarted, the file <TWS_home>/appserver/profiles/twsprofile/temp/TWS/EIFListener/eif.temp

l is removed and replaced with a new one.

This implies that, if you had changed the value of property BuffEvtmaxSizefile, your changes are lost.

If this happens, you must set the value of this property again in the new version of the file. The section Managing the event processor in the IBM Tivoli Workload Scheduler: Administration Guide documents how to do this. (38971)

Note that the new copy of the file is created in the new path, which is: <TWA_home>/eWas/profiles/twaprofile/temp/TWS/EIFListener/eif.templ

Deploying large numbers of event rules

The rule deployment process (run either automatically or with the command) performs slowly when you deploy high numbers of new and changed rules (2000 and more).

Workaround

If you need to deploy large numbers of event rules, there are two measures you can take to improve performance:

Use planman deploy with the -scratch option

To deploy large numbers of rules collectively in an acceptable time limit, use planman deploy with the -scratch option (37011).

Increase the Java heap size of the application server

Increase the Java heap size of the application server, as described in the

Page 14: TWS 8.5 New

section of the Performance chapter in the Administration Guidewhich you should increase your heap size is difficult to quantify, but consider a deployment of several thousands of rules as being at risk from an out of memory failure.

"Absolute" keyword required to resolve time zones correctly with enLegacyStartOfDayEvaluation set

If the master domain manager of your network runs with the enLegacyStartOfDayEvaluation and enTimeZone options set to startOfDay time set on the master domain manager to the local time zone set on each workstation across the network, and you submit a job or job stream with the keyword, you must also add the absolute keyword to make sure that the submission times are resolved correctly.

The absolute keyword specifies that the start date is based on the calendar day rather than on the production day. (41192)

Interoperabilty problems with pre V8.3 agents

When a Tivoli Workload Scheduler network includes agents running on versions older than V8.3 managed by a version V8.3 or later master domain manager with the enLegacyId option set to yes, having multiple instances of a job stream as pending predecessors produces errors caused by identification problems at submission time.

Workaround

None. (40729)

When using the Legacy ID and Start of Day Evaluation, dependencies are not handled correctly

You have the enLegacyID and enLegacyStartOfDayEvaluationbut the planner is unable to correctly change the pending predecessors if the plan contains, as pending predecessors, various instances of the same job stream.

Workaround

None. (40729)

Deploy (D) flag not set after ResetPlan command used

The deploy (D) flag is not set on workstations after the ResetPlan

This is not a problem that affects the processing of events but just the visualization of the flag which indicates that the event configuration file has been received at the workstation.

Page 15: TWS 8.5 New

Workaround

You can choose to do nothing, because the situation will be normalized the next time that the event processor sends an event configuration file to the workstation.

Alternatively, if you want to take a positive action to resolve the problem, do the following:

Create a dummy event rule that applies only to the affected workstations Perform a planman deploy to send the configuration file Monitor the receipt of the file on the agent When it is received, delete the dummy rule at the event processor. (36924 /

37851)

Some data not migrated when you migrate database from DB2® to Oracle, or

Neither of the two migration procedures migrate the following information from the source database:

The pre-production plan. The history of job runs and job statistics. The state of running event rule instances. This means that any complex event

rules, where part of the rule has been satisfied prior to the database migration, are generated after the migration as new rules. Even if the subsequent conditions of the event rule are satisfied, the record that the first part of the rule was satisfied is no longer available, so the rule will never be completely satisfied. (38017)

Incorrect time-related status displayed when time zone not enabled

You are using Tivoli Workload Scheduler in an environment where nodes are in different time zones, but the time zone feature is not enabled. The time-related status of a job (for example, LATE) is not reported correctly on workstations other than that where the job is being run.

Workaround

Enable the time zone feature to resolve this problem. See the User's Guide and Reference Manual to learn more about the time zone feature. See the Guide for instructions about how to enable it in the global options. (37358)

Event LogMessageWritten is not triggered correctly

You are monitoring a log file for a specific log message, using the LogMessageWritten event. The message is written to the file but the event is not triggered.

Cause

Page 16: TWS 8.5 New

The SSM agent monitors the log file. It sends an event when a new message is written to the log file that matches the string in the event rule. However, there is a limitation. It cannot detect the very latest message to be written to the file, but only messages prior to the latest. Thus, when message line "n" is written containing the string that the event rule is configured to search for, the agent does not detect that a message has been written, because the message is the last one in the file. When any other message line is written, whether or not it contains the monitored string, the agent is now able to read the message line containing the string it is monitoring, and sends an event for it.

Workaround

There is no workaround to resolve this problem. However, note that in a typical log file, messages are being written by one or other processes frequently, perhaps every few seconds, and the writing of a subsequent message line will trigger the event in question. If you have log files where few messages are written, you might want to attempt to write a dummy blank message after every "real" message, in order to ensure that the "real" message is never the last in the file for any length of time. (33723)

Microsoft® Remote Desktop Connection must be used with "/console" option

If you use the Microsoft Remote Desktop Connection to operate Tivoli Workload Scheduler, you must use it always with the "/console" parameter, otherwise Tivoli Workload Scheduler gives inconsistent results.

The planman showinfo command displays inconsistent times (IZ05400)

The plan time displayed by the planman showinfo command might be incongruent with the time set in the operating system of the workstation. For example, the time zone set for the workstation is GMT+2 but planman showinfo displays plan times according to the GMT+1 time zone.

This situation arises when the WebSphere Application Server Java Virtual Machine does not recognize the time zone set on the operating system.

Workaround:

Set the time zone defined in the server.xml file equal to the time zone defined for the workstation in the Tivoli Workload Scheduler database. Proceed as follows:

1. Create a backup copy of this file: 2. appserver/profiles/twsprofile/config/cells/DefaultNode

/nodes/DefaultNode/servers/server1

3. Open server1. xml with an editor.4. Find the genericJvmArguments string and add:

genericJvmArguments="-Duser.timezone=time zone

Page 17: TWS 8.5 New

where time zone is the time zone defined for the workstation in the Tivoli Workload Scheduler database.

5. Stop WebSphere Application Server.6. Restart WebSphere Application Server.

WebSphere Application Server limitations in a pure IPV6 environment when using the Job Scheduling Console or the Tivoli Dynamic Workload Console (35681)

When you install Tivoli Workload Scheduler, the following WebSphere Application Server variables are initialized as follows to allow communication in a mixed IPV4 and IPV6 environment:

java.net.preferIPV6Addresses=falsejava.net.preferIPv4Stack=false

If your configuration requires the use of a pure IPV6 environment, or you have specific firewall configuration settings that block IPV4 packets, the connection between the Tivoli Workload Scheduler master domain manager and the Tivoli Dynamic Workload console or the Job Scheduling Console fails.

Workaround:

To establish a connection in this specific environment, you must initialize the variable as follows:

java.net.preferIPV6Addresses=true

by editing the server.xml file in the following path:

$TWS_home/appserver/profiles/twsprofile/config/cells/DefaultNode /nodes/DefaultNode/servers/server

If, instead, you want to use IPV4 communication exclusively, set: java.net.preferIPv4Stack=true

Different behavior of UNIX and Windows operating systems at springtime daylight savings (94279)

During the springtime daylight savings time change (01:59:59-03:00:00), different operating systems behave differently.

For example:

Windows

The command: conman submit job at=02xx is set to 01xx

HP-UX

The command: conman submit job at=02xx is set to 03xx

Page 18: TWS 8.5 New

Workaround

Avoid creating jobs that have a start time in the "lost" hour and are due to run on the night of the daylight savings change (a Saturday night in spring).

The writer process on a fault-tolerant agent does not download the Symphony file (22485)

If you delete the Symphony file on a fault-tolerant agent, writer should automatically download it when it next links to the master domain manager. However, this does not happen if you launch conman before the file has downloaded.

Workaround

Delete the mailbox.msg file and writer downloads the Symphony file.

File monitor provider events: older event configurations might stay active on workstations after rule redeployment (34103)

If you deployed rules containing event types from the FileMonitorand then you redeploy rules that no longer include these file monitoring events, you might find that, despite the new configurations, the local monitoring agents still forward the file monitoring events to the event processor. The event processor correctly ignores these events, accordingly with the new configuration specifications deployed on the server; however, a certain amount of CPU time and bandwidth are needlessly misused.

The status of the local monitoring configuration on the agent will be corrected when one of the following occurs:

The planman deploy -scratch command is issued The event processing server is restarted Another rule containing an event condition involving the

provider is deployed to the agents.

Event rule management: Deploy flag is not maintained in renewed symphony (36924)

The deploy flag (D) indicates that a workstation is using an up-to-date package monitoring configuration and can be displayed by running the command. Testing has revealed that the flag is lost from the symphony file when the file is renewed after a JnextPlan or ResetPlan command. Although the event monitoring configuration deployed to the agents is the latest one, and event management works properly, an incorrect monitoring agent status is shown on the workstations.

Time zone not enabled: wrong schedtime reported on the command line of agents (36588)

The following problem has been found when a job stream is defined with time

Page 19: TWS 8.5 New

restrictions and time zone was not enabled: after running JnextPlanshowschedules command on both the master and the agents. You notice that, while the time restriction times remain the same, the schedtimes shown on the agents are converted to the workstation's time zone. This should not happen when the time zone conversion is not enabled.

For example, this is the showschedules output on the master:

(Est) (EST) Jobs SchWorkstation Job Stream SchedTime State Pr Start Elapse # OK LimFTA1 #CLEM1 0600 07/26 READY 10 ( 0:01) 1 0FTA1 #JS1 0600 07/26 READY 10 ( 0:01) 1 0FTA1 #JS_EVERY1 0600 07/26 READY 10 ( 0:01) 1 0FTA1 #TEST1 1900 07/26 READY 10 ( 0:01) 1 0 <19:10; <19:20MDM #CLEM0 0600 07/26 READY 10 ( 0:01) 1 0MDM #FINAL 0559 07/27 READY 10 ( 0:04) 4 0 [Carry]MDM #JS0 0600 07/26 READY 10 ( 0:01) 1 0MDM #JS_EVERY0 0600 07/26 READY 10 ( 0:01) 1 0MDM #TEST0 1900 07/26 READY 10 ( 0:01) 1 0 <19:10; <19:20

This is the showschedules output on FTA1:

(EST) (EST) Jobs SchCPU Schedule SchedTime State Pr Start Elapse # OK LimFTA1 #CLEM1 2300 07/25 READY 10 ( 0:01) 1 0FTA1 #JS1 2300 07/25 READY 10 ( 0:01) 1 0FTA1 #JS_EVER+ 2300 07/25 SUCC 10 00:00 1 0FTA1 #TEST1 1200 07/26 HOLD 10(19:00) ( 0:01) 1 0 <19:10; <19:20MDM #CLEM0 2300 07/25 READY 10 ( 0:01) 1 0MDM #FINAL 2259 07/26 HOLD 10(07/27) ( 0:04) 4 0 [Carry]MDM #JS0 2300 07/25 READY 10 ( 0:01) 1 0MDM #JS_EVER+ 2300 07/25 READY 10 ( 0:01) 1 0MDM #TEST0 1200 07/26 HOLD 10(19:00) ( 0:01) 1 0 <19:10; <19:20

SchedTime on FTA1 is converted from the one displayed on the master and this is not correct. The time restrictions are the same and this is correct.

 Jobmanrc is unable to load libraries on AIX 6.1, if streamlogon is not root. (IZ36977)

If you submit a job using as streamlogon a non-root user, you see in the joblog several error messages like these:

Could not load program /usr/maestro/bin/mecho. Could not load module /usr/Tivoli/TWS/ICU/3.4.1/lib/libicuuc.a. Dependent module libicudata34.a could not be loaded. Could not load module libicudata34.a. System error: No such file or directory. Could not load module mecho. Dependent module /usr/Tivoli/TWS/ICU/3.4.1/lib/libicuuc.a could not be

loaded.

Page 20: TWS 8.5 New

Could not load module.

 If you run date command after setting tws_env, you see your date referred to in GMT, and not in your TimeZone. (IZ36977) On AIX 6.1 batchman process does not correctly recognize the timezone of the local workstation (IZ70267)

On AIX 6.1 batchman process does not correctly recognize the timezone of the local machine that is set to GMT, even if, in the Tivoli Workload Scheduler CPU definition, it is correctly set to the correct timezone. You see in the stdlist log the following message: "10:29:39 24.11.2008|BATCHMAN:AWSBHT126I Time in CPU TZ (America/Chicago): 2008/11/24 04:29 10:29:39 24.11.2008|BATCHMAN:AWSBHT127I Time in system TZ (America/Chicago): 2008/11/24 10:29 10:29:39 24.11.2008|BATCHMAN:+ 10:29:39 24.11.2008|BATCHMAN:+ AWSBHT128I Local time zone time differs from workstationtime zone time by 360 minutes."

Batchman does not recognize the correct timezone because AIX 6.1 uses (International Components for Unicode) libraries to manage the timezone of the system, and these ICU libraries are in conflict with the Tivoli Workload Scheduler ones.

Workaround

Export the TZ environment variable before starting Tivoli Workload Scheduler to the old POSIX format, for example CST6CDT. This is an example of a convention instead of an Olson name convention (for example America/Chicago). This avoids the new default TimeZone management through the 6.1, by switching to the old POSIX one (as in AIX 5.x).

Globalization notesThis section describes software limitations, problems, and workarounds for globalized versions of Tivoli Workload Scheduler.

The InstallShield wizard installation fails if DBCS characters are used in the is:tempdir path (36979)

If you are installing using the -is:tempdir option and you specify DBCS characters in the path, the installation fails.

Workaround:

Do not specify DBCS characters when using this option.

In the output of the composer list and display commands, the list and report headers

Page 21: TWS 8.5 New

are in English (22301, 22621, 22654)

This has been done to avoid a misalignment of the column headers in DBCS versions that was making it difficult to understand the information.

In the output of the product reports, the report headers are in English

This has been done to avoid a misalignment of the column headers in DBCS versions that was making it difficult to understand the information.

Data input is shorter in DBCS languages (IY82553, 93843)

All information is stored and passed between modules as UTF-8, and some characters occupy more than one byte in UTF-8. For DBCS languages, each character is three bytes long. Western European national characters are two bytes long. Other Western European characters are one byte long.

On Windows operating systems, you cannot create a calendar with a name containing Japanese characters using the "makecal" command (123653).

Workaround:

Enclose the calendar name in double quotes.

On Windows operating systems, the Tivoli Workload Scheduler joblog is created with incorrect characters (APAR IY81171)

You are working in a non-English language environment and you have correctly set the LANG and TWS_TISDIR environment variables. However, the Tivoli Workload Scheduler joblog is created with incorrect characters in the body of the log (the headers and footers of the log are correct).

Workaround

The problem is caused by the code page in use. Windows editors and applications use code page 1252 which is correct for writing text files. However, the DOS shell uses the default code page 850. This can cause some problems when displaying particular characters.

To resolve this problem for Tivoli Workload Scheduler jobs, add the following line to the beginning of the file jobmanrc.cmd on the workstation:

chcp 1252

For further details about the jobmanrc.cmd file, see the section on customizing job processing on a workstation in the Tivoli Workload Scheduler: User's Guide and Reference

It is possible to resolve this problem for all applications on the workstation, by using

Page 22: TWS 8.5 New

regedit to set the DOS code page globally in the following registry keyword:

HKEY_LOCAL_MACHINE/system/current Control set/Control/Nls/Code page/OEMCP =1252

You must reboot the workstation to implement the change.

Note:Microsoft warns you to take particular precautions when changing registry entries. Ensure you follow all instructions in Microsoft documentation when performing this activity.

APARS Fixed in this release

Tivoli Workload Scheduler version 8.5 includes all applicable APARs fixed in V8.3 Fix Pack 4 and V8.4 Fix Pack 1, plus the following:

IZ05159 : On Windows 2003, running an upgrade from Tivoli Workload Scheduler V8.2.1 to V8.3 does not update the system registry.

IZ07038 : Access check for scripts defined without fully qualified path.

IZ08936 : If a job has "recovery rerun", submitting a job stream logs AWSBHT023E to twsmerge.log.

IZ10297 : Ad hoc submit job stream containing an "until" dependency after midnight +1 day is incorrectly added to the "until" time.

IZ10349 : Wrong date in "last start time" and "scheduled start" is set by

IZ11522 : Makesec error updating user.

IZ12472 : On Tivoli Workload Scheduler V8.3 Fix Pack 4, security errors can occur when connecting the Job Scheduling Console when a Symphony file is not present.

IZ12504 : "Schedule not found" error when the SAP R/3 extended agent interception collector attempts to put an interception job onto a nonexistent job stream.

IZ12845 : AWSJPL528E error occurs on "conman sbs" with alias, if it is performed between makeplan and switchplan.

IZ13008 : Composer add/replace memory leak.

IZ13027 : Tivoli Workload Scheduler V8.3 fix pack installation overwrites the jobmanrc in the maestro_home directory.

IZ13668 : Rccondsucc condition defined on a recovery job is ignored if the job is an ad hoc submission made using the Job Scheduling Console.

IZ13672 : "Scheduled time" of a job submitted without a date (to start immediately)

Page 23: TWS 8.5 New

shows the next day.

IZ13822 : A job stream with a valid "from" date is never submitted.

IZ14290 : Fault-tolerant agent stays unlinked even after issuing link commands.

IZ14618 : Job streams that do not have an "at" time dependency at the job stream level have a start date of next day instead of current day

IZ14657 : Rep7 -f not working.

IZ14746 : After FINAL job stream, the unison_sched_id of job streams matches the carried forward FINAL job stream's sched_id.

IZ15392 : Cannot kill two jobs with same job name in userjobs from the Job Scheduling Console.

IZ15414 : Conman submit with a "follows" dependency does not behave as expected.

IZ15584 : Concurrent conman sbs fails with AWSJCS011E with Oracle database.

IZ15782 : "AWSBHT156E. Batchman could not open the mailbox file deadmessage.msg" error received when "enswfaulttol"=yes.

IZ16002 : Multiple users browsing joblog at same time, caused crash of Tivoli Workload Scheduler V8.3 with Fix Pack 4.

IZ16020 : The explicit conman prompt defined in localopts (=%%%) is removed by a fix pack installation.

IZ16421 : The current default workstation in a "follows" dependency of any submit command is the workstation from which conman is run. This default is wrong and it is different from the default used in Tivoli Workload Scheduler V8.2.x, breaking the compatibility between them.

IZ16601 : Some job streams are duplicated on different days in the plan during migration.

IZ16832 : Tivoli Event Console event management fails when a fault-tolerant agent is not reachable.

IZ16857 : Unixlocl extended agent stops if backup domain manager is stopped when "enswfaulttol"=yes.

IZ17294 : Extended agent is shown as unlinked on Job Scheduling Console, although it is linked.

IZ17475 : The rep7 report on V8.3 displayed the "elapsed time" in minutes. In V8.4 it is displaying the elapsed time in seconds.

IZ17479 : Jobmon terminates after applying Fix Pack 04.

Page 24: TWS 8.5 New

IZ17565 : The usage command xref -u does not list the -when

IZ17655 : Installing Tivoli Workload Scheduler on Windows 2003 64-bit domain controller causes "application error ..." running makesec.exe.

IZ17806 : Garbled MBCS characters in a joblog header, if the MBCS characters are passed as a parameter in a scriptname field.

IZ18166 : Unixlocl extended agent stops if backup domain manager is stopped when "enswfaulttol"=yes.

IZ18196 : Logical OR missing in the Tivoli Workload Scheduler V84 metafile.mdl file.

IZ18938 : When a dependency is deleted in Tivoli Workload Scheduler, it can cause all Windows and Solaris fault-tolerant agents to unlink and go down.

IZ19308 : FINAL job stream not scheduled correctly at Daylight Saving Time change.

IZ19459 : Audit plan wrong date displayed.

IZ19740 : Canceled jobs are not removed from JHR_Job_History_Runs Table.

IZ20328 : On the Job Scheduling Console, the last run time information disappears in the All Job Definitions list view after right-cling to access the job properties panel.

IZ21378 : Incorrect output when running "reptr" and outputting to text file.

IZ21379 : In the "Set Alternate Plan" panel of the Job Scheduling Console, V8.3, the "Plan Start Time" and "Plan End Time" have incorrect times when the timezones feature is switched off in optman.

IZ21464 : After using the "Set Alternate Plan" panel of the Job Scheduling Console, V8.3 to view the job streams in an old plan, removing a schedlogbrowsed by that panel does not increase the available disk space as it does if the removed schedlog file had not been browsed by that panel.

IZ21879 : Special characters in passwords are not allowed during the installation.

IZ21941 : IBM Tivoli Monitoring scripts not working properly on AIX.

IZ22263 : Standard agent job output header showing wrong date (02/07/70).

IZ22417 : If Tivoli Security Compliance Manager is already installed, during the installation of Tivoli Workload Scheduler, V8.4.

IZ22712 : Tivoli Workload Scheduler V8.4 "child workstation link changed" event rule does not work.

IZ22904 : SSM 4.0 in Tivoli Workload Scheduler V8.4 logs Windows events about

Page 25: TWS 8.5 New

the missing driver pcamp50.sys.

IZ22949 : The V8.3 rmstdlist did not work properly when launched outside the maestro_home directory.

IZ22954 : After applying Fix Pack 04, XRXTRCT does not put all of the jobs in the output files.

IZ23253 : "Datamigrate -topology" does not handle DBCS characters correctly.

IZ24025 : Batchman went down when the job stream was canceled by the keywords onuntil canc.

IZ24042 : The Windows version of the reptr script (reptr.cmdtemporary file <TWS_home>\datafile when it has finished using it.

IZ24047 : Twsclusteradm.exe -uninst with hosts=<target> fails to uninstall Tivoli Workload Scheduler from the specified target cluster node.

IZ24747 : For a chain of rerun jobs stageman is considering the first job in the chain to determine if it should be carried forward, instead of the latest job. If the first rerun job is in ABEND state and the last in SUCC state, the job is carried forward, even though it has run successfully.

IZ24748 : The Sendmail action performed by an event-driven function sets different character sets for the "Subject" and "Body" parts of the mail.

IZ25226 : After running switchmgr to the backup master domain manager, when the master domain manager is brought back online or restarted the fault-tolerant agents are unlinked.

IZ26291 : After running many job streams, Report 10A in CreateReportsincorrect value in the Total Job Streams item.

IZ26739 : A job stream that has a pending predecessor incorrectly goes into status.

IZ27478 : Jobs fail or do not execute properly when the system path on Windows exceeds 1024 characters.

IZ27977 : On a fault-tolerant agent, the temporary file "TWS<xx>"C:\Windows\temp" and is not being removed immediately by

IZ28114 : When the Logmansmoothpolicy global option is set to -1 (default value), the average_cpu_time is not calculated correctly.

IZ28131 : A job stream with the onuntil cont option is not carried forward.

IZ28400 : Specifying "schedid" without a job stream ID did not return a syntax error.

IZ36977 : Jobmanrc is not able to load libraries on AIX 6.1, if the

Page 26: TWS 8.5 New

not root.

IZ37152 : The WebSphere Application Server terminates when planaudit=1, if a job with a recovery rerun option is submitted from the Job Scheduling Console.