Signiant Support

12.0 Component Development Developer's Guide Print


CHAPTER 1 Workflow Components
CHAPTER 2 Understanding Workflow Components
CHAPTER 3 Component Commands
CHAPTER 4 Understanding Job Template Libraries
Workflow Canvas
Component Editor Dialog
Inputs/Outputs
Component Instances
Component Revision
Component Versioning
Update Instances Window
Switch Versions Window
Specializing vs. Publishing a Component
Component Icon Variations
Permissions
CHAPTER 5 Configuring Component Properties
Prompt Editor Window
Prompt Types
Prompt Details
Previewing Prompts
CHAPTER 6 Understanding Advanced Features
Automation and Application Integration
Commands Window
Common Scripting Environment
Alternate Scripting Language
Safe Handling of Component Inputs with Scripts
Script Library
Interval Statistics
CHAPTER 7 Script Definitions
Signiant Core
Scheduler Service
Server Configuration
Miscellaneous
Statistics
Input/Output Processing
Volume Shadow Services
File/Path Operations
CHAPTER 8 Publishing Components
Publishing a Component
Viewing Component Help
Deleting a Published Component
Deleting a Toolbox Classification Type
CHAPTER 9 Importing And Exporting Components
Importing a Component
Exporting a Component
CHAPTER 10 Perl Modules
Linux, Unix, Macintosh, Windows
Windows Only
CHAPTER 11 Understanding Component Keywords
Property Keywords
Continuous Pre-File Command Notes
Continuous Post-File Command Notes
Variable Definition Keywords
Notification Keywords
Statistics Keywords
Keyword Availability
CHAPTER 12 Sequence Diagrams
Push Transfer Model
Pull Transfer Model
CHAPTER 13 Tutorial
Creating a Component
Creating a Workflow
Running the Workflow

Workflow Components

A Signiant Workflow or Job Template component is a structured set of scripts and properties that execute specific tasks on Signiant agents (for example: file transfer between two agents, running specific commands on a given agent). Workflow or Job Template components are simple building blocks that developers can combine in an almost unlimited number of ways to automate data transfer operations, including scheduling, file transfer, processing, reporting and notification. Component developers are users who develop Workflow or Job Template components in a Workflow or Job Template process.

In order for developers to be able to edit and create components, they must be assigned Component Editor privileges.

To assign a developer component editor privileges, do the following:

  1. In the Manager, select Administration > Users > List.
  2. Select the user to grant Component Editor privileges and click Edit.
  3. Select the Roles tab.
  4. Under Full User, enable Component Editor.
  5. Click OK.

Understanding Workflow Components

This section provides reference material on Workflow or Job Template components, including a description of those included with the Signiant software. It also provides an overview of the Job Template Library interface and the Workflow Canvas, and a checklist of issues you may need to consider before creating your Workflow or Job Template component.

A Workflow or Job Template component is a set of scripts and properties that execute a specific task (for example a file transfer between two agents), or provide output values that other components can use as inputs. Workflow or Job Template components are simple building blocks that you can combine in an almost unlimited number of ways to automate your data transfer operations, including scheduling, reporting and notification. Workflow components can include the following information:

  • rules for pre- and post-processing of data (for example: templates can run a user-defined program or script).
  • existing scripts, programs, or batch files -- agents can run any command line interface that is valid for its agent operating system.

Component developers can create these Workflow or Job Template components in the Signiant Manager. Once created, Workflow designers can drag-and-drop these components in the Workflow Canvas and connect them to create a business workflow. All Workflows or Job Templates must begin with a start component (for example, ScheduledStart, DropBoxStart, AvidEditorStart, and so on). The start component provides the prompts and input values required for the entire Workflow or Job Template. Components that follow the start component (such as a remote command or file transfer) specify the tasks for the Workflow or Job Template.

Before you create your component, you need to consider the following items:

Scripting Requirements

Every command you specify in a component should include the SolutionStandardHeaderPerl script library reference at the beginning. This script library provides the following functions, among others:

  • Sets Perl Unicode support

  • Sets standard output and standard error for automatic flushing

  • Sets the Perl variable LOGLEVELNAME to one of the strings ERROR, WARN, INFO, or DEBUG, depending on the setting of the input Log Detail Level. If the level is DEBUG, then set the Perl variable DEBUG to true

You can select the SolutionStandardHeaderPerl script from the Reference Script drop-down in the Script editing screen.

Inputs

Since variables and mapped properties are set in scripts via a preprocessor phase (which performs textual substitutions) it is important to set input variables in a specific manner to avoid having any special characters cause syntax or parsing errors in the script.

To use an input variable in a Perl script, use a heredoc like the following:

   my $varname = ModifyInputValue( <<'__VARNAME__', 0 );
   %Inputs.server%
   __VARNAME__
         

The ModifyInputValue function cleans up the variable with extraneous characters that may have been added by user variable prompting, as well as leading & trailing whitespace. You must reference the ModifyInputValue routine from the script library by including the following in the script header:

   %script_lib_obj:SolutionModifyInputValue%

 

Outputs

For compatibility with existing and future components, you must set the following outputs in the Agent Completion, Error, and Finalize commands.

  • ReturnCode: Set to 1 if command failed, 0 otherwise

  • ReturnMessage: Set to a failure message string

These should be added as outputs to the command. You may add additional outputs to the command as required. You must set all outputs using the dds_property_set syntax or the SigSetOutput function (which abstracts the dds_property_set syntax). The following is an example:

   my $ReturnCode = 1;
   my $ReturnMessage = "Command Component Completed Successfully\n";

   print STDOUT "%dds_property_set%Outputs.ReturnCode=$ReturnCode\n";
   print STDOUT "%dds_property_set%Outputs.ReturnMessage=$ReturnMessage\n";
          

Percent Complete

File transfer components automatically show progress during their execution (based on the amount of data transferred) but Command components do not. If a Command component is to report progress, this must be done explicitly in the script code. The following example shows how this can be done:

   # Register that we will be sending percent complete information 
   print STDOUT "%dds_cmd_capable%\n";
   # Initialize “Total Work Units” (twu) and “Work Units Complete” (wuc) 
   print STDOUT "%dds_cmd_statistics%twu=$num_files ,wuc=0\n";
   # Update percent complete as new progress information becomes available 
   print STDOUT "%dds_cmd_statistics%twu=$num_files ,wuc=$file_count\n";
            

Debugging

Commands should generate additional debug and tracing output to standard output and/or standard error when the Perl variable DEBUG is set to true. The following is an example of a debugging script:

   %script_lib_obj:SolutionStandardHeaderPerl% 

   SigPrintLog(“This is normal job info”,LOGINFO);
   SigPrintLog(“This is some extra debug info”,LOGDEBUG);
   SigPrintLog(“WARNING: This is a warning message”,LOGWARN);
   SigPrintLog(“ERROR: This is a warning message”,LOGERROR);
            

File Names

File names presented to a component on an input are by default in an XML encoded list. Legacy components (components from Signiant pre-8.x) may use a comma separated list, so any input accepting a list of file names should accept both formats. There are routines in the script library to implement this functionality. These functions provide backward compatibility in the event that the format changes. For example:

   %script_lib_obj:SolutionSigListXML% 
   %script_lib_obj:SolutionSplitPaths% 

   # Is SourceData in SigList XML format? 
   if (&IsSigListXmlFormat($SourceData)) { 
       # Parse the XML input 
       if (&SigListXMLParse($SourceData,\$PathListType,\@SourceElements) != 0) { 
           SigPrintLog("ERROR: SourceData XML is not parsable",LOGERROR); 
           return ($FALSE); 
       } 
       @SourcePathArray = SigListGetELementsByAttribute(\@SourceElements,"V"); 
   } else { 
       # SourceData is in legacy format... 
       @SourcePathArray = SolutionSplitPaths($SourceData,','); 
   } 
         

File names in an output must be XML encoded using the provided function. For example:

   # Create list of processed files. This may be a subset of the input 
   # file list if some files did not process correctly 

   if (&SigListXMLCreate($XMLString,'PATHLIST',\@SourcePathArray)) { 
       print STDERR "Failed to create XML file list\n"; 
       return ($FALSE); 
   }

   # Set component output 
   print STDOUT "%dds_property_set%Outputs.processedFiles=$XMLString\n";             
            

Component Commands

This chapter details the command line options for the Manually Enter a File List Command field.

The Manually Enter a File List Command field accepts any command that generates a list of file names (one file per line). For example, you can use the findfiles utility and a set of arguments as the Filenames Command value. The findfiles utility is installed with the Agent software and works regardless of the Agent’s operating system. The findfiles utility uses the following syntax:

   findfiles <dir_path_list> <option_list>

The <dir_path_list> is a series of whitespace-separated directory pathnames to be searched recursively for entries that match criteria specified by items in <option_list>. If no directory paths are specified, either in this initial list or via the "-path" option (see below), the current working directory is used as the sole <dir_path_list> item. Any Windows directory pathname incorporating a drive letter must be specified as an absolute path.

The <option_list> is a series of command line options used to specify the selection criteria for directory entries to be displayed, as well as to indicate various traversal and listing directives. If no options are specified, the "-name '*â'"; option list will be used as default.

Available command line options are listed in the following table:

 

Parameter Description
-path <pathname> Adds the supplied <pathname> string to the aforementioned list of directory pathnames <dir_path_list> for searching purposes.

-name <pattern>

-exclude <pattern>

Used to enable (‘-name’ option) or disable (‘-exclude’ option) the display of non-directory entry names based on successful matches against the supplied filename <pattern> (note that these particular options have no effect in terms of enabling or disabling directory traversal). Individual <pattern> strings operate as follows: Any character in the pattern, except the special pattern characters described below, matches itself. You must escape special characters with a backslash to be matched literally. The special characters have the following meanings:
  •  * matches zero or more characters
  • ? matches any single character
  • [...] matches any one of the enclosed characters
A pair of digit, lowercase or uppercase characters separated by a hyphen ‘-’ denotes a range set and matches any character sorted in the range. If the first character following the ‘[‘ is ‘^’ or ‘!’, then any character not enclosed is matched. Use commas to specify multiple distinct patterns as an alternative to using multiple option specifications.
Note that these options do not enable/disable directory traversal.
-leafname <pattern> Enables display of both directory and non-directory entries matching the supplied <pattern> argument in the same manner as the -name option described above.
-ic Specifies that case is to be ignored. Case is ignored by default on Windows systems. Note that the <ic> and <noic> options are mutually exclusive.
-noic Specifies that case is to be respected. Case is respected by default on UNIX systems. Note that the <ic> and <noic> options are mutually exclusive.
-depth <max_scan_depth> Used to limit the traversal depth of each specified directory to <max_scan_depth> directory levels. For example, using “-depth 1” will limit criteria tests and display functions to the immediate contents of each specified directory.

-dirinclude <dirpattern>

-direxclude <dirpattern>

Used to enable (‘-dirinclude’ option) or disable (‘-direxclude’ option) directory traversal, and display of the corresponding directory name, based on successful matches against the supplied <dirpattern> string. Such pattern strings allow the same match capability as for the ‘-name’ and ‘-exclude’ options, with the addition of path component separators (such as / or \ ) to apply specific patterns to components at specific directory levels. The matching of ‘-dirinclude’ <dirpattern> strings, as well
as ‘-direxclude’ <dirpattern> strings prefixed with the ‘@’ character against sub-directory names is “anchored” under the particular <dir_path_list> item being traversed. Matching of ‘-direxclude’ <dirpattern> strings not prefixed with the ‘@’ character is not “anchored” and is performed against the appropriate number of the lowest level components in the sub-directory name. Use commas to specify multiple distinct patterns as an alternative to using multiple option specifications. Special characters allow users to make use of pattern matching on the directory path. You must escape special characters with a backslash to be matched literally. Characters
include the following:
  • * (matches zero or more characters)
  • ? (matches any single character)
  • [...] (matches any one of the enclosed characters - for example, [ch] would match the characters “ch”)
A pair of digit, lowercase or uppercase characters separated by a hyphen ‘-’ denotes a range set and matches any character sorted in the range. If the first character following the ‘[‘ is ‘^’ or ‘!’, then any character not enclosed is matched.
-pathexclude <dirpath> Disables traversal of the specified directory path (Note: No pattern matching is performed.) On Windows systems, if the <dirpath> is not prefixed with a recognized drive specification then directory traversal is disabled for that directory path for each "drive qualified" top-level directory. For example, ... -path c:/ -path d:/ -pathexclude /tmp ... prevents traversal of both c:/tmp and d:/tmp whereas ... -path c:/ -path d:/ -pathexclude d:/tmp ... only prevents traversal of d:/tmp.

-accessed <relop> <date_time_val> [ <relop> <date_time_val> ]

-changed <relop> <date_time_val> [ <relop> <date_time_val> ]

-created <relop> <date_time_val> [ <relop> <date_time_val> ]

-modified <relop> <date_time_val> [ <relop> <date_time_val> ]

Used to display entries whose creation, modification or access times match the given expression where <relop> is one of relational operators ‘lt’, ‘le’, ‘eq’, ‘ge’ or ‘gt’ and where <date_time_val> is one of the date/time value forms described below. A second (optional) relational expression enables time comparisons against lower & upper bound values. For example, “;-accessed gt 17h lt 23h” would display files accessed between 17 and 23 hours ago. The following date/time forms are supported:
  • <count><unit> is a relative, counted time span measured from the current time in <unit> values of ‘s’(seconds), ‘m’(minutes), ‘h’(hours), ‘d’(days), ‘w’ (weeks), ‘M’(months) or ‘y’(years). For example, 17h means 17 hours ago and 8M means 8 months ago
  • <hour>:<minute>:<second> is a time of day reference for the current calendar day (from the UTC-based system time).
  • <year>/<month>/<day> is a UTC date reference for a valid day occurring from 1970 to 2037, inclusive.

The following caveats apply:

When using relational operators, “less than” (i.e., ‘lt’ operator) means “newer than” and “greater than” (i.e., ‘gt’ operator) means “older than”. For example, ‘-mod lt 1d’ means “modified within the last 24 hours”.
On UNIX-type systems, the ‘-created’ option is an alias for the ‘-changed’ option which tests the file’s content and/or inode modification time. On Windows systems, the ‘-changed’ option is an alias for the ‘-modified’ option.

-size <relop> <num_kbytes> [ <relop> <num_kbytes> ] Used to display file entries whose sizes (in kilobytes) match the given expression where <num_kbytes> is a number greater than zero and <relop> is one of the relational operators ‘lt’, ‘le’, ‘eq’, ‘ge’ or ‘gt’. A second (optional) relational expression enables size comparisons against lower & upper bound values. Note that size comparisons against directory entries are skipped so display of directory names is not suppressed based on size.
-matchallexpr Used to require that all time/size based expressions specified are matched in order for the entry to be displayed (i.e., an “AND” boolean relationship is applied across the expressions). If ‘-matchallexpr’ is not present, only one of the expressions need match for the entry to be displayed (i.e., an “OR” relationship).
-markerfile <marker_file_name> Used as an automatically-updated file-based time reference, this option is used to display file entries whose modification times are newer than the modification time of the file named by the option argument <marker_file_name>. If the named marker file does not exist, it is created and modification time comparisons against file entries are skipped. Upon program termination, the named marker file’s access/modification
times are assigned the modification time of the newest file entry listed (if any). NOTE: The ‘-markerfile’ option is mutually exclusive of other time-based entry selection options (i.e., ‘-accessed’, ‘-changed’, ‘-created’ or ‘-modified’).
-maxpathlen <max_pathname_length> Used to display file/directory entries whose pathname length is a maximum of <max_pathname_length> characters. When a pathname exceeding the specified length is encountered, an error message (to “;stderr”) is generated. If unspecified, the system-defined maximum will be the default, typically 1024 for both Windows and UNIX. (Note: Values are approximate.)
-parentdirs Suppresses the display of directory entry names that are not part of a parent directory hierarchy containing items matching the selection criteria.
-pathparentdirs Incorporates all the functionality of the ‘=parentdirs’ option, but also displays the parent directories of any <dir_path_list> hierarchy containing items matching the selection criteria.
-reverse Used to reverse the normal directory tree traversal order such that directory entries in a given directory are traversed first, followed by the evaluation of the entries in that directory against the available display criteria. Normally, entry display evaluation is done first then directory entries are traversed.
-follow Used on UNIX-type systems to cause symbolic links to directory entries to be "followed" during directory tree traversal. This option is silently ignored on Windows systems.
-attributes <attrspec> Selects Windows entries for display whose attributes meet the criteria given by <attrspec>, which has the following forms:
  • <attrs> - entity attributes must equal <attrs>
  • !<attrs> - entity attributes must not equal <attrs>
  • -<attrs> - entity attributes must have all <attrs> members
  • +<attrs> - entity attributes must have one or more <attrs> members
  • !-<attrs> - entity attributes must not have any <attrs> members
  • !+<attrs> - entity attributes must not have all <attrs> members

where <attrs> is one or more of the supported attribute codes, namely ‘a’ for archive, ‘c’ for compressed, ‘e’ for encrypted, ‘h’ for hidden, ‘n’ for normal, ‘o’ for offline, ‘r’ for read-only, ‘s’ for system and ‘t’ for temporary.

This option has no effect on entry selection for non-Win32 platforms, but the <attrspec> string will be syntactically validated regardless of platform type.

-nameinfo

-backupinfo

Displays selected entries in a format suitable for the file transfer agent. The entries contain sufficient information for the file agent to begin transferring the file without the need for additional information fetches, thus improving its performance considerably. Use the "-backupinfo" option to request that the privileges of the Windows backup operator be enabled and that the information from encrypted directories be appended to the file information.

NOTE: The ‘-nameinfo’ and ‘-backupinfo’ options are mutually exclusive of other “long” format display options.

-long

-longtimes

-longunix

-longunixtimes

Used to display selected entries in “long” format where the displayed pathname is prefixed with a UNIX-style mode mask (in octal), an entry size (in bytes) and the last modification time. When ‘-longtimes’ or ‘-longunixtimes’ is used, the last access and creation/change times are shown as well. When ‘-long’ or ‘-longtimes’ is used, all entry time information is displayed in “YYYY/MM/DD HH:MM:SS” or “YYYY/MM/DDHH:MM:SS” format respectively. When ‘-longunix’ or ‘-longunixtimes’ is used, all entry time information is displayed as the number of seconds elapsed since January 1, 1970 UTC.
-optfile <options_file> Specifies a text file containing one or more command line options. This option can be useful for constructing large command lines, for building command lines in an incremental fashion across multiple “options” files, or for avoiding undesirable processing behaviors of command line interpreters. The contained <option_list> can span multiple lines and can include other ‘-optfile’ options.
-deleteoptfiles Indicates that any text files named in "-optfile" option specifications should be automatically removed during program termination operations.
-hidesandbox Indicates that when operating with a sandbox, the sandbox directory should be taken as the file system root for all names displayed as well as for the search. Without this option the full path including the sandbox is displayed for all absolute names.
-getsize Calculates the size of the files or directories passed in <dir_path_list>. The output format is '<size><tab><path>'.
-pedantic Used to send error messages (to “stderr”) for each entry that cannot be processed while scanning the specified list of directories. The default behaviour is to silently ignore such errors, except in the case of the specified search directories themselves.
-rootlist Used to display the available “root” directories on the system before attempting to traverse the specified directory list. On UNIX-type systems, the ‘/’ directory is displayed. On Windows systems, the current available driver letters are displayed. If ‘-rootlist’ is specified with no directory list, the current working directory is not scanned and the process exits.
-nodirs Used to suppress the display of all entry names corresponding to directory entries.
-pwd Used to display the current working directory, followed by an immediate process exit. No directories are traversed and no other command line options are acted upon.
-- Acts as a syntactic separator for the <dir_path_list> and <option_list> command portions. It has no operational effect.
-Version Used to display the executable release version number and build information, followed by an immediate process exit.
-8.3 Used on Windows systems to indicate that, when available, the "classic 8.3" format of the entry name should be used for pattern matching and output display purposes. This option is silently ignored when running on UNIX-type systems.

-help

-?

Used to display this help information block (to "stderr"), followed by an immediate process exit.

Understanding Job Template Libraries

A Job Template Library is the container for a Job Template or Workflow that you create or modify to automate data transfer operations. Each Job Template or Workflow must belong to a unique Job Template Library. Once a Job Template Library is created, you can create and define your Job Template or Workflow graphically on the Workflow Canvas.

To assemble a Job Template or Workflow, you drag components from a toolbox onto the Workflow Canvas, linking the components together to create a customized transfer and automation path. Once components are positioned on the Workflow Canvas they may be opened for modifications, or if needed, you can create your own components.

The following is an example of the Job Template interface:

It is implied that components in a group are linked to the end of the group. The interface has two main areas:

  • A workflow object toolbox (contains reference components)

  • A workflow layout canvas (sequenced instances of components)

The components toolbox contains icons that represent objects that you can include in a workflow. These components represent tasks (such as schedule, file transfer, e-mail notification, and so on) that you combine to create the workflow by dragging an icon from the toolbox to the canvas.

The following table describes the Signiant workflow components commonly available in this release, there may be additional components not listed in this table. (Not all components are available for use: some require user licences.)

Name Purpose Icon
Editors
AvidEditorStart Specifies the start properties (such as source host, target host, and so on) for transferring media among Avid workgroups.
AvidIngestMedia Specifies a Remote command to move staged media to final location and ingest a metadata file via the Avid TM (Transfer Manager)-Auto API.
AvidResolveExistingMedia Specifies a remote command to determine which media files already exist at the target, in an Avid transfer.
AvidtoAvidTransfer Specifies a File Transfer Component for Avid TM (transfer manager) to Avid TM (transfer manager) transfers.
Foundation
Command Specifies a command script, including properties such as the agent on which the script runs, the user name as which it runs and so on.
DropBoxStart Includes predefined prompts that can be used to implement drop box functionality, which transfers data between a single source location and one or many destination locations. It polls the specified location at a user-determined interval and sends any changes detected in the source location to the specified destination locations. This allows you to transfer common files from a central repository. The Interrupt on Failure attribute can be manually added to any of your Start components (ensure the Name and attribute type are correct).  
FileList Produces a list of files on an agent based on include/exclude criteria. This list is an input to subsequent components in a workflow.
Group Job components can be grouped to address more complex data transfer requirements. When job components are associated as a group, you can configure them to run in a sequence or concurrently, specify the path type, or associate commands and conditions with them. Note: It is implied that components in a group are linked to the end of the group.
ScheduledStart Specifies the start date and time when the workflow will run. Also includes properties relating to how the workflow will run (Use UDP Control Channel, Run Until Successful, and so on). The Interrupt on Failure attribute can be manually added to any of your Start components (ensure the Name and attribute type are correct).
Notification
Email A simple component used to send a user-defined email message.
Email Attachment A simple component used to add an attachment to an email.
JobReport A component which generates a full job report, emails it to specified recipients and optionally sends out SNMP traps related to job completion status.
SNMPtrap A simple component used to generate an enterprise SNMP trap and send it to one or more trap receivers (network management stations).
Transport
FTPPut Specifies an FTP “Put” transfer, including properties such as agent, server, port, files and target directory for the transfer. Note: You must always place a FileList component before the FTP Put component in a workflow.
FTPGet Specifies an FTP “Get” transfer.
SigniantAggregate Specifies an advanced pull file transfer that exposes all available properties, including bandwidth, and delivery type.
SigniantDistribute Specifies an advanced push file transfer that exposes all available properties, including bandwidth, and delivery type.
SigniantSimple Specifies a file transfer workflow item, including properties such as bandwidth, and delivery type.
SmartJog Specifies a SmartJog transfer, including properties such as agent, server, port, files and target directory for the transfer.
Watermark
NexGuard Specifies a Thomson NexGuard transfer, including properties such as agent, server, port, files and target directory for the transfer.

All components have a set of properties that control their behavior. A property value can be either fixed, or a value that is passed into the component from a previous component in the workflow (an input). All components produce information as they execute, and can export information exposed by keywords as outputs.

Dragging an out node onto an in node links the nodes and components. You must have Component Editor privileges to edit existing components and create new ones.

Workflow Canvas

The Workflow Canvas is where you can do the following:

  • Create new Workflows or Job Templates for specific tasks.
  • Edit and create Workflow or Job Template components.
  • Publish and run the reference Workflows or Job Templates (such as SmartJog, NexGuard, Agility, and so on) for which they are licensed.

Component Editor Dialog

The component editor dialog displays the component properties that are associated with the selected component, as in the following:

The left pane lists input properties that are associated with the selected component. The Add option allows you to create custom properties. To delete a property, click on it and choose Delete. You are not prompted to confirm deletion.

Icons beside the properties indicate the following:

Indicates the property is not an output or an input, nor is it visible.

The binoculars indicate the property is visible. The arrow pointing down indicates the property is an output.

The binoculars indicate the property is visible. The arrow pointing up indicates the property is an input.

Indicates the property is a command with a value.

Indicates the property is a command.

Indicates the property is a group.

You can drag and drop properties onto other properties to move them. Properties containing other child properties are ignored by the system and used only for grouping.

You can edit a property by selecting it and clicking the pencil icon. When editing a property, you can drag and drop or double-click another property, and it will be inserted at the point where the cursor appears in the right-side of the screen. Click Update to save changes to the property, or Cancel to exit the property editing screen. The properties that appear in the left pane vary depending on the type of component you are editing.

Inputs/Outputs

You can specify whether a component generates outputs and/or accepts inputs. Workflow developers link different components so that the output of one component (for example, a list of files) becomes the input of another component. Note that you must set all outputs with using the dds_property_set syntax or the SigSetOutput function.

Component Instances

Signiant makes a distinction between "versions" of a component and "revisions" of a component. There are different actions you can take to revert to or update different instances of a component, depending on whether you are working with a revised or versioned component.

Component Revision

Every time a user opens a component and saves it by clicking OK, a new revision of the component is created. The component editor screen allows you to switch among revisions, by choosing the revision number from the drop-down list, clicking Apply (beside the "Switch to revision" field) and clicking OK. To go back to a previous revision of the component, follow these steps:

  1. On the Workflow Canvas, right-click the component and choose Edit Component.
  2. In the Switch to revision drop-down, choose the revision of the component you want to use and click Apply.
  3. Click OK to save and exit.

    When you click OK, the revision numbering (as displayed in the component editing dialog) of the component on the canvas still increments by one. The CONTENT of the component will be the same as the earlier revision you selected, however, the revision will be listed as the latest revision. For example, if you have 4 revisions, choose to revert to revision 2, and then you save the component, it will be listed as revision 5.

Component Versioning

If you re-publish a component with the same name, Signiant overwrites the existing version and creates a new version of the component in the toolbox area (for example, Version 2, Revision 1). However, the component on the canvas will still be the older version. If you want to update the version on the canvas, you can drag the new version from the toolbox and drop it over the one on the canvas. You are prompted to update the component on the canvas. Once updated, the component on the canvas will be labeled with the same version and revision number as the one in the toolbox from which you updated it (for example, Version 2, Revision 1).

Update Instances Window

To update all instances of the component (or select specific instances to update), right-click the component in the toolbox and choose Update Instances. Place a check beside the instances you want to update, or choose Select all to update all instances of the component.

Switch Versions Window

To return to a previous version of a component that has more than one version, in the tool bar, right-click the component and select Versions. In the Switch to Version dialog, select the version to which you want to revert and click OK.

When you switch to the previous version, the revision numbering (as displayed in the component editing dialog) of the component on the canvas still increments by one. The CONTENT of the component will be the same as the earlier version you selected, however, the revision will be listed as the latest revision. For example, if you have 4 revisions, choose to revert to revision 2, and then you save the component, it will be listed as revision 5.

Specializing vs. Publishing a Component

A specialized component is a published component that a user has modified, but not yet published. A "starburst" icon on the component indicates that it has been specialized. A specialized component is a single instance of the component, available only in the workflow canvas in which it was created

A published component is one that is available globally in the software. Once you have specialized a component, you may want to publish it so that others can use it.  You may also want to create documentation for your component in PDF format as well as an icon in PNG (Portable Network Graphics) format.

Component Icon Variations

Signiant uses several small icons to indicate different states for components (for example, if they have been modified, licensed or vary between instances on the canvas and in the toolbar.

The following section describes these variations:

A starburst icon indicates that the component has been edited (specialized) in the canvas.

An exclamation mark indicates that the instance on the canvas is different from the version of the component in the toolbar. For information on updating the instance of the component.

A large "x" in the middle of a component indicates that the icon associated with the icon is not available. The component is still available in the workflow.

A small "x" on an icon indicates that the component is not installed or not licensed.

Permissions

Permissions allow administrators to control user and group access to management objects. Access permissions include Read, Edit, Delete, View Jobs, Schedule, and Publish. By default, all users are able to read and edit their own user properties.

To add permissions, follow these steps:

  1. Select the Permissions tab.
  2. In the Available Users/Groups list, select the users or groups to add to the Current Permissions list.
  3. Click the appropriate check boxes beside the corresponding permissions.
  4. To remove permissions, in the Current Permissions list, select the user or group you want to remove and then click Remove.
  5. Click the OK button when complete.

Configuring Component Properties

When you create a component, you can choose from a list of input and output properties. The following sections describe these properties. At the end of the description of each of the properties is a list of sample components that contain the property. Properties containing other child properties are ignored by the system and used only for grouping.

You can configure component properties to use prompts. Prompts enable a user to customize the
actions of a component when scheduling it to run as a job. For example, you can provide a prompted
variable called “FileType” that asks “What type of file should be sent?” The user can then choose a
value (such as *.txt, *.doc, *.pmt, and so on) from a pick list. The value the user suggests is then available for all templates and links within the same job template.

Note that once you create a variable, it does not automatically get used in the template. You must apply the variable in other areas of the template (such as an override for a specific option such as the source agent user name). When combined with the overrides that appear in the various tabs and screens, variables allow users to specify which values they can choose from when overriding a value.

Once you create a variable, you need to build the logic behind that variable. For example, if you create a variable that prompts what type of file the agent should transfer, you need to tell the agent what to do with the value a user chooses. For example, if a user says they want to transfer *.doc files, you need to include the script or command that tells the transfer agent to transfer only *.doc files. You can create this command in the Commands tab or in the Files tab.

Prompt Editor Window

  1. In the Manager, select Jobs > Templates.
  2. Double-click the Job Template Library in which you want to create a property prompt.
  3. Right-click the start component (ScheduledStart or DropBox) and choose Edit Prompts.
  4. Click the property for which you want to create/edit a prompt and click Edit.

The Prompt Editor window displays the name and type of prompt. Depending on the type of prompt selected from the drop-down list, different fields will appear in the Prompt Details area of the window. The following describes the fields in the Prompt Editor window:

  • Prompt Name: The name of the prompt you are creating.
  • Prompt Type: The kind of prompt you want to add to the property (see Prompt Types).
  • Prompt Details: Miscellaneous fields appear, depending on the prompt and property type.

Prompt Types

Prompt Type Description

Agent

Prompts users to choose from a list of Agents. Only agents to which a user has access will appear in the agent list prompt. Users can assign a default value of one of the agents in their system. If this is not a required field, you can leave this field blank. To remove a value from this prompt, select the prompt and press the Delete key. List options for the type of agents the prompt will display
include “All Agents,” “Windows Agents”, “Non-Windows Agents,” “Agents in a
Group” (from which users can select a specific agent group), or a specific
agent platform. As well, users can specify the following:

  • exclude agents with aliases
  • exclude agent groups
  • exclude the Manager agent
  • allow users to select more than one host
  • whether the variable appears as “IDs” (Windows Security IDs/UNIX
    User IDs) or “Names” (Windows User Names/UNIX User Names).
    Using “IDs” allows for more selection.

Bandwidth Limit

Prompts users to choose the percentage of bytes per second to a maximum of whatever the CPU can handle, or a percentage of a selected network connection (for example, 75% of 128 Kbps). Note that bandwidth limiting is done on each stream connection, so a value specified here is passed to the controlling agent for each template executed and divided among the streams as follows:

  • It is divided by the number of streams the agent will use for each remote agent (typically four).
  • It is further divided by the potential number of concurrent remote agents. This will be the lesser of the maximum allowed number of concurrent agents, and the number of remote agents specified in the template.

    Note that bandwidth throttles may also be employed by other network devices and policies (e.g., QoS), therefore, a bandwidth throttle (or target maximum) defined here may not be achievable. If you are having difficulty achieving a particular bandwidth target ensure that other policies are not impacting your ability to reach the desired throughput.

Bandwidth Limit Time of Day

Prompts users to specify start/end times and days of the week for certain bandwidth limits to be in force. Users can assign default values in a percentage of bytes per second or a percentage of a selected network connection type. Once a job has started, all bandwidth throttles are applied at the times based on the Daylight Savings Time (DST) in effect when the job started. If DST changes while the job is running, bandwidth time of day changes may be off by the time change value (plus or minus an hour) after the time change.

Note that bandwidth throttles may also be employed by other network devices and policies (e.g., QoS), therefore, a bandwidth throttle (or target maximum) defined here may not be achievable. If you are having difficulty achieving a particular bandwidth target ensure that other policies are not impacting your ability to reach the desired throughput.

Boolean

A simple Yes or No choice.

CheckRadio

Prompts users to choose a value from multiple checkboxes. Users list multiple values separated by commas. As well, users can specify whether only one or multiple values can be selected, and the type of separator that will be used.

CheckRadioForm

Prompts users to choose a value from multiple radio buttons. Based on the radio button selected the sub-prompts change. You can specify the number and type of sub-prompts displayed. Sub-prompt types include: text field, number field, or picklist.

Date Time

Prompts users to enter the date and time at which the job should run. Options include the date in the following forms: yyyy/mm/dd hh:mm:ss OR yyyy/mm/dd OR hh:mm:ss.

Day of Week

Prompts users to choose which days of the week the job will run. Options include any combination of days of the week, Sunday through Saturday, inclusive.

Email Address

Prompts users to include their e-mail address for notification. There is an option to specify the logged in user's e-mail address as the default by clicking on Default to Logged in User's Email Address.

File Directory

Prompts users to enter the directory path for the job in a text field. The prompt will display the agent directory, allowing the user to click through to the desired directories.

List options include “All Agents”, “Source Agents”, “Target Agents”, “Windows
Agents”, “Non-Windows Agents,” “Agents in a Group” (from which users can
select a specific agent group) or “Only the Agent(s) Specified via a Linked
Agent Variable”. As well, users can specify whether the variable is limited to directory only, file only or directory and file, the text box size in characters, an agent variable to link to this variable, the (optional) default user as which to browse directories and the path limit (users are able to access only certain directories), If you specify a path limit, you can also set whether or not the limited path is displayed in the browser, and whether or not truncated paths are used when saving the form.

The "Allow local file uploads" option applies only to Media Exchange, and has no impact on jobs a user accesses through the Web Interface. Enabling this value allows a user to browse files on their local PC. The default is not enabled.

Findfiles Date Filter

Prompts users to filter by date when running a transfer. Users can specify a default value by date, unit of time or when the file was created, last accessed or modified.

Findfiles Directory Depth

Prompts users to choose whether a transfer includes the working directory and all sub-directories or the working directory only. Users can specify a default value of "Working Directory and all Sub-Directories" or "Working Directory Only".

Findfiles Size Filter

Prompts users to filter on the size of files when running a transfer. Users can
specify a default value of “less than”, “greater than”, “less than or equals”, or
“greater than or equals” a specific number of kilobytes. Use the two fields to create
a range. For example, “less than 100 kb, greater than 60 kb”.

Include or Exclude MX User Group

Specify whether to include or exclude the MX user group.

Include or Exclude Selected Channels

Specify whether to include or exclude the specified channels (or categories).

Index

Indicates the point of execution within the List. This is used within an Iteration Group.

Iteration Count

Indicates the number of times a List should be executed. Configure this prompt when working with an Iteration Group

Job Frequency

Prompts users to choose how often a job runs. Users can choose from basic or advanced frequency. Under the basic frequency, users can specify none, hourly, daily, weekly, monthly, yearly, monthend or once. Under the advanced frequency, users can specify the options in the basic frequency category, as well as the following:

  • every x unit of time (i.e., minutes, days, and so on)
  • a specific day of the week (i.e., the first Sunday, the second Monday and so on, of each month)

List

Can be either a TokenList or a Siglist. A TokenList can have multiple elements that are executed in order. A SigList has only one element. Configure this prompt when working with an Iteration Group.

List Size

Indicates the number of elements in the List. Configure this prompt when working with an Interation Group.

Pick List

Prompts users to choose a single or multiple value from a list. Users list multiple values separated by commas. As well, users can specify whether only one or multiple items can be selected, and the type of separator (comma, space, semi-colon, colon).

MX Category

The name of the Media Exchange category (or channel) that you want to include.

MX User Group

The name of the Media Exchange user group that you want to include.

SmartJog Certificate

Prompts for the SmartJog Certificate. It is filled in behind the scenes and is not seen as a visible prompt to the user.

SmartJogServer

The name of the Smart Jog server. Note that the name must be resolvable by DNS. Specify the SmartJog Server. This field is automatically populated with a list of SmartJog servers defined in the system, for users to choose from.

Text Display

Displays a constant value to the user; may also be useful as a label. Users can specify whether or not the variable is a section header in the schedule screen. Users can also specify whether the section header appears expanded or collapsed by default.

Text Box

Prompts users to enter a value. Users can specify that the text in the field must be a certain type (any text, integer, float) and the size of the text box in characters. They can also specify whether the text appears as"clear" text or "masked" text. Clear text appears as regular characters when a user types in the field. Masked text appears as asterisks.

Time

The time associated with the prompt. Specify the time in hh:mm:ss format.

Timezone

The timezone associated with the prompt.

Value

This is the value at the List execution point. This is used within an Interation Group.

Prompt Details

The following describes some of the additional options associated with prompts. Not all of these prompt options are available with every prompt type. They vary depending on the type of prompt specified:

  • Break on Error: When enabled, the Iteration Group will stop executing upon detection of an error. This option is enabled by default.
  • Browse User: Specify the (optional) default user as which to browse directories. Associated with the File/Directory Path prompt type.
  • Default Value(s): The default value(s) that will appear in the prompt.
  • Display As: Whether text entered in the field is clear (as is) or masked (appears as asterisks).
  • Expanded By Default: Whether or not the section appears expanded by default in the scheduling screen. Associated with the Text Display prompt type.
  • Field Validation: A drop-down list of values that indicate the correct format for information to appear in a field. For example, with the date/time prompt, one can choose the date/time format (for example, yyyy/mm/dd hh:mm:ss). With Delivery Poll Interval, one can choose Integer.
  • Linked Agent Variable: Associated with the File/Directory Path prompt type.
  • List Labels: Labels for the list.
  • List Values: A comma-separated list of items or a range of numbers from which a user can choose.
  • Multiple Selections: Whether a user can choose from more than one item. When Yes is selected, the user can then choose the type of separator used for the list of items. Users can choose from comma, space, semi-colon, colon or tab. This allows users to set the separator type based on their scripting language needs.
  • Prompt Placement: Whether the prompt appears to the left, right or above the values.
  • Required Field: Whether or not the user must select the field in order for the job to run.
  • Scope: Whether the variable includes the directory only, the directory or file, or the file only. Associated with the File/Directory Path prompt type.
  • Section Header: Whether or not the variable appears under a section header in the scheduling screen. Associated with the Text Display prompt type.
  • Separator: The kind of character (comma, space, semi-colon, colon) that separates a list of values.
  • Text Box Height: The height of the text box, in characters.
  • Text Box Width: The width of the text box, in characters.
  • Type: Associated with the Job Frequency prompt type. Allows users to choose from basic or advanced. The basic option allows users to choose from a list of default values including hourly, monthly, weekly and so on. The advanced option allows users to choose more specific time options such as a certain day of the month.

Previewing Prompts

To preview the prompts associated with a start component, do the following:

  1. In the Manager, select Jobs > Templates.
  2. Double-click the Job Template Library in which the component you want to preview is located.
  3. Right-click the start component (ScheduledStart or DropBox) and choose Preview Prompts.
  4. Click Cancel to return to the component and edit the prompts, or fill in the prompts and click Validate to run a job.    

Understanding Advanced Features

All Workflow components can make use of more advanced features such as prompted inputs and scripts.

Automation and Application Integration

You can enter a script in any Commands property type in a Component. The script should use a scripting language that is available to the agent on which the workflow component runs. Note that dds_perl is installed on all agents as part of the agent installation process. This scripting capability enables you to create powerful pre- and post-processing of data and advanced notification techniques. Every command you specify in a component should include the SolutionStandardHeaderPerl script library at the beginning. This script library provides the following functions:

  • Sets Perl for Unicode support
  • Sets standard output and standard error for automatic flushing
  • Sets the Perl variable LOGLEVELNAME to one of the strings ERROR, WARN, INFO, or DEBUG, depending on the setting of the input Log Detail Level. If the level is DEBUG, then set the Perl variable DEBUG to true

You can select the SolutionStandardHeaderPerl script from the “Insert Script …” drop-down in the Script editing screen.

The following limitations apply to scripts used in components:

  • Scripts must run in non-interactive mode (in other words, require no end-user intervention).
  • Scripts must in no way depend on the use of graphical interfaces. If a script attempts to create a graphical element, the workflow component fails immediately and an error notification is generated. Normally this is an issue for Windows-based agents, but may also be a problem on some UNIX-based installations.

Commands Window

Use the Commands Window as a workspace to create the commands you want to use with any given component.

To access the Commands window, do the following:

  1. In the Manager, select Jobs > Templates.
  2. Double-click the Job Template Library with the component to which you want to add a command.
  3. Right-click the component in which you want to specify a command, and choose Edit Component.
  4. In the left menu of the dialog, choose a command from the list of commands.
  5. Click the pencil/edit icon in the Value prompt.
  6. You can modify an existing script, by clicking within it.

    The following describes the fields in the Commands page:

    • Input: The values in this field vary, depending on the inputs defined for the component.
    • Notification Keyword: Keywords that relate to jog log/status notification.
    • Property Keyword: Keywords that relate to various properties.
    • Reference Script: Allows you to insert a reference to an existing script (located in the script library).
    • Insert Script: Allows you to insert the code from an existing script (located in the script library).
    • Preview: Clicking here opens a new window, and shows any referenced scripts expanded inline with line numbers. This helps with script debugging when a job reports line numbers associated with component errors.

     

  7. Click Preview to view the script (including reference scripts).
  8. Click Apply to save the script.
  9. Click OK to exit.

If you create a script that you want to reuse regularly, you can add it to the Script Library. Scripts created in the Script Library are displayed in the Reference Script and Insert Script drop-down lists. These drop-down lists allow you to refer to or insert a script stored in the Script Library. Whenever possible, use script referencing. You can make changes (bug fixes, and so on) in just the one script, instead of having to search for every command in which a commonly-used script appears.

Common Scripting Environment

The Signiant software is distributed with a common scripting environment which can be referenced by using the following keyword on the first line:

#!dds_perl
.Using a common scripting environment allows users to minimize processing issues among non-heterogeneous systems (Windows and UNIX hosts taking part in the same workflows).

The script library allows you to store scripts which are commonly used in the commands found in workflow components. They can be referenced or inserted into component commands. The primary advantages include the following:

  • update bug fixes to scripts in one common location (i.e., instead of having to update commands within each affected component).
  • share scripts among workflow component developers.
  • import and export scripts (e.g., between development and production Signiant Managers).

Referencing a script library script simply inserts a token in the command script which represents the referenced script. This option should be used when you wish to use the most up-to-date functionality that is available for a common script.

Inserting a script into a command takes a copy of the existing script and places it directly into your command script. This option should be used when you need to modify an existing script to create functionality that is unique to the workflow component you are building.

Alternate Scripting Language

You use the first line of any command script to inform the agent which script interpreter it should use. You can also use the first line to indicate the file extension which should be given to the file. This allows Windows based-systems to launch the appropriate associations.

Syntax:

#!{script interpreter with options}#ft={file extension to be used}

For example, on Windows, to use WSH (Windows Script Host), use the following:

#!cscript //NoLogo#ft=vbs 
Wscript.Echo "Command terminated successfully"

The option "//NoLogo" must always be used, as it will ensure that the agent launches no dialog windows. The agent executes commands in the following manner:

  • The agent takes the contents of the command and substitutes any variables which it can now resolve (i.e., any variables that were defined using the %{variable}% syntax).
  • The agent takes the contents of the newly-substituted command (i.e., the lines which make up the command) and writes them to disk in either C:\tmp or /tmp (depending on the OS of the agent).
  • The agent writes this file out using the following file naming syntax:

    "dds_agent_{process id}.{command interpreter extension}"

    For example:

    c:\tmp\dds_agent_a01234.vbs

    .
  • The agent executes this file and traps the return within the normal template execution.

By default, if no script interpreter is specified on the first line of the script, a Windows-based Agent will try to execute any commands using the default shell which is CMD. By default, if no script interpreter is specified on the first line of the script, a UNIX-based transfer agent will try to execute any commands using the default shell which is SH. There is now a common scripting environment which the command can reference by using the following keyword on the first line:

#!dds_perl
.Safe Handling of Component Inputs with Scripts
 

To safeguard against a security vulnerability we recommend the safe handling of inputs. The following example demonstrates this safe handling using the remote execution of commands:

my $default_directory = <<'__myHereDocTerminator__';
%dds_default_directory%
__myHereDocTerminator__
chomp $default_directory;						
						

Script Library

The script library allows users to create frequently-used scripts or portions of scripts and store them in a central location from which they can be imported, exported or reused in workflow components. Users can either copy the particular script from the script library into the workflow component, or create a reference to the particular script located in the script library. Script library components are available globally across the Manager. There are no Access Control Lists associated with them, so any user with access to the Script Library menu item can use scripts that are in the library.

  1. In the Manager, select Developer > Script Library.
  2. Creating and managing scripts is composed of the following:

Add

To add a script to the script library, do the following:

  1. Click Add the dialog.
  2. In Script Name, type a name for the script.
  3. In Script Code, type the script you want to create.

    Use the Insert Notification Keyword and Insert Property Keyword drop-down lists to assist in creating the script.

  4. Click Save.

Delete

To delete a script from the script library, do the following:

  1. Select the script to delete and click Delete.
  2. Click Yes at the confirmation prompt.

Import

To import a script library file, do the following:

  1. Click Import .
  2. In File To Import, type the directory path of the file you want to import, or click Browse to choose the directory path.
  3. Click do not import if you want to ignore a script library that has the same name as an existing script library, or click import as a new version to import the file.
  4. Click Import.

    On import a results page displays messages and warnings which are generated during the import of a script library.

Export

Exporting the script library allows you to save the scripts in the library to a different location, as a backup or to use in another program. The entire script library is exported in an XML-formatted file, allowing you to edit or use the file as needed.

To export a script from the script library, do the following:

  1. Click Export .
  2. Follow the directions to specify the location to which you want to export the script.

Version

The Version drop-down in the top right corner allows users to select different versions of the script. Click Set Active Version to make the selected script the active version in any component in which it is inserted or referenced. By default, a maximum of ten versions of the script are stored in the database. The Version drop-down displays all the versions that are in the database.

To view versions of a script, do the following:

  1. Select the script for which you want to view version history.
  2. In the Version drop-down, select the script version to view.
  3. Click Set Active Version beside the version of the script to make the current version.

Edit

To edit a script, select the script and apply your changes and click Save.

Rather than edit the code for a component command from within the component using the Manager, it is possible to reference a file on disk from within the command. In this manner, you can modify the code using an editor of choice, and every time a job is running that uses this component, it reads the latest code from disk. Use this method only during the development of a component. Never use it on a production system.

To set up a command to reference a file on disk, replace the component command with a single line (it must be the first line of the script) such as the following:

Windows Manager:

file://c:/tmp/myscript.pl

Linux Manager:

file:///tmp/myscript.pl

You can then edit the myscript.pl file as desired. When the component runs, it automatically reads the latest code from disk. The file is read from the Manager even if the command is being executed on a remote agent.

Interval Statistics

You may want to allow running Jobs to send statistics back to the Manager to show the current progress of the Job. Signiant uses two special tokens for this purpose: twu (total work units) and wuc (work units complete). In the sample script below, if twu=10, then wuc would equal the number of files transferred, and would increment as each file is transferred. The percentage of the job completed is calculated based on the number of work units transferred in relation to the total number of work units.

There are two command you use to do this:

   %dds_cmd_capable%

(Called once to tell the agent the application can report statistics.)

   %dds_cmd_statistics%

(Called throughout the job to report back current job progress.)

Sample script snippet:

             sub DdsCmdCapable () { 
                 print STDOUT "%dds_cmd_capable%\n"; 
             } 
               
             sub DdsCmdStats { 
                 my $toTransfer = shift; 
                 my $transfered = shift; 

                 print STDOUT "%dds_cmd_statistics%twu=$toTransfer ,wuc=$transfered\n"; 
             }
           

Script Definitions

Signiant Core

These functions provide functionality directly related to interacting with Signiant-specific files, directories, and data structures. This information should never be accessed directly as the format may change.

GetBaseDDSInstallDir()

Retrieves the base installation directory for the Signiant software. By default, this is:

  • Linux:

    /usr/signiant/dds

  • Windows:

    C:/Program Files/Signiant/Mobilize

    Note that the Windows path is returned with forward slashes, not back slashes.

Parameters

  • None

Return

  • $installDir Installation directory

getBaseDDSConfigPath()

Retrieves the path to the Signiant configuration file. By default, this is

  • Linux:

    /etc/dds.conf

  • Windows:

    C:\Program Files\Signiant\Mobilize\bin\dds.cfg

Parameters

  • None

Return

  • $installDir Installation directory

readConfig($configFile)

Reads the signiant.ini file into an associative array.

Parameters

  • $configFile Path to signiant.ini file

Return

  • %ini Key/value pairs for all entries in signiant.ini

readDDSConfig($configFile)

Reads the dds.conf (Linux) or dds.cfg (Windows) file into an associative array

Parameters

  • $configFile Path to signiant.ini file

Return

  • %ini Key/value pairs for all entries in signiant.ini

Scheduler Service

The SchedulerService script provides a set of functions that provide easy access to the Signiant web services SOAP interface. The following is an example:

   use SOAP::Lite;

   my $schedSoapCalls = eval{new SchedulerService();} or die ($@);

   $schedSoapCalls->setup("https://myserver/signiant", "admin", "password");
   $schedSoapCalls->commandForJob("jobGroup", "jobName", "kill");
         

With all subroutines that return a return code, a zero indicates success, non-zero for failure. Details of a fault/failure can be retrieved from $object->lastError().

addJobVariable($username, $password, $jobGroupName, $jobName, $name, $value)

Adds a custom job variable.

Parameters

  • $username the username for the user calling this method
  • $password the password for the user calling this method
  • $jobGroupName the name of the job group
  • $jobName the name of the job
  • $name the name that will be mapped to the value of the variable
  • $value the value that will be mapped to the name of the variable

Returns

  • $rc 0 = success, non-zero = failure

commandForJob($jobGroup, $jobName, $action)

Performs a specific action on an existing job.

Parameters

  • $jobGroup Job group name
  • $jobName Job name
  • $action Action to perform on job

    Valid actions:

  • delete Deletes job
  • force Starts job running
  • kill Cancels running job
  • resume Resumes schedule for suspended job
  • suspend Suspends schedule for job
  • setbwlimits_#:#:# Sets one or more bandwidth limits for a given job. "#" is replaced by the overall bandwidth, the bandwidth floor, and the bandwidth ceiling respectively. You may leave any of the limits blank. Values are in bits/second. Non-WAN accelerated transfers ignore the floor and ceiling parameters.

Returns

  • $rc 0 = success, non-zero = failure

dumpComplexHashAsString($depth, %theHash)

Returns the supplied hash as a string.

Parameters

  • $depth Starting depth for tabbing
  • %theHash A hash

Returns

  • $string String containing hash

getActiveStatusMsg($jobGroup, $jobName)

Retrieves the active status message (custom message from a component) from either the current running instance of this job or the last message from the most recent run of this job.

Parameters

  • $jobGroup Job group name
  • $jobName Job name

Returns

  • $rc 0 = success, non-zero = failure
  • $statusMessage Last status message issued from the job

getJobActiveState($jobGroup, $jobName)

Retrieves the active state of job.

Parameters

  • $jobGroup Job group name
  • $jobName Job name

Returns

  • $rc 0 = success, non-zero = failure
  • $state Current active state, one of RUNNING or IDLE

getJobGroupId($username, $password, $groupName)

Retrieves the int ID for the given job group.

Parameters

  • $username the username for the user calling this method
  • $password the password for the user calling this method
  • $groupName the name of the group

Returns

  • $rc 0 = success, non-zero = failure
  • $jobGroupID the ID for the job group

getJobInfo($jobName, $jobGroupName)

Retrieves information for a given job from the specified job group name.

Parameters

  • $jobName the name of the job
  • $jobGroupName the name of the job group

Returns

  • $rc 0 = success, non-zero = failure
  • $jobName the name of the job
  • $jobGroup the name of the job group
  • $jtlName the name of the template library in which the workflow of the job is located
  • $jobTemplateName the name of the actual template (workflow) that is used for the specified job
  • $timeZone the time zone that is used to calculate the original date values that are stored for the job
  • $jobGroupID the job group ID
  • %variables where %variables contains any array of hashes for other variables

getJobScheduledState($jobGroup, $jobName)

Retrieves the scheduled state (DORMANT or SUSPENDED) of job.

Parameters

  • $jobGroup Job group name
  • $jobName Job name

Returns

  • $rc 0 = success, non-zero = failure
  • $state Current scheduled state, one of DORMANT or SUSPENDED. Dormant means the job will run at its next scheduled start time, suspended means the job will not run at its next scheduled state time.

getJobStatus($jobGroup, $jobName)

Retrieves both the active state and the scheduled state of the job. The active status (RUNNING or IDLE) and schedule status (DORMANT or SUSPENDED) are separated with a plus (+) sign.

Parameters

  • $jobGroup Job group name
  • $jobName Job name

Returns

  • $rc 0 = success, non-zero = failure
  • $jobState Current active state (RUNNING or IDLE) followed by current scheduled state (DORMANT or SUSPENDED) separated by a plus "+" sign.

getLastJobResult($jobGroup, $jobName)

Retrieves the overall job exit code for the specified job for the most recent run of the job.

Parameters

  • $jobGroup Job group name
  • $jobName Job name

Returns

  • $rc 0 = success, non-zero = failure
  • $exitCode Job exit code. 0 = Success, non-zero = failure

getStats($jobGroup, $jobName, $runNumber, $fields, $fieldSep, $recordSep)

Retrieves transfer statistics for a given job run. Will always return the following fields (regardless of whether or not they are requested):

package_name (also known as component name)

package_type (also known as component type)

source_host

target_host

Other valid fields: job_id, package_name, status, effective_bytes, byte_count, files_deleted, remote_start_time, transfer_end_time, bandwidth_throttle, transfer_rate_min directories_deleted, file_attr_bytes, file_bytes_skipped, ovhd_src_rsync, ovhd_tgt_mnfst_comp, ovhd_src_cchnl, sf_bytes_sent, udp,_aggressiveness, udp_pkts_recvd, udp_pkts_resent, unrecovered_errors, proc_data_bytes_unconsumed, tnnl_data_bytes_recvd, source_host, package_type, file_count, files_transferred, failed_files, names_cmd_end_time, remote_end_time, transfer_rate, directories_transferred, failed_directories, file_data_bytes_comp, file_bytes_deleted, ovhd_tgt_rsync, ovhd_src_prtcl, ovhd_tgt_cchnl, ntwk_bytes_recvd, udp_payload_size, udp_pkts_rjctd, udp_ceiling, total_errors, tnnl_data_bytes_sent, agent_stat_time, target_host, transport_type, directory_count, files_skipped, agent_start_time, transfer_start_time, agent_end_time, transfer_rate_max, directories_skipped, file_data_bytes, file_attr_bytes_comp, rsync_bytes_skipped, ovhd_src_mnfst_comp, ovhd_tgt_prtcl, sf_bytes_recvd, ntwk_bytes_sent, udp_header_size, udp_pkts_sent, udp_floor, proc_data_bytes_sent

Parameters

  • $jobGroup Job group name
  • $jobName Job name
  • $runNumber Job run number
  • $fieldsStatistics Fields to retrieve
  • $fieldSep Field separator to be used in returned string
  • $recordSep Record separator to be used in returned string

Returns

  • $rc 0 = success, non-zero = failure
  • $stats String containing requested statistics with specified field and record separators

listGroups

Returns a list of group names and contractIDs.

Parameters

None

Returns

  • $rc 0 = success, non-zero = failure
  • %groupInfo List of group names and contract IDs.

listJobs($jobGroup)

Returns a list of job names within a given job group.

Parameters

  • $jobGroup Job group name

Returns

  • $rc 0 = success, non-zero = failure
  • @jobs List of matching jobs

listQueuedJobs($username, $password, $groupName)

Retrieves a list of queued jobs.

Parameters

  • $username the username for the user calling this method
  • $password the password for the user calling this method
  • $groupName the name of the group

Returns

  • $rc 0 = success, non-zero = failure
  • @queuedJobs the list of queued jobs

listResourceControlQueue($username, $resourceControlName)

Retrieves a hash of all queued jobs along with certain properties of those jobs.

Parameters

  • $username the username for the user calling this method
  • $password the password for the user calling this method
  • $resourceControlName the name of the resource control

Returns

  • $rc 0 = success, non-zero = failure
  • %allQueuedJobsInfo a hash of all queued jobs with specific properties of the jobs

listResourceControls

Retrieves a hash of all resource controls.

Parameters

None

Returns

  • $rc 0 = success, non-zero = failure
  • %resourceControls a hash of all resource controls

listResourcesForJob($username, $password, $jobName, $jobGroupName)

Retrieves hash with certain job parameters and the agent resource controls that apply for that job.

Parameters

  • $username the username for the user calling this method
  • $password the password for the user calling this method
  • $jobName the name of the job
  • $jobGroupName the name of the job group

Returns

  • $rc 0 = success, non-zero = failure
  • %queuedJobInfo a hash of job parameters and agent resource controls that apply for the job

modifyJobPriority($username, $password, $jobGroupName, $jobName, $priority)

Sets a priority for a job.

Parameters

  • $username the username for the user calling this method
  • $password the password for the user calling this method
  • $jobGroupName the name of the job group
  • $jobName the name of the job
  • $priority the priority of the job. The options are: Low, Medium, High, Urgent, Immediate.

Returns

  • $rc 0 = success, non-zero = failure

moveJobInQueue($username, $password, $jobGroupName, $jobName, $offset)

Pushes a job on a position in the queue relative to its current position.

Parameters

  • $username the username for the user calling this method
  • $password the password for the user calling this method
  • $jobName the name of the job
  • $jobGroupName the name of the job group
  • $offset the position in the queue on which the job will be moved

Returns

  • $rc 0 = success, non-zero = failure

moveJobInResourceQueue($username, $password, $consumerId, $offset)

Moves jobs that have a specified consumer Id in the queue by some offset.

Parameters

  • $username the username for the user calling this method
  • $password the password for the user calling this method
  • $consumerId the id of the consumer
  • $offset the position in the queue on which the job will be moved

Returns

  • $rc 0 = success, non-zero = failure

newJob($jobGroup, $jobName, $jobTemplateLibrary, $jobTemplate)

Creates a new job. If a job with the same name exists in the job group, an exception will be thrown. The time zone is set to the current time zone. No variables are passed to the job. A new job group will be created if one does not exist and ENABLE_AUTO_JOB_GROUP_CREATE is set to "yes" in the signiant.ini file on the Manager.

Parameters

  • $jobGroupJob group name
  • $jobNameJob name
  • $jobTemplateLibrary Job template library
  • $jobTemplate Job template

Returns

  • $rc 0 = success, non-zero = failure

newJobGroup()

Creates a new job group and returns the ID of the created job group. In the event that the job group already exists, the ID of the existing job group is returned.

Parameters

  • $jobGroupName the name of the job group

Returns

  • $rc 0 = success, non-zero = failure
  • $jobGroupID the ID of the job group.

removeJob($jobGroup, $jobName)

Delete an existing job.

Parameters

  • $jobGroup Job group name
  • $jobName Job name

Returns

  • $rc 0 = success, non-zero = failure

removeJobVariable($jobGroup, $jobName, $name)

Removes a variable from a job.

Parameters

  • $jobGroup Job group name
  • $jobName Job name
  • $name Variable name

Returns

  • $rc 0 = success, non-zero = failure

resetAllJobVariables($jobGroup, $jobName, %varValues)

Clears all of the job's variables and uses varValues as the new set.

Parameters

  • $jobGroup Job group name
  • $jobName Job name
  • %varValues New variable values

Returns

  • $rc 0 = success, non-zero = failure

resumeJobSchedule($jobGroup, $jobName)

Resumes a job that has been suspended. The job will run at its next scheduled start time.

Parameters

  • $jobGroup Job group name
  • $jobName Job name

Returns

  • $rc 0 = success, non-zero = failure

runJob($jobGroup, $jobName, $timeout)

Starts a job (commandForJob("force")) and monitors it (every half second) until the timeout expires.

Parameters

  • $jobGroup Job group name
  • $jobName Job name
  • $timeout Time in seconds that the job will be monitored after it's started

Returns

  • $rc 0 = success, non-zero = failure

setBandwidthCeiling($jobGroup, $jobName, $bwCeiling)

Sets the bandwidth ceiling for the specified job.

Parameters

  • $jobGroup Job group name
  • $jobName Job name
  • $bwCeiling Bandwidth ceiling in bits/second

Returns

  • $rc 0 = success, non-zero = failure

setBandwidthFloor($jobGroup, $jobName, $bwFloor)

Sets the bandwidth floor for the specified job.

Parameters

  • $jobGroup Job group name
  • $jobName Job name
  • $bwFloor Bandwidth ceiling in bits/second

Returns

  • $rc 0 = success, non-zero = failure

setBandwidthLimit($jobGroup, $jobName, $bwLimit)

Sets the bandwidth limit for the specified job.

Parameters

  • $jobGroup Job group name
  • $jobName Job name
  • $bwLimit Bandwidth limit in bits/second

Returns

  • $rc 0 = success, non-zero = failure

setBandwidthLimits($jobGroup, $jobName, $bwLimit, $bwFloor, $bwCeiling)

Sets the bandwidth limit for the specified job.

Parameters

  • $jobGroup Job group name
  • $jobName Job name
  • $bwLimit Bandwidth limit in bits/second
  • $bwFloor Bandwidth ceiling in bits/second
  • $bwCeiling Bandwidth ceiling in bits/second

Returns

  • $rc 0 = success, non-zero = failure

setJobVariable($jobGroup, $jobName, $name, $value)

Sets a single job variable overwriting the existing setting if already set.

Parameters

  • $jobGroup Job group name
  • $jobName Job name
  • $name Variable name
  • $value Variable value

Returns

  • $rc 0 = success, non-zero = failure

setJobVariables($jobGroup, $jobName, %varValues)

Sets the variables that have been passed in via %varValues. Does not remove any other variables that might also have been set prior to this call.

Parameters

  • $jobGroup Job group name
  • $jobName Job name
  • $varValues Variable values (key is variable name)

Returns

  • $rc 0 = success, non-zero = failure

setup($baseUrl, $username, $password)

Sets up the SOAP environment.

Parameters

  • $baseUrl SOAP URL
  • $username Username
  • $password Password

Returns

  • $rc Returns the string "ok".

shortCreateJob($jobGroup, $jobName, $jobTemplateLibrary, $jobTemplate, $timeZone, %varValues)

Creates a new job. If a job with the same name exists in the job group, an exception will be thrown. A new job group will be created if one does not exist and ENABLE_AUTO_JOB_GROUP_CREATE is set to "yes" in the signiant.ini file on the Manager.

Parameters

  • $jobGroup Job group name
  • $jobName Job name
  • $jobTemplateLibrary Job template library name
  • $jobTemplate Job template (component) name
  • $timeZone Time zone
  • %varValues Variable values (key is variable name)

Returns

  • $rc 0 = success, non-zero = failure

shortUpdateJob($jobGroup, $jobName, $timezone, %varValues)

Updates time zone and variable values in an existing job.

Parameters

  • $jobGroup Job group name
  • $jobName Job name
  • $timeZone Time zone
  • %varValues Variable values (key is variable name)

Returns

  • $rc 0 = success, non-zero = failure

showFault($soapResult)

Prints SOAP fault.

Parameters

  • $jobGroup Job group name
  • $jobName Job name

Returns

  • $rc 0 = success, non-zero = failure

startJob($jobGroup, $jobName)

Starts a non-running job.

Parameters

  • $jobGroup Job group name
  • $jobName Job name

Returns

  • $rc 0 = success, non-zero = failure

stopJob($jobGroup, $jobName)

Stops (cancels) a running job.

Parameters

  • $jobGroup Job group name
  • $jobName Job name

Returns

  • $rc 0 = success, non-zero = failure

suspendJobSchedule($jobGroup, $jobName)

Suspends a job's run schedule. If the job is currently running it will continue running until its normal completion.

Parameters

  • $jobGroup Job group name
  • $jobName Job name

Returns

  • $rc 0 = success, non-zero = failure

Server Configuration

This section describes functions associated with the ServerConfiguration script.

GetLinuxCPU()

Retrieves the CPU information from a Linux machine.

Parameters

  • None

Prerequisite Scripts

  • None

Return

  • $cpu CPU information

GetLinuxMemory()

Retrieves the physical memory from a Linux machine.

Parameters

  • None

Pre-requisite Scripts

  • None

Return

  • $memory Memory information

GetLinuxVolumeInfo()

Retrieves the volume information from a Linux machine.

Parameters

  • None

Pre-requisite Scripts

  • SolutionParseUnixVolumeInfo

Return

  • @volinfo [0] = Name, [1] = Total Size (Kbytes), [2] = Used (Kbytes), [3] = Available (Kbytes), [4] = Percent Used, [5] = Mounted Point

GetSolarisCPU()

Retrieves the processor information from Solaris machine.

Parameters

  • None

Pre-requisite Scripts

  • None

Return

  • $cpu CPU information

GetSolarisMemory()

Retrieves the physical memory from a Solaris machine.

Parameters

  • None

Pre-requisite Scripts

  • None

Return

  • $memory Memory information

GetSolarisVolumeInfo()

Retrieves the volume information from a Solaris machine.

Parameters

  • None

Pre-requisite Scripts

  • SolutionParseUnixVolumeInfo

Return

  • @volinfo [0] = Name, [1] = Total Size (Kbytes), [2] = Used (Kbytes), [3] = Available (Kbytes), [4] = Percent Used, [5] = Mounted Point

GetUnixOS()

Returns the operating system details for Unix machines.

Parameters

  • None

Pre-requisite Scripts

  • None

Return

  • $os Operating system details

GetWindowsCPU()

Retrieves Windows processor information.

Parameters

  • None

Pre-requisite Scripts

  • SolutionConnectWMIOLE.

Return

  • $cpu CPU information

GetWindowsMemory()

Retrieves Windows memory information.

Parameters

  • None

Pre-requisite Scripts

  • SolutionConnectWMIOLE

Return

  • $memory Memory information

GetWindowsOS()

Returns the Operating System details for Windows.

Parameters

  • None

Pre-requisite Scripts

  • SolutionConnectWMIOLE

Return

  • $os Operating system information

GetWindowsVolumeInfo()

Retrieves volume information from a Windows machine.

Parameters

  • None

Pre-requisite Scripts

  • SolutionConnectWMIOLE

Return

  • @volinfo [0] = Name, [1] = Total Size (Kbytes), [2] = Used (Kbytes), [3] = Available (Kbytes), [4] = Percent Used, [5] = Mounted Point

 

Miscellaneous

This section describes a variety of functions associated with different scripts.

BuildFullPathSearchVariable($debug, $srcdir)

Formats a list of search paths for findfiles.

Parameters

  • $debug Set non-zero to enable debugging
  • $srcdir List of comma separated search paths

Pre-requisite Scripts

  • ModifyInputValue
  • DebugLog

Return

  • $searchPath Search path for findfiles.

ConfigSimpleEmail($to, $subject, $type)

Configures Signiant notification job success and/or failure emails, assigning the email "to" address and a subject. The subject for a failure email is the provided subject prefaced with "Failure".

Parameters

  • $to Email address of recipient
  • $subject Subject of email
  • $type 0 = Configure success email only, 1 = Configure failure email only, 2 = Configure both success and failure emails

Pre-requisite Scripts

  • None

Return

  • None

ConnectWMIOLE($server, $cimroot)

Connects to Windows Management Instrumentation with OLE.

Parameters

  • $server Server, empty for local server
  • $cimroot Entry to look into table

Pre-requisite Scripts

  • None

Return

  • $services Null if an error occurred

DateTimeLabel($timeZone)

Creates a label based on current date/time combination.

Parameters

  • $timeZone 1 = Use GMT, other wise use the local time zone

Pre-requisite Scripts

  • None

Return

  • $timestamp Time stamp in the format YYMMDDHHMMSS

DebugLog($line)

Prints passed string to STDOUT.

Parameters

  • $line String to print

Pre-requisite Scripts

  • None

Return

  • None

 

SolutionPgDump($ddsHome)

Resolves the location of the postgres database dump command line tool. Invokes a Perl die command if the tool cannot be found.

Parameters

  • $ddsHome Signiant install directory (for example, /usr/signiant/dds)

Pre-requisite Scripts

  • None

Return

  • $pgPath Path to pg_dump tool

SolutionFindPsql

Resolves the location of the postgres command line query tool. Invokes a Perl "die" command if the tool cannot be found.

Parameters

  • $ddsHome Signiant install directory (for example, /usr/signiant/dds)

Pre-requisite Scripts

  • None

Return

  • $psqlPath Path to psql tool

GetDebugFromLogLevel($jobLogLevel)

Checks job log level whether debug logging is requested or not. A job log level of three means that debug logging is requested.

Parameters

  • $jobLogLevel Job log level, numeric, 0 through 3

Pre-requisite Scripts

  • None

Return

  • $debug 0 = debug not enable, 1 = debug enabled

SolutionKillProcess

Kills the process identified by the given PID.

Parameters

  • $pid Process id

Pre-requisite Scripts

  • None

Return

  • $rc 0 = Success, non-zero = Failure

NotifyMessage($message, $interfaces)

Sends the supplied message to the requested notification interfaces. Interfaces defined bitwise by $interfaces:

Bit 1: Job Log; INFO

Bit 2: Active status

Bit 4: Body of job success notification email

Bit 8: Body of job failure notification email

Bit 16: Job log; WARN

A value of zero for $interfaces is equivalent to bits 2, 4, 8, and 16.

Parameters

  • $message Message to output
  • $interfaces Interfaces on which place the message

Pre-requisite Scripts

  • None

Return

  • None

Win32MultiByteToWideChar($string, $codePage)

Converts UTF-8 character string to Windows Wide Characters. The function invokes a Perl "die" if an error occurs. If no code page is specified, then a code page of zero is used.

Parameters

  • $string String to convert
  • $codePage Code page to use

Pre-requisite Scripts

  • None

Return

  • $ustring Converted string

Win32WideCharToMultiByte($ustring, $codePage)

Converts Windows Wide Character string to UTF-8. The function invokes a Perl "die" if an error occurs. If no code page is specified, then a code page of zero is used.

Parameters

  • $ustring String to convert
  • $codePage Code page to use

Pre-requisite Scripts

  • None

Return

  • $string Converted string

SolutionStandardHeaderPerl

This code should be included at the beginning of every Perl based command being used in each component that is developed. It performs the following functions:

Enforces "strict" operation.

Enables Unicode support.

Enables automatic flushing of STDOUT and STDERR.

Sets the variable $LOGLEVELVALUE

Sets the variable $LOGLEVELNAME

Sets the variable $DEBUG

Sets the variables $TRUE = 1 and $FALSE = 0.

Sets the variables $IsWindows, $IsUnix, and $SLASH as follows:

Windows

$IsWindows = $TRUE

$IsUnix = $FALSE

$SLASH = /

Non-Windows

$IsWindows = $FALSE

$IsUnix = $TRUE

$SLASH = /

Parameters

  • Not applicable. This is not a function, but is inserted as straight code.

Pre-requisite Scripts

  • Not applicable

Return

  • Not applicable

dumpComplexHash($depth, %theHash)

Prints all keys and values in a hash.

Parameters

  • $depth Starting depth for tabbing
  • %theHash A hash

Pre-requisite Scripts

  • Not applicable

Return

  • Not applicable

dumpSimpleHash(%theHash)

Prints all keys and values in a hash.

Parameters

  • %theHash A hash

Pre-requisite Scripts

  • Not applicable

Return

  • Not applicable

 

Statistics

This section describes a variety of functions associated with different statistics.

ParseStatsSummaryForJob($report)

Parses the stats_const message, returns hash with template as key and name/value pairs hash per key.

Parameters

  • $report Statistics report to parse, set to %workflow_summary_stats%

Pre-requisite Scripts

  • None

Return

  • $hashTemplate

ParseStatsConstantsForJob ($report)

Parses the stats_const message, returns hash with template as key and name/value pairs hash per key.

Parameters

  • $report Statistics report to parse, set to %workflow_const_stats%

Pre-requisite Scripts

  • None

Return

  • $hashTemplate

ParseStatsReportForJob ($report)

Parses the stats_const message, returns hash with template as key and name/value pairs hash per key.

Parameters

  • $report Statistics report to parse, set to %workflow_report_stats%

Pre-requisite Scripts

  • None

Return

  • $hashTemplate

ParseStatsAggregateForJob ($report)

Parses the stats_const message, returns hash with template as key and name/value pairs hash per key.

Parameters

  • $report Statistics report to parse, set to %workflow_aggregate_stats%

Pre-requisite Scripts

  • None

Return

  • $hashTemplate

 

Input/Output Processing

This section describes a variety of functions associated with processing inputs and outputs.

ModifyInputValue($value, $replaceBS)

Modifies the input values to remove quotes and trailing "/" "\" This strips off a lot of spurious characters that get inserted into variables from substitution from prompts. If $replaceBS == 1, then single backslashes are changed into double backslashes.

Parameters

  • $value Value to modify
  • $replaceBS 1 = Replace backslashes

Pre-requisite Scripts

  • None

Return

  • $modifiedValue Modified value

ModifyInputValueMulti($values, $replaceBS)

Calls the ModifyInputValue value for each component of the comma-delimited string. If $replaceBS == 1, then single backslashes are changed into double backslashes.

Parameters

  • $values Values to modify; comma-separated list
  • $replaceBS 1 = Replace backslashes

Pre-requisite Scripts

  • SolutionModifyInputValue

Return

  • $modifiedValues Modified values, comma-separated list

SigListXMLCreate($xmlString, $listType, @elArray, $xAttrs)

Creates an XML string based on the type and element array passed. The array can either be a simple array of strings, or an array of hashes where each hash element represents an element attribute. (In other words, ElArray[n]{'V'}="C:/temp" indicates that this element has an attribute called "V" with the value "C:/temp").

Parameters

  • $xmlString XML string to create
  • $listType The string indicating the list type: ("FILEDIR","PATHLIST","MULTILINEVAL")
  • @elArray Simple array of strings OR an array of hashes (element attribute name/value pairs). By reference.
  • $xAttrs Optional

Pre-requisite Scripts

  • None

Return

  • $rc 0 = String successfully created, 1 = A failure occurred

IsSigListXmlFormat($xmlString)

Returns TRUE or FALSE indicating whether the string appears to be in SigListXML format.

Parameters

  • $xmlString String to check

Pre-requisite Scripts

  • None

Return

  • True if supplied string is in SigListXML format, false otherwise.

SigListXMLParse($xmlString, $listType, @elArray, $xAttrs)

Parses the XML string and returns an array of hashes. One array element (hash) is returned per element. Each hash element represents an element attribute name/value pair.

Parameters

  • $xmlString XML string to parse
  • $listType The string indicating the list type ("FILEDIR","PATHLIST","MULTILINEVAL"). By reference.
  • @$elArray Simple array of strings OR an array of hashes (element attribute name/value pairs). By reference.
  • $xAttrs By reference. Optional

Pre-requisite Scripts

  • None

Return

  • $rc 0 = Success,1: String is not SigList type, 2 = String is not parsable

SigListGetELementsByAttribute(@elArray, $elAttr)

Returns an array of attribute values from the passed element hash of the requested type.

Parameters

  • @elArray Array of elements (simple array of strings OR an array of hashes). By reference.
  • $elAttr Element attribute type requested

Pre-requisite Scripts

  • None

Return

  • @elArray Array of elements of the requested type

SigArrayToMultiLineStr(@elArray)

Takes an element array and creates a multi-line string out of the element values. The input array can either be a simple array of strings or an array of hashes. If the input array is an array of hashes, only those hash elements with a key of 'V' (meaning "value") are used to create the output string.

Parameters

  • @elArray Array of elements (simple array of strings OR an array of hashes)

Pre-requisite Scripts

  • None

Return

  • $rc 0 = Success, 1 = Failure

SolutionSNMPGetTrap($trapName, $trapOID, $trapText)

Retrieve the trap OID and optional text for a given trap name from the SNMP map file.

Parameters

  • $trapName Trap name
  • $trapOID Trap OID, by reference
  • $trapText Trap text, by reference

Pre-requisite Scripts

  • None

Return

  • None

SolutionSNMPSendTrap($trapOID, $trapText, $trapHostList, $community, $enterprseID)

Sends an SNMP trap using the dds_snmptrap utility.

Parameters

  • $trapOID The enterprise trap number in the Signiant MIB
  • $trapText The optional trap text to send along with the trap
  • $trapHostList A comma-separated list of trap hosts
  • $community SNMP community string
  • $enterprseID Optional

Pre-requisite Scripts

  • None

Return

  • $rc

SolutionSplitPaths($paths, $separator)

Splits a list of paths into an array of individual paths. List to split may either be in SigListXML format or a list with a generic separator. If the list is in SigList XML format, then the supplied separator is ignored.

Parameters

  • $paths
  • $separator

Pre-requisite Scripts

  • None

Return

  • @paths List of paths

 

Volume Shadow Services

This section describes a variety of functions associated with Volume Shadow Services.

SolutionSubstituteShadowDevices($multiPathString, $deviceMapping)

Replaces all occurrences in the string $multiPathString that match keys in the device mapping hash with the values stored in the given hash bucket.

Parameters

  • $multiPathString Hash of strings to map, by reference
  • $deviceMapping

Pre-requisite Scripts

  • None

Return

  • $multiPathString Original string with substitutions performed.

SolutionParseShadowMapFile($deviceMapping, $setID, $mapFile, $stdoutSafe)

Reads shadow device map file. If drive "n" exists in map file, then sets value from map file in device map hash.

Parameters

  • $deviceMapping Device mapping, by reference
  • $setID Shadow set ID
  • $mapFile Shadow map file
  • $stdoutSafe If set to true, then print informational messages

Pre-requisite Scripts

  • None

Return

  • %deviceMapping

SolutionOSSupportsVolumeShadowCopy($agent, $stdoutSafe)

Returns true if the OS supports volume shadow copy service.

Parameters

  • $agent Check "source" or "target"; assumes a file transfer
  • $stdoutSafe If true and job log level set to debug, then send check result to job log

Pre-requisite Scripts

  • SolutionGetWindowsOS

Return

  • $vscSupported 0 = Not supported, 1 = supported

SolutionCreateShadow($deviceMapping, $timeout, $lifetime, $mapFile, $errorFile)

Creates a process to execute dds_shadow, then loop until the mapping file appears or the timeout expires.

Parameters

  • $deviceMapping Device mapping (by reference)
  • $timeout Maximum time to wait for shadow file to be created; in seconds
  • $lifetime Time to wait for dds_shadow to execute; in seconds
  • $mapFile Map file name
  • $errorFile Error file name

Pre-requisite Scripts

  • None

Return

  • $pid Process id of dds_shadow process, or -1 if an error occurred.

SolutionDeleteShadow($setID)

Deletes all shadow copies within the given ShadowSetID.

Parameters

  • $setID Shadow set ID

Pre-requisite Scripts

  • None

Return

  • $pid Process id of dds_shadow or -1 if an error occurred

 

File/Path Operations

This section describes a variety of functions associated with File/Path operations.

ValidateRootDirectory($rootDir)

Validates that a root directory was specified. This will check Windows drives, Unix root, and UNC names. Forward and backward slashes are treated as equal.

Parameters

  • $rootDir Directory to validate

Pre-requisite Scripts

  • None

Return

  • $rc 1 = Root directory, 0 = Not a root directory

SolutionValidateUnixAbsolutePath($path, $checkPathExists)

Validates whether supplied path is an absolute Unix path. If it is an absolute Unix path, then optionally check whether it exists and is a directory. If the path is a file, then this check will indicate the path does not exist. No check is made that the path is readable. Forward and backward slashes are treated as equal.

Parameters

  • $path Path to validate
  • $checkPathExists Should an existence check be made on the path if it is validated as an absolute path? 0 = Do not check, 1 = Check

Pre-requisite Scripts

  • None

Return

  • $rc 0 = Path is not absolute, 1 = path is absolute and (optionally) exists as a directory, 2 = path is absolute and (optionally) does not exist as a directory

ShareAccessible($sharePath)

Makes sure share specified is accessible.

Parameters

  • $sharePath Path to share

Pre-requisite Scripts

  • None

Return

  • $rc 0 = Share not accessible, 1 = Share accessible

SolutionValidateWindowsAbsolutePath($path,$checkPathExists)

Validates whether supplied path is an absolute Windows path. If it is an absolute Windows path, then optionally check whether it exists and is a directory. If the path is a file, then this check will indicate the path does not exist. No check is made that the path is readable. Forward and backward slashes are treated as equal. Both local and UNC paths are supported.

Parameters

  • $path Path to validate
  • $checkPathExists Should an existence check be made on the path if it is validated as an absolute path? 0 = Don't check, 1 = Check

Pre-requisite Scripts

  • None

Return

  • $rc 0 = Path is not absolute, 1 = path is absolute and (optionally) exists as a directory, 2 = path is absolute and (optionally) does not exist as a directory

GetShortName($pathName)

Returns the DOS 8.3 name for a given file. The function invokes a Perl "die" if an error occurs.

Parameters

  • $pathName

Pre-requisite Scripts

  • None

Return

  • $shortPathName Convert path

MakeNativePath($path)

Converts back slashes to forward slashes or vice-versa depending on the running platform.

Parameters

  • $path Path to modify

Pre-requisite Scripts

  • None

Return

  • $nativePath Path with native separators

SolutionGetCommonDrives($searchDrives, $availableDrives)

Finds common drives common to two lists.

Parameters

  • $searchDrives Drives to search, by reference
  • $availableDrives Available drives, by reference

Pre-requisite Scripts

  • None

Return

  • %deviceMapping List of common drives

GetBaseDirectoryListing($directory)

Gets the files in the base directory.

Parameters

  • $directory Directory to read

Pre-requisite Scripts

  • None

Return

  • @files List of files

CreateDirectory($dir)

Creates a directory based on a full path. Sets owner/group to admin/admingrp on GOS-based systems.

Parameters

  • $dir Directory to create

Pre-requisite Scripts

  • None

Return

  • $rc 0 = Success, non-zero = failure

CreateSimpleShortCut($file, $share, $description)

Creates a simple Windows shortcut. The shortcut name will be $filename.lnk

Parameters

  • $filename Shortcut name
  • $share Target file name
  • $description Description

Pre-requisite Scripts

  • None

Return

  • $rc 0 = Success, non-zero = failure

SolutionExistsLongPathname($pathname)

Checks the existence of a file via a slow but reliable method. Perl cannot handle pathnames greater than 256 characters, so this function gets the directory name (still subject to a 256 character maximum) and uses string comparisons for the filename - eliminating any length restriction.

Parameters

  • $pathname Pathname

Pre-requisite Scripts

  • None

Return

  • $fileFound 0 = Not found, 1 = Found

SolutionGetAvailableDrives()

Returns a hash representing the local drives found on the system. The hash is indexed by the drive itself.

Parameters

  • None

Pre-requisite Scripts

  • SolutionGetWindowsVolumeInfoByDriveType

Return

  • %AvailableDrives

SolutionGetPathDrives($searchPaths,$pathCount)

Returns a hash representing the drives found in the search paths. The hash is indexed by the drive itself.

Parameters

  • searchPaths List of paths
  • pathCount Number of paths

Pre-requisite Scripts

  • None

Return

  • %searchDrives

GetFileName($path)

Gets just the filename from a full path. Works for both Linux and Windows file separators.

Parameters

  • $path Full path to a file (for example, /foo/bar)

Pre-requisite Scripts

  • None

Return

  • $filename File name (for example, bar)

GetFixedLocalDrives()

Returns a hash representing the local (fixed) drives found on the system. The hash is indexed by the drive itself. For Windows only.

Parameters

  • None

Pre-requisite Scripts

  • None

Return

  • %availableDrives List of local drives

SolutionGetPathComponents($searchPath)

Returns an array representing the path components in the comma-separated path.

Parameters

  • $searchPath Comma-separated path components

Pre-requisite Scripts

  • None

Return

  • $searchPaths List of path components

separatePathAndFile($filePath)

Gets the path and file name as separate elements.

Parameters

  • $filePath Full path to the file

Pre-requisite Scripts

  • None

Return

  • $justDir
  • $justName

Publishing Components

Since a published component is available globally in the software, you may want to create documentation describing it, to make using the component easier for others in your organization. This documentation must be in PDF format. You can associate the PDF file with the component when you publish it

Publishing a Component

To publish a component, do the following:

  1. In the Manager, select Jobs > Templates.
  2. Select  the Job Template Library which contains the Workflow or Job Template component you want to publish and click Open Canvas (or double-click the template library).
  3. Right-click the component you want to publish and choose Publish > Component.
  4. Type a name for the component (note that the field defaults to the current component name prefaced with “New”).
  5. Choose the toolbox menu under which you want the component to appear by selecting it from the Classification drop-down.
  6. Type a description.
  7. If you have created an icon for the component, click Browse to choose a picture to use as the icon. If you do not select a picture, a default system icon is used.
  8. If you have created documentation for the component, click Browse to choose the PDF file that is displayed when a user selects Help for this component.
  9. Click OK.

    The icon appears in the toolbox menu you specified in the Classification field.

Viewing Component Help

If a component developer has included documentation with the component, it is available for viewing by the user in a PDF reader. To view documentation that has been included with a component, do the following:

  1. Double-click the Job Template Library.
  2. Expand the area of the toolbox in which the component is located or locate it on the Workflow Canvas.
  3. Right-click the component and choose Help. Note that if Help is grayed out, there is no documentation included with this component.

    The documentation included with the component by the component developer opens.

Deleting a Published Component

When you delete a published component, existing Workflows or Job Templates that use a removed component are not affected by the component removal. Components on the canvas are a copy, not a reference.

To delete a published component, do the following:

  1. Double-click the Job Template library.
  2. Expand the area of the toolbox in which your published component is located.
  3. Right-click the component you want to delete and select Delete.
  4. At the confirmation prompt, click Yes.

Deleting a Toolbox Classification Type

In order to delete a toolbox classification type, you must remove all components from it, transferring them to other classification types.

To delete a toolbox classification type, do the following:

  1. Double-click the Job Template Library.
  2. Expand the area of the toolbox in which your published component is located.
  3. Drag the component to the Workflow Canvas.
  4. Right-click the component you want to publish and choose Publish Component.
  5. Make sure the component has the same name as the existing component in the toolbox classification you want to remove (you may need to remove the word New that prefaces the name).
  6. Choose a different toolbox menu under which you want the component to appear by selecting it from the Classification drop-down.
  7. Click OK.

    You are prompted to confirm overwriting the existing version.

  8. Click Yes.

    The component is transferred to the new specified classification.

  9. Repeat for all components located in the classification you want to remove.

    Once the specified classification has no more components in it, it disappears from the toolbox.

Importing And Exporting Components

Importing a Component

Published components are exported as ZIP files. Exported components are useful as backups or for use on other Managers. To use an exported component on another Manager, you must import it.

When you import a component to a different Manager, the component appears in the toolbox menu from which it was exported. If the menu does not exist, it will be created. For example, if the component was exported from a menu called MyPublishedComponents, a toolbox menu called MyPublishedComponents is added to the component toolbox in each Workflow or Job Template on the Manager to which it is imported.

To import a component, do the following:

  1. Copy the component ZIP file to a directory location from which you can import it.
  2. As a system administrator, login to the Manager for the Manager to which you want to import the component.
  3. Choose Administration > Manager > Applications.
  4. Choose Install.
  5. Click Browse, and navigate to the location of the exported component ZIP file.
  6. Select the ZIP file, as per your operating system's process (so that it appears in the Signiant Application Install window), and click OK.
  7. In Customizer > Job Template Libraries choose a Job Template Library.

    The component you imported appears in the toolbox menu from which it was exported.

Exporting a Component

You may want to export a component as a backup or to use the component on another Manager. Organizations who use multiple Signiant Managers, probably want the component on all of them. Developers creating components to sell or distribute to others need a way to deliver the component to customers.

Exporting a Job Template Library

Exporting a Job Template Library creates an XML file. Any specialized components used are exported. If the specialized component is a base component (edited Signiant-developed components) it will appear as a regular icon and users can edit and publish it. If it is a custom published component (edited component that has been published), the icon will appear as red "x" (but users can still run the workflow). However, you cannot edit the component or use it in other workflows.

To export a Job Template Library, do the following:

  1. Double-click the Job Template Library where the Workflow or Job Template is saved.
  2. Click Export.
  3. In the File Download dialog, click Save.
  4. In the Save As dialog, browse to your backup location and click Save.

Exporting a Published Component

To export a component that you have created and published, do the following:

  1. Double-click the Job Template Library.
  2. Expand the area of the toolbox in which your published component is located.
  3. Right-click the component you want to export, and select Export.
  4. Follow the directions to save the file.

Perl Modules

The following table listss the Perl Modules available with Unix, Macintosh and Windows.

Linux, Unix, Macintosh, Windows

AnyDBM_File Mail::Field::Date
Apache::SOAP Mail::Field::Generic
Apache::XMLRPC::Lite Mail::Filter
App::Prove Mail::Header
AppConfig Mail::Internet
Archive::Tar Mail::Mailer
Archive::Tar::Constant Mail::Mailer::qmail
Archive::Tar::File Mail::Mailer::rfc822
Archive::Zip Mail::Mailer::sendmail
Attribute::Handlers Mail::Mailer::smtp
attributes Mail::Mailer::testfile
attrs Mail::Send
AutoLoader Mail::Util
AutoSplit Math::BigFloat
autouse Math::BigFloat::Trace
B Math::BigInt
B::Asmdata Math::BigInt::Calc
B::Assembler Math::BigInt::Trace
B::Bblock Math::BigRat
B::Bytecode Math::Complex
B::C::Section Math::Trig
B::CC Memoize
B::Concise Memoize::AnyDBM_File
B::Debug Memoize::Expire
B::Deparse Memoize::ExpireFile
B::Disassembler::BytecodeStream Memoize::ExpireTest
B::Lint Memoize::NDBM_File
B::Showlex Memoize::SDBM_File
B::Stackobj Memoize::Storable
B::Stash MIME::Base64
B::Terse MIME::Lite
B::Xref MIME::QuotedPrint
base MIME::Type
Benchmark MIME::Types
bigint Module::Build
bignum NDBM_File
bigrat Net::Cmd
blib Net::Config
Bundle::DBD::Pg Net::Domain
Bundle::DBI Net::FTP
Bundle::LWP Net::FTP::A
ByteLoader Net::FTP::dataconn
bytes Net::FTP::E
Carp Net::FTP::I
CGI Net::FTP::L
CGI::Carp Net::hostent
CGI::Cookie Net::HTTP
CGI::Fast Net::HTTP::Methods
CGI::Pretty Net::HTTP::NB
CGI::Push Net::HTTPS
CGI::Util Net::netent
charnames Net::Netrc
Class::ISA Net::NNTP
Class::Std Net::Ping
Class::Std::Fast Net::POP3
Class::Std::Fast::Storable Net::protoent
Class::Struct Net::servent
Compress::Raw::Bzip2 Net::SMTP
Compress::Raw::Zlib Net::SSL
Compress::Zlib Net::Time
Compress::Zlib NEXT
Config O
constant Opcode
CPAN open
CPAN::FirstTime ops
CPAN::Nox overload
Crypt::SSLeay Package::Constants
Crypt::SSLeay::Conn PerlIO
Crypt::SSLeay::CTX PerlIO::encoding
Crypt::SSLeay::Err PerlIO::scalar
Crypt::SSLeay::MainContext PerlIO::via
Crypt::SSLeay::X509 PerlIO::via::QuotedPrint
Cwd Pod::Checker
Data::Dumper Pod::Coverage
Date::Format Pod::Coverage::CountParents
Date::Language Pod::Coverage::ExportOnly
Date::Language::Afar Pod::Coverage::Overloader
Date::Language::Amharic Pod::Escapes
Date::Language::Austrian Pod::Find
Date::Language::Brazilian Pod::Functions
Date::Language::Chinese_GB Pod::Html
Date::Language::Czech Pod::InputObjects
Date::Language::Danish Pod::LaTeX
Date::Language::Dutch Pod::Man
Date::Language::English Pod::ParseLink
Date::Language::Finnish Pod::Parser
Date::Language::French Pod::ParseUtils
Date::Language::Gedeo Pod::Plainer
Date::Language::German Pod::Select
Date::Language::Greek Pod::Simple
Date::Language::Italian Pod::Text
Date::Language::Norwegian Pod::Text::Color
Date::Language::Oromo Pod::Text::Overstrike
Date::Language::Sidama Pod::Text::Termcap
Date::Language::Somali Pod::Usage
Date::Language::Swedish POSIX
Date::Language::Tigrinya re
Date::Language::TigrinyaEritrean REST::Client
Date::Language::TigrinyaEthiopian Safe
Date::Manip Scalar::Util
Date::Parse Scalar::Util
DB Scalar::Util::PP
DBD::DBM SDBM_File
DBD::ExampleP Search::Dict
DBD::File SelectSaver
DBD::Gofer SelfLoader
DBD::Gofer::Policy::Base Shell
DBD::Gofer::Policy::classic sigtrap
DBD::Gofer::Policy::pedantic SOAP::Constants
DBD::Gofer::Policy::rush SOAP::Lite
DBD::Gofer::Transport::Base SOAP::Lite::Deserializer::XMLSchema1999
DBD::Gofer::Transport::null SOAP::Lite::Deserializer::XMLSchema2001
DBD::Gofer::Transport::pipeone SOAP::Lite::Deserializer::XMLSchemaSOAP1_1
DBD::Gofer::Transport::stream SOAP::Lite::Deserializer::XMLSchemaSOAP1_2
DBD::NullP SOAP::Lite::Packager
DBD::Proxy SOAP::Lite::Utils
DBD::Sponge SOAP::Packager
DBI SOAP::Test
DBI::Const::GetInfo::ANSI SOAP::Transport::FTP
DBI::Const::GetInfo::ODBC SOAP::Transport::HTTP
DBI::Const::GetInfoReturn SOAP::Transport::IO
DBI::Const::GetInfoType SOAP::Transport::JABBER
DBI::DBD SOAP::Transport::LOCAL
DBI::DBD::Metadata SOAP::Transport::LOOPBACK
DBI::FAQ SOAP::Transport::MAILTO
DBI::Gofer::Execute SOAP::Transport::MQ
DBI::Gofer::Request SOAP::Transport::POP3
DBI::Gofer::Response SOAP::Transport::TCP
DBI::Gofer::Serializer::Base SOAP::WSDL
DBI::Gofer::Serializer::DataDumper Socket
DBI::Gofer::Serializer::Storable sort
DBI::Gofer::Transport::Base Storable
DBI::Gofer::Transport::pipeone strict
DBI::Gofer::Transport::stream subs
DBI::Profile Switch
DBI::ProfileData Symbol
DBI::ProfileDumper Sys::Hostname
DBI::ProfileDumper::Apache Sys::Syslog
DBI::ProfileSubs TAP::Base
DBI::ProxyServer TAP::Formatter::Base
DBI::SQL::Nano TAP::Formatter::Color
DBI::Util::_accessor TAP::Formatter::Console
DBI::Util::CacheMemory TAP::Formatter::Console::ParallelSession
Devel::DProf TAP::Formatter::Console::Session
Devel::Peek TAP::Formatter::File
Devel::PPPort TAP::Formatter::File::Session
Devel::SelfStubber TAP::Formatter::Session
Devel::Symdump TAP::Harness
Devel::Symdump::Export TAP::Object
diagnostics TAP::Parser
Digest Template
Digest::base Template::Base
Digest::CRC Template::Config
Digest::file Template::Constants
Digest::HMAC Template::Context
Digest::HMAC_MD5 Template::Directive
Digest::HMAC_SHA1 Template::Document
Digest::MD5 Template::Exception
Digest::SHA1 Template::Filters
DirHandle Template::Grammar
Dumpvalue Template::Iterator
DynaLoader Template::Namespace::Constants
Email::Date::Format Template::Parser
Encode Template::Plugin
Encode::Alias Template::Plugin::Assert
Encode::Byte Template::Plugin::CGI
Encode::CJKConstants Template::Plugin::Datafile
Encode::CN Template::Plugin::Date
Encode::CN::HZ Template::Plugin::Directory
Encode::CN::HZ Template::Plugin::Dumper
Encode::Config Template::Plugin::File
Encode::EBCDIC Template::Plugin::Filter
Encode::Encoder Template::Plugin::Format
Encode::Encoding Template::Plugin::HTML
Encode::GSM0338 Template::Plugin::Image
Encode::Guess Template::Plugin::Iterator
Encode::JP Template::Plugin::Math
Encode::JP::H2Z Template::Plugin::Pod
Encode::JP::H2Z Template::Plugin::Procedural
Encode::JP::JIS7 Template::Plugin::Scalar
Encode::JP::JIS7 Template::Plugin::String
Encode::KR Template::Plugin::Table
Encode::KR::2022_KR Template::Plugin::URL
Encode::KR::2022_KR Template::Plugin::View
Encode::MIME::Header Template::Plugin::Wrap
Encode::MIME::Header Template::Plugins
Encode::MIME::Header::ISO_2022_JP Template::Provider
Encode::MIME::Name Template::Service
Encode::Symbol Template::Stash
Encode::TW Template::Stash::Context
Encode::Unicode Template::Stash::XS
Encode::Unicode::UTF7 Template::Test
encoding Template::View
English Template::VMethods
Env Term::ANSIColor
Errno Term::Cap
Exporter Term::Complete
Exporter::Heavy Term::ReadKey
  Term::ReadLine
ExtUtils::Command Test
ExtUtils::Command::MM Test::Builder
ExtUtils::Constant Test::Builder::IO::Scalar
ExtUtils::Embed Test::Builder::Module
ExtUtils::Install Test::Builder::Tester
ExtUtils::Installed Test::Builder::Tester::Color
ExtUtils::Liblist Test::Harness
ExtUtils::MakeMaker Test::Harness::Assert
ExtUtils::MakeMaker::Config Test::Harness::Iterator
ExtUtils::Manifest Test::Harness::Straps
ExtUtils::Miniperl Test::More
ExtUtils::Mkbootstrap Test::Pod::_parser
ExtUtils::Mksymlists Test::Pod::Coverage
ExtUtils::MM Test::Simple
ExtUtils::MY Text::Abbrev
ExtUtils::Packlist Text::Balanced
ExtUtils::testlib Text::ParseWords
Fatal Text::Soundex
Fcntl Text::Tabs
fields Text::Wrap
File::Basename Thread
File::CheckTree Thread::Queue
File::Compare Thread::Semaphore
File::Copy threads
File::DosGlob threads::shared
File::Find Tie::Array
File::Glob Tie::File
File::GlobMapper Tie::Handle
File::Listing Tie::Hash
File::Path Tie::Memoize
File::Spec Tie::RefHash
File::stat Tie::Scalar
File::Temp Tie::SubstrHash
FileCache Time::gmtime
FileHandle Time::HiRes
filetest Time::Local
Filter::Simple Time::localtime
FindBin Time::tm
GDBM_File Time::Zone
Getopt::Long UDDI::Lite
Getopt::Std Unicode::Collate
Hash::Util Unicode::Normalize
HTML::Entities Unicode::UCD
HTML::Filter UNIVERSAL
HTML::Form URI
HTML::HeadParser URI::_foreign
HTML::LinkExtor URI::_generic
HTML::Parser URI::_ldap
HTML::PullParser URI::_login
HTML::Tagset URI::_query
HTML::TokeParser URI::_segment
HTTP::Config URI::_server
HTTP::Cookies URI::_userpass
HTTP::Cookies::Microsoft URI::data
HTTP::Cookies::Netscape URI::Escape
HTTP::Daemon URI::file
HTTP::Date URI::file::Base
HTTP::Headers URI::file::FAT
HTTP::Headers::Auth URI::file::Mac
HTTP::Headers::ETag URI::file::OS2
HTTP::Headers::Util URI::file::QNX
HTTP::Message URI::file::Unix
HTTP::Negotiate URI::file::Win32
HTTP::Request URI::ftp
HTTP::Request::Common URI::gopher
HTTP::Response URI::Heuristic
HTTP::Status URI::http
I18N::Collate URI::https
I18N::Langinfo URI::ldap
I18N::LangTags URI::ldapi
I18N::LangTags::List URI::ldaps
if URI::mailto
integer URI::mms
IO URI::news
IO::Compress::Base::Common URI::nntp
IO::Compress::Gzip::Constants URI::pop
IO::Compress::Zip::Constants URI::QueryParam
IO::Compress::Zlib::Extra URI::rlogin
IO::Dir URI::rsync
IO::File URI::rtsp
IO::Handle URI::rtspu
IO::Pipe URI::sip
IO::Poll URI::sips
IO::Seekable URI::snews
IO::Select URI::Split
IO::SessionData URI::ssh
IO::SessionSet URI::telnet
IO::Socket URI::tn3270
IO::Socket::INET URI::URL
IO::Socket::UNIX URI::urn
IO::Uncompress::Adapter::Bunzip2 URI::urn::isbn
IO::Uncompress::Adapter::Identity URI::urn::oid
IO::Uncompress::Adapter::Inflate URI::WithBase
IO::Uncompress::Unzip User::grent
IO::Zlib User::pwent
IPC::Msg utf8
IPC::Open2 vars
IPC::Open3 version
IPC::Semaphore version::vxs
IPC::SysV vmsish
less warnings
lib WWW::RobotRules
JSON::PP WWW::RobotRules::AnyDBM_File
List::Util XML::NamespaceSupport
List::Util XML::Parser
List::Util::PP XML::Parser::Expat
List::Util::XS XML::Parser::Lite
locale XML::Parser::Style::Debug
Locale::Constants XML::Parser::Style::Objects
Locale::Country XML::Parser::Style::Stream
Locale::Currency XML::Parser::Style::Subs
Locale::Language XML::Parser::Style::Tree
Locale::Maketext XML::SAX
Locale::Script XML::SAX::Base
LWP XML::SAX::DocumentLocator
LWP::Authen::Basic XML::SAX::Exception
LWP::Authen::Digest XML::SAX::Expat
LWP::Authen::Ntlm XML::SAX::ParserFactory
LWP::ConnCache XML::SAX::PurePerl
LWP::Debug XML::SAX::PurePerl
LWP::DebugFile XML::SAX::PurePerl
LWP::MediaTypes XML::SAX::PurePerl
LWP::MemberMixin XML::SAX::PurePerl
LWP::Protocol XML::SAX::PurePerl
LWP::Protocol::cpan XML::SAX::PurePerl
LWP::Protocol::data XML::SAX::PurePerl::DebugHandler
LWP::Protocol::file XML::SAX::PurePerl::Exception
LWP::Protocol::ftp XML::SAX::PurePerl::Productions
LWP::Protocol::GHTTP XML::SAX::PurePerl::Reader
LWP::Protocol::gopher XML::SAX::PurePerl::Reader
LWP::Protocol::http XML::SAX::PurePerl::Reader
LWP::Protocol::http10 XML::SAX::PurePerl::Reader::Stream
LWP::Protocol::https XML::SAX::PurePerl::Reader::String
LWP::Protocol::https10 XML::SAX::PurePerl::Reader::URI
LWP::Protocol::loopback XML::Simple
LWP::Protocol::mailto XMLRPC::Lite
LWP::Protocol::nntp XMLRPC::Test
LWP::Protocol::nogo XMLRPC::Transport::HTTP
LWP::RobotUA XMLRPC::Transport::POP3
LWP::Simple XMLRPC::Transport::TCP
LWP::UserAgent XS::APItest
Mail::Address XS::Typemap
Mail::Cap XML::LibXML
Mail::Field XML::XPath
Mail::Field::AddrList XML::XPath::XMLParser

Windows Only

ExtUtils::CBuilder Win32::Event Win32::OLE::TypeInfo
ExtUtils::ParseXS Win32::EventLog Win32::OLE::Variant
OLE::Variant Win32::File Win32::PerfLib
Thread::Signal Win32::FileSecurity Win32::Pipe
Thread::Specific Win32::Internet Win32::Process
Tie::Registry Win32::IPC Win32::Registry
Win32 Win32::Job Win32::Semaphore
Win32::API Win32::Mutex Win32::Service
Win32::API::Callback Win32::NetAdmin Win32::Shortcut
Win32::API::Struct Win32::NetResource Win32::Sound
Win32::API::Test Win32::ODBC Win32::TieRegistry
Win32::API::Type Win32::OLE Win32::WinError
Win32::ChangeNotify Win32::OLE Win32API::File
Win32::Clipboard Win32::OLE::Const Win32API::Net
Win32::Console Win32::OLE::Enum  
Win32::DirSize Win32::OLE::NLS  

Understanding Component Keywords

Workflow Component Automation Keywords allows you to create scripts that you can save for repeated use in multiple templates. You can enhance the functionality of Job Template Libraries through the use of special keywords. The tables in this section describe the use of these keywords and provide examples of their use.

Property Keywords

 

Value Description

agent_connect_name

Enables agent processes to know what name was used to make the connection by which they were instantiated. This keyword provides support for “upgrade-inplace” for clustered agents and enables the use of agent host name aliases as well. The “%agent_connect_name%”keyword is substituted with the network node name that was used to originate the connection when spawning that particular agent process.

agent_restarted

Use in the event of an agent restart to avoid commands that have already been performed. For example, if the agent is restarting, but there is a command to remove a directory on the target machine, one would not want to do this if files had already been transferred when the agent failed.

authentication_mode

Specifies the way agents verify their credentials. Values are “none”, “server only” or “mutual”. By default, the agents use mutual authentication (the client identifies itself to the server, and the server is identified as the one the client intends to connect to).

bandwidth_throttle

Indicates the maximum bandwidth (in bytes per second) which the job should use. Note that bandwidth limiting is done on each stream connection, so a value specified here is passed to the controlling agent for each template executed and divided among the streams as follows: 1. It is divided by the number of streams the agent will use for each remote agent (typically four). 2. It is further divided by the potential number of concurrent remote agents. This will be the lesser of the maximum allowed number of concurrent agents, and the number of remote agents specified in the template. Note that bandwidth throttles may also be employed by other network devices and policies (e.g., QoS), therefore, a bandwidth throttle (or target maximum) defined here may not be achievable. If you are having difficulty achieving a particular bandwidth target ensure that other policies are not impacting your ability to reach the desired throughput.

components_complete (does not appear in the Manager Web interface but can be added manually)

Indicates the number of components in a group that completed successfully.

component_exit_code (does not appear in the Manager Web interface but can be added manually)

Displays the exit code generated after the component runs.

components_in_error (does not appear in the Manager Web interface but can be added manually)

Components In Error indicates the number of components in a group that ended with an error condition. When no packages are in error the variable is set to an empty string.

component_path (does not appear in the Manager Web interface but can be added manually)

The full path of the relevant job component, which is the job component name qualified with the names of any job "parent" groups (e.g., Group1.Group2.Component1).

components_skipped (does not appear in the Manager Web interface but can be added manually)

Components Skipped indicates the number of components in a group that were skipped due to link triggers. Normally this variable is used in post- commands. When no components are skipped, the variable is set to an empty string.

components_to_execute (does not appear in the Manager Web interface but can be added manually)

Indicates the number of components that are remaining to run.

ctl_agent

The name of the controlling host in a remote transfer.

ctl_platform

Returns the architecture and the operating system of the controlling agent in a remote transfer.

ctl_platform_full

Returns the architecture, operating system and version of the operating system of the controlling agent in a remote transfer. (For example, "i686-Linux-RH3".) This applies only to Linux systems.

ctl_user

The User ID as which the controlling agent runs in a remote transfer.

default_unavailable_action

Indicates the default action (skip or error) that occurs when an agent is unavailable.

encryption_level

Indicates the level of encryption (high, medium, low or the system default). Note that mutual authentication is always used regardless of the encryption level specified.

exit_code

Indicates the exit code number generated by the Manager when the job is completed.

file_disposition

Indicates the disposition of a file; valid values are in use, transferring, skipping, deleting and error_skipping (for pre-file commands) and transferred, deleted, skipped, error, skipped error and skipped delete (for post-file commands).This keyword is substituted only in non-continuous pre-file and post-file commands.

file_gmtime

Indicates the date/time stamp of the file sent by the job template using the following format: Mmm dd, yyyy hh:mm:ss(for example: Jul 11, 200217:11:49) This keyword is substituted in non-continuous pre-file commands, non-continuous post-file commands and filter commands.

file_name

Indicates the name of the file sent by the job template; typically used in the Post-File Transfer Command field. This keyword is substituted in non-continuous pre-file commands, non-continuous post-file commands and filter commands.

file_size

Indicates the size in bytes of the file sent by the job template. This keyword is substituted in non-continuous pre-file commands, non-continuous post-file commands and filter commands.

file_type

Indicates the type of the entity sent by the job template. Returned values are F for file, D for directory, and L for symbolic link. The value of file_type will be S (denoting "something else") for anything that is not a file (F), directory (D) or symbolic link (L).

group

The name of the relevant group element.

grpntfy_aggregate_stats

see: %grpntfy_aggregate_stats%

grpntfy_const_stats

see: Understanding Component Keywords

grpntfy_report_stats

see: %grpntfy_report_stats%

grpntfy_summary_stats

see: %grpntfy_summary_stats%

job_id

A unique identifier for the job.

job_group_id

The ID of the group in which the current job is a member.

job_name

The current job name.

job­_run_number

The current run number (execution number) of the job.

job_template

Job Template is set to the name of the automation package being executed.

job_template_library

Indicates the name of the job template library.

job_template_library_owner

Indicates the name of the user who created the job template library.

link

The name of the relevant link element.

links_complete

The number of links executed.

links_in_error

The number of links that had errors.

links_skipped

Indicates the number of links that were not executed.

links_to_execute

Indicates the number of links which are remaining.

logged_in_user (does not appear in the Manager Web interface but can be added manually)

The username as which the user authenticated. The value for this keyword is added/set when a job is created or edited via the Manager Web interface or SOAP. Agents do not know of or set this type of keyword. Users cannot add or set it.

logged_in_user_email(does not appear in the Manager Web interface but can be added manually)

The e-mail address associated with the logged-in user. The value for this keyword is added/set when a job is created or edited via the Manger Web interface or SOAP. Agents do not know of or set this type of keyword. Users cannot add or set it.

logged_in_user_id(does not appear in the Manager Web interface but can be added manually)

The numeric ID field in the Signiant database. The value for this keyword is added/set when a job is created or edited via the Manager Web interface or SOAP. Agents do not know of or set this type of keyword. Users cannot add or set it.

mngr_name

Represents the host name of the manager as dds knows it.

req_agent

The name of the agent on which the Data Manager is running.

req_inception

The date on which the job template was initiated.

req_user

The user ID the job template executes under on the Manager.

remote_bandwidth_limit

The bandwidth limit with which the peer agent is operating. For the controlling agent, the remote bandwidth limit is the bandwidth limit of the non-controlling agent for each agent that has a bandwidth limit specified. For the non-controlling agent, it is the limit communicated to it from the controlling agent. Note that bandwidth throttles may also be employed by other network devices and policies (e.g., QoS), therefore, a bandwidth throttle (or target maximum) defined here may not be achievable. If you are having difficulty achieving a particular bandwidth target ensure that other policies are not impacting your ability to reach the desired throughput.

remote_system_type

Identifies the operating system available on the non-controlling agent.

src_agent

The host name of the source agent.

src_directory

The parent directory from which files are transferred.

src_platform

Returns the architecture and the operating system of the source agent.

src_platform_full

Returns the architecture, operating system and version of the operating system of the source agent. (For example, “i686-Linux-RH3”.) This applies only to Linux systems.

src_user

The user ID the agent runs as on the source agent.

stream

A number between 0 and 3 that indicates the stream connection over which a particular file is being transferred.

system_root_directory

On Windows systems, the value will be that of the "Windows Directory" (e.g., C:\WINNT). On UNIX systems, the value will be "/".

tgt_agent

The host name of the target agent.

tgt_directory

The parent directory to which files are transferred.

tgt_platform

Returns the architecture and the operating system of the target agent.

tgt_platform_full

Returns the architecture, operating system and version of the operating system of the target agent. (For example, “i686-Linux-RH3”.) This applies only to Linux systems.

tgt_user

The user ID the agent runs as on the target agent.

transfer_type

Indicates whether a file transfer job template is a push (initiated by the source) or a pull (initiated by the target).

unavailable_agent

The name of the agent that was not available.

Continuous Pre-File Command Notes

The input record placed on the standard input has the following space-separated field layout:

“<file_name>” <file_disposition> <file_type> <file_size> <file_date> <file_time>

 where

  • <file_name> is the name of the filesystem entity
  • <file_disposition> indicates the entity's current transfer status, with values equivalent to those of the %file_disposition% keyword
  • <file_type> indicates the type of the entity, with values equivalent to those of the %file_type% keyword (see above) as well as the value N (denoting "non-existent") used in the source pre-file command input record for entity that no longer exists
  • <file_size> is the entity's size, in bytes (note: not supplied if <file_type> is N)
  • <file_date> is the entity's last modification date, in YYYY/MM/DD format, and expressed in GMT (note: not supplied if <file_type> is N)
  • <file_time> is the entity's last modification time, in HH:MM:SS format and expressed in GMT (note: not supplied if <file_type> is N)

The corresponding output record should be one of the following tokens:

  • ok: directive to proceed with operations consistent with the <file_disposition> value
  • error: directive to indicate any operation be skipped (in error)
  • skip: directive to indicate any operation be skipped
  • transfer: directive to override any skip “disposition” with a transfer operation
  • preserve=<filename>: directive to rename any target entity to <filename> before performing a transfer or deletion

    This overrides any skip "disposition" with a transfer operation. It is valid only for target pre-file command.

Continuous Post-File Command Notes

The input record placed on the standard input has the following space-separated field layout:

"<file_name>" <file_disposition> <file_type> <file_size> <file_date> <file_time>

 where

  • <file_name> is the name of the filesystem entity
  • <file_disposition> indicates the entity's current transfer status, with values equivalent to those of the %file_disposition% keyword
  • <file_type> indicates the type of the entity, with values equivalent to those of the %file_type% keyword
  • <file_size> is the entity's size, in bytes
  • <file_date> is the entity's last modification date, in YYYY/MM/DD format and expressed in GMT
  • <file_time> is the entity's last modification time, in HH:MM:SS format and expressed in GMT

Variable Definition Keywords

Note that variables for which there are prompts, these are substituted at the start of the job execution. If you use the dds_set keyword command to change these variables later on, the new values are not re-substituted in subsequent job templates. In effect, these prompted variables are read-only, unless you change them in the Manager GUI. Variables created in the template appear in the drop-down list, as well as the following variable:

  • dds_set

    Sets a variable that will be available in the job template, or resets the value of a variable.

    Example: echo %dds_set%FileType=tar

Notification Keywords

  • dds_active_status

    Prints a message to the active Status Message field for the current job. If you do not specify this, the status message (for example, "Starting Job"), will show up in the log file generated by the job.

    Example: print ("dds_active_status% Starting job");

  • dds_msg_info

    Prints the message in the job log with the specified severity.

    Example: print STDOUT ("%dds_msg_info%$LogMessage\n");

  • dds_msg_debug

    Prints the message in the job log with the specified severity.

    Example: print STDOUT ("%dds_msg_debug%$LogMessage\n");

  • dds_msg_error

    Prints the message in the job log with the specified severity.

    Example: print STDOUT ("%dds_msg_error%$LogMessage\n");

  • dds_msg_warning

    Prints the message in the job log with the specified severity.

    Example: print STDOUT ("%dds_msg_warning%$LogMessage\n");

 

Statistics Keywords

The statistics keyword substitution functionality is provided for group notify commands. The text substituted by each keyword takes the form of one or more statistics messages, each terminated by a semicolon character. The keywords within a single record are space-separated. The following section lists the available keywords and a description of the message returned.

Note: The format represents value=keyword

Value Keyword Description

JOBID

<job_identifier>

scheduler-assigned job identifier string

JOBNAME

<job_name>

designated job name string

JOBGRPID

<job_group_identifier>

designated job group number (contract_id in DTM_DB terms)

JOBGRPNAME

<job_group_name>

designated job group name string (contract_name in DTM_DB terms)

PKGNAME

<package_name>

fully-qualified package name string (job template name) for the transfer

PKGTYPE

<package_type>

indicates the package type for the particular transfer one of FILE_TRF, PROCESS_TRF, STREAM_TRF, REMOTE_CMD

PKGFLOW

<package_data_flow_direction>

indicates the package data flow direction/disposition one of PUSH, PULL

TRANSTYPE

<transport_type>

indicates the transport type using for the transfer one of: TCP, UDP

Optional keywords are substituted:

AGTSTRTTM

<agent_start_time>

specifies the controlling agent process start-up time (system epoch time in microseconds) non-zero int64_t value

AGGLVL

<udp_aggressiveness>

indicates the UDP aggressive level one of: HIGH, MEDIUM, LOW

PLDSZ

<udp_payload_size>

specifies the number of payload bytes in a UDP transport packet (for use in UDP byte count computation) non-zero int32_t value

HDRSZ

<udp_header_size>

specifies the number of header bytes in a UDP transport packet (for use in UDP byte count computation) non-zero int32_t value

%grpntfy_report_stats%

SRC

<source_agent>

specifies the network host name of the transfer source agent

TGT

<target_agent>

specifies the network host name of the transfer target agent

RMTSTRTTM

<remote_start_time>

specifies the remote agent start-up time (system epoch time in microseconds) non-zero int64_t value

XFERSTRTTM

<transfer_start_time>

specifies the start time for transfers with the remote agent (system epoch time in microseconds) non-zero int64_t value

INTVLSTRTTM

<interval_start_time>

specifies the start time of the reporting interval (system epoch time in microseconds) non-zero int64_t value

INTVLENDTM

<interval_end_time>

specifies the end time of the reporting interval (system epoch time in microseconds) non-zero int64_t value

The following optional keywords are substituted with report_stats messages:

INITBWTHRTL

<initial_bandwidth_throttle>

specifies the bandwidth throttle applied at transfer start-up (in bytes/seconds) non-zero int32_t value

INITUDPCEIL

<initial_udp_ceiling>

specifies the UDP ceiling applied at transfer start-up (in bytes/seconds) non-zero int32_t value

INITUDPFLOOR

<initial_udp_floor>

specifies the UDP floor applied at transfer start-up (in bytes/seconds) non-zero int32_t value

CRNTBWTHRTL

<current_bandwidth_throttle>

specifies the bandwidth throttle being applied at the end of the reporting interval (in bytes/seconds) non-zero int32_t value

CRNTUDPCEIL

<interval_udp_ceiling>

specifies the UDP ceiling being applied at the end of the reporting interval (in bytes/seconds) non-zero int32_t value

CRNTUDPFLOOR

<interval_udp_floor>

specifies the UDP floor being applied at the end of the reporting interval (in bytes/seconds) non-zero int32_t value

TE

<total_errors>

general error counter for all exceptions detected by or reported to the controlling agent with respect to the remote agent non-zero int32_t value

PE

<unrecovered_errors>

error counter for all unrecoverable exceptions declared by the controlling agent with respect to the remote agent non-zero int32_t value

TF

<file_count>

specifies the known number of files to transfer (ie. number of non-directory entries from the source names command) non-zero int32_t value

TD

<directory_count>

specifies the known number of directories (as files) to transfer (ie. number of directory entries from the source names command) non-zero int32_t value

FT

<files_transferred>

specifies the number of files successfully transferred non-zero int32_t value

DT

<directories_transferred>

specifies the number of directories (as files) successfully transferred non-zero int32_t value

FS

<files_skipped>

specifies the number of files skipped non-zero int32_t value

DS

<directories_skipped>

specifies the number of directories (as files) skipped non-zero int32_t value

FD

<files_deleted>

specifies the number of files explicitly deleted by the target agent non-zero int32_t value

DD

<directories_deleted>

specifies the number of directories explicitly deleted by the target agent non-zero int32_t value

FE

<failed_files>

specifies the number of files that failed to transfer non-zero int32_t value

DE

<failed_directories>

specifies the number of directories (as files) that failed to transfer non-zero int32_t value

RMTENDTM

<remote_end_time>

specifies the remote agent termination time (system epoch time in microseconds) non-zero int64_t value

XFERENDTM

<transfer_end_time>

specifies the end time for transfers with the remote agent (system epoch time in microseconds) non-zero int64_t value

NMCMPLTTM

<names_cmd_complete_time>

Specifies that time that the names command completed (file transfers) non-zero int64_t value

TB

<byte_count>

specifies the total number of data bytes in the known files to transfer (ie. sum of data sizes of TF files) non-zero int64_t value

FBDATA

<file_data_bytes>

specifies the number of uncompressed file data bytes transferred non-zero int64_t value

FBDATCPR

<file_data_bytes_comp>

specifies the number of compressed file data bytes transferred non-zero int64_t value

FBATTR

<file_attr_bytes>

specifies the number of uncompressed file attribute bytes transferred non-zero int64_t value

FBATRCPR

<file_attr_bytes_comp>

specifies the number of compressed file attribute bytes transferred non-zero int64_t value

FBDATSKP

<file_bytes_skipped>

specifies the number of file data bytes not transferred because the files were skipped non-zero int64_t value

FBRSYNCSKP

<rsync_bytes_skipped>

specifies the number of file data bytes not transferred because the RSYNC algorithm deemed them unchanged non-zero int64_t value

FBDATDEL

<file_bytes_deleted>

specifies the number of file data bytes deleted (ie. sum of data sizes of FD files)

non-zero int64_t value

EB

<effective_bytes>

specifies an aggregate value equal to (“value of FBDATA” + “value of FBDATSKP” + “sum of the data counts of all currently-active streams”) non-zero int64_t value

OHSRCRSYNC

<ovhd_src_rsync>

specifies the number of non-data RSYNC overhead bytes flowing from source agent to target agent non-zero int64_t value

OHTGTRSYNC

<ovhd_tgt_rsync>

specifies the number of non-data RSYNC overhead bytes flowing from target agent to source agent non-zero int64_t value

OHSRCMNFST

<ovhd_src_mnfst_comp>

specifies the number of compressed source-to-target file manifest overhead bytes non-zero int64_t value

OHTGTMNFST

<ovhd_tgt_mnfst_comp>

specifies the number of compressed target-to-source file manifest overhead bytes non-zero int64_t value

OHSRCPRTCL

<ovhd_src_prtcl>

specifies the number of source-to-target transfer protocol overhead bytes   non-zero int64_t value

OHTGTPRTCL

<ovhd_tgt_prtcl>

specifies the number of target-to-source transfer protocol overhead bytes   non-zero int64_t value

OHSRCCHNL

<ovhd_tgt_ccjnl>

specifies the number of source-to-target bytes sent over the control channel   non-zero int64_t value

OHTGTCHNL

<ovhd_tgt_cchnl>

specifies the number of target-to-source bytes sent over the control channel

non-zero int64_t value

SFRECVD

<sf_bytes_recvd>

specifies the number of bytes received by the controlling agent from the Security Framework (SF) layer non-zero int64_t value

SFSENT

<sf_bytes_sent>

specifies the number of bytes sent by the controlling agent to the Security Framework (SF) layer non-zero int64_t value

NWRECVD

<net_bytes_recvd>

specifies the number of bytes received by the controlling agent from the network non-zero int64_t value

NWSENT

<net_bytes_sent>

specifies the number of bytes sent by the controlling agent on the network non-zero int64_t value

UPKTRECVD

<udp_pkts_recvd>

specifies the total number of UDP packets received by both the controlling and remote agents non-zero int64_t value

UPKTRJCTD

<udp_pkts_rjctd>

specifies the total number of UDP packets rejected by both the controlling and remote agents non-zero int64_t value

UPKTSENT

<udp_pkts_sent>

specifies the total number of UDP packets sent by both the controlling and remote agents non-zero int64_t value

UPKTRSNT

<udp_pkts_resent>

specifies the total number of UDP packets resent by both the controlling and remote agents non-zero int64_t value

PBDATA

<process_data_bytes_sent>

specifies e number of bytes (a) sent by the controlling (source) agent of a process push transfer or (b) received by the controlling (target) agent of a process pull transfer non-zero int64_t value

PBUNCNSMD

<process_data_bytes_sent>

specifies the number of bytes received by the target agent of a process transfer that were not delivered to the target data sink command non-zero int64_t value

PBSENT

<stream_data_bytes_sent>

specifies the number of bytes sent by the controlling (source) agent across all channels of a streaming transfer non-zero int64_t value

PBRECVD

<stream_data_bytes_recvd>

specifies the number of bytes received by the controlling (source) agent across all channels of a streaming transfer non-zero int64_t value

%grpntfy_summary_stats%

AGTENDTM

<agent_end_time>

specifies the controlling agent process termination time (system epoch time in microseconds) non-zero int64_t value

REPORT_STATs And Summary_Stats Keywords

JOBID

<job_identifier>

scheduler-assigned job identifier string

PKGNAME

<package_name>

fully-qualified package name string (job template name) for the transfer

STATUS

<transfer_status>

for a stats_report message, indicates the transfer status between a particular source/target pair one of COMPLETE, PENDING, RUNNING, PAUSED always set to COMPLETE for stats_summary messages

%grpntfy_aggregate_stats%

Substitutes clusters of “const_stats/report_stats/summary_stats” records that are available for all transfer components organized under the group component. The following is an example of a group notify command script:

   #!dds_perl 
   #

   select STDOUT;
   $| = 1;

   my $constStats = '%grpntfy_const_stats%';
   my $reportStats = '%grpntfy_report_stats%';
   my $summaryStats = '%grpntfy_summary_stats%';
   my $aggregateStats = '%grpntfy_aggregate_stats%';

   print "START -- group notify cmd for '%component_path%'\n\n";
   print "=== CONST ===\n\n" . $constStats . "\n";
   print "=== REPORT ===\n\n" . $reportStats . "\n";
   print "=== SUMMARY ===\n\n" . $summaryStats . "\n";
   print "=== AGGREGATE ===\n\n" . $aggregateStats . "\n";

   exit 0;
         

Keyword Availability

Due to the "asynchronous" nature of template execution, keywords may not always be available. Please consult the appropriate table to determine if the keyword is available. An "X" indicates commands in which the value is substituted. A "D" indicates commands in which the value is substituted, if it is defined in the template, agent or at scheduling time.

File Transfer Template: Push - Source

The following table details the keywords associated with File Transfer Template (Push - Source). An "X" indicates commands in which the value is substituted. A "D" indicates commands in which the value is substituted, if it is defined in the template, agent or at scheduling time.

 

Keywords Unavailable Initialization Setup Mapping Filter Pre Post Completion Error Finalize
agent_restarted   X X X X X X X X X
authentication_mode X X X X X X X X X X
bandwidth_throttle X X X X X X X X X X
default_unavailable_action X                  
encryption_level X X X X X X X X X X
exit_code               X X X
file_disposition           X X      
file_gmtime         X X X      
file_name         X X X      
file_size         X X X      
file_type         X X X      
job_id X X X X X X X X X X
job_name X X X X X X X X X X
job_template X X X X X X X X X X
job_template_library X X X X X X X X X X
job_template_library_owner X X X X X X X X X X
remote_bandwidth_limit       D D D D D D  
remote_system_type_available       X X X X X X  
req_agent X X X X X X X X X X
req_inception X X X X X X X X X X
req_user X X X X X X X X X X
src_agent   X X X X X X X X X
src_directory   X X X X X X X X X
src_platform   X X X X X X X X X
src_user   X X X X X X X X X
stream         X          
system_root_directory   X X X X X X X X X
tgt_agent     X X X X X X X X
tgt_directory   X X X X X X X X X
tgt_platform       X X X X X X X
tgt_user     X X X X X X X X
transfer_type   X X X X X X X X X
unavailable_agent X                  

File Transfer Template: Push - Target

The following table details the keywords associated with File Transfer Template (Push - Target). An "X" indicates commands in which the value is substituted. A "D" indicates commands in which the value is substituted, if it is defined in the template, agent or at scheduling time.

Keywords Unavailable Initialization Setup Mapping Filter Pre Post Completion Error Finalize
agent_restarted X X X X X X X X X X
authentication_mode X X X X X X X X X X
bandwidth_throttle X X X X X X X X X X
default_unavailable_action X                  
encryption_level X X X X X X X X X X
exit_code     X         X X X
file_disposition           X X      
file_gmtime         X X X      
file_name         X X X      
file_size         X X X      
file_type         X X X      
job_id X X X X X X X X X X
job_name X X X X X X X X X X
job_template X X X X X X X X X X
job_template_library X X X X X X X X X X
job_template_library_owner X X X X X X X X X X
remote_bandwidth_limit   D D D D D D D D D
remote_system_type_available   X X X X X X X X X
req_agent X X X X X X X X X X
req_inception X X X X X X X X X X
req_user X X X X X X X X X X
src_agent X X X X X X X X X X
src_directory X X X X X X X X X X
src_platform X X X X X X X X X X
src_user X X X X X X X X X X
stream         X          
system_root_directory X   X X X X X X X X
tgt_agent X X X X X X X X X X
tgt_directory X X X X X X X X X X
tgt_platform   X X X X X X X X X
tgt_user X X X X X X X X X X
transfer_type X X X X X X X X X X
unavailable_agent X                  

File Transfer Template: Pull - Source

The following table details the keywords associated with File Transfer Template (Pull - Source). An "X" indicates commands in which the value is substituted. A "D" indicates commands in which the value is substituted, if it is defined in the template, agent or at scheduling time.

Keywords Unavailable Initialization Setup Mapping Filter Pre Post Completion Error Finalize
agent_restarted X X X X X X X X X X
authentication_mode X X X X X X X X X X
bandwidth_throttle X X X X X X X X X X
default_unavailable_action X                  
encryption_level X X X X X X X X X X
exit_code     X         X X X
file_disposition           X X      
file_gmtime         X X X      
file_name         X X X      
file_size         X X X      
file_type         X X X      
job_id X X X X X X X X X X
job_name X X X X X X X X X X
job_template X X X X X X X X X X
job_template_library X X X X X X X X X X
job_template_library_owner X X X X X X X X X X
remote_bandwidth_limit   D D D D D D D D D
remote_system_type_available   X X X X X X X X X
req_agent X X X X X X X X X X
req_inception X X X X X X X X X X
req_user X X X X X X X X X X
src_agent X X X X X X X X X X
src_directory X X X X X X X X X X
src_platform X X X X X X X X X X
src_user X X X X X X X X X X
stream         X          
system_root_directory X   X X X X X X X X
tgt_agent X X X X X X X X X X
tgt_directory X X X X X X X X X X
tgt_platform   X X X X X X X X X
tgt_user X X X X X X X X X X
transfer_type X X X X X X X X X X
unavailable_agent X                  

File Transfer Template: Pull - Target

The following table detains the keywords associated with File Transfer Template (Pull - Target). An "X" indicates commands in which the value is substituted. A "D" indicates commands in which the value is substituted, if it is defined in the template, agent or at scheduling time.

Keywords Unavailable Initialization Setup Mapping Filter Pre Post Completion Error Finalize
agent_restarted   X X X X X X X X X
authentication_mode X X X X X X X X X X
bandwidth_throttle X X X X X X X X X X
default_unavailable_action X                  
encryption_level X X X X X X X X X X
exit_code               X X X
file_disposition           X X      
file_gmtime         X X X      
file_name         X X X      
file_size         X X X      
file_type         X X X      
job_id X X X X X X X X X X
job_name X X X X X X X X X X
job_template X X X X X X X X X X
job_template_library X X X X X X X X X X
job_template_library_owner X X X X X X X X X X
remote_bandwidth_limit       D D D D D D D
remote_system_type_available     X X X X X X X X
req_agent X X X X X X X X X X
req_inception X X X X X X X X X X
req_user X X X X X X X X X X
src_agent   X X X X X X X X X
src_directory   X X X X X X X X X
src_platform   X X X X X X X X
src_user   X X X X X X X X X
stream         X          
system_root_directory    X X X X X X X X X
tgt_agent     X X X X X X X X
tgt_directory   X X X X X X X X X
tgt_platform       X X X X X X X
tgt_user     X X X X X X X X
transfer_type   X X X X X X X X X
unavailable_agent X                  

Process Transfer Template: Push

The following table details the keywords associated with Process Transfer Template (Push). An "X" indicates commands in which the value is substituted. A "D" indicates commands in which the value is substituted, if it is defined in the template, agent or at scheduling time.

Note: Src=source, Tgt=target
Keywords Unavailable Initialization Setup Process Completion Error Finalize
  Src Tgt Src Tgt Src Tgt Src Tgt Src Tgt Src Tgt Src Tgt
agent_restarted   X X X X X X X X X X X X X
authentication_mode X X X X X X X X X X X X X X
bandwidth_throttle X X X X X X X X X X X X X X
default_unavailable_action X X                        
encryption_level X X X X X X X X X X X X X X
exit_code          X       X X X X X X
job_id X X X X X X X X X X X X X X
job_name X X X X X X X X X X X X X X
job_template X X X X X X X X X X X X X X
job_template_library X X X X X X X X X X X X X X
job_template_library_owner X X X X X X X X X X X X X X
remote_bandwidth_limit       D   D D D D D D D   D
req_agent X X X X X X X X X X X X X X
req_inception X X X X X X X X X X X X X X
req_user   X X X X X X X X X X X X X
src_agent   X X X X X X X X X X X X X
src_directory   X X X X X X X X X X X X X
src_platform   X X X X X X X X X X X X X
src_user   X X X X X X X X X X X X X
system_root_directory       X X X X X X X X X X X
tgt_agent   X   X X X X X X X X X   X
tgt_directory   X X X X X X X X X X X X X
tgt_platform       X   X X X X X X X   X
tgt_user   X   X X X X X X X X X   X
transfer_type   X X X X X X X X X X X X X
unavailable_agent X X                        

Process Transfer Template: Pull

The following table detains the keywords associated with Process Transfer Template (Pull). An "X" indicates commands in which the value is substituted. A "D" indicates commands in which the value is substituted, if it is defined in the template, agent or at scheduling time.

Note: Src=source, Tgt=target
Keywords Unavailable Initialization Setup Process Completion Error Finalize
  Tgt Src Tgt Src Tgt Src Tgt Src Tgt Src Tgt Src Tgt Src
agent_restarted   X X X X X X X X X X X X X
authentication_mode X X X X X X X X X X X X X X
bandwidth_throttle X X X X X X X X X X X X X X
default_unavailable_action X X                        
encryption_level X X X X X X X X X X X X X X
exit_code                 X X X X X X
job_id X X X X X X X X X X X X X X
job_name X X X X X X X X X X X X X X
job_template X X X X X X X X X X X X X X
job_template_library X X X X X X X X X X X X X X
job_template_library_owner X X X X X X X X X X X X X X
remote_bandwidth_limit       D   D D D D D D D   D
req_agent X X X X X X X X X X X X X X
req_inception X X X X X X X X X X X X X X
req_user   X X X X X X X X X X X X X
src_agent   X X X X X X X X X X X X X
src_directory   X X X X X X X X X X X X X
src_platform   X X X X X X X X X X X X X
src_user   X X X X X X X X X X X X X
system_root_directory       X X X X X X X X X X X
tgt_agent   X   X X X X X X X X X   X
tgt_directory   X X X X X X X X X X X X X
tgt_platform       X   X X X X X X X   X
tgt_user   X   X X X X X X X X X   X
transfer_type   X X X X X X X X X X X X X
unavailable_agent X X                        

Remote Command Template

The following table details the keywords associated with Remote Command Template. An "X" indicates commands in which the value is substituted. A "D" indicates commands in which the value is substituted, if it is defined in the template, agent or at scheduling time.

Keywords Unavailable Initialization Finalize Setup Completion Process Error
agent_restarted X X X X X X X
authentication_mode X X X X X X X
bandwidth_throttle X X X X X X X
ctl_agent X X X X X X X
ctl_platform X X X X X X X
ctl_user X X X X X X X
default_unavailable_action X            
encryption_level X X X X X X X
exit_code     X   X   X
job_id X X X X X X X
job_name X X X X X X X
job_template X X X X X X X
job_template_library X X X X X X X
job_template_library_owner X X X X X X X
remote_bandwidth_limit   D D D D D D
req_agent X X X X X X X
req_inception X X X X X X X
req_user X X X X X X X
system_root_directory X X X X X X X
tgt_agent X X X X X X X
tgt_directory X X X X X X X
tgt_platform   X X X X X X
tgt_user X X X X X X X
transfer_type X X X X X X X
unavailable_agent X            

Component Group

The following table details the keywords associated with Component Group. An "X" indicates commands in which the value is substituted. A "D" indicates commands in which the value is substituted, if it is defined in the template, agent or at scheduling time.

Variables Group Notify Command Link Trigger Command Push Job Template Error Pull Job Template Error
authentication_mode X X X X
bandwidth_throttle X X X X
components_complete X   X X
component_exit_code     X X
components_in_error X   X X
component_path X X X X
components_skipped X   X X
components_to_execute   X    
encryption_level X X X X
group X      
job_id X X X X
job_name X X X X
job_template     X X
job_template_library X X X X
job_template_library_owner X X X X
link   X    
links_complete X   X X
links_in_error X   X X
links_skipped X   X X
links_to execute   X    
req_agent X X X X
req_inception X X X X
req_user X X X X
src_agent     X  
tgt_agent       X

Sequence Diagrams

The following diagrams illustrates a representation of the sequence file transfer command execution process for push/pull file transfer templates.

Push Transfer Model

Pull Transfer Model

Tutorial

This tutorial provides an example of creating a simple “Hello World” component based on an existing component and using it in a simple Workflow or Job Template.

The tutorial also describes:

  • creating hierarchical properties
  • using values of input properties in a Perl script
  • and setting the values of output properties so that they are available to other workflow components

Creating a Component

In this procedure, you will learn how to create a simple “Hello World” component by completing the following tasks:

  • Create a new Job Template Library
  • Customize an existing component (a remote command component)
  • Create inputs and outputs for the component
  • Create a command

To create a "Hello World" workflow component, follow these steps:

  1. In the Manager, select Jobs > Templates.
  2. Click Add.
  3. In the Add Job Template Library page, type a name and description of the new Job Template Library.
  4. Click Create to open the Workflow Canvas.
  5. Expand the Foundation item to display the foundation components.

     

  6. Select the Command icon and drag it to the canvas.
  7. Click in the "Command name field and rename the command to:

    "Hello_World"

    .

    Component names can contain only alphanumeric characters; use an underscore between words.

  8. Right-click the command icon and choose Edit Component.

    You may be prompted to save the Workflow or Job Template before continuing. You will also be prompted that your Workflow or Job Template does not contain a start component. Since you are creating a component and not a Workflow, click Yes to continue. (You can also enable Do not show this message again to permanently disable this message.) If you are using Internet Explorer, you may receive a warning about Internet Explorer running slowly and causing the computer to be unresponsive. You can ignore and permanently disable this message.

  9. Right-click Inputs and click Add.
  10. Fill in the prompts as follows:
    • Name: PrintHelloWorld
    • Type: Boolean
    • Options: Is Input and Is Visible
    • Value: No

      Note that customized property names are case sensitive when they appear in scripts and any spaces in the displayed property name are removed from the internal property name (which the scripts use). The internal property names are also camel-cased (upper case character for first letter in each word, lower case for the remainder) and the first character of the name is lower case.For example, if you created an input called “My Test Input”, it has to be referenced in a script as “myTestInput” (note the lower case m).Note that it is not necessary to click OK to save your changes, since you will be adding more properties. Clicking OK saves all changes made during the current session, and exits the component editor screen.

  11. Right-click Outputs and click Add.
  12. Create a property called "My Custom Properties" as shown below.
  13. Fill in the prompts as follows:
    • Name: WasHelloWorld Printed
    • Type: Boolean
    • Options: Is Output and Is Visible
    • Value: No
  14. Right-click the Inputs property and click Add.
  15. Create a property called "My Custom Properties.

     

  16. Drag the "PrintHelloWorld" property onto "My Custom Properties" to group it.

    The group name gets embedded in the input name when you insert that property into a script (and spaces in the group name are stripped out). If you move a property to another group, the script you created when the property was in that group will no longer reference the correct property (because the property is now in a different group).

  17. Scroll down to the Commands list and click Target Command to open the script edit dialog.
  18. Click OK when complete.
  19. In the Workflow Canvas, click Save.

    Your Workflow Canvas should now appear similar to the following (note that a starburst icon on a component indicates that the component has been edited/specialized). A specialized component is a published component that a user has modified, but not yet published. A “starburst” icon on the component indicates that it has been specialized. A specialized component is a single instance of the component, available only in the workflow canvas in which it was created.

     

"Hello World" Sample Script Text

         my $printHelloWorld = '%Inputs.myCustomProperties.printHelloWorld%'; 

         if ("$printHelloWorld" eq "yes") { 
             print "Hello World\n"; 
             print "%dds_property_set%Outputs.wasHelloWorldPrinted=Yes\n"; 
         } else { 
             print "%dds_property_set%Outputs.wasHelloWorldPrinted=No\n"; 
         }
         

Creating a Workflow

A Workflow or Job Template is a series of related tasks that together form a business process. Signiant enables users to create customized, automated workflows by combining and linking Signiant-designed Workflow or Job Template components (and components created by Signiant users). Now that you have created a “Hello World” component, you will want to test it, by incorporating it in a Workflow or Job Template and running it.

In this procedure, you will learn how to create a simple “Hello World” Workflow or Job Template by completing the following tasks:

  • Adding a start component and linking it to the Hello World Component
  • Adding an Email notification component and linking it to the Hello World Component
  • Mapping input and output properties among the components

To create a "Hello World" Workflow or Job Template, do the following:

  1. From the Manager, select Jobs>Templates.
  2. In the toolbar, click Add.
  3. Type a name in the Name field and click Create. The Job Template Library canvas will appear. On the left is the Components toolbox.  On the right is the canvas.
  4. Expand the Foundation component menu, click the ScheduledStart component and drag it to the canvas.
  5. Click the Command component and drag it to the canvas.
  6. On the canvas, select the Command component, right-click, choose Rename, type Hello_World and press the Enter key.

     

  7. On the canvas, select the Hello_World component, right-click and choose Edit Component.
  8. In the Component Editor, click Add and do the following:
    • In Name type: Print_Hello_World.
    • From the Type drop-down list, select Boolean.
    • Enable the Is Input and Is VisibleOptions.
  9. Click Apply.
  10. In the Component Editor, expand Commands and select Target Command.
  11. Type Perl code that looks like:
  12. Link the HelloWorld component to the ScheduledStart component by clicking the arrow on the right of the ScheduledStart component and dragging the link line to the arrow on the left of the HelloWorld component:

     

  13. Expand the Notification area in the Components toolbox.
  14. Drag and drop the Email component onto the canvas.
  15. Link the HelloWorld component to the Email component.
  16. Double-click the HelloWorld component to see its properties that are exposed for mapping.

     

  17. Drag and drop the PrintHelloWorld property on the right side of the dialog to the ScheduledStart property on the left to create prompts.
  18. Drag and drop the Target Agents property on the right to the ScheduledStart property on the left.

    Not all inputs need to be mapped to a prompt. Values can be provided in the input screen on the right, in which case users are NOT prompted for these values.

  19. Click OK to close the mapping window.
  20. Double-click the Email component.
  21. Drag the Email To property from the right onto the ScheduledStart property on the left.
  22. Click in the Value field beside the Email Subject property on the right side of the screen and click the pencil/edit icon.
  23. Type the words “Was Hello World Printed:”
  24. Drag the "Was Hello World Printed" output on the left into the expression editor.

     

  25. Click Update.
  26. Click OK to close the Mapping Editor dialog.
  27. Click Save to save the workflow.

 

Running the Workflow

To run a workflow, follow these steps:

  1. Right-click on the ScheduledStart component in the workflow.
  2. Select Create Job
  3. Select a job group to run the job in (for example, Default).
  4. Specify a name for the job in the Job Name field (for example, "HelloWorldTest").
  5. Accept the default “None” for the job schedule frequency.
  6. Choose "Yes" from the Print Hello World drop-down.
  7. Select the agent on which you want the workflow to run, from the Target Agents drop-down.
  8. Type your e-mail address in the Email To field.
  9. Click OK.
  10. Select the job and click Run.
  11. Choose Yes when prompted about running the job.
  12. When the job has finished running, click the Logs tab.
  13. Click the magnifying glass beside the Job Log item.
  14. Check your e-mail. You should have received a customized e-mail message.