19B Otm WN Workbench
19B Otm WN Workbench
2
GTM How To/Configuration Topic - License Screening Enhancements ····························································· 143
AES Enhancements ································································································································································ 143
Order Release to Trade Transaction ································································································································ 143
Campaign Management ························································································································································ 145
Determine Trade Program Eligibility and Qualification Based on Item Origin ··················································· 147
Specify Item Type ···································································································································································· 149
Rename Trade Item to Item ················································································································································· 149
Origin Management ································································································································································· 149
Party Site ····················································································································································································· 150
Tariff Rate Management ························································································································································ 150
Trading Partner Item ······························································································································································· 152
Trade Compliance ··········································································································································································· 152
Support Automated Exceptions to Other Types of Controls ···················································································· 152
Trade Agreements ·········································································································································································· 152
Trade Agreement Management ·········································································································································· 153
SmartLinks on Trade Agreements ····································································································································· 153
Global Trade Intelligence (GTI) ················································································································································· 154
License and License Line Facts and Dimensions Available ···················································································· 154
New Count and Cost Facts Available ······························································································································· 154
3
UPDATE 19B
REVISION HISTORY
This document will continue to evolve as existing sections change and new information is added. All updates
appear in the following table:
19 APR 2019 Agent Logging and Statistics Updated document. Delivered feature in 19B.
OVERVIEW
This guide outlines the information you need to know about new or improved functionality in Oracle
Transportation & Global Trade Management Cloud Update 19B. Each section includes a brief description of
the feature, the steps you need to take to enable or begin using the feature, any tips or considerations that you
should keep in mind, and the resources available to help you.
GIVE US FEEDBACK
We welcome your comments and suggestions to improve the content. Please send us your feedback at otm-
doc_us@oracle.com. Please indicate you are inquiring or providing feedback regarding the Oracle
Transportation & Global Trade Management What’s New in Update 19B.
UPDATE TASKS
This section gives you information to help you plan, deploy, and validate your update. We make frequent
additions to this document, so don’t forget to check back and get the latest information before your update
starts.
Use the following resources to prepare for and validate your Oracle Engagement Cloud update.
Doc ID 2508854.1
Oracle Cloud Applications - Transportation and Global Trade Management Cloud: Quarterly
Updates - Preparation and Testing Recommendations
4
Doc ID 2095528.1
Oracle Cloud Applications - Transportation and Global Trade Management Cloud: Quarterly
Update Planning
Doc ID 2096782.1
Oracle Cloud Applications - Transportation and Global Trade Management Cloud: Quarterly
Update Planning FAQs
Doc ID 2098110.1
Oracle Cloud Applications - Transportation and Global Trade Management Cloud: Update Policy
5
FEATURE SUMMARY
Column Definitions:
Customer Action Required = You MUST take action before these features can be used by END USERS. These features are delivered disabled and you choose if and when
to enable them. For example, a) new or expanded BI subject areas need to first be incorporated into reports, b) Integration is required to utilize new web services, or c) features
must be assigned to user roles before they can be accessed.
Ready for Use by End Users Customer Must Take Action before Use by End Users
(Feature Delivered Enabled) (Feature Delivered Disabled)
Reports plus Small Scale UI or Process-Based new features Not disruptive as action is required to make these features
will have minimal user impact after an update. Therefore, ready to use. As you selectively choose to leverage, you
customer acceptance testing should focus on the Larger set your test and roll out timing.
Scale UI or Process-Based* new features.
UI or UI or
Process-Based: Process-Based:
Feature Report Small Scale Larger Scale* Customer Action Required
TRANSPORTATION AND GLOBAL TRADE
PLATFORM
Architecture
REST Services
Default Notify Stylesheets Available as Public Content
Export File Usability Improvements
Agent Logging and Statistics
User Interface Refresh
Accessibility - Keyboard Navigation
Accessibility - Skip Navigation Menu
Accessibility - Validate Usage of Color
Accessibility - Screen Reader
6
Ready for Use by End Users Customer Must Take Action before Use by End Users
(Feature Delivered Enabled) (Feature Delivered Disabled)
Reports plus Small Scale UI or Process-Based new features Not disruptive as action is required to make these features
will have minimal user impact after an update. Therefore, ready to use. As you selectively choose to leverage, you
customer acceptance testing should focus on the Larger set your test and roll out timing.
Scale UI or Process-Based* new features.
UI or UI or
Process-Based: Process-Based:
Feature Report Small Scale Larger Scale* Customer Action Required
Accessibility - Documentation
User Experience - General - New Indicators
Home Experience Improvements - Default Colors and
Theme Management Enhancements
Manager Layout - Support Removal of Reference
Number Grid
Screen Set Result Improvements
Item Unified UI
Workbench
Workbench - Additional GTM Objects Supported in the
Workbench
Workbench - Additional OTM Objects Supported in the
Workbench
Workbench - Export to Excel Support for Workbench
Tables
Workbench - Layout Messages
Workbench - Splitter Configuration - Split Existing
Region
Workbench - Layout Display Format
Workbench - Manager Layout a Region
Workbench - Selected Rows Totals for Workbench
Tables
Workbench - Mass Update Support for Workbench
Tables
Workbench - Multiple Masters to One Detail Table
7
Ready for Use by End Users Customer Must Take Action before Use by End Users
(Feature Delivered Enabled) (Feature Delivered Disabled)
Reports plus Small Scale UI or Process-Based new features Not disruptive as action is required to make these features
will have minimal user impact after an update. Therefore, ready to use. As you selectively choose to leverage, you
customer acceptance testing should focus on the Larger set your test and roll out timing.
Scale UI or Process-Based* new features.
UI or UI or
Process-Based: Process-Based:
Feature Report Small Scale Larger Scale* Customer Action Required
Workbench - View Only Access
Workbench - Refresh All
Workbench - Refresh After Action
Workbench - Refresh Detail Tables When Master
Table is Refreshed
Workbench - Saved Query No Longer Runs During
Creation or Edit of a Workbench Table
Other Improvements
Document Management - Add Document Multi-Select
Document Option
Query Based Dynamic Drop List for Reports
Support for Document Context as Pseudo Field
Oracle Transportation Management (Base)
Multi-Threading for Rating Engine
Running Manifest
Addition Release Method Order Configuration Options
Documents Actions Added to Tracking Events,
Document Element Added to Shipment Status Interface
External Distance Engine and Map Enhancements
Simplified External Distance Engine Configuration UI
Screen Set - Configure Map Hover Fields
Workbench Map - Expose Vendor Map Controls
Workbench - Configure Map Hover Text in Screen Set
8
Ready for Use by End Users Customer Must Take Action before Use by End Users
(Feature Delivered Enabled) (Feature Delivered Disabled)
Reports plus Small Scale UI or Process-Based new features Not disruptive as action is required to make these features
will have minimal user impact after an update. Therefore, ready to use. As you selectively choose to leverage, you
customer acceptance testing should focus on the Larger set your test and roll out timing.
Scale UI or Process-Based* new features.
UI or UI or
Process-Based: Process-Based:
Feature Report Small Scale Larger Scale* Customer Action Required
Workbench Map - Support Multiple Maps in
Workbench Layout
Workbench Map - External Distance Engine and Map -
Consider Traffic Between Stops
Workbench Map - Consider Hazmat for Each Pair of
Stops
Workbench Map - Lock Zoom Level and Lock View on
Map
Workbench Map - Additional HERE Supported
Parameters
Workbench Map - ALK Rail Routes
Workbench Map - Additional ALK Supported
Parameters
Workbench Map - Map Filters
External Distance Engine and External Service Engine
Consider Equipment Restrictions
Transportation Operational Planning
Clustering Merge Algorithm
Multi-Stop Consolidation for Co-Located Stops
Load Configuration - Scoring Algorithm Load Bearing
Consider Service Provider Capacity Across Days
Honor Location Inactive Flag for Intermediate
Locations in Network Routing
Rule 11 and Network Routing
Tracking Event Ahead/Late Calculation Based on ETA
9
Ready for Use by End Users Customer Must Take Action before Use by End Users
(Feature Delivered Enabled) (Feature Delivered Disabled)
Reports plus Small Scale UI or Process-Based new features Not disruptive as action is required to make these features
will have minimal user impact after an update. Therefore, ready to use. As you selectively choose to leverage, you
customer acceptance testing should focus on the Larger set your test and roll out timing.
Scale UI or Process-Based* new features.
UI or UI or
Process-Based: Process-Based:
Feature Report Small Scale Larger Scale* Customer Action Required
Ability to Turn Off Rating Within Network Routing
Center of Gravity Out of Bounds Reporting
Top-Off Orders
Out of Gauge Load Building
Network Routing - Allow Order to Start and End at
Through Point
Network Routing - Cross Leg Consolidation
Oracle Fleet Management
Combination Equipment Group Usability - Return Set
Scenario
Stand Alone Work Assignment Process
Solution Quality Improvement for Round-Trip Shipment
Sequence vs. One-Way Shipment Sequences
Estimate Hours of Service When Tracking Events Are
Received
Combination Equipment Group Usability - Support
Multi-Stop Scenarios
Freight Payment, Billing, and Claims
Invoice Adjustment Cost Behavior Enhancement
Logistics Network Modeling
Logistics Network Modeling
Global Trade Management (Base)
Flex Fields for Grouping and Aggregating Data
Copy Flex Fields Using Data Configuration
10
Ready for Use by End Users Customer Must Take Action before Use by End Users
(Feature Delivered Enabled) (Feature Delivered Disabled)
Reports plus Small Scale UI or Process-Based new features Not disruptive as action is required to make these features
will have minimal user impact after an update. Therefore, ready to use. As you selectively choose to leverage, you
customer acceptance testing should focus on the Larger set your test and roll out timing.
Scale UI or Process-Based* new features.
UI or UI or
Process-Based: Process-Based:
Feature Report Small Scale Larger Scale* Customer Action Required
Report to Show License Assignment and Balances
Display Stoplight for Restricted Party Screening on
Transaction and Declaration
Shipment Group View Related Trade Transaction
SmartLink
Approve or Decline Classification at the Classification
Type/Code Level on an Item
Customs Description for Classification Code on Item
Workbench - Work Queue Support
GTM How To/Configuration Topic - Supplier Solicitation
GTM How To/Configuration Topic - Product
Classification Process
Review Match Factor Action to Use Inverse Index
Rename Tariff Preference Types to Trade Preferences
Track Supplier Information
Accessibility Improvement for Party Screening Results
SmartLinks Between Product Classification Type and
Trade Programs
SmartLinks Between Product Classification Code and
Tariff Rates
GTM How To/Configuration Topic - License Screening
Enhancements
AES Enhancements
Order Release to Trade Transaction
Campaign Management
11
Ready for Use by End Users Customer Must Take Action before Use by End Users
(Feature Delivered Enabled) (Feature Delivered Disabled)
Reports plus Small Scale UI or Process-Based new features Not disruptive as action is required to make these features
will have minimal user impact after an update. Therefore, ready to use. As you selectively choose to leverage, you
customer acceptance testing should focus on the Larger set your test and roll out timing.
Scale UI or Process-Based* new features.
UI or UI or
Process-Based: Process-Based:
Feature Report Small Scale Larger Scale* Customer Action Required
Determine Trade Program Eligibility and Qualification
Based on Item Origin
Specify Item Type
Rename Trade Item to Item
Origin Management
Party Site
Tariff Rate Management
Trading Partner Item
Trade Compliance
Support Automated Exceptions to Other Types of
Controls
Trade Agreements
Trade Agreement Management
SmartLinks on Trade Agreements
Global Trade Intelligence (GTI)
License and License Line Facts and Dimensions
Available
New Count and Cost Facts Available
12
TRANSPORTATION AND GLOBAL TRADE PLATFORM
ARCHITECTURE
REST SERVICES
This feature provides you with the next chapter in Oracle's adoption of REST APIs. This REST Services feature
provides you with an additional set of REST resources and supported operations as well as a completely
revamped and enhanced REST API Documentation is on docs.oracle.com.
Resource Operations
Appointment GET
Bill GET
Claim POST, GET, PATCH, DELETE
Consol POST, GET, PATCH, DELETE
Contact POST, GET, PATCH, DELETE
Contact (Trade Parties) POST, GET, PATCH, DELETE
Corporation POST, GET, PATCH, DELETE
Driver POST, GET, PATCH, DELETE
PowerUnit POST, GET, PATCH, DELETE
Equipment POST, GET, PATCH, DELETE
EquipmentGroup POST, GET, PATCH, DELETE
EquipmentType POST, GET, PATCH, DELETE
GtmShipment POST, GET, PATCH, DELETE
GtmTransaction POST, GET, PATCH, DELETE
GtmLicense POST, GET, PATCH, DELETE
Invoice GET
PackagedItem POST, GET, PATCH, DELETE
Item POST, GET, PATCH, DELETE
Itinerary POST, GET, PATCH, DELETE
Location POST, GET, PATCH, DELETE
Order GET
OrLine GET
13
OrderBase GET
OrderMovement GET
Quote POST, GET, PATCH, DELETE
ServiceProvider GET, PATCH
Shipment GET, POST
SellSideShipment GET, POST
Voucher GET
Voyage GET
WorkInvoice GET
GtmCampaign POST, GET, PATCH, DELETE
NEW DOCUMENTATION
The new REST API documentation provides you with comprehensive documentation for each resource and
operation available in a standard Swagger format. The documentation provides proper request, response
syntax, examples, and detailed field level description of all the REST API resources.
14
STEPS TO ENABLE
Review the REST service definition in the REST API guides, available from the Oracle Help Center > your
apps service area of interest > REST API. If you're new to Oracle's REST services you may want to begin with
the Quick Start section.
Almost all resources in OTM/GTM use a Unique Global Identifier, or 'GID', as the primary key for records in the
database. The GID value is a concatenation of an External Identifier, or XID, and a Domain Name. Sub-
resources can also have their own GID field as well as their parent GID field. Prior to this update, the data
returned for a specific resource request contained all GID, XID and Domain Name attributes which meant that
there was a level of redundancy between those values, and most of the sub-resources returned since the parent
GID was repeated in each sub-resource even though the parent GID is implicit within the enclosing context.
NOTE: Starting with this update, the desire is to hide primary key values for requested resources and hide
parent primary keys in sub-resources. However, due to backward compatibility requirements the default
REST API configuration will still return all attributes. To enable the preferred approach (hide primary key
values for requested resources and hide parent primary keys in sub-resources) please set the following
configuration properties:-glog.fusion.cil.restapi.config.hidePks=trueglog.fusion.cil.restapi.config.
hideParentPks=true It is highly recommended to use the property configurations as soon as backward
compatibility is no longer an issue. It is likely that at some point in a future update that these property settings
will be the default.
DOCUMENTATION
The old REST API Getting Started Guide has been replaced by the new interactive REST API Guide.
KEY RESOURCES
The old REST API Getting Started Guide has been replaced by the new interactive REST API Guide.
STEPS TO ENABLE
15
TIPS AND CONSIDERATIONS
The new content can be found at Business Process Automation > Power Data > Event Management >
Stylesheet Profiles
For users involved in the export of data either as a CSV export, DBML export or using the Perform Integration Command to
obtain the outbound transmission for a specific object this feature provides a number of usability improvements. You now
have the option to save File on Local so the output can be saved locally, and a new browser text panel and Copy Text
option has been provided which provides a cleaner view to the export values rendered on your browser.
16
STEPS TO ENABLE
If the text displayed in the browser is very large, it is recommended that you use "File to Local" as Output
versus using "Copy Text". The 'Browser" as Output Destination should be used primarily when the goal is to
get a glance at the data.
In addition, cumulative statistics for each agent/agent action are also captured including the average and
maximum time for agent/agent action completion. With this enhancement agent logging will generate a log
entry even if the agent fails and rolls back.
AGENT_LOG - a log of each agent execution. Each record represents an agent initiation, completion,
error or customer-specified log instruction.
AGENT_ACTION_LOG - a log of each agent action execution. Each record represents an action
initiation, completion or error.
AGENT_STATS - cumulative statistics for each agent. This includes average and maximum time for
agent completion.
AGENT_ACTION_STATS - cumulative statistics for each agent action. This includes average and
maximum values for queue time, execution time and completion time.
AGENT_LOG Content
LOG_SEQUENCE - a unique sequenced ID for the log record. Log search results are sorted by this
sequence in ascending order. This allows agent log statements to be viewed chronologically.
AGENT_RUN_SEQUENCE - a unique ID for the execution instance of the agent. Each time an agent is
run, it generates a run sequence. This allows log statements for a particular run to be logically grouped
together.
PARENT_AGENT_RUN_SEQUENCE - if an agent execution was triggered by activity from another
agent (e.g. the RAISE EVENT agent action or a mod lifetime event after modifying data), this holds the
execution ID of the parent agent.
AGENT_GID - the agent ID
STATE - the type of log message, usually reflecting agent state:
STARTED - the first agent action has been published.
NOTE - a user LOG action has specified an informational message
WARNING - a user LOG action has specified a warning message
ERROR - either an exception occurred during an agent action, a RAISE ERROR action was run
or a user LOG action has specified an error message
COMPLETED - the last agent action has completed.
17
TIME - the UTC time for the log event
LOG_PROCESS_ID - a link to the System Log process ID for the agent's execution instance. Note that
process IDs are not guaranteed to be unique outside of a 24 hour period. Searches should include agent
start time and process ID.
APP_MACHINE_GID - the server the agent ran on. This is to allow for System log retrieval.
LIFETIME_EVENT - the agent event that triggered the agent execution
AGENT_DATA_QUERY_TYPE_GID - the data query type of the business object triggering the agent
execution
AGENT_BUSINESS_OBJECT - the ID of the business object triggering the agent execution
RUN_TIME - for COMPLETED records, the time spent on agent execution. Note that this could be
derived by subtracting the STARTED TIME from the COMPLETED TIME. This column is de-normalized
to simplify searching for long agents in OTM finders.
NOTES - for records created by the LOG agent action, any notes specified by the user
ERROR_AGENT_ACTION_GID - for ERROR records caused by an agent action, the action that failed
ERROR_CAUSE - for records created by the RAISE ERROR agent action, the error message specified
in the action; for agent action exceptions, the first line of the exception
ERROR_OBJECT - the business object that caused the error. This may differ from
AGENT_BUSINESS_OBJECT if the error occurred in a DTA or FOR loop.
AGENT_ACTION_LOG Content
LOG_SEQUENCE - a unique sequence ID for the log record. Log search results are sorted by this
sequence in ascending order. This allows agent action log statements to be viewed chronologically.
ACTION_RUN_SEQUENCE - a unique ID for the execution instance of the action. Each time the action
is run, it generates a run sequence.
AGENT_RUN_SEQUENCE - a link back to the execution instance of the agent that invoked the action
AGENT_GID - the agent ID
ACTION_FLOW - the agent block holding the action ( Norm or Error)
ACTION_SEQUENCE - the sequence # of the action in the action flow of the agent. Note that (
AGENT_GID, ACTION_FLOW, ACTION_SEQUENCE) uniquely identify the action within the
AGENT_ACTION_DETAILS table. If, however, the agent is modified after the log record is written, this
identification may no longer be accurate.
STATE - the type of log message, reflecting action state:
STARTED - the action has begun execution
ERROR - the action threw an exception
COMPLETED - the action (and any triggered work) has completed
TIME - the UTC time for the log event
LOG_PROCESS_ID - a link to the System Log process ID for the action's execution instance
APP_MACHINE_GID - the server the action ran on
AGENT_DATA_QUERY_TYPE_GID - the data query type of the business object triggering the agent
execution
AGENT_BUSINESS_OBJECT - the ID of the business object triggering the agent execution
AGENT_ACTION_GID - the agent action ID
18
ACTION_DATA_QUERY_TYPE_GID - the data query type of the business object processed by the
action. This may differ from AGENT_DATA_QUERY_TYPE_GID if in a DTA or FOR loop.
ACTION_BUSINESS_OBJECT - the ID of the business object processed by the action. This may differ
from AGENT_DATA_QUERY_TYPE_GID if in a DTA or FOR loop.
RUN_TIME - for COMPLETED records, the time between the start of action execution and the
completion of all related activity
ERROR_MSG - for records created by the RAISE ERROR agent action, the error message specified in
the action; for action exceptions, the first line of the exception
AGENT_STATS Content
AGENT_ACTION_STATS Content
Agent and Agent Action log records can be accessed from OTM via any Process Management menu.
19
Agent Logs
20
View Agent Log Finder Error Tab
Note that the finder assumes that the Agent and Application Server exist in the database. To search for old data
after agent deletion, customers cannot use these criteria.
On the finder result page, two smartlinks are available for Agent Log records:
Action Log - displays all Agent Action Log records related to the run instance of the current agent.
System Log - display all System Log records from the time of the Agent Log record, using the System
Log ID of the record and retrieving lines from the specified Application Server. Note that system logs will
cycle frequently so this information is likely to be unavailable for historical analysis.
21
Agent Action Finder
Note that the finder assumes that the Agent and Application Server exist in the database. To search for old data
after agent deletion, customers cannot use these criteria.
On the finder results page, there is one smart link: AgentLog. This brings up all agent log records for the agent
run sequence that ran the action.
Agent and Agent Action statistics can be viewed directly on the Agent viewer or Agent manager. E.g.:
22
STEPS TO ENABLE
Agent logging and statistics can be controlled globally or per-agent. The following properties control default
behavior for all agents:
glog.agent.defaultLogLevel = [ NONE| AGENT |ACTIONS] - the default logging for agents that don't
explicitly set their logging in AGENT.LOG_LEVEL
NONE = no agent logging is performed. There should be no performance overhead for agent
logging when it is turned off.
AGENT = agent activity is logged. This includes only AGENT_LOG records.
ACTIONS = agent and agent action activity is logged. This includes both AGENT_LOG and
AGENT_ACTION_LOG records.
glog.agent.defaultStatsLevel = [ NONE | AGENT | ACTIONS] - the default statistics collection for
agents that don't explicit set their statistics collection in AGENT.STATS_LEVEL
23
NONE = no agent statistics are collected. There should be no performance overhead for statistics
collection when it is turned off.
AGENT = agent statistics are collected. This includes inserts/updates to the AGENT_STATS
records.
ACTIONS = agent and agent action statistics are collected. This includes inserts/updates to both
AGENT_STATS and AGENT_ACTION_STATS records
On the agent header, these defaults can be overridden for a specific agent:
Agent Header
Customers can add information to the agent and agent action logs via the LOG agent action. This action has
been updated to:
allow a message to be written to the System Log, the Agent Aog or both
assign a severity to the logged message
assign a specific Log ID to messages written to the System Log.
24
By using the ASSIGN VARIABLE action with the LOG action, customers can add the result of ad-hoc queries to
agent logs.
Note that only the Message field is required. The following defaults are used if selection is not explicitly made:
Thus, the action is backward compatible with v18 AGENT_ACTION_DETAILS records. No data migration is
needed.
Note that the finder assumes that the Agent and Application Server exist in the database. To search for
old data after agent deletion, customers cannot use these criteria.
If the Agent statistics level is AGENT or ACTIONS, the agent header will show the average and
maximum Run Time since the Since date. This run time is measured from the publish of the first agent
action to the agent process' completion. Note that it does not include the time to evaluate any agent
saved condition. If the Agent statistics level is ACTIONS, each action will show the average and
maximum completion time for that action. This time is measured from the time the action begins
execution to the completion of the action process.
25
A new action on the Agent finder / manager, Reset Statistics, allows agent statistics to be reset. This can
be used to test agent performance under a particular scenario (which may have its control flow path).
Unlike the Agent Log, Agent and Action Statistics are tied to an existing agent and/or agent action via
foreign keys. Any modification of an agent (including a change in log or statistics level) will reset the
statistics.
Specifically, for this update - the improvements made to support Accessibility could require a modification to
your existing automated UI test scripts.
For example:
Eliminated any keyboard traps to make sure you don't get stuck when navigating the application.
Make it easier for you to jump between sections/tabs
Key various keyboard keys to navigate the application.
Here is a summary of the keyboard navigation for a few areas of the application.
NAVIGATOR MENU
The first level menu groups are always expanded and cannot be collapsed.
26
GLOBAL HEADER
Use tab or Shift + tab keys to navigate the Global Header functions
SPRINGBOARD MENU
Move to the left through the menu groups or links at the same level - Left arrow
Move to the right through the menu groups or links at the same level - Right arrow
Close a menu group and places the focus back on the higher level menu group - Up arrow
Open a menu group and places the focus on the first item in the next level menu group - Down arrow
When selected on a menu link, opens the page - Enter
FINDER CRITERIA
Use tab or Shift + tab keys to navigate the fields on the Finder Criteria page.
FINDER RESULTS
STEPS TO ENABLE
If you are on the home screen/springboard the Skip Navigation menu is the following:
If you are on a page within the application where tabs are found (Search Criteria and Manager Layout) you can
navigate the tabs using the Skip Nav menu.
27
When you are on the Tab option you see all of the available tabs:
The Skip Navigation menu always has a Home option, this allows you to return to the Springboard.
STEPS TO ENABLE
Mass Update in the legacy finder showed the saving of the objects to you as green when successful and
red when unsuccessful, now indicators are included as well to more vividly convey this success or failure
to you.
Links are displayed in underlined text as well as being light blue.
STEPS TO ENABLE
28
ACCESSIBILITY - SCREEN READER
This feature provides you with a screen readable navigation path through OTM and GTM. Screen readers
convert digital text to synthesized speech and are used to help people who are blind or who have low vision to
use information technology with the same level of independence and privacy as anyone else. Screen readers
depend on a consistent UI format and layout to provide a usable and seamless navigation process.
To improve the OTM and GTM screen reader navigation experience the following improvements were made:
Table Format
Add proper headings and alternative labels to allow the screen reader to successfully read a finder
results table to the end user
Add Roles
To allow the user to take advantage of additional benefits of using a screen reader proper roles
need to be assigned to various areas of the application
Buttons should have role of Buttons
Menus should have the role of menu
Sections/Headings
Verify all types of fields have the proper sections/headings to insure the screen reader can
navigate properly
Alternative Text
Add additional text for certain areas to be read by the screen reader including images
STEPS TO ENABLE
ACCESSIBILITY - DOCUMENTATION
A new topic, Accessibility Features Guide, has been added to OTM/GTM online help. This guide outlines how
to navigate OTM/GTM without the use of a mouse or with a screen reader.
Springboard
Navigator Menu
Finder Results Pages
Inline Edit
Tree Control
Automation Agent Actions or Error Handler
On all pages with the Unified Global Header, the first tab stop on the OTM/GTM page opens a skip
navigation list that allows you to skip the global header and jump straight to the main page.
29
STEPS TO ENABLE
STEPS TO ENABLE
Default colors for the OTM/GTM header and springboard have been changed to light sky blue background with
dark gray font and header icons as seen in the below image.
30
THEME MANAGEMENT ENHANCEMENTS - NEW COLOR SETTINGS
New features have been added to Theme Management to allow you to manage your home experience font and
icon colors. New fields added include:
Main Font Color - The font color used for the top level springboard menu items. This color is also used for
third level springboard menu items.
Springboard Submenu Font Color - The font color used for the second level springboard menu items.
Springboard Submenu Background Color - The background color used for the second level springboard
menu items. The default is white.
Header Icon Color - The color used for the Unified Global Header icons.
Header Background Color - The color used as the background color for the Unified Global Header.
Select a Color Scheme to see/choose a pre-selected grouping of colors and images to use as the basis for your
theme. There are several color schemes available by default:
Autumn Red
Crisp Green
Dark Blue
Dark Gray
Midnight Blue
Sky Blue (Default): this is the default color scheme automatically used by OTM/GTM.
STEPS TO ENABLE
31
Move this grid to another tab
Remove the grid from your Manager Layout
Here is a list of objects that support this feature when configuring their Manager Layout:
OTM OBJECTS
Driver
Freight Forwarding
Invoice
Invoice Line
Item
Item Qualification
Item Remarks
Location
Order Base
Order Base Line
Order Base Ship Unit
Order Release
Order Release Line
Order Release Ship Unit
Packaged Item
Power Unit
Rapid Order
Ready to Ship OB Line
Ready to Ship OB Ship Unit
Release Instructions
Ship Unit Line
Shipment
Shipment Actuals
Shipment Ship Unit
Shipment Ship Unit Line
Shipment Stop
GTM OBJECTS
Bond
Campaign
Campaign Line
Compliance Rule
Contact
License
License Line
Location
Registration
Shipment
Shipment Line
Structure
Structure Component
Trade Agreement
Transaction
Transaction Line
32
STEPS TO ENABLE
To remove or move a reference number grid from a supported manager layout you should follow the steps
used to modify manager layouts.
1. Go to the Manager Layout manager found in Configuration and Administration > User Configuration >
Manager Layout.
2. Select the supported manager.
3. Then select the Detail tab.
4. In the Manager Layout Detail page you can configure the manager including modifying or deleting
Reference Number Grids.
The primary reason to remove this grid is if you have grid flattened one or more reference numbers into fields
on your Manager Layout and don't want them to display in this grid as well.
Show multiple values for a single remark or reference number qualifier in the Finder Results in a comma
separated string
Reference Numbers and Remarks can display as active links in the finder results
You'll notice the first Order Release has multiple values for the Buyer Number reference number qualifier
and they are displayed here in a comma separated string
You'll notice the second Order Release has a link displayed for the Buyer Number, you can click this
active link directly from these finder results. This is configured in screen set results using the "Display as
Link" setting.
33
STEPS TO ENABLE
Setting up Reference Numbers and Remarks so they can display as active links is is configured in screen set
results using the "Display as Link" setting.
ITEM UNIFIED UI
This feature provides one user interface for Item across both OTM and GTM. The unified item includes all the
information applicable to a user’s transportation or trade needs. The new unified item can be accessed from
the same OTM menu and GTM menu as in previous releases. In addition, the Trade Item in GTM has been
renamed to Item.
STEPS TO ENABLE
This is the new default Item Manager for both OTM and GTM.
WORKBENCH
Campaigns
Campaign Lines
Campaign Line Documents
STEPS TO ENABLE
34
WORKBENCH - ADDITIONAL OTM OBJECTS SUPPORTED IN THE WORKBENCH
When adding a table to the workbench, the following are now available:
Claims
Shipment Cost Object
Work Assignment Bulk Plan
STEPS TO ENABLE
Export
STEPS TO ENABLE
1. For the workbench table you want to export records from select the Export icon to initiate the
process. You will be given the option - in the next step to export all the records or only your selected
records.
2. Select the option to export either all the records in the table or the selected records.
35
3. Wait for the file to export and download the file.
4. Open the exported xls file.
Date: Displays the date and time at which the message occurred. This will be the server date and time.
Tab Name: Displays the name of the Workbench tab that generated the message.
Workbench Message: Displays the message generated by the Workbench.
Map Vendor Message: Displays the map vendor specific message if there is one.
After reviewing these messages you can export or clear the messages. The Messages icon continues to
display on the workbench layout toolbar until the messages have been cleared.
STEPS TO ENABLE
36
Upon performing the split function, the first pane contains the contents of the original and the second one is
empty and available for you to add content. In this example, the map pane was split horizontally and the
original map is on the left and the new pane is on the right.
To reverse the split operation, you can delete the empty region using the delete button that is available in the
pane.
STEPS TO ENABLE
37
Default Largest Font
38
Super Compact Smallest Size Font
STEPS TO ENABLE
You can assign a layout format to a specific workbench layout when you create, copy, or edit a layout using
Create/Copy/Edit Layout.
The layout format specified by clicking Layout Display overrides the layout format selected when the layout was
created/edited.
Tab Name
Indicate if the Edit or View manager layout is to occupy this region
39
The Associated Master Table, the manager layout used is the one that is associated with the master
table's screen set (it's configured in the general tab of the screen set).
40
As with most detail regions within a Workbench Layout, there is a Lock/Unlock function on the region.
Click the Lock View icon to lock that manager layout and keep the results; then, if you click another row
in the master table the results displayed in the manager layout do not change.
This is useful, for example, when you have a shipment master table and a shipment manager layout
since it allows you to select a shipment to view and then lock the manager layout. With the shipment
manager layout locked, you can click around in the shipment table and the manager layout will not
change.
STEPS TO ENABLE
It is important to keep in mind that it is suggested to use simplified/configured Manager Layouts where possible
since the default/PUBLIC Manager Layouts have a vast amount of data displayed and could be overwhelming
when displayed within a Workbench Layout.
41
Total Display
STEPS TO ENABLE
Configure the relevant result columns in the screen set to total, then use that screen set when configuring your
workbench table.
Here's an example of a screen set result column being set to total, this setting is available via the "more" button
in the screen set results configuration.
42
This allows you to make the same edit to multiple records at the same time.
STEPS TO ENABLE
To initiate a mass update, select the records in the table and then click the mass update icon:
STEPS TO ENABLE
1. Configure a workbench table
43
WORKBENCH - VIEW ONLY ACCESS
User Access has been enhanced to allow users to be limited to view only access for selected workbench
layouts. If a user has view only access to a Workbench Layout they will not be able to edit or delete the layout.
STEPS TO ENABLE
An administrator with User Access permissions can configure which users have access to specific Workbench
Layouts and in this release can indicate if access should be limited to "View Only" access by setting the flag
accordingly next to the Workbench Layout in the User Access Manager.
44
WORKBENCH - REFRESH ALL
The OTM data displayed in a workbench layout can often become stagnant because other OTM users or
backend processes might have modified the data. To refresh the data, you now have 2 options, auto refresh or
manual refresh all, to refresh each (non-child) component in the layout. A refresh all process (either manual or
automatic) retrieves the latest data for the existing objects on each non-child component, selection may not be
retained because the selected object might no longer exist.
There are 2 new Refresh All functions available for your workbench.
1. Auto Refresh - An ADMIN user (a user with the ADMIN role) can configure a workbench layout to refresh
automatically after a set duration (between 5 and 120 minutes), this is suitable for monitoring data with
minimal intervention. This setting is available when creating the layout or available in layout details when
editing the layout.
2. Refresh All Data (button/icon) on the Toolbar - You can initiate the workbench refresh directly within your
workbench using the new "Refresh All Data" icon/button on the workbench layout toolbar.
STEPS TO ENABLE
An ADMIN user (a user with the ADMIN role) can configure a workbench layout to refresh automatically after a
set duration (between 5 and 120 minutes). This setting is available when creating the layout or available in
layout details when editing the layout. When the duration passes a Refresh All will initiate and will do a
complete refresh for the entire workbench including a re-querying of the data displayed.
45
Setting a workbench layout to auto refresh can be done when creating the workbench layout or via the layout
details icon on the layout toolbar.
If a workbench is set to auto refresh a calendar icon will be evident at the top of the workbench layout. Upon
hovering over this icon the user can see the time of last refresh and the time of next refresh.
You can initiate the workbench refresh directly within your workbench using the new refresh all icon/button on
the workbench layout toolbar.
46
Refresh all will do a complete refresh for the entire workbench including a re-querying of the data displayed. If a
manual refresh is performed on a workbench layout that is configured for auto refresh, the auto refresh timer is
reset. For example, if your auto refresh interval is 20 minutes and the last refresh time is 9:00, you manually
refreshed the layout at 9:15, then the next refresh time would be updated to 9:35 (20 minutes after your manual
refresh).
A couple of important things to consider when using either the manual refresh all function or the auto refresh
capability:
The Refresh All function (either manual or auto) should not be run if a component is in Inline Edit
mode. If a change has been made to the data but not yet saved, the change will be lost upon refresh.
Refresh all will do a complete refresh for the entire workbench including a re-querying of the data
displayed.
If a manual refresh is performed on a workbench layout that is configured for auto refresh, the auto
refresh timer is reset. For example, if your auto refresh interval is 20 minutes and the last refresh time is
9:00, you manually refreshed the layout at 9:15, then the next refresh time would be updated to 9:35 (20
minutes after your manual refresh).
STEPS TO ENABLE
STEPS TO ENABLE
47
STEPS TO ENABLE
While the saved query no longer runs automatically during creation or edit - you do have the option of
manually running the saved query when creating or editing a workbench.
OTHER IMPROVEMENTS
1. Create As New - allows you to add a new document to one or more selected objects in your finder
results.
2. Create As Link - allows you to link a selected/existing document to one or more of your selected finder
results.
3. Create As Copy - allows you to copy a selected/existing document to one or more of your selected finder
results.
48
STEPS TO ENABLE
The new Business Process Automation> Documents> Add option supports the same functionality (and
more) that is provided by the Attach Documents, Generate Document and Upload Document actions - where
the new Add option is available you can reduce the number of actions provided to you users to help avoid
confusion.
STEPS TO ENABLE
A text area is available when defining the report parameter where a SQL query can be written which will be
executed and will generate a drop list based on the output of the query entered.
Running the report will generate a drop list based on the SQL query entered.
49
Document Context Pseudo Field
STEPS TO ENABLE
This feature is enabled by following the standard Screen Set Manager steps used for any other Pseudo Field.
50
1. Go to Configuration and Administration > User Configuration > Screen Set Manager and copy the Public
Document screen set.
2. Go to the Results tab. Use this page to configure the columns that appear on the Results page of the
business object assigned to this screen set. Enter an ID column width which consistently appears as the
first column on all Results pages.
3. The Document Context field is identified as a Pseudo and is marked with the letter P.
4. Add the Document Context field to your results.
5. Select the Document Context Qualifier to use for the field.
STEPS TO ENABLE
The initial thread count/the default value is 1 - which means the rate record evaluation will be performed in
sequence.
If you are noticing a large Backlog and a high (average) Queue Size then it is suggested to increase the number
of threads in a step of 2. The recommended approach is to increase the thread count in a batch of 2 checking
the throughput (Backlog and Queue Size) after each new setting.
You can change/tune and review the impact of different settings batch thread settings through
the Event Queues page. Note that changing the settings in the Event Queue is a temporary change that will be
lost when the server is restarted. Once the desired thread setting has been determined - the approach is to set
the thread group before starting the server.
The Event Queues page can be accessed via Configuration and Administration > Technical Support >
Diagnostics and Tools > EventManagement > Event Queues.
You must be logged into OTM as DBA.ADMIN to access the Event Queues page.
KEY RESOURCES
51
Review the ‘Workflow Thread Tuning’ section of the Cloud Getting Started Guide: https://docs.oracle.
com/cd/E60665_01/otmcs_gs/OTMCG/OTMCG.pdf
RUNNING MANIFEST
This feature provides you with a shipment action to generate a running or rolling manifest view of your
shipments. The running manifest provides stop level detail from stop to stop capturing the details of the freight
being transported.
The Running Manifest action is accessed via Shipment Management > Shipment Management > Buy
Shipments > Actions > Shipment Management > View > Running Manifest.
Header
The header provides the basic shipment information about the geography, reference numbers, and
equipment. The equipment section provides the only reference to the Shipment-Equipment ID since the
equipment will subsequently be referenced by a sequence number. The equipment information also includes
the number painted on the actual truck and the type of truck, if this has been configured.
The result screen is then segmented by the stop to stop details. These sections can be expanded to provide
the information between the stops.
52
Running Manifest Stops
The expanded sections are each arranged to provide a header which explains the details associated with the
next stop. There is also a detailed listing of each ship unit on board between those stops. There are also 2
sections that show summaries of the equipment and compartments in terms of utilization. Each section needs
to be expanded to show the details.
Expanding the Ship Unit details will provide information on the freight that is being carried between stops. This
is organized by equipment, compartment and stop number. This allows the user to understand the inventory on
board and where is is destined. For those users who know how to properly ship bulk, the ship unit count will be
of great utility to the clients. This report intends to have commercial purpose in the fact that it contains details
about the quantities and weights.
53
Ship Units - Stop 1 to Stop 2 (Partial List)
STEPS TO ENABLE
You must set the parameter RUN HAZMAT QUALIFICATION PROCESS to true to view the Hazmat Icons on
the Running Manifest.
1. NEVER,
2. IF NULL (default value),
3. ALWAYS.
54
Order Release Line Weight and Volume Options
STEPS TO ENABLE
glog.business.order.orLineVolumeCalcType
glog.order.line.alwaysRecalcWeightVolume
Tracking Events now support all of the standard Business Process Automation Document Management actions:
Attach Documents
Generate Documents
Limited Documents
Upload Documents
55
In addition, the Document element has been added to the ShipmentStatus (aka as Tracking Event) interface -
this gives you the ability to load documents directly from within the ShipmentStatus interface versus having to
load your documents using the Document interface.
STEPS TO ENABLE
KEY RESOURCES
Review the Integration Guide on the Oracle Help Center for more information regarding OTM/GTM's
integration capabilities: https://docs.oracle.com/cloud/latest/otmcs_gs/docs.htm
When the user selects an External Engine Type, the UI will only display the attributes associated with the
selected External Engine Type. The code will also fill in the correct Java Class.
We have also provided a way to turn off the Cache and this is a new option.
56
The HERE engine has been updated with new parameters as well. Additionally, the user will be able to fix the
limited weight by configuring it on EDE/ESE else it will be dynamically calculated. This Limited Weight is an
additional Parameter that is only used when the user would like to set a fixed Limited Weight and should not be
confused with the dynamically calculated method which does not require the parameter.
The ALK Engine has been updated with new parameters as well.
STEPS TO ENABLE
This feature is all about configuration of the external engines. What is specified for the EDE (External Distance
Engine) also applies for the ESE (External Service Engine).
57
The user is directed through the setup with a "smart" UI. First the user selects the engine to be used. The
options that are available are based on that engine. The user does not have to remember the Java Class
either. This is done for the user.
It was decided to provide logic for specific engines based on the attributes that they supported. This way a
user would not be able to specify an incompatible attribute nor an incompatible value where there is a specified
list of values.
The down side of this is that any new engine must be on-boarded in the same manner. Since it is rare to find
new vendors for such extensive applications, it is unlikely that they will be timing issues once a new product is
announced.
It was decided that the benefits of the ease of use far outweighed any considerations for generic solutions.
Vendor solutions are expensive so it is also likely that an implementation may only go with one vendor, but OTM
can handle multiple engine configurations. It is common that users will have more than one engine
configuration for the same vendor so that is why the effort was made to simplify the configuration.
STEPS TO ENABLE
Use the "Include in Hover" setting in the Screen Set to configure your map hover fields in the Workbench
Layout.
HERE MAP
Configurable Controls:
Zoom to Area
Overview Map
Measure Distance
ALK MAP
Configurable Controls:
Mouse Coordinates
Overview Map
Navigation Toolbar
Geolocation Toolbar
58
ORACLE ELOCATION MAP
Configurable Controls:
Magnify Area
Toolbar
STEPS TO ENABLE
In each section there is a Show Default Hover check box. This check box controls which default fields are
shown in the map hover text pop-up. This check box is selected by default. To remove the default hover fields
deselect this check box.
59
1.
60
STEPS TO ENABLE
1. If you want to change which fields appear by default in the map hover text, expand the Hover Screen
Sets section.
2. Select user-defined screen sets for one or more objects for which you want to change the hover text.
A Hover Screen Set controls the fields which appear in the hovering pop-up window when you
click the Show Details Hover icon on the map.
Within the Hover Screen Sets section, you can select the screen sets for various objects
supported by the map component
To configure the fields displayed in the hovering pop-up window on the screen set, select
the Include in Hover Text option for fields in the Results tab via the More button.
3. Once expanded, the Hover Screen Sets section is grouped as follows:
Shipment: includes shipment-related objects such as buy shipment and shipment stop as well as
logistics network modeling related objects such as modeling shipment and modeling shipment
stop.
Order: includes order-related objects such as order movement and order release.
Driver: includes the driver object.
Network: includes network-related objects such as location, network leg, region, and region details.
STEPS TO ENABLE
The additional map regions can be added following the steps used to add a single map region in previous
releases.
61
WORKBENCH MAP - EXTERNAL DISTANCE ENGINE AND MAP - CONSIDER TRAFFIC
BETWEEN STOPS
Previous implementation provided the departure time of only the origin to the map vendor (ALK or Here) to
calculate the route based on historic traffic conditions. This change allows the logic to consider the estimated
departure times at all stops and pass them on to the vendors.
New logic parameter USE TRAFFIC PER STOP is added to control the use of traffic per stop pair.
User can still use the existing USE TRAFFIC parameter if the idea is to use historic traffic, but not at each
stop.
If the new parameter is turned on, the planned departure time of each stop will be calculated and the
origin stop departure time of each stop pair will be passed on to the vendors.eg if the shipment contains
3 stops;A-B-C .This contains 2 stop pairs .A-B and B-C. For the first pair the departure time of A will be
passed and for the second the departure time of B will be provided.
If both USE TRAFFIC and USE TRAFFIC PER STOP are turned on USE TRAFFIC PER STOP will take
precedence.
STEPS TO ENABLE
STEPS TO ENABLE
Previous functionality of external distance engine parameters for HAZMAT on Map didn't vary based on the
number of stops on the shipment. This feature expands the functionality to consider Hazmat between each
pair of shipment stops. In past releases, multi-stop shipments were plotted similar to 2 stop shipments with
HAZMAT and resulted in sub-optimal routing since the route was decided by the most hazardous item on the
shipment. With the implementation at stop pair, the route will be calculated on the hazmat items specific to the
stop pair.
62
WORKBENCH MAP - LOCK ZOOM LEVEL AND LOCK VIEW ON MAP
This feature provides you with two new lock functions on the Workbench Map toolbar. The new lock functions
improve the usability of the map display by allowing you to control the view and zoom more directly.
Click the Lock Zoom Level icon to lock that map's zoom level and maintain the current zoom view of the
map.
For example, if you select shipment A which starts and ends on the west coast of the US and you click
the Lock Zoom Level icon, the map will stay zoomed into that shipment. Next, if you select a second
shipment, shipment B which starts and ends on the east coast of the US, the map will not redraw to show
that shipment. Instead the map zoom will remain on shipment A which starts and ends on the west coast
of the US.
If you manually zoom in or out using the mouse or map controls, the zoom level icon is automatically
unlocked.
LOCK VIEW
Click the Lock View icon to lock that map and keep the current items on the map; then, if you click
another object such as a shipment or order release to be added to the map, the map does not change.
This is useful when you have two maps, because you can add a shipment to the map and then lock that
map. With one map locked, you can add a second shipment to the second map.
STEPS TO ENABLE
By default, the Lock Zoom Level icon is shown as unlocked. When you click the Lock Zoom Level icon to lock
a map, the icon is shown as locked.
Click the Lock Zoom Level icon to lock that map's zoom level and maintain the current zoom view of the
map.
For example, if you select shipment A which starts and ends on the west coast of the US and you click
the Lock Zoom Level icon, the map will stay zoomed into that shipment.
Next, if you select a second shipment, shipment B which starts and ends on the east coast of the US, the
map will not redraw to show that shipment. Instead the map zoom will remain on shipment A which starts
and ends on the west coast of the US.
63
If you manually zoom in or out using the mouse or map controls, the zoom level icon is automatically
unlocked.
BOAT_FERRIES
0 - Normal
1 - Avoid
2 - Soft Exclude
3 - Strict Exclude
MOTORWAYS
If you specify motorway:-2, OTM will exclude motorways (highways) from the route calculation if it is
possible to find a route that avoids them. Otherwise, the route calculation will include motorways.
Valid values are:
0 - Normal
-1 - Avoid
-2 - Soft Exclude
-3 - Strict Exclude
TUNNELS
0 - Normal
1 - Avoid
2 - Soft Exclude
3 - Strict Exclude
RAIL_FERRIES
0 - Normal
1 - Avoid
2 - Soft Exclude
3 - Strict Exclude
DIRT_ROAD
0 - Normal
1 - Avoid
2 - Soft Exclude
3 - Strict Exclude
FIXED_LIMITED_WEIGHT
If this is set, OTM will use the weight here rather than the OTM shipment calculation.
Values entered must be in tons.
For example, if the shipment limit is 8 tons then the parameter must be set to 8.
TUNNEL_CATEGORY
Truck routing only, specifies the tunnel category to restrict certain route links. The route will pass only
through tunnels of a less strict category.
64
The HAZMAT_ROUTING parameter needs to be set to Y for the routing engine to calculate the value
for tunnel category. See the HAZMAT_ROUTING parameter details as well.
The valid values are B, C, D and E with B being the least restrictive.
EQUIPMENT_RESTRICTIONS
STEPS TO ENABLE
RAIL is for use with ALK only. A separate license is required for ALK RAIL.
ALK's RAIL routing api is called for a RAIL shipment only if the RAIL mode's line style is set to RAIL.
To plot rail shipments ALK's api needs these parameters for each stop location.
The following maps the above parameters to the corresponding OTM object from which the values can be
retrieved.
railroad = BuyShipment - > Mode -> Air/Rail Route Code - > Rail Route Code ID - > Carrier SCAC (from
the list of Sequences)
When Rail Route Code has more than one sequence with the "Rail Junction Code" then all the
intermediate junctions are used for fetching the routes between the stop location.
Rail junctions Carrier SCAC code comes from the next sequence's SCAC code.
format = ALK takes in five parameters but OTM supports the following 4 parameters.
name = The actual value retrieved from OTM objects.
65
Format Description OTM Object Value Notes
Since there could be multiple format values available for a specific location, the preferred order in which we
pass in these format values are based on Location Role of that location.
STEPS TO ENABLE
FERRY_DISCOURAGE
Indicates whether to discourage the use of ferries when creating the route. Default: false
ELEVATION_LIMIT
EQUIPMENT_RESTRICTIONS
STEPS TO ENABLE
66
WORKBENCH MAP - MAP FILTERS
Use the Map Filters hovering popup to specify criteria by which to limit the objects that display on the map. The
objects you can filter are shipments, modeling scenarios, or a combination of both.
Shipments and Scenarios can be filtered on the map. For example, you can map all of the shipments displayed
in the buy shipment table and then use the map filters to control which shipments are displayed on the map
such as, you can select a transport mode of LTL to limit the shipments on the map to only be LTL shipments.
Transport Mode
Service Provider
Equipment Group
Rate Offering
Total Weight - Between a Low/High value
Total Volume - Between a Low/High value
Total Ship Unit Count - Between a Low/High value
Number of Stops - Between a Low/High value
67
Scenario Filtering Criteria:
Modeling Scenario ID
Scenario Name
Scenario Number
There is also a Manage Map Filters function available in the Workbench Layout toolbar (using the same icon).
68
This allows you to create, edit, or delete layout filters.
Once a Map Filter is saved, you can assign it to a specific map when you add a map to a workbench layout or
edit an existing map region. If a map has a Map Filter assigned, the mapping criteria is applied automatically
when the workbench layout opens.
69
STEPS TO ENABLE
To use a Map Filter within a Workbench Layout follow the below steps.
1. Select an existing layout with a Buy Shipment or Modeling Scenario tables and a map.
2. Click the (Map Filters) icon in the map toolbar.
3. In the popup select the criteria you want to limit the objects to be mapped by, such as you can specify a
weight range to limit the shipments to those of a certain weight.
70
4. Click the Apply button.
5. The map will refresh and display only the objects that meet the specified criteria. The Shipment # of
# section is refreshed to display the number of shipments that meet the mapping criteria and
71
6. the Modeling Scenarios section is also refreshed.
The first step is to provide an attribute for the Equipment Group for the TRUCK TYPE. This only applicable to
truck equipment and is designed to categorize the general type of the equipment that is based on certain
characteristics. The trailer types are not user configurable as are the drop downs for the characteristics like roof
type. These values are used INTERNALLY in the code so that cannot be user defined. In general, all
equipment will fall into these categories.
The user is able to create their own TRUCK TYPE and define the related attributes. This also helps to define
the Equipment Group.
72
Trailer Type is used to describe the various generic ways that equipment is combined. This is primarily used to
calculate the vehicle length including the tractor.
Self Contained is a vehicle with a motor. It is NOT a trailer. It does not need a tractor.
Semi-trailer is a trailer with a king pin and rear axles. This trailer requires equipment with a fifth wheel
hitch to pull it.
Pony Trailer is a trailer that requires a pulling vehicle with a bumper style hitch.
Drawbar Trailer is a trailer that contains a rotating, hitch equipped equipment under the front king pin,
commonly called a dolly. It also requires a bumper style hitch.
B-Train Leader is a unique semi-trailer that comes equipped with a rear extension that includes a fifth
wheel hitch for pulling another trailer.
Dromedary is a type of tractor that has a cargo carrying space between the cab and the fifth wheel
hitch. It can also be loaded with freight.
Each of these types of equipment can be categorized by the roof type and by the ability to have freight extend
beyond the equipment.
Roof Type is used to describe the characteristics necessary to correctly calculate the height.
Closed - This fixes the height as the configured external height of the Equipment Group.
Flatbed - This assumes that there is no height of the cargo area and that the max height of the cargo is
added to the floor height to determine the external height.
Open Top - Open top equipment has a side height and there is a calculation based on the flatbed and
based on the side height. We take the max for the external height.
Overhang permitted is the ability to allow freight to extend beyond the basic defined equipment.
PULLING LENGTH
The length of the equipment, except for the self contained will not represent the actual length of the
equipment. HERE wants the total length while ALK wants the trailer length. While the equipment length is
easy, the total length represents a challenge since the equipment with a king pin and 5th wheel will overlap,
hence the definition of pulling length.
The power unit now has 2 critical attributes. TARE WEIGHT and PULLING LENGTH.
BE SURE to configure your favorite tractor as the Power Unit to Use with the Parameter DEFAULT POWER
UNIT FOR LENGTH CALULATION.
73
Pulling Length
EDE MAPPING
The concept of a "generic API" for the EDE and ESE is not possible because each vendor approaches the
attributes quite differently. In order to provide a robust connection, a mapping table was required. The column
for vehicle type is our internal column. One vendor call this a vehicle type and the other call it a mode. The
mapping is done internally. This table allows the user to specify the category of the vehicle for the purposes of
input to the EDE. When combined with weight, the vendor software will sue different velocities for the
vehicles. This is a very important consideration.
74
Width Calculation
Length Calculation
75
Combo Length Calculation
76
Dromedary Length Calculation
77
Height Calculation
Running Dimensions are persisted at the stop level as they will likely change between stops. The EDE is called
for distance between stops and will use the running dimensions and weights between those stops.
78
STEPS TO ENABLE
The logic is invoked when the Equipment Restrictions is set to Y on the external engine.
TRUCK TYPE
PULLING LENGTH
TARE WEIGHT
EXTERNAL DIMENSIONS
FLOOR HEIGHT
You must select your standard tractor or power unit with the parameter.
The power unit must have TARE WEIGHT and PULLING LENGTH
The Clustering Merge algorithm is appropriate for the following kinds of scenarios:
79
Clustering - Shipments in Each Cluster
STEPS TO ENABLE
Set your MULTISTOP CONSOLIDATION ALGORITHM TYPE to 7.Clustering Merge
The MULTISTOP CONSOLIDATION ALGORITHM TYPE option is found in the Logic
Configuration Type = MULTISTOP in the MULTISTOP CONSOLIDATION Group
Fallback Consolidation:
The particular consolidation algorithm used as the fallback is determined by the new property: glog.
business.consolidation.multistop.fallbackMergeAlgorithm
By default, this is set to "0" (Concurrent Savings). The other option is "1" (Sequential Savings).
The Clustering Merge algorithm is intended for solving outbound routing and scheduling problems where the
following conditions exists:
In this version:
80
Equipment capacity is checked for weight, volume, and ERU. Load Config logic is not used in the
algorithm to determine equipment capacity..
Fallback Consolidation:
It is possible that the consolidation logic will be unable to create a single shipment for the cluster (i.e.,
"clustering failure"). If that happens, then a "fallback" consolidation algorithm is used to create shipments
for the cluster. (Most likely, this fallback algorithm will create multiple shipments for the cluster.)
The particular consolidation algorithm used as the fallback is determined by the new property:
glog.business.consolidation.multistop.fallbackMergeAlgorithm
By default, this is set to "0" (Concurrent Savings). The other option is "1" (Sequential Savings).
NOTE: Users are encouraged to use the Clustering Merge algorithm for scenarios where clustering
failure is unlikely.
Clustering Merge only supports scenarios where there is a single source and multiple destinations.
Clustering Merge does not support cases where the consolidation decisions depend upon a range of
different equipment group capacities.
Clustering Merge does not consider load configuration logic when determine equipment capacity.
Clustering Merge does not check hours of service during the consolidation step and assume a "full"
shipment is based on equipment capacity (weight, volume and ERUs).
Clustering Merge assumes that any order consolidation will not be made infeasible by any of these
constraints:
time window constraints,
commodity constraints,
order constraints (e.g., service provider, equipment group), or other non-capacity constraints.
Clustering Merge does not considering order priority.
Multi-stop logic was enhanced to recognize and consolidate the pickup and delivery stops that are close to each
other so to avoid having more than 1 shipment service the area. The area is defined by a radius and the intent
is that the radius is small. Examples of co-located stops are multiple stops within a large building, the same
location with multiple IDs, and locations within a very short distance of each other. This “business rule” is
stronger than the economics that would likely perform this function since not all rates are designed to support
this policy. The process is designed to work like a “first pass” with the multi-stop
81
Co-Located Example
1. Avoid repeat visits with multiple shipments to either the same location or to locations nearby.
2. Idea is that one shipment, if possible, should handle all the work.
3. Applies to pickups or deliveries from locations that are defined as co-located.
Examples:
82
Co-Located City Example
STEPS TO ENABLE
To use this feature you will need to provide a value for the MAX DISTANCE BETWEEN CO-LOCATED
STOPS parameter which is in the Logic Configuration Type = MULTISTOP under the General group. The
default value of this parameter is 0, and the new co-located multistop logic is NOT used in this case.
This parameter specifies the maximum distance within which two locations can be considered co-located.
Group input shipments with co-located stops and perform group-wise multi-stop consolidation first: In this
step we will first create groups of input shipments with co-located stops. To be able to do this, a new
multi-stop logic parameter MAX DISTANCE BETWEEN CO-LOCATED STOPS will be introduced.
If the value of this parameter is set to 1 mile, Shipment 1 (S1 -> D1) and Shipment 2 (S2 -> D2) will be
put in the same group if one of following 2 conditions are true.
Distance between S1 and S2 is less than 1 mile, and distance between D1 and D2 is less than 1
mile
Distance between S1 and D2 is less than 1 mile, and distance between D1 and S2 is less than 1
mile
Group co-located stops in sequencing: Again, based on the value of the parameter MAX DISTANCE
BETWEEN CO-LOCATED STOPS, we will group stops that are co-located. Then we have following 2
options for generating stop sequences that would have co-located stops in succession.
Generate new stop-sequences by treating co-located stop groups as one stop
Generate new stop sequences by moving co-located stops close to each other in current
sequences
83
TIPS AND CONSIDERATIONS
One should be using LAT and LON for their distance calculations. One should validate the LAT and LON of the
location and be careful NOT to be using the LAT and LON of the postal code or the zip code.
The following criterion have been introduced for scoring placement combinations in 3D load configuration:
a. Put boxes to eliminate fragment spaces : This criterion encourages placement that reduce the number of
fragment spaces, give rise to less constraint for further placement.
b. Consider load bearing abilities during evaluation: This criterion encourages the algorithm to put strong
ship units with high load bearing weight capacity lower in the stacking layers, and put weaker ship units
that have lower load bearing weight capacity in the topmost layers.
84
With Scoring Turned On and the New Parameters in Use
STEPS TO ENABLE
There are many tuning parameters related to the use of 3D Scoring. To turn on this functionality and use the
default settings provided you will need to turn on the 3D Scoring functionality by setting the parameter
CONSIDER SCORING LOGIC IN 3D PLACEMENT to TRUE. This parameter is located in the Logic
Configuration Type = CONTAINER OPTIMIZATION and is in the CONTAINER OPTIMIZATION 3D SCORING
group.
3D Based Load Configuration with 3D Scoring involves taking advantage of some extremely advanced
planning features. The setup and tuning of this feature should be only done by a user with significant OTM
planning logic experience.
Shippers who would like to take advantage of lower cost carriers with limited capacity and still manage the
priority of the shipments (High, Medium, Low) now have a place to configure a policy on how to configure the
delay to shipping based on priority and how long of a delay is allowed.
New Parameters
85
shipment to tomorrow - assuming that the change is feasible - to use tomorrow's capacity -
or does "delaying" the start time by a day add an unacceptable service time failure risk by
eliminating the available slack time in the shipment timing?
The available SPA TIME CHANGE POLICY options are:
3. Encourage Time Change.
2. Discourage Time Change For High & Medium Priority Shipments'
1. Discourage Time Change For High Priority Shipments'
0. Discourage Time Change
STEPS TO ENABLE
Assuming that you are already planning with capacity limits - to extend capacity limits to consider using
capacity availability on different days you will want to configure the following parameters which can be found in
the Service Provider Assignment group:
SPA MAX DAYS TO CONSIDER - When using this parameter, it is best to set this no larger than needed, as it
can have a performance impact. For example, if two extra days is sufficient to find the right resources, then
there is no reason to set it higher than 2.
STEPS TO ENABLE
0 (default for upgrading clients): inactive locations can be used as throughpoints/via points
86
1 (default for new clients): inactive locations cannot be used as throughpoints in Network Routing nor as
via points (Network Routing or Cost-based Routing)
The default in glog.base.properties is 1 (inactive locations cannot be used as throughpoints in
Network Routing etc.
Existing clients will want to set the glog.business.location.inactiveLocationSetting to 1 to take advantage of the
new capability.
The Honor Location Inactive Flag only works for the Order Routing Method of Network Routing, the feature is
not supported using the Order Routing Method of Cost-Based Routing, nor will it work with legacy Pool-XDock
logic.
Rule 11 governs what rail shipments can be built (for which rail carriers) on consecutive legs of a multi-leg
itinerary. If a rail shipment is arriving at a rail hub using a certain rate record, the subsequent rail shipment
coming out of that hub is allowed to use only certain rate records based on the configuration of a combo route
code and the beyond flags on the rates. In other words, for consecutive rail shipments, only certain
combinations of rate records are allowed to "link up" together. The combo route code signifies that a valid
interchange between carriers exists at that junction. The rate records contain a route code and flags "For
Beyond" and "From Beyond". This signifies that the rates are qualified to be used in combination with
other rates in the overall trip. The logic checks to see that the rate into the hub has the FOR BEYOND flag
checked and that the rate from the hub has the FROM BEYOND flag checked. The rules about which rate
records are allowed to link up are specified in the ROUTE_CODE_COMBINATION and
ROUTE_CODE_COMBINATION_D tables ("combo tables").
RULE 11 EXAMPLE
87
The Rule 11 Route code logic in the Network Routing Logic provides the checking that is necessary to only
permit valid combinations of rates across junctions. Without this checking, behavior would be like that of
ordinary routing logic that favors the cheapest combinations. It also assigns the Combination Route Code to
each shipment as this route code is required for tender. In Rule-11, only the first shipment is tendered but both
are required for settlement.
Shipment Attributes for Rule 11 to be checked to assure that each shipment is a valid shipment.
Route Codes.
Local - Each Shipment should have the route code from the Rate Record
Combo – this should be the same on all rule-11 shipments
Reference Numbers
BM – Bill of Lading Number - This must be the same on all rule-11 shipments for the same order.
Next Rule-11 Shipment
Previous Rule-11 Shipment
Tender Instruction
Each shipment will have a tender instruction that matches the leg that was used to build the
shipment.
STEPS TO ENABLE
The Rule11 logic is built into the planning logic and only invoked under the conditions where it is needed. This
logic has existed in the non-network process but was only recently added to the Network Routing process for
both building shipments and for the Network Rate & Route RIQ.
It should be noted that for NR R&R RIQ, the user does NOT need to have the itineraries configured for Network
Routing.
It should also be noted that for NR R&R RIQ, the user does NOT have to specify a PPS that contains the NR
logic as the option to select NR R&R RIQ will automatically invoke NR logic as does SNRO, which is a
shipment building action.
This feature provides you with the to set indicators based on the Tracking Event Ahead – Late Calculation
Based on ETA
This enhancement contains saved queries and is provided within in context of a sample agent that sets the
Tracking Event indicator color depending on how late an arrival is.
88
Concept of Ahead/Late is introduced.
As shipments progress, the vendor will provide the event data at specified points along the route.
User will want to establish the criteria as to how late an arrival is using the Planned Arrival Time and the ETA
from Each Event and visually set the indicator on the Tracking Event (Red, Yellow, Green).
Event data always contains the SPLC of the car destination (195249)
Agent Variable: Fetch the planned arrival time at the car destination
Based on 1) Common Ref Number 2) Car Destination SPLC - only 1 stop on 1 shipment will match
The enhancement will provide a sample Agent with the agent actions and the related saved queries. The user
should use this as a template to add these steps to their agents. This is the logic that is used to configure the
agent.
89
The user will make an IF statement with a saved condition
The saved condition will be used to determine the ahead/late time condition
The user will then use an agent action to set the indicator color.
The user will make IF/ELSE logic to set all 3 colors.
The reason why this is an agent type of configuration is that the user can set the criteria for any number of
categories and not be limited to 3. Then the agent can be configured through IF / ELSE logic to send
notifications or to set Shipment Indicators as well as Tracking Event indicators depending on the severity OR
the event status value.
STEPS TO ENABLE
Caveat is that we currently have it set up to use the SPLC for rail.
This is clearly designed for the user to "build it themselves" because that allows the user to specify the criteria
for the seriousness of the delay and the level of notifications.
90
problems. The ability to configure the Network Routing logic used for your scenarios provides you with the
opportunity to reduce run-time for scenarios where route optimization (determining the path through the
network) does not require full network routing optimization. A good example where this parameter can be used
effectively is in any bulk plan planning scenarios where network routing logic is required, but where there are
no network routing decisions to be made - e.g,, a single leg network that uses Work Assignment logic.
The Network Routing logic parameter ROUTING SOLUTION METHOD can have one of three values:
Optimize (the default) - Rate the network and go through the Network Routing optimization.
Simple Solve With Rating - Rate the network, but do not optimize.
Simple Solve With No Rating - Do not rate the network, and do not optimize. The logic will create a single
NRLegOption for each NRLeg, with a container cost = 1 (or the Leg Estimated Cost, if any).
STEPS TO ENABLE
Based on ship unit length, width, height and coordinate information, the EVALUATE CENTER OF GRAVITY
agent, when configured, will do the following:
1. Read the Center of Gravity values in each axis for the freight from the s_equipment where the 3-D has
persisted them.
2. Send notifications based on thresholds that the user sets when configuring the EVALUATE CENTER OF
GRAVITY agent action
The action behaves like an IF statement. If the center of gravity is out of the defined threshold,
only then are the child actions triggered.
91
The Utilization factor is provided to prevent false alarms when shipments are partially loaded. The packing
algorithm always starts in the front so loads will naturally be "un-balanced" so the length threshold will always be
violated. The utilization factor allows only "full" shipments to receive the evaluation for notification.
As part of the enhancement, the 3-D logic will populate s_equipment fields below.
STEPS TO ENABLE
1. Use 3-D Load configuration. The EVALUATE CENTER OF GRAVITY agent action depends on having
the length, width, height and origin coordinates (lower left hand corner coordinate) for each ship unit on
a shipment to perform the required calculations. To provide this information you must use one of the 3-
D load configuration engines.
2. Create an agent and configure the agent action EVALUATE CENTER OF GRAVITY. EVALUATE
CENTER OF GRAVITY parameters:
Weight Utilization Factor (0-1)
Minimum height allowed (along y-axis)
Maximum height above ground level allowed (along y-axis)
Offset threshold from center to left side of equipment (along x-axis)
Offset threshold from center to right side of equipment (along x-axis)
Offset threshold from center to front of equipment (along z-axis)
Offset threshold from center to rear of equipment (along z-axis)
CONSIDERATIONS
This enhancement uses “moment calculations” to determine the Center of Gravity (C of G) of the cargo
based on the placement of the freight by 3-D Load Configuration. The standard 3-D populates this
information on the shipment-equipment and an agent action is configured to allow the user to set the
threshold in each axis of the Center of Gravity so that an alert can be issued to review the shipment.
Only the freight (cargo) Center of Gravity is evaluated and this calculation is only done using 3-D
placement.
This new capability is also useful for the evaluation of multi-stop shipments where there is no re-use of
equipment capacity. The solution supports scenarios with multiple pickups preceding multiple deliveries
the center of gravity solution does not support interleaved pickup and delivery multi-stop scenarios. Note
that if the center of gravity feature is being used with a multi-stop scenario you can re-sequence stops to
balance the shipment. An action must be run to re-pack the shipment and then the EVALUATE CENTER
OF GRAVITY agent can be re-run to re-calculate the center of gravity.
1. Obtain the center of gravity dimensions based on the lower left corner of the object - for perfect cuboids,
this is half of each dimension. For each cuboid object, the assumption is that the center of gravity is at
the geometric center.
2. Obtain the starting point of the cuboid relative to the “packing envelope” of the Equipment Group
/Equipment Origin.
92
3. Calculate the “adjusted” center of gravity of the cuboid based on the Equipment Origin.
4. For each dimension, calculate the “weighted average” of weight times the distance to the plane of the
desired axis.
5. Divide by the total weight and that is the point of the C of G relative to that axis.
6. Augment the center of gravity for Height with the Floor Height of the Equipment
TOP-OFF ORDERS
This feature provides you with the ability to use OTM's planning logic to fill available capacity on a shipment
with product from a designated Top-off order. When configured, the process works in two steps:
1. Step 1 involves building the best shipment considering the regular (or non-Top-off) orders;
2. Step 2 involves using the designated Top-off order to fill any remaining capacity in the shipment coming
out of Step 1.
The top-off concept works for any industry where the shipper has the option to top-off or fill a shipment with
additional product after all the orders placed by the customer are planned.
For example, a shipper supplying seasonal products to a DIY retailer will often have the option to add additional
product to their shipments if the customer's orders do not fill the capacity of the shipment already planned for
that customer. Given a DIY retailer order for 42 pallets of black mulch and a shipment with the capacity to hold
44 pallets - the mulch supplier has the option to fill or top-off the shipment with two additional pallets to make a
full shipment even though the retailer did not order them.
STEPS TO ENABLE
1. You need to create a Top-off order. A Top-off order is an order release where you have checked the Top-
off Order flag on the order release.
2. You need to turn the USE CONOPT MERGE IN TOP-OFF PASS parameter to TRUE.
This parameter can be found in the MULTISTOP Logic Configuration Type and is in the General
Parameter Group.
The multi-stop logic parameter USE CONOPT MERGE IN TOP-OFF PASS provides an option to
use the Conopt Merge algorithm for shipment consolidation in the top-off pass. The main objective
in the top-off pass is utilize the equipment as much as possible by filling the shipment with Top-off
order freight.
This feature also supports a unique use case where the topping off opportunity occurs because the legal
capacity of the truck increases in the middle of a shipment. In this scenario, the shipment starts out with a street
legal limit of 40 units, but the shipment's destination is in an area (most likely a port) where the street legal
capacity increases to 50 units - so once the truck enters this area the available capacity increases by 10 units
and thus can be used for a Top-off. In this case OTM needs to understand the increased capacity of the
equipment once the shipment is in this area. This is accomplished in a straightforward and intuitive way.
1. As above the USE CONOPT MERGE IN TOP-OFF PASS parameter must be set to TRUE.
2. For the equipment groups involved you will need to configure your Capacity Override to be the lower
legal limit for the equipment group(s) involved and then the capacity of your equipment group(s) should
be set to the higher capacity limit associated with the special area where the higher capacity is allowed.
Using my example above the Capacity Override for my equipment group would be 10 units, my
equipment group capacity would be 50 units.
3.
93
3. To effectively use that additional capacity in the Top-off planning step you will need to set the IGNORE
CAPACITY OVERRIDES IN TOP-OFF PASS parameter to TRUE.
This parameter can be found in the MULTISTOP Logic Configuration Type and is in the General
Parameter Group.
This parameter allows planning to ignore the weight capacity override in the second step which is
where the Top Off orders are used to fill the equipment to its stated capacity. The use of the
override allows the placement of a lower limit for the core orders. For example, if the container
can hold 50,000 lbs on the ocean leg but only 40,000 lbs on the highway, the equipment capacity
is configured as 50,000 lbs. The leg, which has an override of 40,000 lbs will limit packing. This
new parameter will only tke effect in the second step and ignore the capacity for that step. The
override is not always needed when there are orders that are designated to be planned onto one
equipment where the core order will not fill the equipment. The top off order will do that in the
second pass. This can happen where customers will provide an order that they know will not fill a
truck and then will designate an item to fill the remaining space.
OTM works by splitting off entire ship units but is not able to count split individual ship units. In addition, this
feature only works for weight, volume and ERU. You cannot use this functionality with 3-D packing in this
release.
94
Out of Gauge Diagram
The 3-D packing always starts in the lower left front corner and this point is the origin for all equipment
dimensions. Any compartment will also start here too since negative coordinates are not allowed. An
adjustment factor was developed to allow the oversized compartment to be shifted to center that compartment
with respect to the equipment.
There is another step to be taken. With this feature, it is advised to create a User Defined Pattern since it is
likely that the load will be one large item or a few items. The UD Pattern allows the load to be centered in the
compartment or to be shifted as desired. The significance of this is that the OOG statistics are based on the
item that is packed and not the compartment. This means that the item does not have to consume the entire
compartment. So OOG is now based on what is packed and the OOG is based on the base equipment
dimensions.
OOG dimensions are kept in memory to be applied to the calculations for the EDE.
95
Adjustment for Starting Point
Example
96
Load Adjusted
Length Example
Out of Gauge information is persisted to the stop with the other running information.
97
Out Of Gauge Data on Shipment Stop
STEPS TO ENABLE
Only a very accomplished user of 3-D should attempt to use this feature.
One must first define an equipment such as a flatbed trailer. This is because oversized loads will extend
beyond the walls.
The OTM packing logic will ALWAYS respect the boundary of the container being packed, therefore it is
necessary to provide a compartment above an equipment so that the width of the equipment will be preserved
AND that we can calculate an overhang.
The second caveat is that the origin of the equipment (0,0,0) is the common reference point. if you place an
oversized compartment on the equipment, their lower left hand corners would coincide. Therefore we have
developed OFFSETS to allow the user to display the compartment as it would be visualized on the equipment.
The third caveat is that the load to be packed is ALWAYS started in the lower left hand corner. In UD patterns,
we have allowed the load to start in other positions such as being centered.
The end result is that one will be able to define an equipment setup and a load configuration that allows the load
to be centered on the equipment as it should be and not to violate any of the existing OTM packing
paradigms. This is very important that this enhancement be compatible.
The amount of the overhang is now persisted to the stop with the other running information.
Please review the steps to configure. This use of this capability requires a skilled user for 3-D, Equipment
setup, and User Defined Patterns.
We were asked why wouldn't we just allow the item to overhang? The answer is that a change like that would
violate the packing boundary logic for 3-D and that change would not be trivial. This approach is much cleaner,
although complicated.
98
Originate at Throughpoint
This is an example of a Network Detail with four legs. The last one is no longer needed.
Example of Network
STEPS TO ENABLE
This feature is invoked through the configuration of the Network. Prior to the enhancement, the user was
encouraged to set up separate, independent, parallel legs to start from a hub or to terminate at a hub. These
are no longer required.
NOTE: You must constrain the order with an itinerary in order to use this feature.
In Network Routing, when multiple network legs have the same leg consolidation group, OTM can consolidate
the shipments that have been created on separate legs. Leg consolidation groups allows OTM to consolidate
99
orders across multiple legs to save costs. This feature extends this capability so that OTM considers these
cross-leg consolidation opportunities options while making the network routing decision (which involves the
generation of the order movements that are then built in to shipments). With this new capability OTM can take
full advantage of possible cross-leg consolidation options in shipment building.
SET TO TRUE
1. New logic config parameter for Network Routing: USE LEG CONSOLIDATION GROUPS IN ROUTE
SELECTION
2. Parameter: ALLOW DIFF ORIG LEG GID OMS CONSOLIDATE
3. Logic config parameter for Network Routing: PERFORM DYNAMIC CLUSTER LOGIC FOR SOURCE
REGIONS / PERFORM DYNAMIC CLUSTER LOGIC FOR DEST REGIONS
EXAMPLE 1
As shown in the following figure, we use in this case, a 4-leg network to plan orders originating in the state of
Ohio and going to either New York City or Philadelphia areas. Notice that orders going to Philadelphia area
have only one route through the cross-dock location in Philadelphia. New York City area orders however can go
direct or through the Philadelphia cross-dock location. Legs 1 and 2 have the same leg consolidation group.
This means that shipments on these legs can be consolidated together as a multi-stop shipment if it is cheaper
to do so.
In this network, suppose we will plan two orders; Order 1 (NEO-SD1-0001) from Cleveland to New York City,
and Order 2 (NEO-SD2-002) from Cleveland to Philadelphia. On all network legs, consider only one equipment
group that has sufficient capacity to carry both the orders together. In this case, the optimal planning solution
should have a total cost of $2,100 with the following 2 shipments: (1) Shipment 1: CLEVELAND ->
PHILADELPHIA-XDOCK ->NEW YORK CITY, and (2) Shipment 2: PHILADELPHIA-XDOCK ->
PHILADELPHIA. This can be achieved if the orders are routed the that way.
However, with the old logic, planning resulted in a total cost of $3,100 with the following three shipments: (1)
Shipment 1: CLEVELAND -> PHILADELPHIA-XDOCK , (2) Shipment 2: PHILADELPHIA-XDOCK -> NEW
YORK CITY, and (3) Shipment 3: PHILADELPHIA-XDOCK -> PHILADELPHIA. The orders are currently routed
in that manner.
100
Take current same-leg consolidated shipment options for all the legs under the same leg consolidation
group, and combine options only if they are created on different network legs and contain non-
overlapping orders. In the given example, PRIOR to the enhancement Dynamic Clustering logic (at
source) created the following consolidated shipment options on Legs 1 and 2.
With our enhanced changes, we will obtain the following consolidated shipment options in Dynamic
Clustering.
EXAMPLE 2
Orders:
3 Orders to DC region
Shipment 3: multistop to PHILLY x-dock and WILMINGTON x-dock, and with 1 NYC order and 1 DC
order ($2,000)
101
Cross Leg Consolidation Example 2
STEPS TO ENABLE
1. New logic config parameter for Network Routing: USE LEG CONSOLIDATION GROUPS IN ROUTE
SELECTION
2. Parameter: ALLOW DIFF ORIG LEG GID OMS CONSOLIDATE
3. Logic config parameter for Network Routing: PERFORM DYNAMIC CLUSTER LOGIC FOR SOURCE
REGIONS / PERFORM DYNAMIC CLUSTER LOGIC FOR DEST REGIONS
It is assumed that the problem being solved is like one on the description section where there is the classical
scenario through a hub or direct AND that the user is utilizing Network Routing.
Since the routing decision is economic, it helps that the rates are compatible with the problem to be solved.
102
The adjacent legs where the consolidation can happen must be configured with the same Leg Consolidation
Group.
Specific use case - one of many - single driver does all 4 shipments
Shipment 1 - Line haul, DC- "A" to Location "B" with combination equipment equipment group out with
two loaded trailers Equipment Group 1 (48 foot) Equipment Group 2 (pup) Structurally, the shipment has
two S_Equipments.
Shipment 2 - At Location "B" - drop Trailer 1 Equipment Group 1 (48 foot) and then have another
shipment - shipment 2 from Location B delivering to all the locations in Equipment Group 2 (pup) the
return back to Location B
Shipment - 3 Then from Location B - Swap Equipment Group 2 with Equipment Group 1 (48 foot) make
deliveries return to Location B.
Shipment 4 - [This is the shipment created from this feature,] At Location B - hook up empty Equipment
Group 2 (pup) with empty equipment group 1 (48 foot) and return to DCA with combo equipment
group of empty equipment. Structurally, the shipment has two S_Equipments.
There are many variants on this scenario - the underlying goal is to simplify the creation of shipment 4, the
return shipment with the combo equipment with two equipment groups. In the above scenario, shipments 1,2
and 3 are freight-related and are, accordingly, natively created by OTM planning. This enhancement
addresses the need for the 4th shipment where there is no freight, but the driver must return home, and with the
same, or similar equipment, that they left their domicile with to maintain equipment balancing.
Other use cases include variations on whether the exact same equipment is used as on the line haul out and
the return leg or other, similar equipment with different ID's. It is also possible that the work assignment is built
103
for both, one, or even neither of the 'local delivery' shipments (#2 and #3 from out scenario above). It is
primarily the stand-alone work assignment building logic, or manual driver assignment that is intended to 'stich
together' shipments 1,2,3 with shipment 4, in our scenario.
While it is the traditional planning engine that is responsible for creating all freighted shipments, it is the agent
logic, that happens after bulk plan shipment building, that creates shipment 4 in our scenario above. If work
assignment logic is enabled in bulk planning, OTM planning will create shipment 1, 2, and 3, but it will be the
agent action that creates shipment 4. It will also be possible, and generally recommended, to initially build the
freighted shipments in planning without work assignment logic enabled, then have the agent action create the
return shipment, and then, run stand-alone work assignment building which in turn, will output all 4 shipments
(or at least shipments 1 and 4) planned into a single work assignment.
It is worth mentioning that the creation of the return shipment is effectively a mirror image of the input shipment
from the agent action. In many ways this new agent action is similar to the empty reposition equipment action
previously delivered, So, shipment 4 in our scenario, will have the same service provider, Equipment Group/
Combo Equipment Group, and have reversed source and destination from the outbound line haul
shipment (shipment 1 in our scenario). Also, the return shipment will be created with a start time immediately
after the end time of the outbound line haul shipment. Subsequent shipment insertions, into work assignments
and/or driver assignment(s) will allow for the return shipment to be re-driven to slide forward to accommodate
these other potentially inserted shipments, such as shipment 2 and or 3 in our scenario.
STEPS TO ENABLE
An agent saved query will be required to decide which shipments the agent is to run against, (shipment #1 in
our scenario) and, at least as important, which shipments to exclude shipments 2 and 3 in our scenario.
Deciding the saved query criteria for the agent will be important. To model our scenario, at a minimum, the
criteria should include that the source shipment for the agent, should only be run against shipment with combo
equipment groups, and contain freight. It may also be helpful to add a certain minimum distance to the
agents' saved query as well. Individual implementations of this will likely vary.
It will likely also be helpful to use the rate basis item Shipment Bobtail Distance as a penalty, even as a
weighted cost if required, to dissuade work assignment building, or optimize driver assignment, from sending
the driver directly back to the domicile without appending the return shipment. In this way, by making bobtail
less appealing, the fleet planning assignments will naturally consider appending the return shipment as a more
attractive option.
The 'stitching together' of the set of shipments (1,2,3 and 4 -in our example) must be done either manually, or in
a batch process. As a batch process this can be accomplished by using either stand-alone work assignment
building, also available in this update, or with optimize driver assignments. Note that either of these
functionalities will accomplish that stitching. Also, using the driver assignment to assign all 4 shipments
together, will of course stich these 4 shipments together. Note that until one of these three actions have been
undertaken, there will be no inherent relationship between shipment 4 and the freighted shipments 1-3 from our
scenario.
It is also worth noting that while this feature was added with combo equipment groups in mind, there is nothing,
other than the saved query definition, that will prevent this action from running against shipments with single/
conventional equipment groups.
104
STAND ALONE WORK ASSIGNMENT PROCESS
This feature will allow for the Work Assignment Process to be run separately from the Bulk Plan. Previously
automated work assignment creation was done exclusively within bulk planning. This feature will allow the
user to select a set of pre-built shipments, to consider for stringing, using work assignment logic, as a separate
stand alone process.
This new functionality will allow users to select shipments from, potentially multiple, previously run, bulk plans to
submit for work assignment creation. This will allow clients that wish to use the bulk plan to make the fleet
versus common carrier decision, to do so, and to then only select those fleet shipments for submission into work
assignment creation. Clients that wish to intercede between shipment creation, and work assignment building,
can now do so.
Also, as a result of this feature, there is a new work assignment building planning results screen, that will
provide uses the ability to monitor stand-alone and in-the-bulk-plan work assignment plans, terminate stand-
alone work assignment plans, and review rich captured metrics about each stand-alone work assignment plan.
STEPS TO ENABLE
Prior to this update, work assignments were created in one of two ways:
1. Manual action: the 'Create Work Assignment' action allows user to create work assignments by manually
selecting the shipments to add to it.
Or,
2. In bulk plan: the user could enable the fleet aware bulk plan functionality and run bulk plan. This would
then create optimized work assignments -a set of shipments strung together meant for a single resource,
from a given location, for a defined amount of time) for the selected order releases.
As a result of this feature, OTM has been enhanced with a third way. A new action 'Optimized work
assignments' on the buy shipment manager UI has been added which provides users the ability to create work
assignments from already planned shipments.
From the OTM shipment manager, the user will select multiple shipments and run the action: optimize
work assignments
On the action input screen, the user provides Parameter set ID to be used to execute this action, and
optionally adds a description.
This action will utilize the RESOURCE SCHEDULER CONFIG ID configured in the selected parameter
set to create resource schedule instances (RSIs) -if they are not already created, and to continue with
work assignment building.
Using these RSIs work assignments will be created based on the constraints specified on the RSI's, for
the given shipments.
On the resulting page, OTM will create an work assignment planning results page.
Additionally, the work assignment planning results page can be viewed independently, similar to
bulk plan results, once the plan has been started.
(Independent Navigation to the work assignment planning results :Fleet Management ->
Planning Results -> Work Assignment Planning)
Users can invoke the optimize work assignment functionality from two different places.
1.
105
1. Shipment management -> Buy Shipment -> Actions -> Fleet management -> Manage Work Assignment -
> Optimize Work Assignments
2. Fleet management -> Process management -> Optimize Work Assignments -this allows them to be
scheduled etc
Lastly, it is now possible to terminate a long-running work assignment bulk plan. This can be accomplished
from the work assignment bulk plan screen.
A work assignment bulk plan results page is always created. This is true whether the user uses standard bulk
plan to created them or if they use this feature. In regular bulk plan of orders, when create work assignment is
enabled, OTM will generate work assignments generate at the end of bulk plan. At this time, OTM will create a
work assignment bulk plan results page to collect all the work assignments created from that bulk plan and the
results page from this standard bulk plan.
The following are the values from the planning result page:
Shipment Details:
106
Distance and Duration:
Resources:
Work Assignments:
Number of WA created
Maximum Shipments
Minimum Shipments
Average Shipments
Resource schedule instances Available. This is the number of resource instances created + number of
resource instances pulled in from db
Resource schedule instances Newly Generated = number of resource instances created during planning.
107
From the Work Assignment Planning Metrics tab:
Resource
Available
Used
Work Assignment
Minimum Shipments
Maximum Shipments
Average Shipments
Minimum Cost
Maximum Cost
Average Cost
Minimum Total Duration
Maximum Total Duration
Average Total Duration
Minimum Slack Duration
Maximum Slack Duration
Average Slack Duration
Minimum Rest Duration
Maximum Rest Duration
Average Rest Duration
Minimum Total Distance
Maximum Total Distance
108
Average Total Distance
Average Resource Utilization
Average Total Time Utilization
Average Slack to Total Time Utilization
Average Rest to Total Time Utilization
NOTE: The Shipments transport mode should match with RS mode, those shipments that do not match will
not be considered in work assignment creation and their count will not be included in the shipments assignable
values on the result page.
1. Sell Shipments
2. Shipments with different RS mode
3. Consol shipments.
4. CM shipments
5. Shipment Schedule Type is specified and it's not 'GROUND SERVICE', or Schedule Type is
'DETACHABLE TRIP' and shipment status value is 'RESERVATION_OPEN'.
6. Buy shipments that have a driver already assigned.
All these shipments will not be part of work assignment and all the other valid shipments count will be displayed
under shipments assignable.
As the user selects buy shipments as input for the work assignment creation, some of these shipments may
already be a part of existing work assignments.
Any shipments that exists on an existing work assignment are eligible for re planning in a new work
assignment build.
If however, the W/A status is is WA_DRIVER_ASSIGNMENT_ASSIGNED, any and all shipments
from that work assignment will be excluded from the work assignment build.
If all shipments from a given existing work assignment are included, the entire work assignment is
disbanded prior to new work assignments being built.
If one, or some, but not all shipments from a given work assignment are included in a new work
assignment build, the original work assignment will remain, be re-driven, and the selected shipments
from that work assignment will be removed from that work assignment prior to new work assignments
being built.
109
This enhancement addresses the proper sequencing of stops during multi-stop shipment creation. The goal of
this enhancement is to allow the best “round-trip” fleet rate to be considered for a stop sequence that minimizes
distance inclusive of depot stops, as well as the best “straight line” common carrier rate to be considered for a
stop sequence that minimizes distance exclusive of depot stops. These two potential stop sequence solutions
can then be compared to allow the selection of the best overall rate offering/stop sequence combo. In this way
round-trip aware shipments, or those with depots, can be properly contrasted with common carrier-type
shipment stop sequences, where returning back to a depot is not a consideration.
The previous algorithm combined shipments by generating the best stop sequences of the combined shipments
and choosing the best sequence that is found to be feasible via various checks, rating, and driving processes.
The new approach will be more rate centric. For a set of shipments being combined, it will generate a list of
applicable rates. These will be gathered and evaluated. Each applicable rate will go through a combining
process that is similar to the current process (determine best sequences and find best sequence that is feasible
for various checks, rating and driving). The best solution from across all rate solutions will then be
chosen. Crucially, the consideration of depot profiles within the combination logic will now be dependent upon
the depot applicable status of the rate.
Note that only the depot applicable attribute of the rate geo (actually the rate offering associated with the rate
record) is considered within this new multi-stop logic. This allows runtime improvement by sequencing the
shipment stops for only one rate that is depot applicable and one rate that is not depot applicable, instead of
going through the stop sequencing logic for every possible rate. The assignment of a rate to the resulting
shipment stop sequence is not restricted to just the rate that was used to determine the stop sequence.
Opening the rating process to all rates allows just two passes through the sequencing logic, one pass using the
depot profile and one pass without the depot profile.
A new Multistop Logic Config parameter was introduced (MULTISTOP USE RATE CENTRIC SEQUENCING).
When true the multi-stop logic will loop thru each rate that is compatible with the shipments being merged. Stop
sequencing will behave differently for a rate that is depot applicable than it will for a rate that is not depot
applicable. The new rate centric sequencing applies to 3opt, 2opt, and MIP sequencers.
The existing MULTISTOP MAXIMUM NUMBER OF SEQUENCES parameter will be applied to the sequencing
processing for each rate geo, not across all rate collectively.
Special consideration will be given to the MULTISTOP START OR END AT DEPOT REWARD and
MULTISTOP USE RETURN MILES IN SEQUENCING LOGIC parameters. When the rate is determined to be
depot applicable, then these parameters will be considered, otherwise they will be treated as False.
110
STEPS TO ENABLE
The existing MULTISTOP MAXIMUM NUMBER OF SEQUENCES parameter will be applied to the sequencing
processing for each rate geo, not across all rate collectively.
Special consideration will be given to the MULTISTOP START OR END AT DEPOT REWARD and
MULTISTOP USE RETURN MILES IN SEQUENCING LOGIC parameters. When the rate is determined to be
depot applicable, then these parameters will be considered, otherwise they will be treated as False.
When using this functionality a depot profile should exist and parameters should indicate that depots should be
used. The Rate Offering Depot Applicable flag must be set to True (Checked) in order to distinguish between
rates that are depot/ meant for the fleet, and those that are not/ meant for common carriers.
The data should create a multi-stop shipment using depot stops during sequencing, so stops fit a petal route.
As as result of this feature, when OTM receives a CAT/ CAL (current available time/ current available location) -
which can be [shipment] stop related or it may not, OTM will go back to the most recent NAT/ NAL (next
available time/ next available location) time for that driver, and drive forward from there/ then. Further, OTM will
use known shipment data to further inform this HOS calculation.
This new functionality drives forward from the most recent known NAT NAL information or known shipment
information. As information, inside the inner workings of OTM, this information comes from the most recent
record that has been inserted into the DRIVER_ASSIGNMENT table. Events that insert records into that table
are; assignment of a shipment, completion of a shipment with a tracking event indicating it, or a Next Available
Time (NAT) Next Available Location (NAL) override entry -either from the UI or via tracking event.
By way of example if a driver had a NAT/ NAL of say Tuesday February 26th at 8:00 am representing the start
of his shift, at Philadelphia, PA USA. Let's assume that after a shipment assignment for that day, that driver
with fresh hours, picked up a load at stop 1 at 9:00 am and began doing their work. Let's say that at 12:30 pm,
3.5 hours of driving later, the driver arrives at the second stop, in Scranton, PA, USA. At which point, they
submit a tracking event to OTM indicating that they arrived at stop 2 at 12:30. Let's assume that this tracking
event does not contain any hour remaining/ hours consumed information.
Previously OTM's driving forward HOS calculations from stop 2 for this driver would assume the driver had fresh
hours at stop 2, simply because OTM did not receive any hours consumed or hours remaining information on
the tracking event. As a result of this feature, OTM will now go back to the most recent, non-current shipment
record, in the DRIVER_ASSIGNMENT table (Tuesday February 26th at 8:00 am in this case) and drive forward
from there to obtain it's HOS calculation. Note that in this scenario, the most recent, non current shipment HOS
111
record was Tuesday February 26th at 8:00 am yet the driver actually started at 9:00 am with 3.5 hours of
driving. OTM HOS calculation, will actually use the shipment start time to estimate the drivers' consumed/
remaining hours. So the drivers hours consumed, in this case, will be 3.5 hours, not 4.5 hours. OTM will now
perform HOS re-dives for downstream shipment stops and even future shipment assignments with this more
accurate HOS information.
NOTE: There will be no behavioral change in OTM HOS calculation if the users continue to enter tracking
events with hour remaining hours information. It will, however, make HOS calculations inherently better and
more informed for those users that do not submit CAT CAL tracking events that include hours remaining
information.
STEPS TO ENABLE
If users wish to prevent this new behavior from happening, submitting Current Available Time (CAT) Current
Available Location (CAL) tracking events with hours remaining or hours consumed information will continue to
work as it has.
There are three use cases using LIFO (last in, first out) multi-stop shipment:
1. Single pick, multiple drops: Pick-> Drop -> Drop ->Drop etc.,
2. Multiple picks with a single drop: Pick-> Pick-> Pick-> Drop
3. Multiple picks and multiple drops: Pick-> Pick-> Pick-> Drop-> Drop ->Drop etc.
It is a combination of the freight-related stops and special services that determine where equipment is either
picked up (using the special service PICKLOADED), or dropped off, (using the special service
DROPLOADED). There is also a scenario using non-LIFO. These scenarios are explained in further detail in
the new feature summary.
STEPS TO ENABLE
Note that the behaviors of OTM described in the 3 outlined use-cases are all within the context of a bulk plan.
112
Example: P(1) --> D(2) --> D(3) [drop equipment 1 at this stop] --> D(4)
In this use-case, let's assume we have a set of s ship units for each drop off stop (each P/D pair). Suppose Stop
3 has a DROPLOADED special service, other drop-off stops don't have such special service (in which case,
OTM will assume they are LIVE UNLOAD) Now from the last drop-off stop calculating backwards, OTM finds
the next stop with DROPLOADED special service, (stop 3 three in this case). Now, all S ship units for stop 4
are sent to conopt along with, all the child equipments groups (as defined on the combination equipment
definition) Conopt will then pack the s ship units to one or more child equipment and return back to the
solve. For the next stop, moving backwards, stop 3, OTM finds that it has the DROPLOADED special
service. Now OTM will send the s ship units that are dropped at these stops (i.e. dropped off at stop 2 and 3)
as packing items, and the child equipment groups that were not yet used, to conopt. Conopt will pack the s ship
units to these child equipment, and then return. Planning will continue this way until all the s ship units are
packed.
Example: Multiple pickups and single drop-off P(1) --> P(2) [pickup equipment 2 at this stop] --> P(3) --->
D(4)
In this case, OTM would have a set of s ship units for each pickup stop and one delivery stop. Suppose in the
case, Stop 2 has a PICKLOADED special service, and the other pickup stops don't have such special service
(which again, indicates that they are LIVE LOAD). Now from first pickup stop forward, OTM finds the next stop
with PICKLOADED special service, in this case it is stop 2, now OTM sends the s ship units that are picked up
at all the stops from first stop to the pick loaded stop(exclusively), in this case Stop 1 as the packing item, all the
child equipments groups as the packing resource to conopt. Conopt will pack the s ship units to one or more
child equipment and the return the result back to planning. Next, from stop 2 forward, OTM finds the next stop
with PICKLOADED special service, or there if any exist for remaining pickup stop(s) -none do in this case.Next,
OTM will send the s ship units that are picked up at these stops (i.e. pickup at 3) as packing items, and the child
equipment groups that was not used yet as packing resource to conopt. Conopt will pack the s ship units to
these child equipment, and then return that result to planning. OTM continues this until all the s ship units are
packed.
Example: Multiple pickup and multiple drop-off P (1) ---> P(2) (PICKLOADED)---> P(3) ----> D(4) --> D(5) --
> D(6) (DROPLOADED) --> D(7)
The packing of combo equipment will happen in the following fashion: From first pickup stop forward to the first
stop with PICKLOADED special service (exclusive), if such PICKLOADED stop does not exist it will be the last
pickup stop, collect all the s ship units that are picked up are these stops, OTM does the following: ----- from last
113
stop going backward, find the previous drop-off stop with DROPLOADED (exclusive), if such drop-off stop does
not exist it will be the first drop-off stop, send the s ship units that are dropped off at all these stops as packing
items and all child equipment of the combo as packing resources and send all of these to conopt, one or more
child equipment will be packed and be returned to planning. It may be the case that one or more s ship unit
cane not be packed stop the process. If there are unpacked s ship units, for the DROPLOADED stop in the
above step going backward,find the previous drop-off stop with DROPLOADED (exclusive), if such drop-off
stop does not exist it will be the first drop-off stop, send the s ship units that are dropped off at all these stops as
packing items and remaining child equipment of the combo as packing resources and send to conopt, one or
more child equipment will be packed and conopt will return a result to planning, again, it is possible that some s
ship units can not be packed stop the process. OTM will repeat these step until all the s ship unit is packed.
going forward to the next pickup with PICKLOADED special service (exclusive), if such PICKLOADED stop
does not exist it will be the last pickup stop, collect all the s ship units that are picked up at these stops and
repeat the above two steps tp pack the s ship units into remaining child equipment. repeat above 2 steps until
all pickup stops are processed, either all s ship units are packed or there is no solution.
For the above example, the packing process happens in the following fashion for a combo equipment with 3
child equipment: SSU1: pickup at stop1, drop off at stop 7 SSU2: pickup at stop 2, drop off at stop 6 SSU3:
pickup at stop 2, drop off at stop 5 SSU4: pickup at stop 3, drop off at stop 4 Now, for first stop to first
PICKLOADED stop (2, exclusive), we have SSUI, we will pack this SSU: From last stop to its previous
DROPLOADED stop(6) exclusive, we have SSU1 drop at stop as the packing item and all child equipment
('CE'): CE1, CE2 and CE3 as packing resource and send them to conopt, suppose OTM packed SSU1 into
CE1, going forward, we don't have PICKLOADED stops, so it will be the last pickup stop, we have , SSU2,
SSU3, SSU4 that are picked up at these stops, OTM will pack this SSU use the remaining child equipment CE2,
CE3: From last stop to its previous DROPLOADED stop(6) exclusive, SSU2, SSU3, SSU4 are not dropped off
at stop 7 There are unpacked s ship unit SSU2, SSU3, SSU4, from the DROPLOADED stop (6) above going
backward OTM does not see DROPLOADED, so it will find the first drop-off stop, SSU2, SSU3, SSU4 drop at
these stops, OTM would then send SSU2, SSU3, SSU4 as packing items and remaining child equipment CE2,
CE3 as packing resources to conopt, supposed these SSU2, SSU3, SSU4is packed into CE2, CE3.
Child equipment groups are sent into conopt based on their sequence no, in the case of LIFO multistop,
tSShipUnitGroups are sent into conopt based on their pickup stops numbers. TSShipUnits that are picked up
early will be packed into the child equipment with lower sequence number. Given LIFO, child equipment picked
up early will be dropped off later.
For NON-LIFO cases, we also send the TSShipUnitGroups based on their pickup stop numbers, the earlier they
are picked up, they are sent to conopt earlier to be packed into child equipment with lower sequence number.
But the child equipment picked up early may be dropped off early as well.
PLANNING PARAMETER
A new planning parameter: CHECK STOP SPECIAL SERVICE IN EQUIPMENT PACKING was added, default
to false.
When this parameter is false, the equipment packing logic will remain as is, i.e., equipment packing will
not happen for combo equipment when it is multistep shipments and packing for single equipment
remains unchanged.
When this parameter is true, the logic of packing multistep LIFO shipments Into combo equipment
described above will be invoked. If a pickup stop has PICKLOADED special service, it is considered pick
loaded. Otherwise if it has LOAD or no LOAD/PICKLOADED, it is considered live load. If a drop-off stop
has DROPLOADED special service, it is considered drop loaded, otherwise if it has UNLOAD or no
UNLOAD/DROPLOADED special service it is considered live unload. We talked about how the special
service and SSUs pickup/dropped at different locations can make a multistop shipment not feasible to be
packed into a combo equipment. This can also happen when packing into a regular equipment. For
114
example, P ---> P( -PICKLOADED)------> D, this is invalid shipment for a regular equipment group
because the second pickup stop is a PICKLOADED, we need one equipment for stop 1, and another
equipment for stop. We need a combo equipment for this shipment.
ONE THING TO NOTE: PICKLOADED special service is ignored if it is on a drop-off stop, DROPLOADED
special service is ignored if it is on a pickup stop from combination equipment packing perspective.
Current multistop logic does not form a multistop shipment with 2 or more that have 2 pickup stops with
PICKLOADED special service or 2 or more than 2 drop-off stops with DROPLOADED special service when
parameter CHECK STOP SPECIAL SERVICE IN EQUIPMENT PACKING is set to false.
Given an adjustment to an Approved Invoice the adjustment amount will be captured in a new invoice.
Given an adjustment to an Unapproved Invoice the adjustment amount will be added to the existing
invoice as a invoice line.
With glog.invoice.adjustments.createNewInvoiceForAdjustments=false:
Given an adjustment to an Approved Invoice the adjustment amount will be added to the existing invoice
as a new invoice line.
STEPS TO ENABLE
Given an adjustment to an Approved Invoice the adjustment amount will be captured in a new invoice.
Given an adjustment to an Unapproved Invoice the adjustment amount will be added to the existing
invoice as a invoice line.
Stay ahead of the game with Oracle Logistics Network Modeling Cloud (LNM), a simple and convenient way to
perform strategic and tactical analyses of your transportation network using real-world operational data.
Whether you are determining the impact of routing via a cross-dock vs direct, quantifying potential savings with
115
adjusting shipping and receiving hours at the distribution center, or trying to understand the impact of an
increase in rates to your transportation budget, LNM is an intuitive tool that allows you to perform detailed what-
if scenarios within the context of your operational environment, offering a richer and more accurate set of results.
OVERVIEW
Supply chains and their associated logistics networks are becoming increasingly complex as we respond to
business challenges such as globalization, omnichannel fulfillment, supply chain risk mitigation, and mergers
and acquisitions. Many companies deploy point-based solutions that only solve specific operational issues,
generating data that can be difficult to effectively consolidate and analyze from an overall network perspective.
In this new demand-driven environment, strategic and tactical analysis is essential in creating a robust, resilient,
and profitable logistics network.
Supply chain managers constantly deal with change and disruption. Some situations where they would like to
determine the impact of a planned change include:
Current applications that are available to perform such analysis often prove inadequate as they typically utilize
simplified models of the logistics network, and operate on estimated aggregate costs using historical data. The
optimization and planning algorithms used in the analysis can also be different from your actual transportation
operations, and usually results in policies that cannot be implemented because they do not translate effectively
to real-world conditions.
Oracle Logistics Network Modeling Cloud provides a simple, intuitive and convenient way to perform strategic
and tactical analyses of your transportation network using real-world operational data – all within the context of
your operational environment. LNM allows you to perform detailed what-if scenario analysis using the
operational details of your existing transportation network. It uses the same rules, policies, and planning
algorithms that you normally employ in your transportation operations. This leads to highly accurate results that
show the actual impact of the changes to your operations. Since everything is in the context of your actual
operational network, identified changes and responses can be easily deployed as needed.
SCENARIO MANAGEMENT
Logistics Network Modeling Cloud allows you to quickly define the different scenarios you wish to analyze in the
context of your current operational environment. Multiple types of analyses can be performed simultaneously as
independent projects. Each project can contain multiple different scenarios to capture specific variations of data,
rules, and policy changes that you may want to compare. All changes are isolated to the particular scenario and
do not impact the actual operational data in any way.LNM makes it easy to analyze the results and compare the
different scenarios side-by-side. The Scenario Analysis Workbench is a multi-pane configurable view that
provides an easy way to view the resulting shipment plan, drill to the associated details, or view them on the
Map.
116
Scenario Analysis Workbench
The Scenario Analytics Dashboard is another useful view that allows you to compare scenarios using multiple
common metrics across many dimensions such as cost, utilization, etc. You can define your own custom
metrics and visualizations, and compare multiple scenarios using the actual shipment data. By using the same
metrics you use to measure your operational performance, you’ll be able to understand the impact of potential
changes and determine the best response
When analyzing the scenarios, Oracle Logistics Network Monitoring Cloud performs the same planning steps
that you would in your operational plans, and uses your actual operational data to produce results showing you
117
the impact to your operations. There are no approximations or aggregations of any kind, and your actual
operational data is used, overlaid with the changes specific to the scenario. Your daily operational planning is
performed to determine the impact to your operations.
Support for both strategic and tactical scenarios. Users can model and analyze quick-running tactical
what-if scenarios right in their operational environments to optimize operations. A separate modeling
environment can be employed for longer running strategic analyses ensuring no impact to operations
from these performance-intensive analyses, if desired.
Specify key criteria such as order sets, time duration, etc., to constrain and shape the analysis to better
reflect real-world operational conditions.
Use actual operational data, overlay changes and add additional data, as needed, to analyze scenarios
as accurately as possible.
Replicate operational planning processes exactly, including the ability to run multiple linked daily/weekly
plans in sequence or in parallel.
Use advanced visualizations capabilities in Oracle Transportation Management Cloud to view and
analyze the shipment plan details, including stop-level details, for each scenario. Compare scenario
results side-by-side.
Use packaged and custom key performance metrics and associated dashboards that support a variety of
slice-and-dice, drill-down and ad-hoc query mechanisms to better understand and compare multiple
scenarios side-by-side. LNM’s analytical capabilities are delivered using Oracle’s best-in-class business
intelligence technology.
Store scenario analyses for future reference. This allows for past analyses to be referenced and utilized
when similar risks or scenarios arise elsewhere in the network, allowing for continuous learning and
improvement of strategies.
Strategic scenario management allows you to optimize your logistics operations long term. It typically involves
modeling changes in key business conditions and then analyzing the impact to the logistics network over a
longer period. Resulting policies may require a network configuration change or a response strategy that could
have a high impact to the network. But this often leads to significant and high-value changes resulting in
considerable savings.
Tactical scenario management involves analyzing different options to determine the optimal approach to fulfill
the current operational demand and to improve your current network. Typically, operational systems present the
user with a single shipment plan based on a single objective – usually to minimize costs. Oracle Logistics
Network Modeling Cloud allows analysis, within the context of the daily business process, of different logistics
strategies simultaneously for the same operational data to determine what the best strategy is today. This
provides a new way to optimize operations that is simply not available from other systems. Executing the
118
resulting solution typically requires little to no network change while providing significant savings over current
operations.
Since LNM is already available in your Oracle Transportation Management Cloud environment, there is no
additional setup or integration needed to immediately use its capabilities. It uses the same entities and concepts
as OTM and is intuitive and easy to use, requiring no special training.
The ability to perform strategic and tactical scenario analyses using real-world operational data can lead to
significant savings. Being able to model the impact of potential changes to your logistics network using actual
operational data, and determine the optimal response allows you to ensure that your network is always
operating at its best and ensures that you meet your service levels most efficiently.
Since LNM is an intrinsic part of OTM, you can quickly analyze and easily implement changes to your network
settings, design and policies. This leads to a resilient supply chain that can easily adapt to the constant change
you face. You can easily perform supply chain risk analyses, such as Network Disruption Risk or Supplier
Change Risk, and ensure that you are always prepared with the optimal response plan for each possible risk.
STEPS TO ENABLE
CONFIGURATION
LNM can be run in a number of different configurations. How and where you choose to run LNM will be mostly
determined by the types of modeling you intend to do. Below are just a few options you might consider for
running LNM:
1. Run LNM on the same Production or Test instances you run OTM – either in a separate LNM domain
(with no access to your operational data) or in a separate domain with read access to your existing
operational data.
a. This is a good option if you plan to use LNM to do quick daily simulations of today's or yesterday's
operation.
2. Run LNM on a separate test instance – typically this setup will involve performing a Production to Test
(PtT) move of your production data to your LNM instance.
a. This is a good option if you intend to use LNM for more involved projects that involve large
amounts of data and require the running of a large number of bulk plans. For example, running a
project that involves a daily bulk plan simulation run covering a year’s worth of order data.
Using LNM involves three basic steps - Setup, Solve, and Analyze.
1. Setup
a. Define your Modeling Project and the Modeling Project Level Data that will be used for all the
related Modeling Scenarios.
b. Define the various Modeling Scenarios related to your project.
c. For each Modeling Scenario define and configure the data source and any data changes required
to model the scenarios.
119
2. Solve
a. Run Bulk Plans for all of the Modeling Scenarios in your Project.
3. Analyze
a. Compare Modeling Scenario Bulk Plan Results.
b. Perform Workbench Review of your Modeling Project and the related Modeling Scenarios.
c. Analyze your Modeling Project using - Logistics Network Modeling Intelligence
1. Move (Modeling Project) Data to Analytics
2. Build Ad Hoc Reports and Dashboards
3. Analyze Results
Before you start working with LNM it's important to clarify the intention of your project and the different modeling
scenarios you wish to compare. Mapping out your project, the various scenarios and any related data,
parameters, data modifications will help streamline the setup required in LNM.
For this example the LNM project is designed to compare the cost of sourcing a representative set of daily order
releases from two different distribution center locations - DC1 which is located in Ontario CA USA and DC2 -
located in Compton CA USA. The two DC operations are roughly 50 miles away from each other and are both
viable locations for this DC. The goal is to model and compare the different transportation cost - in terms of
miles, hours and dollars - associated with servicing the representative set of orders from these two source
locations.
Setup - Involves setting up the Modeling Project and related Modeling Scenarios.
Modeling Project and Modeling Scenarios - Once you have defined your project objective, the first step is to
create your Modeling Project. The Modeling Project describes the overall project and links to the different
Modeling Scenarios you want to compare within your Modeling Project. The Modeling Project object allows you
to define defaults that can be used for all of the related Modeling Scenarios - this can help to simplify your
Modeling Scenario setup.
Modeling Project
120
Example Modeling Project - DC1 v DC2
Modeling Project ID - Add a meaningful Modeling Project ID for your project - there is a strong possibility
that you will generate many projects - with slightly different objectives - having a good way to identify and
differentiate between your many projects based on the ID will become very beneficial.
Modeling Project ID = DC1 VS DC2
Modeling Project Name - Provide a name for your project.
Modeling Project Name = COMPARE ORDER SOURCING FROM DC1 V DC2 - like the ID - the
more descriptive the better.
Description - Provide a good description that captures the purpose of this project and the scenarios
involved.
Description = In this project there will be two scenarios - one Modeling Scenario using DC1 as the
source location and a second scenario using DC2 as the source location. The same set of orders
will be used for both scenarios.
Parameter Set ID - Allows you to set a Parameter Set ID at the project level to be used by all of the
related scenarios.
Parameter Set ID = Null - In this project the default parameter set will be used - so no default
required.
Saved Query Type and "Order Saved Query ID"- allows you to select a different set of order releases (or
order movements) to be used in all of your project scenarios.
For this project all of the Modeling Scenarios will run against the same order release saved query.
Saved Query Type = Order Release
Order Saved Query ID = DC1 V DC2 SOURCING PROJECT.
Itinerary ID and Itinerary Profile ID - allows you to set the default values at the project level for the
Itinerary or Itinerary Profile that will be used by all your project's Modeling Scenarios.
In this project all available itineraries will be considered - so no default used/required.
Modeling Scenario - Create the Modeling Scenarios for your Project. LNM provides you with a number of
extremely power data selection, data change and Bulk Plan simulation tools that you can use to configure your
Modeling Scenarios. The six major capabilities you have at your disposal to configure your Modeling Scenarios
are:
121
1. Saved Query Type/Order Saved Query - this feature allows you to easily select a different set of orders
to consider in each of your Modeling Scenarios.
2. Parameter set, Itinerary or Itinerary profile - setting different values for anyone of these fields allows
you to easily consider different options in each of your Modeling Scenarios.
For example - by simple changing the Itineraries to consider in each Modeling Scenario you can
model different modes options, different equipment group options or different network options.
3. Parameter Overrides - this feature allows to set different parameter values for each of your Modeling
Scenarios without having to create different Planning Parameter Sets or Logic Configurations to model
the different settings.
For example - you can run multiple Modeling Scenarios using the same base Parameter Set but in
one scenario you an override the default value of the HOLD AS LATE AS POSSIBLE parameter to
see which value provide the best result.
4. Scenario Data Changes - this feature allows you to alter a specific value in your Modeling Scenarios.
For example - you can run different Modeling Scenarios where you change the number of Stops
Included In a Rate.
5. Data Rule/Data Rule Instance - Data Rule/Data Rule Instances allow you to do a virtual mass updates
to your data. Data Rules allow you to set values, increase values, decrease values or clear values in
your data.
For example - you can use a data rule set the locations on a set or order releases or increase or
decrease the weight and volume on a set of orders.
6. Bulk Plan Specification - a Bulk Plan Specification provides you with the ability to run multiple bulk
plans, all within the context of one Modeling Scenario. With this feature you can simulate the running of
Bulk Plans starting at different times of the day, or you can simulate the running of Bulk Plans based on
different order grouping options (like group by source or destination location) or based on different saved
queries or you can combined a set of these capabilities to define the Bulk Plans to run based on time of
day, by saved query and by groups.
For example - you can setup a Bulk Plan Specification that allows you to run different Bulk Plans
in one Modeling Scenario where each Bulk Plan Specification runs a different Bulk Plan based on
a different Order Saved Query where the different saved queries simulate the running of a days'
worth of orders but split up into 3 separate regional Bulk Plans. This scenario can then
be compared to another Modeling Scenario where the same set of orders are setup using a Bulk
Plan Specification so there are 2 Bulk Plans vs 3.
Scenario UI
122
Example Modeling Scenario For DC1
123
available Data Rule objects includes: Capacity Usage, Itinerary, Itinerary Leg, Location, Order
Movement, Order Release, Rate Geo, Rate Offering, Rate Quality, Routing Network, Routing Network
Detail, Routing Network Leg.
For this project there will be one Modeling Scenario Data Rule with two Data Rule Instances. The
Modeling Data Rule is configured to allow for the definition of Source Location on the Order
Release, the two Data Rule Instances - one per scenario - will be used to set the source
location of the Order Releases to the "DC1" for the DC1 AS SOURCE LOCATION Modeling
Scenario and the second Data Rule Instance will be used to set the Order Release source location
to "DC2 for the second Modeling Scenario.
Data Rule
Modeling Scenario Date Rule and Data Rule Instance - the Modeling Scenario Data Rule allows you
to establish the definition of the elements that will be configurable in the Date Rule Instance. The Data
Rule Instance defines the changes you wish to make to the base data as part of a specific
scenario. The additional level of configurability provided by the Modeling Scenario Data Rule will allow
your LNM Super User to define the available Data Rules that your LNM modeling users can then use to
configure their Data Rule Instances on their Modeling Scenarios. By predefining the Data Rules, the
options and decisions required by the LNM users when setting up their various scenarios can be
greatly simplified.
Sequence Number - defines the sequence in which the data rule instances are applied on the scenario
data during the execution of the scenario bulk plan.
Sequence Number = 1 - only 1 Data Rule Instance
Data Rule Instance ID - a data rule instance is one component of data rules. You create a Data Rule
Instance using a data rule definition as a basis. Data Rule instances are then assigned to a scenario to
perform data changes for that scenario.
Data Rule Instance ID = CHANGE TO DC1 - Data Rule Instance this Modeling Scenario
Defining a Data Rule Instance
Enter a unique Data Rule Instance ID.
CHANGE TO DC1
Enter a Description of the Data Rule Instance.
Change source location to DC1
Enter a Data Rule Definition ID - this links the Instance to a previously Defined Data
Rule.
CHANGE SOURCE LOCATION
The Table Name will be populated based on the object defined in the data rule
definition.
ORDER_RELEASE
124
A Rule Group provides you with an optional way to group rules, which can be used
to retrieve records in the finder.
Null
Enter the data rule instance parameters:
Enter the Parameter Name. The Column Name will display.
DC CHANGE
SOURCE_LOCATION_GID
Enter the Operand. The operand tells OTM how to modify the column value.
Operands are pre-defined and public. The list of available operand values
varies based on the Column Type (String, Date, UOM). For example, for
String/Gid, the public operands are CLEAR, SET, PREPEND, APPEND. The
CLEAR operand will not be shown for any non-nullable columns in the user
interface.
SET
Enter a Value. The value fields are populated based on the operand. For a
"Clear" operand, a value is not expected so the Value field is hidden when
"Clear" operand is selected, for all other operands. A formula is associated
with each operand and is defined in the LNM_OPERAND table. For example,
an operand INCREASE BY DAYS is available when the column type is a
DATE/TIME and the expected value is a number.
OOTB.DC1
Saved Query Filter - a Saved Query Filter ID which returns the list of objects on which the data rules
defined on the instance will be applied.
DC1 V DC2 SOURCING PROJECT
DC2 Scenario
125
Modeling Scenario Data Rule Instance DC2
The setup for the second scenario in this project is exactly the same as the steps above for the first
scenario with the only difference being that the source location specified on the Data Rule Instance
assigned to the second scenario will be DC2 instead of DC1.
Solve - Once the Modeling Project and Modeling Scenarios have been setup the next step is to run a
Scenario Bulk Plan for all the Modeling Scenarios in your project.
Scenario Bulk Plan - Running the Scenario Bulk Plan action from the project will allow you to kickoff Bulk Plan
runs for all (or some) of the scenarios in the project.
126
Scenario Bulk Plans Running
Scenario Bulk Plan Output – As the Bulk Plans are running for your project the Scenario Bulk Plan
Output screen provides you with visibility into the status of each of the scenario bulk plans as they are
running.
Analyze - Once the Scenario Bulk Plans for your project are complete the next, and final step, is to analyze
the results. LNM provides you with a number of powerful tools to interpret the results of your modeling
simulations.
Comparing the Modeling Scenario Bulk Plan Results - provides a quick overview of the key scenario
metrics - total cost, number of miles, number of shipments etc.
127
Scenario Bulk Plan Compare
Perform Workbench Review - review the solution in a Workbench to do side-by-side detailed solution
analysis of the Modeling Scenarios - review the shipments at a detailed level - using table views for the
LNM shipment data or geographically on Modeling Scenario specific map displays.
128
Scenario Workbench Analysis - Side-by-Side Modeling Scenario Map View
129
Moved Data to Analytics - Tables Moved
Logistics Network Modeling Intelligence - Analyze Results create Ad Hoc Query and dashboards and
reports to analyze your project(s) and scenario(s)
130
Ad Hoc Query with Logistics Network Modeling Intelligence
131
Dashboard
The flex field option is available for the following Constraint Sets:
Aggregation
Grouping
STEPS TO ENABLE
The standard steps required for defining a Logic Configuration and Constraint Sets are required to take
advantage of this feature.
132
b.
2.
a. Declaration-Line Aggregation
b. Declaration-Line Grouping
3. Select one of the Flex Field options:
a. Date Flex Field
b. Number Flex Field
c. String Flex Field
4. Then based on the option selected above define your Constraint Details:
a. Date Flex Fields
b. Number Flex Fields
c. String Flex Fields
The flex field option is available for the following Data Configuration Association Types:
Transaction to Declaration
Shipment to Transaction
Order Release to Trade Transaction
STEPS TO ENABLE
The standard steps required for defining Data Configuration are required to take advantage of this feature.
133
TIPS AND CONSIDERATIONS
The following steps are required/should be considered when setting up the corresponding Manager Layout:
The report is accessed via Business Process Automation > Reporting > Report Manager > License Assignment
Report.
You will be prompted to enter parameters such as the License Line ID, the Start and End Date and the Delivery
Report Format (e.g. PDF, CSV). The system will use these parameters to limit the range of data to be included
in the report.
License Identification
License ID
License Category, Type and Code
Jurisdiction and Regime
Effective Date
License Line Identification
License Line ID
Line Type
Line Description
Authorized Quantity
Authorized Value
Opening Quantity and Value Balances
For each trade transaction line or adjustment, the following information is provided
Date
User
Document (Transaction Line or Adjustment)
Quantity
Quantity Balance
Value
Value Balance
STEPS TO ENABLE
134
DISPLAY STOPLIGHT FOR RESTRICTED PARTY SCREENING ON TRANSACTION AND
DECLARATION
This feature provides you with the ability to visualize the Restricted Party List Screening (RPLS) Status of the
involved parties in a trade transaction using an indicator. The application picks the RPLS status from the party
master and reflects it with different indicators.
White solid circle: RPLS_NOT STARTED. The RPLS process hasn't been executed on the party.
Yellow triangle with an exclamation symbol inside: Status is either RPLS_REQUIRES REVIEW or
RPLS_ESCALATED
RPLS_REQUIRES REVIEW: The RPLS has been executed on the party and there is a potential
match.
RPLS_ESCALATED: The RPLS has been executed on the party and one or more denied parties
are set as Escalated Match.
Red circle with an exclamation symbol inside: RPLS_FAILED. The RPLS has been executed on the party
and there is a confirmed/verified match.
Green circle with a check symbol inside: RPLS_PASSED. The RPLS has been executed on the party
and either no matches were found or matches were found and all were marked as 'Not a Match'.
Stoplight Indicator
STEPS TO ENABLE
135
From the shipment group, you can use the SmartLink ‘View Related Trade Transactions’ to view the related
trade transactions.
STEPS TO ENABLE
You can use standard Screen Set Manager: SmartLinks functionality to remove the View Trade Transaction
smart link from the Shipment Group manager.
STEPS TO ENABLE
136
TIPS AND CONSIDERATIONS
If you do not specify a Product Classification Type ID and hit OK, then the previous behavior will be provided i.
e., all the current Product Classification Types and Codes for the Item will be displayed.
STEPS TO ENABLE
Item
Trade Item Structure
Trade Item Structure Component
Party
Trade Transaction
Trade Transaction Line
Customs Shipment (Declaration)
Customs Shipment Line (Declaration Line)
Contact
137
In addition, you can now configure a workbench for your restricted party list screening work queue. The
capabilities specifically support:
STEPS TO ENABLE
7.
138
7. Select Default Work Queue (e.g.:PARTY_WQ_REQUIRES
REVIEW_WORK_QUEUE)
8. Complete the process clicking the OK button
9. Then select the Work Queue PARTY_WQ_REQUIRES
REVIEW_WORK_QUEUE at the top of the screen. The application will bring
the information of the parties based on the work queue
9. Add the content for the bottom part of the workbench
a. Select Component Type 'table'
b. Select Object Type 'Party Matched Restricted Party'
c. Select the Tab Name (e.g.: Restricted Parties Potential Matches)
d. Select the Screen Set. There is a public screen set GTM_PARTY_SCREENING
PUBLIC that you can use for this purpose
e. Check the Detail table check box
f. Select Parties To Be Reviewed Saved Search PARTY SCREENING MATCH
QUERY PUBLIC. This is a public query.
10. Finalize the definition of the layout by clicking the DONE button at the right top of the
screen. The application will populate the information on the screen
The system will assign the quantity of records for the time indicated in the work queue parameters (e.g.: 10
records for 5 minutes). If the user comes back to a record that has passed the time threshold and tries to run
an action on it, the system will throw an error indicating the user no longer holds the record.
When the user tries to run an action on a restricted party that is not in the current content version the system will
throw an error indicating the Restricted Party is invalid.
About Supplier Solicitation - Introduction to the campaign management and supplier solicitation process
in GTM
Supplier Solicitation: Business Process - Details around the business process for supplier solicitation
Supplier Solicitation: Configuration Steps - Steps to configure GTM for campaign management and
supplier solicitation
Supplier Solicitation: Process Steps - Information on the major steps in the campaign management and
supplier solicitation process
STEPS TO ENABLE
If you plan to have your suppliers use the supplier portal to manage and respond to a solicitation, review the
Supplier Access Configuration details section of Configuration Steps to configure supplier access.
139
GTM HOW TO/CONFIGURATION TOPIC - PRODUCT CLASSIFICATION PROCESS
This feature provides you with a new GTM How To/Configuration topic 'Product Classification Process' that
covers product classification. This How To topic will provide you with the information you need to properly use
Product Classification in GTM.
STEPS TO ENABLE
STEPS TO ENABLE
The standard steps required for defining a Service Configuration are required to take advantage of this feature.
140
2.
The complete Service Preference Configurations considering the previous parameters would be:
dataVersionList=null, dataSource=null, matchEngine=InverseIndex, threshold=0.5,
excludeWords=null, listCode=null, screeningFieldParameter=RPLS.SERVPARA-INVERSEINDEX
When executing the Review Match Factor on a Party, you will be able to select the Service Preference that
includes the Inverse Index as a parameter.
Companies might have to tune the threshold and weightages for the service parameters on the Service
Preference Configuration according with their business needs.
STEPS TO ENABLE
141
Item Origin Detail
STEPS TO ENABLE
The Suppliers grid on the Item has been deprecated and will be removed in a future release.
The Review Screening Results action from the Parties action menu
Within the details of the View Party Screening Results link associated with involved parties on the Trade
Transaction and Declaration
STEPS TO ENABLE
From product classification type, you can use the SmartLink ‘View Related Trade Programs’ to view all trade
programs related to a particular product classification type.
STEPS TO ENABLE
142
SMARTLINKS BETWEEN PRODUCT CLASSIFICATION CODE AND TARIFF RATES
This feature enables you to view the tariff rates related to product classification codes.
From the product classification code, you can use the SmartLink ‘Related Tariff Rates’ to view all tariff rates
related to a particular product classification code.
STEPS TO ENABLE
About License Screening - Introduction to the license management and license screening process in GTM
License Screening Business Process - Information on the business process for license screening
License Screening Configuration Steps - Steps to configure GTM for license management and license
screening
License Screening Process Steps - Details on the major steps in the license screening process
STEPS TO ENABLE
AES ENHANCEMENTS
This feature provides enhancements to the AES Filing capability. Based on regulatory changes, the AES
Template related to the X12.601 Customs & Border Protection (CBP) - Export Shipment Information
specification has been updated as follows:
Support regulatory changes for 9X515 codes - ECCN US codes beginning with 9A515, 9B515, and
9C515 are now treated similar to the Series 600 codes in the X107 element
Enable EIN (Employer Identification Number) to be used as Filer Type ID - The ISA05 element supports
EIN as well as DUNS
STEPS TO ENABLE
143
You can:
Create a new GTM Trade Transaction from an OTM Order Release
Propagate changes from an OTM Order Release to existing GTM Trade Transaction
Remove an OTM Order Release from an existing GTM Trade Transaction
STEPS TO ENABLE
The standard steps required for defining a Data Configuration are required to take advantage of this feature.
The standard steps required for executing Actions are necessary to run this process.
This feature differs from the -OTM Shipment to GTM Trade Transaction- feature in that it does not require to
setup Data Configuration Rules. The application will ask directly for the Data Configuration at the time of
executing the process.
144
CAMPAIGN MANAGEMENT
This feature enables you to manage campaigns and solicit documents and other information from your trading
partners. You can collect information from many suppliers simultaneously. As a campaign administrator, you
can:
Define the details of the campaign including parameters for administering a campaign, documents you
want to request, and processing details such as data configuration and logic configuration
Initiate a campaign for collecting documents and data including qualification information
Notify trading partners when they need to respond to a campaign
Monitor a campaign and review responses from your trading partners
Approve or decline a submission from a trading partner
Copy data from the campaign line to the item origin on the item upon approval
Data Configuration – determines which data to copy among GTM objects. For example, when a supplier
responds to a campaign in the campaign line, they will provide important information like country of origin
and a Certificate of Origin document that you want to copy back to the item origin on the item
When creating a campaign, you can use the Association Type = Partner Item to Campaign to copy
information from your Item or Partner Item to your Campaign Line.
When approving a campaign, you can use the Association Type = Campaign Line to Item to copy
information from your Campaign Line to your Item.
Logic Configuration – defines the details of the campaign workflow configuration. The Logic Configuration
Type you want to use is GTM CAMPAIGN CONFIGURATION and include details such as:
General – specify the involved party qualifiers to be used for various stakeholders such as the
owner contact, receiving contact, and sending legal entity. You can also specify whether the
qualification details should be displayed to the trading partner.
Campaign Creation – specify information to be used when a campaign is created including the
data configuration and business number generator for the campaign ID.
Campaign Notification – specify regulation references and whether you want to send a document
template with the notification.
Campaign Approval – specify information to be used when a campaign is approved including the
data configuration.
To create a campaign, use the ‘Create Campaign’ action on a trading partner item. When you create a
campaign, certain information must be specified and is then copied onto the Campaign including:
Campaign Type – specify the type of campaign being created. Since logic configuration is identified in the
campaign type, this required field helps to drive the workflow of the campaign.
Product Classification Type – specify the product classification type for which you are creating a
campaign.
Campaign Administrator – identify the person who is managing the campaign.
Reminder Duration – this field is reserved for future use.
Effective and Expiration Date – enables you to specify the start and end dates of a campaign.
Purpose – enter details regarding the purpose of the campaign.
Trade Agreement – specify if the campaign is to solicit information related to a trade agreement.
Required Documents – specify if the campaign is to solicit specific documents from trading partners.
145
You can manage a campaign from the Campaign Manager. Each campaign will include information such as:
Campaign Type
Effective and Expiration Date
Reminder Duration (reserved for future use)
Perspective
Product Classification Type
Trade Agreement
Involved Parties
Reference Numbers, Remarks and Flex Fields
Each Campaign Line specifies basic item information for a particular trading partner and stores the response.
The campaign line includes information such as:
There are certain actions a Campaign Administrator may use to manage a campaign. A campaign administrator
can:
The Campaign Line Manager is available which enables you to trigger actions that are specific to a campaign
line including:
You can also create a workbench to enable users to easily manage the details of a Campaign and its
associated Campaign Lines. Saved queries have been provided to easily create a workbench for Campaign and
Campaign Line.
Once a campaign is created, a supplier or other trading partners can see campaign lines specific to their trading
partner items and party sites. GTM can notify a supplier when campaign lines are available. The notification
includes:
Instructions to complete the required data and whether they need to provide a certificate or other
documents
A link to the campaign
A deadline for submitting responses
146
A supplier portal is available which helps a supplier or trading partner to provide the information requested.
Configuration is required for the supplier portal. The supplier portal enables:
When logged into the supplier portal, a supplier has the ability to view both campaign and campaign line
information. In the Campaign Line, the trading partner or supplier can add or update certain information such as:
If a certificate or document is required, the supplier can use the Add Document action on the campaign line.
Once all the data and documents have been added to the campaign line, the supplier can use the Respond to
Campaign Line action to send the information back to the campaign administrator for review and approval.
STEPS TO ENABLE
GTM has reused the existing service provider portal that is managed via the SERVPROV domain. To use the
supplier portal, you need to perform certain configuration steps including:
VPD Profile
Access Control
User Roles
User Menu and Access
As part of the Create Campaign action, the supplier contact is not copied to the campaign line. The
recommended workaround is to use a Direct SQL agent action to copy this information at the time of
campaign creation.
KEY RESOURCES
For more details on setting up the supplier portal, please see the ‘Supplier Access Configuration’ page in Help.
For an item, you can run the Tariff Eligibility Screening action from the Item manager to determine if your item,
based on the associated item origin data, can take advantage of trade programs. When you trigger this manual
action, GTM displays all potential trade agreements that match the criteria.
147
Once you have determined which item origins are eligible to use trade programs, you can determine which trade
programs for which your items qualify. To determine qualification, certain criteria must be considered such as:
Within the item, you can see the item origin details on the Trade Details tab. Within each item origin, you can
determine if your company qualifies for the trade programs listed. You can mark each trade program for which
your company qualifies and set a status. GTM ships with an out of the box status that is set for each trade
program associated with an item origin.
The trade program information is assigned to the appropriate item origin and includes supporting information
such as:
Product Classification Type and Code – defines the product classification type and code for this particular
item origin/trade program pairing.
Trade Program – specifies the trade program for a particular item origin.
Trade Agreement – if a trade agreement is associated with the trade program, it will be displayed.
Status – displays the status of a particular item origin/trade program pairing. GTM ships with the following
out of the box statuses:
ELIGIBLE
DISCONTINUED
QUALIFIED
NOT QUALIFIED
Is Qualified flag – specifies if the item origin qualifies for a particular trade program.
Effective and Expiration Date – enables you to specify the start and end dates of the qualification details.
Preference Criteria – defines the origin criteria that is used to determine that a good qualifies for a trade
program.
Regional Value Content Method – specifies the regional value content method used to determine if an
item origin qualifies for a trade program. This is part of a trade program's rules of origin.
Producer – identifies whether the supplier is producer of the goods such as the manufacturer.
STEPS TO ENABLE
148
SPECIFY ITEM TYPE
To support customers with varied global trade processes that depend on whether an Item is purchased or
manufactured, Item Type is being introduced. This feature provides a new data element on the Item for
identifying the Item Type as defined in the ERP. Examples of Item Types are Purchased Item, Manufactured
Item or Service Item. This provides users with consistency across the Oracle product suite and informs a user's
decision in the information to track for Country of Origin.
STEPS TO ENABLE
STEPS TO ENABLE
ORIGIN MANAGEMENT
This feature provides enhanced support for Origin Management capabilities on the Item. On the new Trade
Details tab of the Item, you can see an Item Origins grid. You can differentiate your item origins by inventory
organizations, partners and/or partner sites. Also, if you do not need to track item origin at a detailed level, you
can continue to use the item level Default County of Origin on the Trade Details tab.
You can create and manage the item origin details such as:
Inventory Organization, Partner and Partner Site – enables you to track your item origin at an inventory
organization level or at a partner/partner site level.
Country of Origin – specify the country of origin for a particular Item Origin.
Country of Manufacture – specify the country of manufacture for a particular Item Origin.
Effective and Expiration Date – enables you to specify the start and end dates of an item origin.
Values – specify value information such as purchase price for a particular Item Origin.
Flex Fields – Use the standard flex field capability to model your company’s specific needs such as
tracking origin by serial number, a range of serial numbers, lot number, and so on.
Trade Programs – Once you determine if your item origin is eligible for a trade program, the trade
program grid is populated. In addition, you can mark the Is Qualified checkbox to indicate that the item
origin qualifies for a particular trade program.
STEPS TO ENABLE
There are two ways to populate the Origin Management information on the Item:
149
Manually enter Item Origin data via the UI
Collect origin data from a partner using a campaign and supplier solicitation. The origin data is entered by
a supplier on a campaign line. The origin information for a particular supplier and trading partner item can
then be copied back to the item from the campaign line.
NOTE: Due to the introduction of Item Origin, the Suppliers grid on the Item has been deprecated from the
Item and will be removed in a future release.
PARTY SITE
This feature enhances the Party functionality to include definition of Party Sites. You have the ability to
associate party sites to a parent party at the time of creation.
Party Sites are defined as branches or offices of a Party type Organization. A Party Site links together a Party
and a Location.
STEPS TO ENABLE
Trade Preferences – Used to model the tariff treatment of the goods. Only certain regimes such as the
European Union support this data. Examples include “reduced import duty rates under the GSP” or
“under arrangements with ACP countries”. In previous versions of GTM, trade preference was populated
by the user. You can now download this data from a third party data content provider. This is an existing
data structure that was previously named Tariff Preference Types.
Trade Program Types – Enables you to model broad types of tariff rates such as General, Special, Import
Control, and so on.
Trade Programs – Model the different programs that define tariff rates such as General Rate, GSP-A
Rate, Caribbean Base Economic Recovery Act Rate, and so on. Included in the Trade Program is a list of
Member Nations. By default, trade programs are not linked to Trade Agreements in GTM. But you can
add this information within a Trade Program.
Tariff Rates – Defines the actual tariff rate associated with a product classification type/code and a trade
program.
150
The following figure shows the relationship between tariff data in GTM:
STEPS TO ENABLE
The following existing functionality continues to use a real-time external call to third party data content
providers to obtain the duty and tax information:
Landed Cost Simulator
Estimate Duties & Taxes action on Trade Transaction and Declaration
View Duties and Taxes action on Product Classification Code
151
TRADING PARTNER ITEM
This feature enables you to manage the details of an item for a particular supplier or customer in a new object
called Trading Partner Item. You can then add a Trading Partner Item to your Item so that you can easily see
an Item and all of the Trading Partner Items associated with it.
You can create and manage your trading partner items including details such as:
Trading Partner Type – enables you to specify the type of trading partner such as Suppler or Customer.
Trading Partner – identifies the ID and details of the trading partner associated with the trading partner
item.
Trading Partner Identifier – specifies the identifier of the trading partner item in the external system of the
partner.
Trading Partner Name – specifies the name of the trading partner item in the external system of the
partner.
Effective and Expiration Date – enables you to specify the start and end dates of a trading partner item.
Remark, Reference Number, and Flex Fields – use remarks, reference numbers, and the standard flex
field capability to model additional information related to trading partner items.
There are certain manual actions you can trigger against your trading partner items including:
STEPS TO ENABLE
TRADE COMPLIANCE
STEPS TO ENABLE
TRADE AGREEMENTS
GTM is introducing Trade Agreements as a new area in the product. Trade Agreements and other supporting
information enables you to proactively leverage the trade agreements your business can take advantage of to
reduce duties and taxes. There are many Trade Agreements in effect globally, each with different parameters
but all following a similar structure.
152
To be able to take advantage of the duties and taxes associated with a Trade Agreement, companies must:
In GTM, users can take advantage of the Trade Agreement capability including:
Download and store duties, taxes, trade programs and other information from third party data content
providers
Manage trade agreement information
Perform trade agreement eligibility screening and qualification on your items
Create a campaign which enables you to solicit qualification data, documents and other information from
suppliers to support the use of trade programs
Suppliers and other users can respond to a campaign they have received
Trade Agreement Type – enables you to group your trade agreements. For example, you may want to
group your trade agreements into preferential (ie. free trade or reduced trade) and non-preferential trade
agreements or some other grouping that makes sense to you.
Short Name – enables you to specify the acronym or short name by which a trade agreement is known.
For example, the North American Free Trade Agreement has a short name of NAFTA.
Active flag – indicates if a trade agreement is currently in use.
Effective and Expiration Date – enables you to specify the start and end dates of a trade agreement.
\\Data Version – specify the version of the trade agreement where applicable.
Member Nations – define the member nations participating in a trade agreement using the existing region
capability.
Remarks, Reference Numbers and Flex Fields – use remarks, reference numbers, and the standard flex
field capability to model additional information related to trade agreements.
STEPS TO ENABLE
From the trade agreement, SmartLinks are available which enable you to:
View Related Trade Programs - this SmartLink enables you to view all trade programs related to a
particular trade agreement.
View Related Campaigns - this SmartLink enables you to view all campaigns related to a particular trade
agreement.
153
STEPS TO ENABLE
The License Analysis facts include License Count. The License Analysis dimensions include date, involved
party, and detailed dimensions including regime, compliance data, location, reference number, and user-defined
code.
The License Line Analysis facts include License Line Count, Authorized Quantity/Value, Available Quantity
/Value, Reserved Quantity/Value, and Used Quantity/Value. The License Line Analysis dimensions include
date, domain, and detailed dimensions including regime, compliance data, product, remark, and reference
number.
STEPS TO ENABLE
STEPS TO ENABLE
---
154
Copyright © 2019, Oracle and/or its affiliates. All rights reserved.
This software and related documentation are provided under a license agreement containing restrictions on use and disclosure and are protected by intellectual property laws. Except as
expressly permitted in your license agreement or allowed by law, you may not use, copy, reproduce, translate, broadcast, modify, license, transmit, distribute, exhibit, perform, publish, or
display any part, in any form, or by any means. Reverse engineering, disassembly, or decompilation of this software, unless required by law for interoperability, is prohibited.
The information contained herein is subject to change without notice and is not warranted to be error-free. If you find any errors, please report them to us in writing.
If this is software or related documentation that is delivered to the U.S. Government or anyone licensing it on behalf of the U.S. Government, then the following notice is applicable:
U.S. GOVERNMENT END USERS: Oracle programs, including any operating system, integrated software, any programs installed on the hardware, and/or documentation, delivered to U.
S. Government end users are "commercial computer software" pursuant to the applicable Federal Acquisition Regulation and agency-specific supplemental regulations. As such, use,
duplication, disclosure, modification, and adaptation of the programs, including any operating system, integrated software, any programs installed on the hardware, and/or documentation,
shall be subject to license terms and license restrictions applicable to the programs. No other rights are granted to the U.S. Government.
This software or hardware is developed for general use in a variety of information management applications. It is not developed or intended for use in any inherently dangerous
applications, including applications that may create a risk of personal injury. If you use this software or hardware in dangerous applications, then you shall be responsible to take all
appropriate fail-safe, backup, redundancy, and other measures to ensure its safe use. Oracle Corporation and its affiliates disclaim any liability for any damages caused by use of this
software or hardware in dangerous applications.
Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.
Intel and Intel Xeon are trademarks or registered trademarks of Intel Corporation. All SPARC trademarks are used under license and are trademarks or registered trademarks of SPARC
International, Inc. AMD, Opteron, the AMD logo, and the AMD Opteron logo are trademarks or registered trademarks of Advanced Micro Devices. UNIX is a registered trademark of The
Open Group.
This software or hardware and documentation may provide access to or information about content, products, and services from third parties. Oracle Corporation and its affiliates are not
responsible for and expressly disclaim all warranties of any kind with respect to third-party content, products, and services unless otherwise set forth in an applicable agreement between
you and Oracle. Oracle Corporation and its affiliates will not be responsible for any loss, costs, or damages incurred due to your access to or use of third-party content, products, or
services, except as set forth in an applicable agreement between you and Oracle.
155