Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
IM 36J04A11-01E
Yokogawa February 21st 2014
14th Edition Issue 1
Exaquantum/PIMS Users Manual i
Highlights
The Highlights section gives details of the changes made since the previous issue of this
document.
Summary of Changes
This is the 14th Edition of this document.
Detail of Changes
The changes are as follows.
Chapter/Section/Page Change
Section 1.3 New chapter added. Minor text updates.
Section 1.10 Section added
Section 2.2 Added Exaopc-CAMS. Versions of Exaopc for logon
security.
Section 5.6 Minor text updates
Section 7.1 Minor text updates, relating to Operating System
support.
Section 14.3 Restoring archives backed up at SQL Server 2000
Chapter 15 New Chapter Rearchive Tool.
Section 17.1 Minor text updates
Table of Contents
Chapter 1 Introduction
1.1 Exaquantum Overview
Exaquantum (Figure 1-1) is a Plant Information Management System (PIMS) combined with
a powerful user interface.
Figure 1-1 Exaquantum Functional Overview
Exaquantum/Web Exaquantum/Explorer
Exaquantum/PIMS
(Server)
Exaquantum/PIMS
(Server)
OPC Servers
TIC1-2
The primary functions of Exaquantum/PIMS are to gather, store and aggregate process and
other business data. This data can then be accessed using Exaquantum/Explorer,
Exaquantum/Web and other tools such as Microsoft Excel. The data flow can be summarized
as follows:
Raw process data is gathered from the PCS, via OPC servers, into a proprietary Real-time
Database. Here, the raw data can be combined in real-time calculations to produce
derived, higher value information.
Both derived and raw values are historized so that their data can be accessed over long time
periods by the Historian.
In addition to normal historization, the Exaquantum/PIMS Historian calculates aggregation
values. This is a process of data reduction (such as mean value calculation,
minimum/maximum tracking, standard deviation, summation) over user-defined periods.
The aggregations are themselves stored; this allows them to be retrieved over long time
periods.
Real-time, historized and aggregation information are made available to users desktops via
Exaquantum/Explorer; this software is able to present the data in a wide variety of formats.
Exaquantum/PIMS is event-driven. When new data arrives, any calculations that use the new
data values are triggered and the Historian is informed by an event. Any user applications
(such as Exaquantum/Explorer displays) which have a registered interest in the data are
informed of the change.
1.2 Exaquantum System
The components of the Exaquantum system are shown in Figure 1-2.
Figure 1-2 Exaquantum System
Third Party
Clients
Exaquantum
Client Client
Exaquantum/Explorer OLE DB and Exaquantum/Web
Server
Exaquantum/PIMS
Server
Role Based View Server
Configuration Real-time
Historian
Tools Database
PCS Interface
OPC Server
TIC1-3
The system forms a multi-tier architecture, with OPC servers providing the raw process data.
This data is first managed and accumulated within the Exaquantum/PIMS server. It then
becomes available to Exaquantum/Explorer (a powerful analysis and reporting facility). The
data can also be accessed by other third party clients via the OLE DB interface or open
interfaces (application programming).
The main components of Exaquantum are described below.
Exaquantum/PIMS
The Exaquantum/PIMS server has three main functions, which are outlined below.
Real-time Database
The Real-time Database is a high performance real-time store for process and plant data that
also provides flexible, user-defined calculations, and aggregations (mean, max, min etc.) over
multiple user-defined time periods.
Historian
An optimized, long-term Historian provides efficient storage and fast retrieval of vast
amounts of plant data, over very long time periods.
Configuration Tools
A suite of easy-to-use tools is provided to build, deploy and manage the Exaquantum/PIMS
environment.
Exaquantum/Explorer
Exaquantum/Explorer is a data utilization client that can be run on user PCs. It is a powerful,
flexible analysis and reporting environment through which business information may be
presented in graphical displays and reports. For further information, see the Technical
Information (Ref. TI 36J04A10-01E) and Exaquantum/Explorer Users Manual (Ref IM
36J04A12-01E).
Exaquantum/Web
Exaquantum/Web enables users to deploy thin clients over the Internet/Intranet. The scope of
supply is similar to that provided by Exaquantum/Explorer. Exaquantum/Web components
can handle multiple Exaquantum servers transparently.
Interfaces
OLE DB Interface
The industry-standard OLE DB interface (which includes ODBC) allows Exaquantum/PIMS
to present PIMS information to other third party clients. Since many of these tools provide
users with non-programming access (making use of such techniques as drag and drop), this
interface is suitable for users who have a requirement for the data, but may not have in-depth
programming skills.
Automation Interfaces
Exaquantum also provides a set of open interfaces by which application programs can browse,
read and write configuration and/or data. These interfaces are COM interfaces and therefore
suitable for application programmers.
OPC Interface
With the addition of the OPC server software, an Exaquantum/PIMS server can make
available its data as an OPC server. The Exaquantum OPC client1 is available for installing on
suitable equipment from other manufacturers, to allow them to access an OPC interface on an
Exaquantum/PIMS server. Once installed there is no configuration required for the OPC
server. Refer to Chapter 12-Cross-server Calculations to see how it can is used by the
Exaquantum system. For technical details of the interface, see the Exaquantum API Reference
Manual.
1
As an MSI package distributed with Exaquantum.
Chapter 17: HIS Tag Generation describes the HIS tags and the groups they belong to can
be imported into Exaquantum using this tag generation function.
Management Console
A Management Console is a user interface that provides an environment for running system
management and administration applications. It provides a common environment for all user
interface applications, and is used to view management and administration tools. The
Microsoft Management Console has been customized for use with Exaquantum.
Console Components
There are four main components of the Management Console (Figure 1-3 ):
Console Tree
Configuration Screen
Management Console Menu Bar
Configuration Screen Menu Bar
These are described below.
Console Tree
The console tree shows a hierarchical structure that represents the available objects in the
Management Console. The console tree is always in the left pane of the window. It can be
fully expanded to allow the selection of the objects within it.
Note: When changes to a configuration are made using the MMC they must be saved using
the controls available on that page before you change to another page, or close the
MMC, otherwise the changes will be lost.
Note: Although the Management Console allows the use of multiple configuration tools in
a single tree, for example Exaquantum Configuration Tools and SQL Server
Management Studio, experience has shown that this can adversely affect the behavior
of DCOM. As this usually leads to an intermittent OPC communication failure, it is
strongly recommended that only one tools set be used at a time.
Configuration Screen
The screen is used to configure and administer the Exaquantum/PIMS system. The
administrator specifies which ordinary users have access to the various screens.
Start
Define
OPC Gateways
Define
Production
Calendar
Set up
Multi-server
Environment
Define Tag
Templates
Define Function
Block Templates
Tag
Generation
Function
OPC OPC File Tag File
Block
Equalisation Import Import
Generation
Configure
RBNS Views
Finish
History Catch-up
History catch-up is a process that allows Exaquantum to recover data for the period between a
system shutdown and a restart. There are several options available. For a full description of
History Catch-up see the Exaquantum Engineering Guide Volume 1 (ref IM 36J04A15-01E),
Chapter 4, Section 4.10.
Template Versioning
Once a template (tag or function block) is created and then later changed, any occurrence of
template definition is maintained in Exaquantum as different versions. Version control is
important when you understand that tags and/or function blocks are created by copying the
definition of templates; they do not refer to templates, but keep a copy as a definition. This
mechanism allows templates to be changed without affecting running tags and/or function
blocks. Changing tags and/or function blocks is done by firstly changing the associated
templates, and then applying such changes to existing tags and/or function blocks. In this way,
any changes that may take place can be analyzed and reported by the user before the user
actually performs such changes. The analysis is made by comparing the original template
with an updated template, not comparing templates and tags/function blocks. This is why the
performance of the analysis is good.
1.7 Native Language Support Compatible
All of the configuration tools are Native Language Support (NLS) compatible; they are thus
capable of:
Showing all the captions and messages in the native language.
Allowing input variables to be in the native language, whenever applicable. In this manual,
all of the NLS fields are clearly indicated.
Supported native languages are:
English for the English environment and any other languages not listed below
Japanese for the Japanese environment.
If the system requires connections to any other types of OPC servers (or even configuration of
commonly used OPC servers in a non-standard way), users are required to configure types
before using this tool. For information about configuring OPC server type, refer to the
Exaquantum Engineering Guide, Volume 3 (IM 36J04A15-03E).
Note: Exaquantum is tested and supported with Matrikon OPC servers for Honeywell
TDCS3000, Foxboro IA series and MODBUS. The necessary OPC type
configuration for these OPC servers should also be made with the tool mentioned
above.
There is an option for the OPC gateway to be subject to Alarm and Events (A&E) gathering.
This can be selected through a check box on the OPC Gateway Configuration screen. The
box can only be checked when the selected OPC gateway type allows A&E access.
There is an option for the OPC Gateway to automatically recover OPC data after a loss of
connection. The Enable Automatic OPC Data Recovery checkbox allows this option. The
box can only be checked when the selected OPC Gateway allows Historical Data or Historical
A&E access.
There is an option to recalculate aggregations and calculations, following a change to their
dependent data. The Enable Automatic Recalculation checkbox allows this option, the
Enable Automatic OPC Data Recovery option must already be enabled to allow this option.
The R3.70 and later releases of ExaOPC support enhanced connection security. The optional
Logon Check configuration allows a User and Password to be specified to connect to the
ExaOPC server.
Once configured, the connection credentials for the selected OPC Gateway can be tested by
clicking on the Test button.
If a user defines an OPC gateway that uses an OPC gateway computer, then the detail is
populated with the settings already defined for this OPC gateway computer:
Automatic OPC Recovery (enabled / disabled)
Automatic Recalculation (enabled / disabled)
OPC gateway security (enabled / disabled, user and password)
If there are multiple OPC Gateways configured for the same OPC Server, the logon
credentials, Enable Automatic OPC Data Recovery and Enable Automatic Recalculation for
each affected OPC Gateway are updated after user confirmation.
It is possible to apply filtering to Alarm and Events collected from the OPC gateway, in order
to reduce the quantity of data being stored in Exaquantum. To define one or more of these
filters, click on the Alarm and Event Filter button.
For Exaopc security details, refer to "10.Setting the Exaopc Product Security function" in
"NTPF100 Exaopc OPC Interface Package Installation Manual (IM36J02A12-01E)
2.3 Associated Screen
Advanced Screen
This screen is displayed when the Advanced button on the OPC Gateway Configuration
screen is pressed. Part of this screen is a read-only table that lists Property and Value for
the OPC server selected in the OPC Gateway Type field. The values displayed change
depending on the server selected. In the Advanced Option part, a check box Use
Exaquantum Time allows the user to allocate an Exaquantum timestamp to any OPC data
received.
Note: This screen is for information only. The information is an advanced topic and only
trained developers may ever need it.
If you do have to configure the values, use a separate tool (see section 2.2).
Alarm and Event Filter Screen
Filter Settings
The Filter Settings screen is displayed by clicking on the Alarm and Event Filter button.
This screen is used to specify the alarm and event filters. These settings define exclusive
filters; that is the events that a user wishes to exclude from saving to the QHistorianData
database. The maximum number of definable filter conditions is 20.
The Alarm and Event Filter settings can be specified after defining the OPC Gateway type.
Detail Settings
In the Detail Settings window it is possible to define new filter conditions and edit existing
filter condition parameters. Filters that are defined are exclusive filters; that is they define
alarm and event conditions that are not to be saved to the Exaquantum database.
One or more filters can be defined in the Condition Definition filter table: A maximum of 100
conditions can be specified.
Attribute select the alarm and event message field to filter on
OPE. define the operator
Value specify the value to exclude (this can include wildcards)
And/Or - Link fields to exclude with And/Or logical operators. (NOTE And has higher
priority over Or).
Select the Disabled check box to temporarily disable the filter conditions.
Figure 2-2 Detail Settings Screen
Value
Define a value consistent with the data type, up to a maximum of 256 characters.
Wildcards can be specified for character data types.
AND/OR
Use And/Or to link filters. At the end of conditions specify a period (.).
New
Defines a new filter condition.
Delete
Deletes the selected attribute condition.
OK
Saves the filter conditions, and closes the window.
Cancel
Cancels the changes to filter conditions and closes the window.
2.4 Scenarios
The OPC Gateway Configuration tool is primarily used:
During initial system installation and set-up
Whenever OPC gateways are added
Whenever OPC server details change, for example, change the OPC server type from Exaopc-
HIS to Exaopc-STN
To enable/disable OPC Alarms and Events for the Exaquantum server.
To reduce the number of alarms and events stored by Exaquantum.
When any new custom periods are created, they are added to the Name/Dependency list. The
dependency is the data on which the production calendar period is based, e.g. a production
calendar period of a day can be derived from hour data.
Production calendar periods will always follow each other immediately. For example, a
period cannot be defined as 8 am to 5 pm each day, i.e. 9 hours followed by a 15-hour gap.
They will also be for an unvarying time period; it is not possible to define 3 shifts of 8, 10 and
6 hours in a day (00:00 to 08:00, 08:00 to 18:00, 18:00 to 24:00).
The duration of system periods is fixed. The user can select the duration of custom
periods from a drop-down list. The default value is 8 hours.
Day period and custom period over one hour in length support Daylight Saving Time (DST).
For example, the Day period and a Shift period (i.e. a Custom period of 8 hours), get 1 hour
shorter when DST begins, and 1 hour longer when DST ends. DST adjustment takes place
when the Windows Date/Time property setting is made so that the time zone assumes DST,
and a check is made to automatically adjust DST.
Production calendar has a concept of dependency that determines a production period from
which another aggregation calculation (statistical calculation for the production period of time
such as mean, maximum, summation etc.) should be made. For example, an hour may be
derived from (be dependent on) raw, a day derived from an hour, and a month derived from a
day. This allows aggregation calculations for longer periods to be more consistent with
shorter ones.
The dependencies that can be selected for a particular production calendar period change
depending on which period is displayed in the Name field. The options are as follows:
Table 3-2 Production Calendar Period Dependencies
Period Possible Selection (Dependency)
(Name)
Hour Raw (default) or one of the Custom aggregations
Day Hour (default), Raw or one of the Custom aggregations
Month Day (default), Raw, Hour or one of the Custom aggregations
Custom Raw (fixed)
The periods that are set in the Production Calendar are also used in the Tag Template (see
Chapter 5). However, in the Tag Template, the production calendar periods are listed under
the heading Aggregation Periods.
3.3 Scenarios
Some typical scenarios associated with the Production Calendar are:
Initially set up the production calendar and dependency
Add an additional production calendar
Adjust the start of the production calendar
Change the dependency (to change the calculation of the aggregations for the dependent
period(s)).
The permitted operations for the two types of production calendar period are summarized
below:
Table 3-3 Production Calendar permitted operations
For a more detailed explanation of the RBNS refer to Chapter 5 (Role Based Namespace
Configuration).
Caution: When a change is made to the RBNS on a server other than the master server, that
server may be selected for publication of the Master Servers RBNS, but there is some risk
involved.
4.3 Scenarios
The Servers tool can be used to modify Server details such as Server Name, Server
Description and Server Computer. These changes are reflected in the RBNS immediately,
without needing to reconfigure the RBNS contents.
In very exceptional circumstances a server may be deleted. This operation will not be
permitted if any RBNS references the server that is to be deleted.
1 When user A changes the content by creating, editing or removing a namespace, user B
cannot make any changes to the affected namespace until he refreshes the content; which
will apply the changes made by user A.
2 Until the content is refreshed, attempts to make changes by user B may be invalid. For
example, user B cannot add an element to a namespace if user A has already removed
that namespace. Changes can be made by user B to a different namespace, but the
content should be refreshed anyway to apply all changes.
If user A wants to publish a servers RBNS content to other Exaquantum servers and user B
is currently updating the RBNS contents on another server, user B is presented with a dialog
stating that the RBNS content has been updated and that a refresh is required before making
any changes. Once the refresh has been completed, user B may continue the updates;
although the new content may have changed considerably. For this reason, frequent refreshing
by concurrent users is encouraged.
To keep concurrent updates to a minimum it is recommended that, for RBNS purposes, one
server be designated as a Master. All changes should be made to the Master and then
published to all the other servers.
5.4 User access to data
There are two types of access to views, and the data they represent:
Read access users with read access can only view the data.
Write access users with write access can change some data values.
Allocating access to users takes place at several levels:
Access to individual data source, such as Tag, Function Block, etc. can be configured
using RBNS Builder (see Security in section 5.9)
Access to Folders and their contents can be configured using RBNS Builder.
Views are assigned to Windows user groups using RBNS Builder.
The allocation of users to Windows user groups can be managed within the Windows
system.
Note: Some objects can also be hidden from view by using the facilities available in the
Filters page (see Filters in section 5.9 ).
5.5 Making views available to users
Role Based Namespace employs the Windows user group feature to manage users with
common workplace roles, and to distribute the views accordingly. Each view can be
associated with two Windows user groups.
RBNS views - A RBNS view will be created for each role, or specific part of a role.
View groups - Each RBNS view will be allocated to either one or two Windows user
groups; one for read only, and one for read and write, depending on requirements.
Role groups - A Windows user group will be created for each workplace role or in
some cases just a specific part of a role.
Users - Users will be added to a group, or groups, consistent with their role.
Thus everyone performing a particular role can be given access to the appropriate view, by
making their role group a member of a suitable view group. See Figure 5-1 Simple RBNS
Example.
As users and groups can be members of more than one group:
A view may be shared across more than one role by making other role groups
members.
A role group can have access to more than one view by adding it to more view groups
Users can have access to more than one role's views by adding them to other roles.
Using different combinations of groups provides a flexible and configurable access control
system.
Group access rules
In situations where everyone will have read or write access it is possible to not allocate
specific group access by leaving either or both of the read and write groups blank. The
following table shows how the access in affected in the different combinations.
Table 5-1 Group access rules
User 1 User 4
User 2 User 5
User 3 User 6
View 1
View 1
Read and Write
Read only Group
Group
View 1
In this example there are two sets of users in different role specific work groups.
There is a single view, View 1, which provides two types of access by being associated with
two view groups; the Read Only Group and the Read and Write Group.
To give users in the Role 1 Group read only access to View 1, using the facilities provided by
the Windows operating system, their role group it is added to the View 1 Read only Group.
To give users in Role 2 Group both read and write access to View 1, using the facilities
provided by the Windows operating system, Role 2 Group is added to the View 1 Read
and Write Group.
A more complicated example that builds upon this scenario is demonstrated in Figure 5-2.
Figure 5-2 More Complicated RBNS Example
Role 4 Group
User 9
User 10
User 11
View 1 View 2
View 1 View 2
Read and Write Read and Write
Read only Group Read only Group
Group Group
Read only access Read and Write access Read only access Read and Write access
View 1 View 2
In this case there are two additional workplace roles with their associated users and Windows
user groups.
The Role 3 Group users have read only access to View 2, but both read and write access to
View 1.
Role 4 users have read and write access to View 2, but only read access to View 1.
Thus allocating users to Role Groups and Role Groups to View groups allows a very flexible,
convenient and precise way of administering users access to various sets of data. When role
responsibilities change or users change roles, the system is easy to manage.
5.6 RBNS Builder
The RBNS Builder is available as part of the Exaquantum Administration Tools.
To open the RBNS Builder:
1 Run the Exaquantum Administration Tools.
2 When the main Administration Tools window opens on the left-hand side select the Tree
tab.
3 In the hierarchy, open the System Configuration branch.
4 Select the Role Based Namespace icon.
The RBNS Builder window has two areas:
Namespace selection and display on the left-hand side (section 5.7)
Properties and settings - on the right hand side (section 5.9)
Note: If Role Based Name Space is used, the following must be consistent on all
referenced Exaquantum Servers (as defined in the Exaquantum Server Window in
Administration Tools).
- Windows Group Name and Windows User Name
- Same name Windows User Password
- Windows Group belong to same name Windows User
5.7 Namespace selection and display
The namespace selection on the left-hand side of the RBNS Builder window comprises two
parts:
Current namespace the drop down list at the top left-hand side allows you to select
the namespace to be displayed in both the hierarchy below, and the properties and
settings section on the right. It can be changed by opening the dropdown list and
selecting a new one. If there are no currently defined namespaces then it will be blank.
Namespace hierarchy a hierarchical list of the namespace selected in the current
namespace box above. Highlighting an item in the list will populate the properties and
settings tabs on the right-hand side of the window.
The following controls are available at the bottom of the window:
Refresh
Can be used at any time to refresh the current status of the local RBNS database. In a multi-
server environment, making changes has implications for maintaining the concurrency of the
RBNS across the different servers. Refer to section 5.3 Publishing views across multiple
servers.
Publish
Opens the Publish dialog box for controlling how the changes are made across the available
servers. The dialog displays a list of available servers. Their status is indicated by the color of
the text and icon.
Icon color green The server is available, and changes can be published to it.
Icon color red The server is not available so changes cannot be published to it.
Text color grey There are no changes that need to be published to the server.
Text color black A change has been made locally that needs to be published to the
server
Text color red - A change has been made on that server that needs to be updated on the
local server.
Note: If any of the servers have changes that need to be updated locally, then the Refresh
dialog box will be automatically displayed. This informs you that a refresh operation must be
performed before you can continue.
To publish changes to the servers, highlight them in the list and click on the OK button. Refer
to section 5.3 Publishing views across multiple servers.
New
Opens the Role Based Namespace New dialog box for creating a new namespace view.
The following features are available.
New Enter a unique name for the view. Use up to 32 characters (but avoid some
punctuation such as full stop, comma and colon).
Read Group This defines the Windows user group that will have only read access to
the view. Either type in a valid path, or use the browse button on the right to open a
network browser to locate and select the appropriate Windows user group. Paths are
limited to 512 characters. .Leaving the Read Group not allocated has security
implications; refer to section 5.4.
Write Group This defines the Windows user group that will have read and write
access to the view. Either type in a valid path, or use the browse button on the right to
open a network browser to locate and select the appropriate Windows user group.
Paths are limited to 512 characters. .Leaving the Write Group not allocated has
security implications; refer to section 5.4.
Apply Click on this button to create the new view.
Cancel Click on this button to close the dialog box. Any changes will be lost.
The root of a new namespace view will be added to the hierarchy on the left-hand side of the
window.
Remove
Only available when a namespace is selected in the list. When clicked, opens a confirmation
dialog box before you can remove the view from the list.
Help
Opens the help system.
5.8 Adding and deleting objects from the namespace hierarchy
To add, remove or modify objects in the namespace:
1 In the current namespace dropdown list, choose the namespace to work on.
2 In the hierarchy, identify the object to be modified.
All functionality within the hierarchy is available from the context menu. This is revealed by
right-clicking on an existing object. The choices available will depend on the nature of the
original object. For a list of objects refer to section 5.2 Creating views.
To add an object to the new namespace, right click on it in the hierarchy. From the context
menu that appears, select the type of object you want to add. Fill in the appropriate
information in the tabbed pages to the right and click on the apply button to finish.
In the case of a folder, you must enter a name in the General tab. In other cases, enough basic
information will automatically be provided. The information can be changed, and other
information can be added, at a later stage. When finished, click on the Apply button to add
the new entry to the list.
Add -> Reference Folder Opens the Intrinsic Data selector for choosing a folder that
resides on another server.
Add -> Folder Adds a virtual folder
Add -> Function Block In the General tab on the right, either type in a Path or use the
browse button to locate the required function block.
Add -> Tag In the General tab on the right, either type in a Path or use the browse button
to locate the required function block.
Remove Opens a confirmation box before you can remove the object from the list.
5.9 Properties and settings
This area to the right-hand side of the RBNS Builder window comprises a set of tabbed pages.
The contents are associated with the object currently highlighted in the hierarchy. The number
of tabs shown will depend on the type of the object. The following sections describe the
features of each tab.
General
These are the basic configuration items. The details shown will depend on the type of object
highlighted in the namespace list on the left.
Name A unique name for the object. Use up to 32 characters (but avoid some
punctuation such as full stop, comma and colon).
Item Type Displays the type of object. This field is automatically generated and
cannot be changed.
Server Shows the server from which the object originates. This field is automatically
generated from the path below, and cannot be changed.
Path Configure the location of the object. Either type in the path, up to 512
characters, or use the browse button on the right to open the Intrinsic Data Selector.
Use this to locate the required item from the available sources.
Filters
Allows you to configure the filters that can be used to restrict the number of objects users will
see in the data browser. The type of filtering available will depend on the type of object
highlighted in the namespace list on the left. However in most cases they follow a similar
pattern.
The filter will compare the set of characters you provide, against all the contents of the object
that is being filtered.
The ones that match will either be the only ones shown, or not shown, depending on the
choice you make:
Include ... that match will only display objects that match the characters provided.
Include ... that do not match will display only objects that do not match the characters
provided.
A box is provided where you can enter up to 32 characters you want to match. The usual
wildcards can be used:
? matches any single unknown character
* - matches zero or more unknown characters
For virtual folders, you can choose to include sub-folders.
For Tags and Function Blocks you can choose to show aggregations they contain.
Security
This page is for setting the type of write access allowed. The options will vary with the type
of object highlighted in the namespace list on the left. Write access is also controlled by the
users group membership. Refer to section 5.5 Making views available to users.
The first two options provide blanket coverage:
All The selected object and everything within, will be write enabled.
None The selected object and everything within can be viewed but cannot be written
to.
All except The selection, and all the objects in the Security list (see below), will be
write enabled.
None except - Except for the objects in the Security list (see below), the selection, and
everything within, will be write enabled.
Selecting either of the last two options opens two additional panes in the window:
Security hierarchy A tree like representation of the contents of the selected object.
Security list A list of all the objects, selected from the security hierarchy, which are
to be specially treated for write access.
Objects are added to the Security list by clicking on them in the security hierarchy and
dragging them into the list.
Objects are removed from the list by selection them and clicking on the Remove button.
Note: Only the top level objects in the security hierarchy can be selected.
5.10 Scenarios
The following scenarios are likely to be encountered during the configuration of a namespace:
1 Server selection. From the multiple servers, select one Exaquantum server to be the
master for the RBNS configuration.
2 Configure Windows groups. Where to configure groups, i.e. the domain controller or
local Exaquantum server, depends on the network policy. For information on Exaquantum
network configuration, refer to the Engineering Guide Volume 2 (ref: IM 36J04A15-02E).
3 Assign users to groups. A user may belong to more than one group. This, however can
lead to confusion as to what namespace to use, hence it should be avoided if possible.
4 Create Role Based Namespaces. The name of the namespace is independent of the group,
but it is a good practice to name it in such a way that the group and its role are obvious.
This scenario is expanded below.
Note 1: It is possible to re-assign a group to a namespace. However, this can result in
confusion where some users may no longer have access to resources that used to
be accessible.
Note 2: The Namespace name appears at the beginning of the access path, e.g.
Namespace 1.Folder 1.tag.value. It is possible to change it later; however this
should be avoided wherever possible, as performance may be affected.
5 Publication of Namespaces. Use the Publish option of the RBNS Builder to copy
configured namespaces to multiple servers. If the RBNS contents (and also Exaquantum
server information) have been updated in several servers, there may be some risk involved
in distributing changes to other servers. This is why one server should be selected as the
master server.
Detailed Scenarios
RBNS Creation
1 Design the RBNS Views:
Define roles such as Operator Area 1, Maintenance Area 1, Site Manager, etc.
Allocate WINDOWS groups for these roles.
Roughly decide what information each group should have access to.
2 Create the namespace:
a Use the RBNS builder tool to create a new namespace.
b Select a user group.
c Allocate a name to the view.
Note: The name appears at the top of the path name.
Batch
21 CFR Part 11
Custom Batch Data Collection
Maximum Active Recipes
NOTE - User Account Control applies to all Windows Operating systems supported by
Exaquantum.
-
The same user may also appear as logged in twice if they are running both the
Exaquantum Administration Tools and another Exaquantum client application, for
example, Exaquantum Explorer, on a client PC.
1 25/05/2009 3d Fred
Login Record
2 24/05/2009
Returned by 2d Daphne
Function
3 27/05/2009 2d Wilma
4 Login26/05/2009
Record Not 1d Wayne
Returned by
5 Function
22/05/20099 10d Chris
6 24/05/2009 2d Justin
7 28/05/2009 2d Barrie
Time
7.4 Viewing a list of users who were logged in at some point between
two date/times
Figure 7-2 describes which login records are returned by this function (the blue bar denotes a
record where a user was logged in at some point during the specified time period; the red bar
denotes user login records not returned by the function).
Figure 7-2 Example login records which correspond to users who were logged in at some
point between two date/times
May 2009 Jun 2009
ID Start User User
19 20 21 22 23 24 25 26 27 28 29 30 31 1 2 3
1 25/05/2009 3d Fred
Login Record
2 24/05/2009
Returned by 2d Daphne
Function
3 27/05/2009 2d Wilma
4 Login26/05/2009
Record Not 1d Wayne
Returned by
5 Function
22/05/2009 10d Chris
6 22/05/2009 2d Justin
7 29/05/2009 2d Barrie
To view a list of users who were logged in at some point between two date/times: -
1. Click on the button labeled Show users logged in between.
2. Specify the start of the period in the first date/time field.
3. Specify the end of the period in the second date/time field.
4. Uncheck the box marked For whole duration.
5. Select the number of login records to show in the data grid by altering the Max Events
field or accept the default of 1000.
Note: If the set of matching records greater than the MaxEvents field then the oldest
records are omitted.
6. Click the Refresh button.
Note: The results can be sorted by clicking on the appropriate column header in the data
grid.
7.5 Viewing a list of users who were logged in for the whole period
spanning two date/times
Figure 7-3 describes which login records are returned by this function (the blue bar denotes a
record where a user was logged in for the whole time period specified; the red bar denotes
user login records not returned by the function).
Figure 7-3 Example login records which correspond to users who were logged in for the
whole period spanning two date/times
May 2009 Jun 2009
ID Start User User
19 20 21 22 23 24 25 26 27 28 29 30 31 1 2 3
1 25/05/2009 3d Fred
Login Record
2 24/05/2009
Returned by 2d Daphne
Function
3 27/05/2009 2d Wilma
4 Login26/05/2009
Record Not 1d Wayne
Returned by
5 Function
22/05/2009 10d Chris
6 24/05/2009 4d Justin
7 25/05/2009 4d Barrie
To view a list of users who were logged in for the whole period spanning two date/times: -
1. Click on the button labeled Show users logged in between.
2. Specify the start in the first date/time field.
3. Specify the end in the second date/time field.
4. Check the box marked For whole duration.
5. Select the number of login records to show in the data grid by altering the Max Events
field or accept the default of 1000.
Note: If the set of matching records is greater than the MaxEvents field then the oldest
records are omitted.
6. Click the Refresh button.
Note: The results can be sorted by clicking on the appropriate column header in the data
grid.
Data Types
When tag templates are created, the data type must be specified in accordance with the
expected data and the potential range of values that will need to be stored. The available
options are:
Table 8-1 Tag Template Data Types
Name Description
Descripti Descriptive text for the tag
on
LowEng Lower limit of engineering range for the tag
HighEng Upper limit of engineering range for the tag
Units Engineering units for the tag
For OPC tags the reference data information is read from the PCS when tags are first created,
unless the associated OPC server does not support property access. For manual and
calculated tags the reference data information is entered via the Tag Editor (Chapter 11), and
Tag Generation (Chapter 10).
OPC Communication
This comprises the Update Rate and the Percent Deadband, and is only pertinent to OPC tags.
The Update Rate is used to specify the OPC update rate. The Percent Deadband is used to
specify the OPC Deadband as a percentage of the engineering range. OPC items can be
written to the OPC server, if the user has the authority. The OPC server uses the Update Rate
and Percent Deadband values to determine, for each tag scanned, if Exaquantum needs to be
notified of any change.
The value in the Update Rate field specifies the minimum time interval between OPC
notifications, i.e. the fastest rate at which data changes may be sent to Exaquantum. If the
value changes many times within the update period, only one value scanned by the OPC
server at an appropriate time is sent to Exaquantum.
Note: When the update rate of an OPC Tag is changed from Low Scan (more than 10
seconds) to High Scan (less than 5 seconds), please monitor the QHistorianData
database space used (before and after the update rate change). This is because the
rate at which database space is used may increase after changing the update rate. An
alternative is to delete the OPC tag, change the update rate, and then re-add the tag.
The disadvantage of this alternative is that the tag history will be lost, after it is
deleted.
The Percent Deadband is an upper and lower threshold pair, expressed as a percentage of the
tags engineering range. A value is only notified to Exaquantum if its value at the end of the
update period has changed by more than the Deadband.
A value can be selected from the Update Rate list by the user, but this may be modified by the
OPC Server when data is acquired using the OPC Communication setting. Such a
modification may occur if the OPC Server cannot support the value selected or the additional
sampling load. This behavior may vary from one OPC server to another. For Yokogawa
Exaopc, see the Yokogawa document NTPF100 Exaopc OPC Interface Package Guide
Manual, ref. IM 36J02A11-01E.
When a tag is configured to write to an OPC server, the Deadband value is automatically set
to 100, meaning that the default is then set to not read the value back from the OPC server.
This setting is made because it is strongly recommended to separate reading from writing due
to a potential problem taking place when:
Exaquantum application writes more frequently than the update rate
OPC server notifies data with a timestamp of an earlier update timing (which is the case of
Yokogawa Exaopc) that result in an error of receiving obsolete data.
Users can manually change the Deadband value under their responsibility.
Note: Setting the Deadband 100% to OPC server may not necessarily mean that the OPC
server does not notify data to Exaquantum, if the OPC server does not fully support
the Deadband feature.
Aggregations
Aggregation Calculated Tags
A template can be created for Aggregation Calculated Tags. This allows the base
aggregation results to be generated by the tags calculation script, rather than the aggregation.
For example, in order for the user to calculate the hour average as half the numerical
average of the hourly mean for Tag 1 multiplied by a manual entry value ManTag , a new
aggregation calculated tag (CalcTag) is required with the following script:
If ([Root.Tag1.Aggregations.Hour.Mean.Value].Changed)
Then
[Result.Aggregations.Hour.Mean.Value]=
[Root.Tag1.Aggregations.Hour.Mean.Value]/2 *
[Root.ManTag.Value]
End if
Note 1: This script is special because it contains a result that refers to the calculated tags
own aggregation result [Result.Aggregations.Hour.Mean.Value].
Note 2: The if statement is required to ensure that the result of the calculation is only
calculated on the hour boundary. If this were not included then the script would fire
and generate the aggregation result at any arbitrary time as defined by ManTag
changing.
To use this option, specify the aggregation data creation method during each aggregation
period on the Aggregation tab sheet. If the creation method is Auto (default), aggregation
data is automatically calculated and created. If the creation method is Calculation, data
during the relevant aggregation period is created from the script. The creation method can be
changed to Calculation only if the selected Tag type is Calculation and the relevant
aggregation period has been selected.
The calculating formula of aggregation calculations needs to calculate and create all the
selected basic summation result data. For example, if a tag template has a following
definition:
Generate From Calculation is selected
Hour, Day and Month production calendar selected
Mean and Summation aggregations selected
The correct script would be:
If [Root.Tag1.Aggregations.Hour.Mean.Value].Changed then
[Result.Aggregations.Hour.Mean.Value] = (some value)
[Result.Aggregations.Hour.Summation.Value] = (some value)
End if
In the above example, if only Mean is written (i.e. without second [Result.] line defined),
the Mean aggregation result of Day (and Month) would be 0, Bad quality.
Aggregation Calculations
The options vary depending on which Aggregation type is selected.
Table 8-3 Aggregation Types
These aggregation values are calculated and stored for each Aggregation period.
Note: A PercentTimeGood aggregation is always created if any aggregation calculation is
selected.
Continuous
There are six aggregation calculations associated with Continuous:
Minimum calculates and stores the minimum value achieved during each aggregation period
selected. The timestamp indicates when the minimum value was recorded.
Maximum calculates and stores the maximum value achieved during each aggregation
period selected. The timestamp indicates when the maximum value was recorded.
Mean calculates and stores the arithmetic mean achieved during each aggregation period
selected
The calculation of the Mean is based on the difference between the timestamp of the
current input notification and the equivalent timestamp form the previous input
notification as follows:
Mean = ( ( Mean * Ttotal ) + (( Vn-1 ) * ( Tn Tn-1 )) )/ ( Ttotal + ( Tn Tn-1 ) )
Where:
Mean Is the accumulating mean.
Ttotal Is the accumulating time for the input being valid i.e. its data quality was either
GOOD or Uncertain.
Vn-1 Is the value of the input at the last time notification.
Tn-1 Is the timestamp of the input at the last time notification.
Tn Is the timestamp of the input at this time notification.
Standard Deviation calculates and stores the standard deviation achieved during each
aggregation period selected
Note: Mean must also have been selected for Standard Deviation to work.
The standard deviation builds upon the Mean calculation shown above:
SumOfXSquared = SumOfXSquared + ( ( Vn-1 )2 * ( Tn Tn-1 ) )
StdDev = Sqrt((( SumOfXSquared ) ( Ttotal * Mean2 )) / Ttotal 1)
Spot Value reads the value at the end of the period and stores it at the end of each
aggregation period selected. The value at the beginning of the period may be selected by
setting the following registry value to 1 (the default value is 0)
HKLM\Software\Quantum\Server\AggSpotValueAtPeriodStart = 1
The Exaquantum server must be restarted for a change in this value to take effect.
Summation calculates and stores the value totalized over each aggregation period selected
There are two ways used to calculate the Summation that depend on the Summation
TimeFactor configured for the aggregation calculation:
When Summation TimeFactor > 0, Integration summation is used.
When Summation TimeFactor = 0 , Simple summation is used.
These two methods are explained below.
Summation TimeFactor > 0 (Integration Summation)
The Summation Time Factor (STF) option is used to correct the units of measurement in a
summation calculation. For example, if a flow rate is measured in Tonnes per hour, and
the raw value represents this summation value, then the Hour option should be selected in
the STF list to get the correct values (in Tonnes) for the aggregations. The sum of the
values over the aggregation period is divided by the STF to correct the final value. The
STF may be set to 1 Day for rates measured per day this is a fixed 24 hour period.
Setting the STF to 0 gives a simple summation of individual new values over the period.
The time difference between the timestamp of the current input notification from the
timestamp of the previous input notification is used to calculate the contribution towards
the summation.
Sum = Sum + ( Vn-1 ) * ( Tn Tn-1 )
Where:
Sum Is the accumulating summation.
Vn-1 Is the value of the input at the final requirement time.
Tn-1 Is the timestamp of the input at the final requirement time or the
timestamp of the start of the current aggregation period if no notifications have
been received during the current period. T is measured in units of 100
nanoseconds.
Tn Is the timestamp of the input at this data.
At the end of the aggregation period, the summation is calculated as:
o Summation = Sum / (10,000,000 * TimeFactor)
o Summation TimeFactor = 0 (Simple Summation)
At the beginning of the aggregation period the internal summation is reset to zero. As
each input notification arrives, it is added to the summation. The summation result is the
accumulated summation at the end of the aggregation period.
Differential summation A difference (current value last value) during each selected
aggregation period is calculated and saved taking into consideration the counter reset
value.
A summation reset value is a reset value on a summation counter used for calculation in
differential summation. In the case where the current value is less than the previous value,
then the result (of the aggregation) would be negative. In this instance, the aggregation value
is calculated by adding the summation reset value to the negative result.
Depending on the data type, the value is checked as follows: (In the case of out of range data,
an error message is stored in the Application Event log.)
Integer : 1 - 32767
Long : 1 - 2147483647
Float : 1 - 1000000
Double : 1 - 10000000000000
Some special instruments may collect values that differ from normal values with a certain
range of deviation. This data deviation can be ignored by setting a valid percent difference.
If a data deviation larger than the valid percent difference occurs during periodic data
collection Exaquantum judges the data as an invalid and will not perform the differential
summation process. The valid percent difference default value is 100 (%). Set a valid percent
difference between 0 and 100.00. The differential summation data may be rounded to a
number of decimal places. The default number of decimal places is 0.
Depending on the data type the number of decimal places is checked as follows:
Integer - 0
Float - between 0 and 6
Double between 0 and 13
Discrete
There are three options associated with Discrete:
On State this field must contain a numeric value or text; this is used in the aggregation
calculations for State Count and State on Time.
State Count when checked the aggregation calculation will count the number of times
during the period that the tag changes from a value not equal to the On State value, to a
value equal to the On State value.
State On Time when checked, the cumulative time (in seconds) that the tag spent during
the period at the value (or state) specified in the On State field is calculated.
Aggregation Periods
The aggregation periods related to the Tag Template are those that were set in the Production
Calendar (see Chapter 3).
8.3 Associated Screen
Version Labeling Dialog
This pop-up dialog is displayed when the Label Version button is pressed and contains a two-
column table. It allows the user to add more information to the Tag Template Version field
on the Tag Template.
8.4 Scenarios
The Tag Template is used to define the structure and property of a tag. The left section of the
Tag Template contains the list of templates that are available for viewing and/or selection.
Note1: Any invalid tag templates are shown in red. (If changes are made to the Production
Calendar, for example, after tag templates have been created, some of the tag
templates may become invalid.)
Note2: The tag template names starting with C3TC:: are used with the HIS Trend Tag
Generation function. Therefore, do not use any of them for a newly-created tag
template name.
For detailed information about the tag templates starting with C3TC::, refer to
Chapter 17 HIS Tag Generation.
These are some of the scenarios in which the Tag Template could be used:
Create a Tag Template, initially filling all of the fields.
Copy an existing Tag Template.
Change an existing template when:
The OPC data update rate needs to be tuned, for example there is a need for finer
monitoring; change the default rate of 1 minute to 5 seconds.
The Raw historian interval needs to be adjusted; for example On Change was
originally selected, but it consumes more disc space than is practical; change the rate
to 5 minutes.
Other aggregation calculation(s) and/or period(s) are required.
Note: Occasionally, when a tag modification to add an aggregation calculation (e.g.
mean, min, max) to an existing aggregation period is made, the following error
message will be reported in the Event Log:
Event ID = 62101 The timestamp of the triplets are out of range
This is a timing issue and does not cause any operational problems. The message
can be ignored.
Derive a new template based on an existing template when:
The Data Type was improperly defined, for example a typical numeric type is used to
get MODE data (string type).
Note 2: Altering the order alone does not create a new version of a template.
Function Block Template Version
Function Block templates are maintained by versions. When a template is created, a version
label is automatically assigned as an incremental number, starting from 1. This number is
displayed in the Function Block Template Version field (which is a drop-down list). When
any version of a template (either the latest version or an earlier one) is modified, a new
version is created and becomes the latest one at that point. If a version other than the latest is
modified then it is always automatically assigned the next incremental number. If the latest
version is modified it is automatically assigned the next incremental number unless no tags
have yet been built to the current version and no Tag Generation activities are currently in
progress.
9.3 User Interface and Associated Screens
User Interface
A list of the currently defined Function Block templates is displayed on the screen. Any new
Function Block templates that are created are added to this list. The screen contains various
control buttons. It also contains a table that shows the tags that are associated with the
currently selected function block template. New tags can be added, and existing tags can be
deleted or modified. Tags are identified by an internal number called the Tag ID. When a
new tag is created, Exaquantum automatically allocates the number, starting from one.
Associated Screens
Function Block Detail Screens
There are three Function Block Detail screens associated with the Function Block Template
tool; they allow the user to specify additional information about a tag. These three screens are
for OPC, Manual and Calculated tag types.
There are two types of values:
Initial values copied and used only when function blocks are being created. They are:
Description
Units
Engineering High/Low
Value (for Manual tag type only).
Controlled values can affect function blocks both when they are being created and later in
relevant function blocks, when any changes made should be reflected. The category
includes calculation expression (calculated tag).
Tag details are defined in accordance with the associated tag templates; units and engineering
ranges are editable only when the template has these values selected.
The reference data of OPC tags is handled differently from that of other tag types. When you
define the OPC tag detail, the tag is not yet associated with a particular OPC gateway. An
OPC gateway, hence the OPC tags accessed through it, may or may not support the reference
data access (or property access). If it does, whatever is defined as the tag detail will be
overwritten by the OPC values when a tag is created. If it does not, the values defined here
will be used as the initial values.
In a single file, tags/function blocks must be unique. If a name is made up of full size (zen-
kaku in Japanese) kata-kana characters and another name half-size (han-kaku in Japanese)
kata-kana characters, they are regarded as an identical name. Such a file is therefore
invalid, although those names are differently treated once they have been created
separately.
There is also an individual set of keywords that are applicable to each of the files; these are
described in Section 8.5. The file format associated with OPC File Import, Function Block
Generation and Tag File Import are also described in Section 10.5.
10.3 User Interface
Associated with each Tag Generation program is a wizard that guides the user through the
various stages of the process.
Common Screens
Three screens are common to all four programs: the Job Viewer, Analysis Report and
Summary screens.
Whichever program is selected, the first screen displayed is a Job Viewer screen. This allows
the user to choose whether to rerun an existing job, or start a new job. There is a separate Job
Viewer screen for each process. If a job is cancelled part way through the wizard, the user is
returned to the appropriate Job Viewer screen.
Before generation actually takes place, an Analysis Report is displayed. This shows the
settings selected, providing a summary of what the outcome of the operation would be, if the
user chose to proceed. A table of detailed actions and reasons is also available. This is
important because it gives the user a chance to make a detailed check of what will take place
and, if required, change the operation or even cancel it.
When the job has finished, a Summary of the operation is displayed. This information is
retained for future reference.
Specific Screens
In addition to the common screens described above, each of the tag generation wizards
contains some specific screens:
Inclusion Definition
Filters File
Exclusion
Filters
When the Tag Editor is used to delete a tag, users are prompted to check for any existing
references (the Cross Reference feature). However, the tag generation programs don't provide
this feature and users should check the references in advance. The practical way to perform
this check is to carefully look at the analysis report that will be presented whenever tag
generation is attempted. When a broken shortcut is being restored, the analyzer puts two
actions to recreate the calculation. For example:
FB1
!shortcut
!calc : [Result] = [Parent.shortcut.value]
Assign FB1.shortcut to Root.Tag1, then remove Root.Tag1, and restore it by assigning it to
Root.Tag2.
10.5 File Formats
The format of the files differs for each of the file-based generation programs. These three
formats are described below.
OPC File Import File format
The file consists of two parts; common definition and candidates list.
Note: In this file format, each of keywords should appear in a separate line.
Common Definition
There are a number of keywords that define how to generate:
Template
To specify the function block template from which function block(s) should be generated.
Item
To assign OPC tags of the function block with the data item part of the OPC item ID. See the
figure below.
This keyword should appear as many times as the number of OPC tags in the function block.
Note: An item delimiter character should precede the data item string; in the case of
Exaopc a dot (.) character. Also, make sure the case matches the OPC servers
specification.
For example: Exaopc requires a keyword like Item = tag 1, .PV
OPC Server
To select one of the OPC gateways defined in the OPC Gateway screen.
OPCitemIDPrefix
To put a string before the PCS tag name. See Figure 10-1 below.
Note: This is an optional keyword, unlike other mandatory keywords.
TargetPath
To specify the folder where the function blocks are to be created.
Candidates List
Place a keyword Candidates, followed by a list of PCS tags for which the function blocks
will be created.
Note: The list should not contain any duplicating PCS tag names.
To summarize, Figure 10-1 below shows how a function block name is assigned, and how an
OPC item ID is constructed.
Figure 10-1 Function block name assignment and OPC item ID construction
Template =
Item = tag 1, .PV Function block name OPC item ID
Item = tag 2, .SV
Candidates
TEST001 Folder_path . TEST001 . tag 2
TEST002
Area1/TEST001.SV
FBItem
To specify an OPC tag name of a function block.
FBItemID
To specify an OPC item ID, including the data item name.
Note: Make sure the case matches the OPC gateway expectation.
Or
Shortcut
To specify an Exaquantum OPC, Manual or Calculated tag full path name to which a shortcut
is assigned.
OPCGateway
To specify an OPC gateway to which the OPC item ID is allocated.
Note 1: This is the required syntax, even when you use Shortcut.
Note 2: You cannot specify a shortcut that references another tag within the same file that is
going to be created together. The shortcut target must exist before running the
generation job.
Tag File Import File Format
A file can have definitions of as many tags as required. One definition of a tag consists of a
set of keywords, each of which is delimited by a comma (,) and may split across multiple lines
with placing a delimiter at the end of a continued line. Keywords can be placed in any order.
Note: When a definition spreads across lines, do not place a comment line, which begins
with a hash (#) character, between continued lines.
Tag
To specify the name of a tag to be created. A maximum of 32 characters, and NLS
compatible.
TagType
To select the type of a tag out of; MAN for manual tag, CALC for calculated tag, or OPC for
OPC tag.
TagTempl
To specify a tag template based on which a tag will be created.
Description
To supply a description of a tag. A maximum of 256 characters, and NLS compatible.
<Optional>
FullPath
To specify where to create the tag. A maximum of 256 characters, and NLS compatible.
<Optional>
Note: If FullPath is not specified then the tag will be stored in the folder selected from the
Job Definition screen.
HighEng
To supply a high engineering range. <Optional>
LowEng
To supply a low engineering range. <Optional>
Units
To supply an engineering units. A maximum of 256 characters, and NLS compatible.
<Optional>
Value
To specify the initial value of a Manual tag. <MAN only>
Script
To specify the calculation expression. If the expression must spread across lines, enclose the
expression by a pair of double quotes. <CALC only>
ItemID
To specify OPC item ID. Make sure that the case is in line with the OPC expectation. <OPC
only>
OPCGateway
To specify the OPC gateway. <OPC only>
Note 1: Description, HighEng, LowEng, and Units (often called reference data) may or may
not be specified regardless of the tag template option selection. If the template does
not select some of this reference data, corresponding keywords will be ignored. If
the associated OPC gateway supports the reference data access, values specified in
the file will be ignored, and values read from the OPC gateway will be used.
Note 2: When reference data is selected in the tag template and the file does not supply
values for reference data, created reference items will have the quality code BAD-
Offline until values will be explicitly set with Tag Editor or any other way.
Note 3: Some values are classified as initial values and used only when tags are firstly
created. Therefore the Tag File Import program will report no action when these
values are changed in a file and a job is re-run. Such values are:
Description, HighEng, LowEng, Units
Value (MAN only)
Note 4: Calculations must have all their dependent tags created before them inside the
equalization file. If not, errors are generated, stating that some of the calculations
inputs cannot be found. An example of this is:
TAG A (MAN)
TAG B (MAN)
TAG C (CALC) = TAG D + TAG B
TAG D (CALC) = TAG A + 1
The order of the tags in the file must be: Tag A, Tag B, Tag D, and then Tag C.
* Tag Editor is an interactive method and does not record the operation.
** Tag Editor does not provide a way to directly delete a function block tag. It is always
done through applying a modified function block template.
Offline Flag
There are situations when the input source(s) of OPC tags may be unstable (e.g. during
calibration), and the user needs to disconnect the input from Exaquantum to prevent the
unstable data being stored and affecting calculations. The Offline flag allows this to be done.
Through the Tag Editor the user can set both OPC and calculated tags offline. Manual tags
and shortcut tags cannot be set offline. Shortcut tags automatically reflect the offline status of
the original tag, as they simply point to it. Tags may be set offline at Function Block level as
well as at individual Tag level, and their status is displayed through the Tag Editor.
When a tag is offline aggregations for that tag continue to run. Unless the user overrides the
tag quality, the Assumed data quality is treated as Uncertain by aggregations. There is no
check to prevent Aggregation Calculated Tags being set offline. The user should be aware
that if an Aggregation Calculated Tag is set offline, dependent aggregations will be calculated
based on the previous values written to the base aggregation results.
Setting tags offline and online is a one-phase operation (the user does not have to confirm the
Offline or Online operation after selecting it). When an OPC tag is set offline, both raw data
and reference data items for the tag are set offline. Changing the configuration of a tag whilst
it is offline is allowed. Any changes made take effect when the tag is set online.
When created, an offline tag has a default quality code value of Assumed, which will persist
until it is explicitly changed. When the primary quality code is overwritten, the value of the
sub quality code becomes Replaced. The value of the sub quality code cannot be
overwritten.
When Exaquantum is shut down, the offline status of tags is preserved.
11.3 User Interface and Associated Screens
User Interface
The Tag Editor consists of two main sections:
Data Browser
Information pane.
Data Browser
The Data Browser displays a hierarchical structure that represents the Exaquantum process
data available. The user can navigate through the structure and display details of existing tags
and function blocks in the Information Pane.
Duplicate names are not permitted at the same level in the structure. Duplicate tag names
cannot be used in the same folder, nor duplicate subfolder names in the same folder. A tag
and a subfolder within a folder cannot have the same name either (e.g. if a tag and a subfolder
are added to a folder, they cannot both be called Calc1).
A pop-up menu is displayed if you right-click in the Data Browser. The options available to
the user depend on which node was selected.
Information Pane
The Information Pane displays the details related to the function block or tag selected in the
Data Browser. When the user selects either function block or tag in the Data Browser, certain
details are displayed in the Information Pane:
When a Function Block is selected, the Function Block Information and the Summary of
Function Block Tags are displayed in the information pane. The Summary of Function
Block Tags is a list of all the tags associated with the selected Function Block.
When a Tag is selected, the Tag Information and Function Block Information are displayed in
the Information Pane. (In this case the Tag Information is detailed.)
If a folder (or sub-folder) is selected, nothing will be shown in the Information Pane.
Associated Screens
The associated screens are:
Add Exaquantum Object displays the objects that can be added using the Tag Editor
Add Folder
Add Function Block
Add Manual Tag
Add Calculated Tag
Expression Editor
Expression Tester
Add OPC Tag
Edit Exaquantum Object (no corresponding screen for this activity)
Edit Function Block
Impact Preview
Edit Manual Tag
Edit Calculated Tag
Edit OPC Tag
Remove
Upgrade To Latest
Update Reference Data.
Some of these screens and processes are described below.
The Upgrade status of each object will be displayed; all successfully upgrading objects will be
highlighted in grey and will no longer be selectable. Update Reference Data Screen.
OPC tags can be supplied with reference data (description, units and engineering ranges) by
the OPC gateway either when tags are initially created, or when the OPC gateway and/or OPC
item ID are changed after being created.
Note: This feature is available only when the OPC gateway supports the property access
feature, which is the case of most of the servers except PLC or other simple devices.
Filling reference data with OPC values could be useful at first. As the user continues to use
Exaquantum, however, it is likely that users will set their own reference data to meet their
information utilization purposes. Assuming this scenario, the ability to update reference data
is provided as an advanced function. The screen will allow you to select an OPC gateway
from which reference data is to be read, and set all of the OPC tags that are associated with
the OPC server. A command line version of this facility is available with the OPC Reference
update tool. See section 11.4.
Note: All OPC tags of the selected OPC server are updated. Any changes that have been
made locally in Exaquantum will be overwritten.
11.4 OPC Reference update tool
The OPC Reference update is a command line tool for updating the Exaquantum OPC tag
reference data with the current values from one or more OPC Gateways. It is the same as the
functionality provided by the Update Reference Data Screen, described above. The advantage
of the functionality available as a command line version is that it can be scripted. This enables
it to be conveniently run to a schedule that ensures the OPC tag reference data is kept
concurrent.
If the command is issued alone, with no following names, then all the gateways that support
OPC Property Access will be updated.
The command line invocation is:
QOPCReferenceUpdate.EXE
The command should be followed by a white space delimited list of the gateway names of all
the OPC gateways that should be updated, as in the following example:
QOPCReferenceUpdate.EXE [OPC Server1] [OPC Server2] [OPC Servern]
Note: The actual OPC Gateway Name should be specified, not the OPC Gateway computer
name. The Names of currently configured OPC Gateways can be established by
using the OPC Gateways Configuration tool, see Chapter 2 - OPC Gateway
Configuration.
11.5 Scenarios
The Tag Editor is used to view, create and edit; folders, function blocks and tags. This
section gives some typical scenarios associated with the Tag Editor.
Folders
The folder mechanism can be used to group tags, for ease of reference. For example, manual
tags based on a particular user criterion can be placed in one folder, and OPC tags in another.
A user can add, rename and remove a folder.
Function Blocks
A user can add, rename, edit and remove a function block.
Tags
A user can add, edit and remove manual, calculated and OPC tags.
Note: Certain actions can only be carried out on flat tags (not on function block tags).
Caution
If an OPC tag is created for an OPC Gateway machine that does not exist on the network, and
this tag is then deleted, it will take up to 3 minutes for the Delete operation to complete. This
is much longer than normal.
Offline Flag
Tags may be set Offline at both Function Block level and individual Tag level. From the
Function Block edit screen, if Offline is selected all OPC and Calculated Tags within it are set
Offline. If Online is selected, all OPC and Calculated Tags within the Function Block are set
Online. The Online/Offline status is associated with each Tag. If a Function Block template
is updated and a new tag is added, the tag will be added Online, even if all other Tags in the
Function Block have been set Offline.
Note: The concept of the Offline Flag is described in Section 11.2.
OPC Server 1
Read Calculated
operation tag
Read
Tag A
operation
Tag 1A + Tag 1B = Tag 1C
Exaquantum/PIMS Server 2
OPC Server 2 with Cross-server Calculations
Read
Tag B Tag 2B
operation
Read
Tag A
operation
Tag 1A + Tag 1B = Tag C
Read Calculated
operation tag
Read
operation
Exaquantum/PIMS Server 2
with Cross-server Calculations
OPC Server 2
Read
Tag B Tag 2B Calculated
operation
tag
Read
Tag D
operation
Tag 2D + Tag 2A = Tag E
This feature provided an extremely versatile and powerful way of exploiting tag and function
block data across the system.
Writing back values
When the ability to write back values to the originating server is added to the above scenarios
then the usefulness and flexibility increases, but so does the complexity. In this sort of
environment it is important to be able to keep track of the data flows to avoid the
administrative task getting out of control.
The following two diagrams illustrate the point:
Figure 12-3 Example using read operation
Exaquantum/PIMS Server 1
with Cross-server Calculations
OPC Server 1
Calculated tag
Read
Tag A
operation
Tag 1A + Tag 1B = Tag X
Read Read
operation operation
Exaquantum/PIMS Server 2
with Cross-server Calculations
OPC Server 2
Read
Tag B Tag 2B
operation
Calculated tag
Read
Tag D
operation
Tag 2D + Tag Y = Tag E
In the example above, the Tag Y on Server 2 is read from Tag X on Server 1.
Calculated tag
Read
Tag A
operation
Tag 1A + Tag 1B = Tag X
Tag Y
Read Write
operation operation
Exaquantum/PIMS Server 2
OPC Server 2 with Cross-server Calculations
Read
Tag B Tag 2B
operation
Calculated tag
Read
Tag D
operation
Tag 2D + Tag Z = Tag E
Grid
architecture
Hierarchy
architecture
As shown above, in terms of the flow of data, the complexity of the Grid architecture is far
greater and more difficult to track and manage. However, the best choice in any given
situation will depend on many other factors as well, such as network topology, geographical
location, operational responsibilities, etc.
History catch up
As in the case of other OPC servers supporting Historical Data Access, when data is missed
because of a scheduled or unscheduled shutdown, if History Catch up is enabled it will
automatically recover as much data as possible. This involves the computer that has missed
data actively fetching the missing items from the source. However, the server has no
knowledge of what data would have been transferred to it using data write back, consequently
that form of missing data will not be recovered using History Catch Up. For a full description
of History Catch-up see the Exaquantum Engineering Guide Volume 1 (ref IM 36J04A15-
01E).
Parameter Value
OPC Item ID Fully qualified path of the tag on the remote Exaquantum server.
For example: Root.FunctionBlock.Tag.Value,
Root.FunctionBlock.Tag.Aggregations.Hour.Maximum.Value.
NOTE: This is not case sensitive.
Engineering Units Set by system; derived from the OPC Server if specified by the
Tag Template.
Engineering Set by system; derived from the OPC Server if specified by the
Range Low Tag Template.
Engineering Set by system; derived from the OPC Server if specified by the
Range High Tag Template.
Parameter Value
Type Exaquantum
Enable Exaquantum Not available as Exaquantum does not support Alarm and
Alarm and Events Events.
Chapter 13 Expressions
13.1 General
The Expression Builder provides a means to create, edit and test expressions for Exaquantum
calculated tags. The building of an expression utilizes two screens:
Expression Editor, for creating or editing an expression
Expression Tester, for testing an expression.
These screens are used in conjunction with the Tag Editor and the Function Block Template.
Note: See the API Reference Manual (ref: IM 36J04A14-01E) for further details of some of
the functions referenced in this manual.
13.2 Concepts
Defining Expressions
Expressions are defined using Microsofts Visual Basic (VB) script, a good knowledge of VB
script is therefore required to carry out these procedures. Refer to Microsoft documentation
for details of VB script, and to the API Reference Manual (ref: IM 36J04A14-01E).
Expressions in Exaquantum must conform to the following rules:
Expressions should contain at least one source tag. Tags are identified in the script by their
fully qualified path enclosed in square brackets:
[Root.Folder1.Foldern.FBName.Tagname.Value]
Note: Paths used in calculations must refer to an Item and not a Tag. This is typically
Value, but other possible identifiers are references items Description,
Units, High_eng or Low_eng. Another possible set is .Quality and
.Timestamp. The Tag Editor will add .Value when Tags are dragged from the
Data Selector for use in expressions.
Expressions can use a relative addressing besides an absolute addressing. This is useful to
configure a function block that contains an expression that can be located anywhere in the
folder hierarchy.
Expressions should return a single value, and this should be assigned to the special keyword
[result]. Additionally, expressions can write to tags other than the one they are associated
with, assuming such tags are either an OPC tag or a Manual tag.
Long lines in expressions may be broken up and continued on the next line. To do this, end a
line with an underscore character.
Expressions cannot be recursive. In some cases such calculations are easily recognized,
however the outcome of a long reference chain in other cases may not be easy to identify.
Expressions can contain references to aggregation results.
Expression can have a self-reference; a typical example is [Result] = [Result] + 1. More
complex referencing is also available.
Expressions can send user-defined events to the Event Management program so that particular
application programs are run to perform tasks.
When defining expressions, the user can make use of time functions and user library functions.
It is also possible for the user to create an aggregation calculated tag. This has an expression
that enables it to generate aggregation values rather than default aggregation logics (see
Chapter 6 for further information).
Event-driven
Calculated tags are event-driven; when any of the inputs change, the expression is evaluated.
At that point, inputs other than the firing one are evaluated as the latest one.
Note 1: When an aggregation value is used in an expression, the expression is evaluated
when the aggregation period is due and the value has been just updated. Meanwhile,
if the other input fires the expression, the aggregation within the expression returns
the last calculated value.
Note 2: When any one of the inputs is invalidated (which happens typically when a tag is
removed from Exaquantum), the expression will never be fired until an expression is
edited and the reference is resolved. Invalidating expression also takes place when
the expression is detected as recursive.
For the expression to identify the firing input, a statement such as this is allowed in the
expression:
If [Root.Tag1.value].Changed
End if
As in these circumstances the not [].Changed always returns true, if you want to check for
the negative, the following form should be used:
If Not Cbool([Root.Tag.Value].Changed)
Data Quality
The expression is re-evaluated when any input value (or quality) changes; the rules for setting
the quality are:
The default quality is determined as follows:
If all source item values are Good then the result is Good
If one or more source item value is Bad then the result is Bad
In all other cases the result is Uncertain.
Unless the expression explicitly overrides this default, and no error takes place during
execution, the default value is used as a Quality Code.
Note: If an error occurs during executing the expression (e.g. zero division), the quality
is always Bad Bad calculation.
The expression can explicitly specify the Quality code. To set the quality code, the
expression should make use of available helper functions, such as:
[Result].Quality = RawQuality (qdaQualityGood)
[Result.[Parent.tag 1.value]].Quality = RawQuality (qdaQualityBad,
qdaSecondaryQualityBadCalc)
Data Timestamp
Result tags (or tag) are assigned a timestamp from the source tag that caused the expression to
be evaluated.
Under certain circumstances the result tag or tags may be updated based on the current system
clock. This may happen:
If the expression has no source tags, e.g. it is a constant value
When the calculated tag is initially created.
In the Expression Tester (where the timestamp of the source tags may be adjusted by the user)
the result tag or tags are assigned the most recent timestamp of the source tags. If the
expression has no source tags, for example it is a constant value, then the result tag or tags are
assigned a default timestamp of 4th May 1957.
Event Generator
Event Generator is a mechanism by which user application programs are triggered by various
source events. Expressions allow users to explicitly generate such events, possibly depending
on the condition. In order to generate events, call a function within expressions as follows:
Qcalc.SendEvent BSTR_parameter, optional_parameters
Where BSTR_parameter is a required string parameter and optional_parameters are
any type of parameters up to 8.
Typical examples are:
If [root.tag1.value] > 10 then
Qcalc.SendEvent State on, [root.tag2.value], comment
Else
Qcalc.SendEvent State off, [root.tag2.value], comment
End if
Note: In this example, the second parameter is resolved into the item ID (numeric).
For more information about Event Generator, refer to Chapter 15 Event Generator.
Timer Functions
Types of Timer
There are two types of timer available for use within scripting. Only one timer can be active
at a time.
Simple Timer
This requests that the calculation be notified once only after the specified time has elapsed.
Any existing timer is cancelled before a new one is started.
For example:
QCalc.StartTimer 60
This will result in the calculation being fired one minute after the current time.
Periodic Timer
This requests that the calculation be notified periodically at each period boundary. It takes
two parameters (Interval and optional Offset) that are specified in minutes; intervals greater
than 60 minutes are DST-aware.
For example:
QCalc.StartPeriodicTimer 60, 30
This will start notifications every hour on the half-hour.
QCalc.StartPeriodicTimer 480
This will start notifications at 08:00, 16:00 and 24:00, local time.
Column Comments
Name
For each object configured within the CalcLibrary table, ScriptedCalcs load type library
details to pass to the Script Engine, and add each object as a named item.
If the IsGlobal flag is set, a single instance is created that is shared by all calculations.
Otherwise each calculation has its own instance created when the calculation is started, and
released when the calculation is shut down.
If the IsNamed flag is set, the ObjectName must be used to reference the object from script,
e.g. QQualityHelper.IsGood( 1qual). Otherwise object properties and methods may be
referenced like intrinsic functions. However, care must be taken to avoid namespace conflicts,
e.g. IsGood(1qual). In the case of namespace conflicts, the behavior is undefined.
Note: It is recommended that IsNamed should be set to TRUE to avoid namespace conflicts.
After registering library(s) in the CalcLibrary table, the Exaquantum server must be restarted
for the changes to be accepted.
The library function should be carefully set up as any function failures may cause serious
damage to the Exaquantum server. In some cases, a function might take a significant amount
of time to execute, which could cause the script to time out for example a complex history
query. When employing a library function of this nature consideration should be given to
executing the function asynchronously, for example by using Event Generator.
13.4 Scenarios
Using Aggregation Results in Expressions
Expressions can use aggregations to update their own value. For example, if a raw OPC tag
value is shown in L/hour, the expression shown below can be used to convert it into kg/hour
by multiplying by a density value. Use the following syntax:
[Result] = [Root.Folder1.Tag1.Aggregations.Hour.Mean.Value]
* [Root.Folder1.Tag2.Value]
Root.Folder1.Tag2.Value is the constant, non-changing, density value by which the Tag1
aggregation is multiplied.
The expression will be updated each time the value of the Tag1 aggregation result is updated.
Note: If a calculated tag is configured to generate its own aggregation values, such as a
summation, then the aggregation may not be calculated properly. This is because of a
potential for mismatches in the timing of aggregation calculations which could result
in a new aggregation being available for a calculated tag, just after its own
aggregation calculation has been performed. To generate a calculated tags
aggregation(s) properly, you can use the Aggregation Calculated Tag mechanism.
See Chapter 8.2 Concepts, Aggregations, Aggregation Calculated Tags.
Using the QualityHelper in Expressions
The QualityHelper library provides functions to allow simple, clear checking of the quality
values of items that can be used directly in expressions. The API reference Manual ()
provides details of these methods. See the example below for an illustration.
Using Quality and Timestamp in Expressions
Expressions can use the quality or timestamp attributes of an item, instead of, or as well as the
value. This can be for both raw and aggregation items, as illustrated in the following examples.
If IsBad([Root.Folder1.Tag1.value].Quality) then
[Result] = [Root.Folder1.Tag1.Value]
else
[Result] = [Root.Folder2.Tag2.Value]
End if
[Result] = [Root.Folder1.Tag2.Value].Timestamp
+Root
+Folder1
OPCTag
CalcTag
+Function
Block1
FBOPCTag
FBManTag
FBCalcTag
When CalcTag is defined as a calculated tag, it may contain the following expression:
[Result] = [Parent.OPCTag.Value]
When FBCalcTag is defined as a calculated tag, it may contain the following expression:
[Result] = [Parent.FBOPCTag.Value]
If you move the parent folder or function block to a new location, expressions containing the
relative path expression will remain valid.
When you write to another tag within the same function block, the expression will look like:
[Result.[Parent.FBManTag.Value]] = [Parent.FBOPCTag.Value]
Note: In function blocks, all referenced tags within a calculated tag of the same function
block should appear before the calculated tag. In order to adjust the order properly,
the Function Block Template program provides an Evaluation Order button.
Self-referencing Expressions
Expression allows a self-reference calculation to be used. Initially this looks like a recursive
calculation, which is not allowed, but the difference is that an output designation is used on
the right-hand side of the expression as well.
One of the simplest self-reference expressions is:
[Result] = [Result] + 1
Another example is to calculate the differential summation, where a raw value represents an
accumulated value and, by keeping a previous value in Exaquantum Manual tag, the
expression should calculate the difference between the previous value and the current one.
Dim temp
Dim diff
Temp = [Parent.Source.Value] If you use
Diff = temp - [Result.[Parent.Previous.Value]] [Parent.Previous.Value]
instead, it is a recursive
If diff >= 0 then calculation.
[Result] = diff
Else
[Result] = diff + [Parent.Previous.Higheng]
End if
[Result.[Parent.Previous.Value]] = Temp
Updating Other Tags
Expressions can update their own value and send a result to other tags. This is most useful in
these two cases:
To write a value to the control system via writing to an OPC tag
To store a temporary value in a Manual tag.
To specify the result tags in other tags, use:
[Result.[<Path>]]
Where <Path> specifies the path to an existing tag.
For Example:
If [Root.Folder1.Tag1.Value] > 100 then
[Result.[Root.Folder1.Tag2.Value]] = 1
Else
[Result.[Root.Folder1.Tag2.Value]] = 0
End if
[Result] = [Root.Folder1.Tag1.Value] / 100
Chapter 14 Archiving
14.1 General
History Archiving in Exaquantum allows users to manage the way that their systems history
is accumulated, and then stored on backup devices. This is done through the History
Archiving tool. This tool is used to free up disk space by allowing historical data, which may
be tag trend data or Alarm and Event history data, to be transferred to a backup device. This
backed up data may later be restored to the system, so making it available again to be viewed
by clients.
Functionality is also provided to permanently delete data from the system.
14.2 Concepts
The Archiving tool also allows the user to control which portions of historical data should be
backed up or restored. Each type of data that may be archived is allocated to a particular
Archive Group as follows:
Table 14-1 Archive Groups
Raw Data : Tag Raw value history data (the value of a manual, calculated or
Raw Fast OPC tag). The OPC update rate is used to determine which group a
Raw Slow particular item will belong to. The threshold is set by a Registry
key; if the OPC update rate is less than or equal to the Registry key
value, the item belongs to the Raw Fast group (the default is 5
seconds)
Reference Data Tag reference value history data e.g. the units, description, low or
high engineering range for tags.
Hourly and Custom Tag aggregation history data for Hourly or other user-defined
Aggregations periods, e.g. 10 minute mean history.
The user can create an archive, create a backup, restore archived data and obtain information
about previously archived data. No modifications can be made to data held in an archive;
archives are read-only.
Raw Fast/Raw Slow Archive candidates composed of Daily chunks starting at midnight.
Alarm & Events Archive candidates composed of Daily chunks starting at midnight.
Hourly and Custom Archive candidates composed of Daily chunks starting at midnight.
Aggregations
System Audit Trails Archive candidates composed of Daily chunks starting at midnight.
When Archive candidates have been configured they may be turned into actual Archives.
Archiving activities are driven by first selecting an Archive Group, after which a reverse time
ordered list of Archives is displayed in the Archive List. Each archive covers a specific
period of history data. For users of Exaquantum history data, each of these Archives is in
either of the following basic states:
Online indicating that the history data for the specified period is available for use by clients,
e.g. it may be trended or queried.
Offline indicating that the history data for the specified period is not available for use by
clients.
Note: For the Alarm & Event Archive Group, the Maximum number of online archives is 63.
Note: The Exaquantum Installation sets the default maximum archive size to be 640Mb, and
defines the maximum number of online archives to be 30. If a user attempts to create an
archive greater than the maximum size, or that there are already the maximum number of
archives online, they will be presented with a warning dialog. It is recommended that users
accept this warning and continue with the create archive operation the archive will get
created, if there is sufficient disk space. If the maximum number of online archives need to be
increased, please run the System Configuration Tool, and change the number of online
archive in Historian Tab. For details, please refer to Chapter 13 System Configuration Tool
in Engineering Guide Vol.3 Support Tools.
An Archive can exist in one of the following states:
Table 14-3 Archive States
State Description
State Description
Operation Example
Example 1: Indicated below are the time intervals for the auto archiving, auto backup,
and auto offline features.
Archive Period: 30 days
Retention Period: 90 days
Time interval until auto offline: 30 days
Figure 14-1 Archiving time intervals
90 Days 30 Days
Actual time interval
30 Days
IMPORTANT
When creating the SQL Backup Device, do not use the name Temporary Device. This is
because this name is used internally in the Archive Tool.
Contents button
When clicking this button, the Contents dialog box is displayed if Device is selected.
This dialog can be used to see the list of archives in the selected device. For the details, refer
to Contents dialog box. Server File Path dialog box is displayed if File is selected. For
the details, refer to the section Define a file when archiving manually dialog box.
IMPORTANT
An error message is displayed in the following situation when Defining a file when archiving
manually is selected.
No file exists on the defined file path on the Exaquantum/PIMS server.
For each archive group selected from the Archive Groups list, a six-column, read-only table
shows information for the:
Candidates available for archiving
Existing archives.
An entry must first be selected from this table before archiving operations can be performed.
Operations
The save operation can be performed on both archive candidates and On-line archives. In the
case of archive candidates an archive will be created before the backup takes place.
Note 1: If the 'Automatic unrestore after save option is checked on the Advanced
Configuration dialog, then the resultant status will become offline.
Note 2: Only the oldest Archive candidate is available for backing up.
The Restore operation may be performed on Off-line archive archives. The restore operation
will read the archive data from the currently selected device and create an on-line archive. See
NOTE 1 below.
The Unrestore operation may only be performed on On-line archives that have been backed-
up. The operation removes the archive from disk, freeing up disk space.
The Delete button is used to remove the selected candidate from the archive. When the user
selects an archive candidate and clicks on Delete, a prompt to confirm the deletion is
displayed. Once confirmed, the candidate will be deleted from the archive.
The user can find archives that store data within a selected time range by using the controls in
the Search section.
NOTES.
1. If the restore is attempted for an archive that was backed-up in SQL Server 2000 or before,
then it cannot be restored directly to Exaquantum. If such an operation is attempted, then
the user will be presented with an error dialog - The Database is incompatible with the
current version of SQL Server. This is due to the database format being incompatible
with SQL Server 2012. Restore of these archives firs requires use of the Rearchive Tool
see Chapter 15
Associated Screens
Advanced Dialog
When the Advanced button on the Archiving screen is clicked, the Advanced dialog is
displayed. This dialog contains settings that are infrequently changed in the archiving tool,
and are for the system as a whole (not specific to a particular archive group).
Maximum archive size
The Maximum archive size field defines the maximum archive size in megabytes. Default is
640 MB.
Automatic unrestore after backup
When Automatic unrestore after backup is checked, an archive is offlined after being saved.
Default Folder for Manual Archive
The Default Folder for Manual Archive allows specification of the default folder where
manual archive files are saved.
Contents Dialog
When the Contents button on the Archiving screen is clicked, the Contents dialog is
displayed. It contains a list of archives available on the currently selected device (by reading
the backup media).
Group Advanced Settings Dialog
When the Group Advanced button on the Archiving screen is clicked, the Group Advanced
Settings dialog is displayed. It is used to configure the auto archiving functionality on a per
Archive Group basis.
The Retention Period is a multiple of the archive period; it is used to determine the amount
of time that data will be maintained online, before it can be considered as a candidate for
auto-archiving or auto-deletion.
The Automatic offline period is the delay following the retention period. This allows:
Backed up archives to be automatically offlined. (Auto-Backup set)
OR
Non-backed up archives to be automatically deleted. (Auto-Backup not set)
IMPORTANT
Do not specify any writing medium such as DVD-R, DVD-RW as a destination folder.
Save File
If "Archive ID" is unchecked, a default save filename will be presented for manual archives.
The default is to use the archive ID to identify the file; the filename format being <archive
ID>.bak. If the Use archive ID is unchecked, then the user can specify their own file.
Use of Archive ID setting
This allows the archive backup file to be named using the archive ID.
Use Archive ID is the default.
If Archive ID is unchecked, a default save filename will be presented for manual archives.
Restore mode
Save folder
This specifies the folder to locate the backup of the archive file (for restore). This is initially
set to the Default folder for manual archive, as set in the Advanced dialog
The save folder can be modified by either:
Direct entry into the field, or
Using the browse button []
Save File
The save file is the name of the archive file to restore. The default is to use the archive ID to
identify the file; the filename format being <archive ID>.bak. The user can specify their own
file, by direct entry into the Save File field.
Contents mode
Save folder
This specifies the folder to locate the backup of the archive file (for contents display). This is
initially set to the Default folder for manual archive, as set in the Advanced dialog
The save folder can be modified by either:
Direct entry into the field, or
Using the browse button []
Save File
The save file is the name of the archive file for contents display. The default is to use the
archive ID to identify the file; the filename format being <archive ID>.bak. The user can
specify their own file, by direct entry into the Save File field.
14.4 Scenarios
The following are situations in which the Archiving tool would need to be used.
Back up a Section of History and Take it Offline
In order to free up disk space, the historian should be regularly administered by backing up
and taking old history data offline. This backed up data may subsequently be restored to the
system, if it is necessary to make the data available again.
Restore a Backed-up Section of History
When data has been backed up and taken offline and backed up, it can be restored to the
system.
Remove a Restored Section of History
A section of previously restored history can be removed, so taking it offline.
Make an extra Backup of a Section of History
An archive that has been backed up can be backed up further times for added data integrity.
Back up an Archive that Failed to Back up
If the initial backup of an archive failed when it was being written to the backup device, then
an Archive will remain online with the status Online, not backed up. To back up these
Archives must be backed up again, to take the history offline.
Mode Description
Check The check mode either displays or outputs to a CSV file the list of archives
within a backup device along with the compatibility with SQL Server 2012.
Rearchive This mode reads one archive from a SQL Server backup device, updates the
compatibility to SQL Server the currently supported (by Exaquantum)
version of SQL Server, and then backs up the modified archive database to
second backup device.
15.3 Security
The user must be a member of the local Administrators group, .and one of the following:
QAdministratorsGroup (Legacy Security mode)
QTM_MAINTENANCE (Standard Security mode)
NOTES:
1. If an argument contains a space then it must be enclosed in double quotes.
2. If an output directory is specified, then a CSV file with the name of the source device is
written. If the filename already exists then it will be overwritten.
Output
The output of the check command consists of tabular data. The output consists of the following
columns:
ID the ID of the archive
Compatible with SQL Server 2012 indicates whether the archive is compatible with SQL
Server 2012
The output can be displayed on screen as shown in Figure 15-1 or to CSV file as shown in Figure
15-2. In this example, archive 1 is incompatible with SQL Server 2012.
Figure 15-1 Rearchive console output
REARCHIVE /check Device A
Arguments
Table 15-3 Rearchive Tool rearchive mode arguments
Message Code
No Error (Success) 0
Exception (Unhandled error) 1
Source archive not found 2
Destination archive not found 4
Archive ID does not exist 8
Output directory does not exist 16
No archives found 64
Archive ID Invalid 128
Cannot start SqlExpress 256
Stored procedures not created 512
Access Denied 1024
Screen Components
The Filter Condition Summary screen comprises two fields and a list, and several control
buttons.
Fields and Lists
Event Source
This drop-down list allows the user to select an OPC Gateway that has alarm & events
enabled or the Exaquantum Server.
Event Category
This field displays the available event categories for the selected event source. It allows the
user to select one of the event categories, to see the currently defined event conditions
(displayed in the event conditions table).
Event Conditions Table
This multi-column table displays all the currently defined event conditions for the currently
selected event category under the following columns:
Order: A number indicating the order in which the event conditions were created.
Application Name: Name of the application that is monitoring a message queue that the
condition will send messages to.
Description: A user-supplied description of the event condition, NLS compatible, maximum
256 characters.
Run mode: 'Enabled is displayed when the event filter condition is checked for matching.
Disabled is displayed when the event filter condition has been registered but is not checked
for matching.
The entry in this column depends on whether or not the Disabled box has been checked in the
Conditions section of the Event Filter Condition Definition screen. The default is to display
Enabled.
To input information in any of these columns, the Event Filter Condition Definition screen
must be used (where these column headings are fields, lists or check boxes). Where no filter
conditions are displayed, click on the New button to input details; to change any details, click
on the Edit button.
Control Buttons
The control buttons that appear within the main list box, and apply to a single event filter
conditions, are:
New
When clicked, displays the Event Filter Condition Definition screen. This allows the user to
specify a new event filter condition. New is enabled only after an Event Category has been
selected and highlighted.
Edit
When clicked, displays the Event Filter Condition Definition screen. This allows the user to
edit the configuration of a selected event filter. Edit is enabled only when an event filter has
been selected.
Delete
Allows a selected event filter to be deleted. A pop-up is displayed to prompt for confirmation.
Delete is enabled only when an event filter has been selected.
Re-load
Requests that event filter conditions be sent to the running Event Handler program and starts
screening alarm and event messages with the new event filter conditions.
Note: This is disabled when MSMQ is either not installed or not running.
Trace On or Trace Off
Toggles the message trace feature on and off. When Trace On is clicked, a pop-up is
displayed that allows a log file name and path to be specified. Event filter alarm & event
messages are then copied to this log.
The log file is also useful for test purposes. When the 'Send' button is selected in the
Simulation Screen, a test message is sent to the log file.
Note: This is disabled when MSMQ is either not installed or not running.
Screen Components
The Filter Condition Summary screen comprises a list, and several control buttons.
Fields and Lists
Application Definition Table
This multi-column table displays all the currently defined Applications under the following
columns:
Application Name: The name to be used for the defined application
Queue Name: The name of the MSMQ private queue that the application will be monitoring
for new event messages
Application Path: The path to the application executable to be used to monitor messages.
Control Buttons
The control buttons associated with the Application Definition section:
New
Creates a new application.
Delete
Allows the selected application to be deleted.
OK
Accepts any changes and closes the screen.
Cancel
Cancels any changes and closes the screen.
Apply
Accepts any changes.
Event Filter Condition Definition
This screen (Figure 16-3) is displayed when either the New button or the Edit button on the
Filter Condition Summary screen is clicked. It is used to either create a new event filter
condition or, to edit an existing one
Figure 16-3 Event Filter Condition Definition Screen
Screen Components
Control Buttons
OK
Accepts any changes and closes the screen.
Cancel
Cancels any changes and closes the screen.
Simulation
Click on this button to display the Simulation screen.
Note: This button is enabled only when MSMQ is installed and running properly.
Disabled
Selecting the 'Disabled' option means that the event condition will not be active and reports
will not be generated. This feature is useful when the event condition configuration cannot be
completed in one session.
Note: The text displayed in the Run Mode column of the Event Condition Summary screen is
determined by whether or not the Disabled box is checked. If it is not checked, the default
is to display Enabled.
Continue
If an event is expected to match further, and to generate more than one event, the Continue
check box must be checked. This means that the search will go on until the next match is
found, or the end of the conditions is reached.
Fields and Lists
Order, Event Source & Event Category
These fields display the Event Filter Order Number and the Event Source and Event Category.
Application Name
Select the application to receive notification when the defined event is matched.
Description
This field should contain a meaningful description of the event condition
Condition Definition Table
This multi-column table displays and allows the definition of the event filter condition.
Opening Parenthesis (Column header is blank): The user can group conditions using
parentheses to form complex logical expressions. This field can contain zero or more
opening parentheses.
Attribute: This allows an event attribute or an Exaquantum Tag against which a match
condition is defined to be selected. When an attribute has been selected, a data type will
be shown adjacent to the name so that user can properly enter the matching value.
OPE: An operator for comparison can be selected from: equal to (=), less than (<), less than
or equal to (<=), greater than (>), greater than or equal to (>=), not equal to (!=).
Note: A value of particular data type allows only a limited set of operators; e.g. the
Boolean type allows only equal to or not equal to.
Value: This specifies the matching value against the attribute. The value must be entered
properly in terms of the data type. A maximum of 32 characters is allowed; the field is
NLS compatible. If the value is a string then the wildcards * (to match zero or more
characters) and ? (to match any single character) may be used.
Note: A logical comparison is executed on a binary basis. A lowercase character is
considered to be different from its uppercase counterpart, a wide character alphanumeric
(full size, or zen-kaku in Japanese) is different from a narrow character alphanumeric
(half-size, or han-kaku in Japanese), and so on.
Closing Parenthesis (Column header is blank): The user can group conditions using
parentheses to form complex logical expressions. This field can contain zero or more
closing parentheses.
And/Or: This specifies the logical concatenation with a following condition, if required, of
either And, Or or a full stop (.) to signify that there are no further conditions. The order
of evaluation is always the order of appearance, unless specified with parentheses.
Control Buttons: The control buttons associated with the Conditions section:
If an Attribute is selected, then it is entered into the selected condition, if the <Select Tag>
option is selected, the Tag Selector as shown in Figure 16-5 is displayed
Figure 16-5 Tag Selector Panel
From this form, the user can select an Exaquantum tag which is then configured for the
current condition after OK is clicked
Note: An event condition must also include one or more Alarm and Event fields.
Simulation Screen
The Simulation screen (Figure 16-6) is displayed when the Simulation button is clicked on
the Event Condition Definition screen. It allows users to manually set values for attributes of
an event condition report to test the process by sending a test message to the log specified in
the Event Summary screen, provided the Trace On option has been selected.
Figure 16-6 Simulation Data Screen
Fields
Attributes
This field displays a list of parameters associated with an event report when its condition has
matched.
Value
This field allows users to set values for the parameters to be included in the test message.
Control Buttons
Send
Sends the specified event with values to the MSMQ.
Close
Closes this screen.
Period Relationships
Collection Periods
HIS Exaquantum
1 second 10 seconds
1 second 1 minute
Setup
Figure 17-1 HIS Tag Generation window
Tags
The following tag details are displayed in the Tags section:
Licensed tags: The number of licenses corresponding to the current key code
Current tags: The number of licensed tags currently defined
Details
The following numeric values are displayed.
Number of current tags
The number of tags already generated for each collection period (update rate), for the
selected OPC gateway.
Number of tags to be generated
The number of CENTUM trend pens at each collection period.
(The tags to generate are checked in the tree structure showing tag candidates.)
Total
Total number of current tags and tags to generate
HIS Station
Enter the HIS computer name that holds the tag information. The HIS name can be typed
in or click on the [...] button to select a computer from the dialog.
Advanced button
Clicking the Advanced button will display the Minimum Collection Period window.
Please see the section, Collection Period for further details.
Browse HIS button
Reads tag information from the specified HIS and generates a list of candidates in a tree
view structure.
Generate button
Generates the selected HIS tags in Exaquantum.
Operation
This section lists the steps to generate Exaquantum tags from HIS tags.
1 Enter the HIS computer name.
2 Click the Browse HIS button. This will retrieve the list of HIS tags.
3 Select the HIS block, group and tags to be generated.
4 In the Tags section of the display the Tags to be generated value reflects the number of
tags selected in step 3. If this amount and the number of Current tags exceed the licensed
tags a warning message will be displayed. This will not stop the Tag generation operation
but tags should be deselected until the total generated tags fall within the licensed tag
count.
5 If required, the Collection period can be adjusted.
6 Select the OPC Gateway which the tags to be generated will collect data from.
7 Click the Generate button to complete the tag generation process.
Note1: Tags with 1-second or 10-second collection periods will be collected at a 1 minute
period, if the Minimum Collection Period is set to 1 minute.
Note2 If QHistorianData free space is less than 10% or 500MB, whichever is the greater, no
new folders, tags, or function blocks can be created.
Information about how many tags are generated at collection periods is displayed in the To
Generate column on the Details display fields for each OPC gateway. Note that this number
includes the number of existing tags (For example, if the number of existing tags is 5, and the
number of non-generated tags is 6, then the number of tags to generate is 11.
How changes to tags on the HIS are reflected in Exaquantum
The following points describe how HIS tag changes will be reflected in the Exaquantum tags:
The HIS tag tree does not differentiate between generated tags and those that have not been
generated. The selection for either option is identical. Tags that remain deselected will not
be deleted from Exaquantum.
If the selected HIS tag already exists in Exaquantum no changes will occur during the tag
generation process.
Note: Changes to the HIS tag attributes (such as display range) or CENTUM tag attributes
(such as engineering unit, range and tag comments) will not be reflected in the
Exaquantum tags during generation. To reflect these changes, please carry out on
Tag Editor from Administration Tools.
Note: Care should be exercised when changing the collection period value. It will only
affect database space if the collection period is increased in frequency