Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
303.938.8282 x115
720.219.3773 (mobile)
jameskoopmann@confio.com
www.confio.com
James F. Koopmann
Director of Technical Consulting
James F. Koopmann
Where to Find Me on the NET
Content
1.
2.
3.
4.
5.
6.
7.
8.
9.
Problems
FROM v$buffer_pool_statistics;
((e.db_block_gets-b.db_block_gets)+
(e.consistent_gets-b.consistent_gets)))) buffhit
FROM beg_buffer_pool_statistics b, end_buffer_pool_statistics e
WHERE b.name=e.name AND b.block_size=e.block_size;
How to diff
1.
2.
3.
4.
5.
6.
FROM V$DB_CACHE_ADVICE
WHERE advice_status = 'ON';
Estd Phys
Reads
---------343669940
280783364
240091867
220733606
208181172
197842182
190052917
182180544
176884743
172420984
165812231
162626093
158797352
152735392
149879874
146571255
143928671
141908868
139972381
1.
CURRENT
2.
2.
3.
4.
5.
6.
7.
8.
9.
Bottom line
1.
2.
3.
Startup fallacy
Everyone is in a rush
Most dont want to do the job the right or proper way the first time
2.
3.
Take Snapshot at T1
Take Snapshot at T2
Compare T1 to T2
Use LogMiner
4.
Search and Extract DDL from Current Redo Logs & Archived Logs
Streams
Set of database structures to capture and replicate changes from a source database to
a destination database
Date Field
What
DBA_OBJECTS
YES
CREATED
LAST_DDL_TIME
DBA_TABLES
Yes
LAST_ANALYZED
DBA_INDEXES
Yes
LAST_ANALYZED
DBA_TAB_COLUMNS
NO
DBA_IND_COLUMNS
NO
2.
3.
4.
DECODES
t1.column != t2.column
The Good - Build Yourself. No reliance on 3rd party software or database vendor
The Bad - Complicated code. (you better be good)
The Ugly - Oracle could change / add tables that are important to object change
2.
3.
Create a dictionary
EXECUTE DBMS_LOGMNR_D.BUILD(
dictionary_filename => 'dictionary.log',
dictionary_location => '/ora/oradata/hcmc/log');
2.
3.
Start logminer
EXECUTE DBMS_LOGMNR.START_LOGMNR(
DictFileName =>'/ora/oradata/hcmc/log/dictionary.log');
4.
Oracle Streams
Overview
Oracle Streams
A feature within the Oracle database that allows for the replication of database structures and
information between two separate databases
1.
2.
3.
4.
5.
6.
7.
Oracle Streams
Directed Networks
A directed network is defined as a networked system of hosts that allow for the
passing of information to a destination database where the destination host
is not directly accessible to the source host.
Two methods of information propagation.
1.
2.
Streams Environment
CAPTURE
SET_UP_QUEUE
ADD_GLOBAL_RULES
User DDL
Change
HCMC
(source)
Oracle
Streams
PROPAGATION
ADD_GLOBAL_PROPAGATION_RULES
SET_UP_QUEUE
CREATE TABLE history_ddl_lcrs
SAIGON (destination)
Streams Environment
CAPTURE
User DDL
Change
Data
Definition
Language
APPLY
(DDL)
HCMC
(source)
SQL statements that affect the
structure of database objects,
such as CREATE TABLE,
ALTER TABLE, DROP TABLE,
and RENAME TABLE.
PROPAGATION
SAIGON (destination)
Streams Environment
CAPTURE
User DDL
Change
Source
database
APPLY
HCMC
(source)
The database that originates
information to be shared within
the Oracle Streams environment
PROPAGATION
SAIGON (destination)
Streams Environment
CAPTURE
User DDL
Change
HCMC
(source)
PROPAGATION
SAIGON (destination)
Streams Environment
CAPTURE
User DDL
Change
HCMC
(source)
PROPAGATION
SAIGON (destination)
Streams Environment
CAPTURE
User DDL
Change
HCMC
(source)
PROPAGATION
APPLY
SAIGON (destination)
Streams Environment
CAPTURE
User DDL
Change
Apply
HCMC
(source)
PROPAGATION
APPLY
SAIGON (destination)
Streams Environment
CAPTURE
User DDL
Change
HCMC
(source)
Rules
APPLY
PROPAGATION
SAIGON (destination)
Streams Environment
CAPTURE
User DDL
Change
HCMC
(source)
PROPAGATION
APPLY
SAIGON (destination)
The Good - Able to report on every DDL statement issued without intervention
The Bad - Learning curve is a bit high
The Ugly - Intensive & cumbersome setup
Streams Environment
CAPTURE
SET_UP_QUEUE
ADD_GLOBAL_RULES
User DDL
Change
HCMC
(source)
PROPAGATION
ADD_GLOBAL_PROPAGATION_RULES
APPLY
SET_UP_QUEUE
CREATE TABLE history_ddl_lcrs
SAIGON (destination)
Streams Environment
Software Requirements
1.
2.
Run catalog.sql & catproc.sql after you have upgraded to version 9.2.0.2
Streams Environment
Archive Log Requirement
1.
No data loss
ii.
iii.
iv.
Streams Environment
Parameter Requirements
Parameter
Setting
Notes
COMPATABLE
9.2.0 or higher.
JOB_QUEUE_PROCESSES
2 or higher.
LOG_PARALLELISM
LOGMNR_MAX_PERSITENT_SESSIONS.
equal to or higher
than the number of capture
processes.
OPEN_LINKS
4 or higher.
PARALLEL_MAX_SERVERS
current value +
(3 * capture processes) +
(3 * apply processes)
PROCESSES
current value +
((capture processes +
apply processes) * 10).
SHARED_POOL_SIZE
current size +
((capture processes +
apply processes) * 10M).
SHARED_POOL_SIZE
should be at least 100M.
GLOBAL_NAMES
TRUE
Intermission
Streams Setup
Create Administrator
1.
2.
3.
4.
Streams Setup
Grant Privileges to Administrator
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
Streams Setup
Grant Privileges to Administrator to Create Rules
1.
2.
BEGIN
DBMS_RULE_ADM.GRANT_SYSTEM_PRIVILEGE(
privilege
=> DBMS_RULE_ADM.CREATE_RULE_SET_OBJ,
grantee
=> DDLMAN,
grant_option
=> FALSE);
END;
/
3.
BEGIN
DBMS_RULE_ADM.GRANT_SYSTEM_PRIVILEGE(
privilege
=> DBMS_RULE_ADM.CREATE_RULE_OBJ,
grantee
=> DDLMAN,
grant_option
=> FALSE);
END;
Streams Setup
Switch LogMiner out of the SYSTEM Tablespace
Why
1.
2.
When you create a capture or apply process, Oracle will create a subset of the data
dictionary to keep track of changes to structures.
3.
4.
The SYSTEM tablespace may not have enough room for these tables.
How
1.
2.
3.
EXECUTE DBMS_LOGMNR_D.SET_TABLESPACE(LOGMINER);
Streams Setup
LogMiner / Streams Issues
What
1.
If you move LogMiner after you have captured or applied, you will lose the Streams
directory changes that have been recorded.
2.
The Streams data dictionary is not kept clean by Oracle which can also cause greater
strains on the Streams dictionary and allow it to grow uncontrollably.
To Do
1.
2.
Remove an object is not being used you can clean out the Streams dictionary by using
DBMS_STREAMS_ADM.PURGE_SOURCE_CATALOG for a particular object.
Streams Setup
Database Link from source to destination
Why
1. For transporting the captured DDL from the source database to the destination
database
How
1.
CONNECT ddlman/ddlman@hcmc
2.
Streams Setup
Capture
1.
2.
3.
CONNECT ddlman/ddlman@hcmc
BEGIN
DBMS_STREAMS_ADM.SET_UP_QUEUE(
queue_table
=> 'ddl_cap_table',
queue_name
=> 'ddl_cap_q',
queue_user
=> 'ddlman');
END;
/
BEGIN
DBMS_STREAMS_ADM.ADD_GLOBAL_RULES(
streams_type
=> 'capture',
streams_name
=> 'cap_ddl',
queue_name
=> 'ddl_cap_q',
include_dml
=> false,
include_ddl
=> true,
include_tagged_lcr => false,
source_database => hcmc);
END;
/
Streams Setup
Propagation Rules
1.
CONNECT ddlman/ddlman@hcmc
2.
BEGIN
DBMS_STREAMS_ADM.ADD_GLOBAL_PROPAGATION_RULES(
streams_name
=> 'prop_ddl',
source_queue_name
=> 'ddl_cap_q',
destination_queue_name => 'ddlman.ddl_apply_q@saigon,
include_dml
=> false,
include_ddl
=> true,
include_tagged_lcr
=> false,
source_database
=> hcmc );
END;
Streams Setup
Create Queue
1.
CONNECT ddlman/ddlman@saigon
2.
BEGIN
DBMS_STREAMS_ADM.SET_UP_QUEUE(
queue_table
=> 'ddl_apply_table',
queue_name
=> 'ddl_apply_q',
queue_user
=> 'ddlman');
END;
/
Streams Setup
Create Table to hold DDL
1.
CONNECT ddlman/ddlman@saigon
2.
DATE,
source_database_name
VARCHAR2(128),
command_type
VARCHAR2(30),
object_owner
VARCHAR2(32),
object_name
VARCHAR2(32),
object_type
VARCHAR2(18),
ddl_text
CLOB,
logon_user
VARCHAR2(32),
current_schema
VARCHAR2(32),
base_table_owner
VARCHAR2(32),
base_table_name
VARCHAR2(32),
tag
RAW(10),
transaction_id
VARCHAR2(10),
scn
NUMBER);
Streams Setup
Logical Change Records (LCRs)
When the capture process mines information from the redo log, it reformats this
information into LCRs. These LCRs are specific to the type of information captured
and the completely defines the changed that has occurred.
SYS.ANYDATA
This is an overloaded object type that can be of any scalar (number, varchar, char,date)
or user defined data type. It has defined with it methods that allows us to query
what type of true data type it holds as well as methods to retrieve the values
$ORACLE_HOME/rdbms/admin/dbmsany.sql
Streams Setup
Create Procedure to handle DDL
1.
CONNECT ddlman/ddlman@saigon
2.
SYS.LCR$_DDL_RECORD;
rc
PLS_INTEGER;
ddl_text
CLOB;
BEGIN
rc := in_any.GETOBJECT(lcr);
DBMS_LOB.CREATETEMPORARY(ddl_text, TRUE);
lcr.GET_DDL_TEXT(ddl_text);
INSERT INTO ddlman.ddl_history
VALUES(SYSDATE, lcr.GET_SOURCE_DATABASE_NAME(), lcr.GET_COMMAND_TYPE(),lcr.GET_OBJECT_OWNER(),
lcr.GET_OBJECT_NAME(), lcr.GET_OBJECT_TYPE(),ddl_text, lcr.GET_LOGON_USER(), lcr.GET_CURRENT_SCHEMA(),
lcr.GET_BASE_TABLE_OWNER(), lcr.GET_BASE_TABLE_NAME(),lcr.GET_TAG(),lcr.GET_TRANSACTION_ID(), lcr.GET_SCN());
COMMIT;
DBMS_LOB.FREETEMPORARY(ddl_text);
END;
/
Streams Setup
Create Rules
1.
CONNECT ddlman/ddlman@saigon
2.
BEGIN
DBMS_STREAMS_ADM.ADD_GLOBAL_RULES(
streams_type
=> 'apply',
streams_name
=> 'apply_ddl',
queue_name
=> 'ddl_apply_q',
include_dml
=> false,
include_ddl
=> true,
=> hcmc);
Streams Setup
Hook in the DDL handler
1.
CONNECT ddlman/ddlman@saigon
2.
BEGIN
DBMS_APPLY_ADM.ALTER_APPLY(
apply_name
=> 'apply_ddl',
ddl_handler
=> 'ddlman.history_ddl');
END;
/
Streams Setup
Instantiate the Stream Environment
Definition
Before we can start capturing, propagating, and applying within our Streams
environment we must instantiate the destination database. This is nothing more
than registering the source SCN with the destination database so it knows the point
in time it can start applying captured information.
Streams Setup
Instantiate the Stream Environment
1.
2.
3.
CONNECT ddlman/ddlman@hcmc
exec dbms_capture_adm.PREPARE_GLOBAL_INSTANTIATION;
DECLARE
iscn NUMBER;
BEGIN
iscn := DBMS_FLASHBACK.GET_SYSTEM_CHANGE_NUMBER();
DBMS_APPLY_ADM.SET_GLOBAL_INSTANTIATION_SCN@saigon(
source_database_name
=> hcmc,
instantiation_scn
=> iscn,
apply_database_link
=> saigon);
END;
/
Streams Setup
Start the Apply Process
1.
CONNECT ddlman/ddlman@saigon
2.
BEGIN
DBMS_APPLY_ADM.START_APPLY(
apply_name
=> 'apply_ddl');
END;
/
BEGIN
DBMS_APPLY_ADM.STOP_APPLY(
apply_name
END;
/
=> 'apply_ddl');
Streams Setup
Start the Capture Process
1.
CONNECT ddlman/ddlman@hcmc
2.
BEGIN
DBMS_CAPTURE_ADM.START_CAPTURE(
capture_name
=> 'cap_ddl');
1.
2.
END;
/
BEGIN
DBMS_CAPTURE_ADM.STOP_CAPTURE(
capture_name
END;
/
=> 'cap_ddl');
OEM
Streams
OEM
Streams - Capture
OEM
Streams - Propagate
OEM
Streams - Apply
SOURCE
-----HCMC
HCMC
HCMC
HCMC
HCMC
HCMC
HCMC
HCMC
HCMC
LOGON_USER
---------SYS
SYS
SYS
SYS
SCOTT
SCOTT
SCOTT
SCOTT
SCOTT
COMMAND_TYPE
-------------------CREATE USER
CREATE TABLESPACE
CREATE USER
ALTER USER
CREATE TABLE
ALTER TABLE
TRUNCATE TABLE
ALTER TABLE
DROP TABLE
OWNER
NAME
TYPE
--------- --------------- ----AA
USER
TEMPUSER
SCOTT
SCOTT
SCOTT
SCOTT
SCOTT
SCOTT
DDL_CHECK_TABLE
DDL_CHECK_TABLE
DDL_CHECK_TABLE
DDL_CHECK_TABLE
DDL_CHECK_TABLE
USER
USER
TABLE
TABLE
TABLE
TABLE
TABLE
DDL_TEXT
--------------------------------------------------------------------------CREATE user aa identified by VALUES '1468620FBA6271E8'
create temporary tablespace temp01
create user tempuser identified by VALUES '2B4C9C62A2919AEF'
alter user scott identified by VALUES 'A7E7E0150C6D5EF3'
CREATE TABLE DDL_CHECK_TABLE (COL1 NUMBER)
ALTER TABLE DDL_CHECK_TABLE ADD (COL2 VARCHAR2(500))
TRUNCATE TABLE DDL_CHECK_TABLE
ALTER TABLE DDL_CHECK_TABLE DROP COLUMN COL2
DROP TABLE DDL_CHECK_TABLE
CREATE INDEX
Wait Time
Time
Running Streams
DDL Types Captured
CREATE/ALTER/DROP Tables includes table comments
CREATE/ALTER/DROP Tablespace (requires global rules to be set)
CREATE/ALTER/DROP Indexes
CREATE/ALTER/DROP Triggers
CREATE/ALTER/DROP Views
CREATE/ALTER/DROP Synonyms
CREATE/ALTER/DROP Sequences
Creation of PL/SQL packages, procedures and functions
Changes to users/roles
GRANT or REVOKE on users/roles
COMMIT
ROLLBACK
AUDIT (can be done on user objects)
Running Streams
DDL Types Captured But NOT Applied
CREATE , ALTER, or DROP MATERIALIZED VIEW LOG
CREATE , ALTER, or DROP MATERIALIZED VIEW
CREATE or ALTER TABLE for Index-organized tables
CREATE SCHEMA AUTHORIZATION
CREATE or DROP DATABASE LINK
RENAME (use ALTER TABLE instead)
CREATE TABLE ? AS SELECT for clustered tables
Running Streams
DDL Types NOT Captured
CREATE or ALTER DATABASE
ALTER SESSION
ALTER SYSTEM
TRUNCATE
CREATE/ALTER/DROP ROLLBACK
CREATE/ALTER/DROP TYPE
CREATE/ALTER/DROP PROFILE
CREATE/ DROP LIBRARY
CREATE/ DROP DIRECTORY
SET ROLE
SET TRANSACTION
SET CONSTRAINT
CREATE CONTROL FILE
CREATE SPFILE
CREATE PFILE
ANALYZE
EXPLAIN
CALL
PL/SQL Procedural calls
Lock Table
Running Streams
Problems You May Encounter
Setup / Running Streams
1. Status of dba_capture & dba_apply where ABORTED
2. ORA-01925: Maximum of 30 enabled roles exceeded
3. What object is that?
4. Mixed case global_name causing mismatch Streams and LogMiner
Remedy
1. Stop the Capture and apply processes and start them again
2. Increase current value for MAX_ENABLED_ROLES
3.
4.
Running Streams
Problems You May Encounter
Bugs
1. Analyze statement is not propagated on streams environment to target database
2.
3.
4.
Remedy
1. Can use DBMS_STREAMS_ADM.ADD_TABLE_PROPAGATION_RULES.
2. Do not use dynamic DDL
3. Ensure that no DDL has been issued around scheduled shutdowns.
4. Get it right the first time or choose a version naming schema.
Running Streams
Monitoring
Just Some
DBA_QUEUES
DBA_QUEUE_TABLES
DBA_APPLY
DBA_APPLY_PARAMETERS
DBA_CAPTURE
DBA_CAPTURE_PARAMETERS
DBA_PROPAGATION
DBA_APPLY_ERROR
DBA_RULES
DBA_RULE_SETS
DBA_RULE_SET_RULES
DBA_JOBS
DBA_QUEUE_SCHDULES
303.938.8282 x115
720.219.3773 (mobile)
jameskoopmann@confio.com
www.confio.com
James F. Koopmann
Director of Technical Consulting