Issue with Oracle Autonomous Database

833 views
Skip to first unread message

Kurt Aquilina

unread,
Feb 21, 2024, 7:47:10 AM2/21/24
to debezium
Dear Debezium Team,

We are currently doing some tests with configuring the Debezium Oracle source connector to capture data changes from an Oracle Autonomous Database environment. However, I'm encountering an error related to insufficient privileges or incorrect configuration. First of all, is Oracle Automous Database supported by Debezium?

We made use of the below database configurations:

"database.user": "",
"database.password": "",
"database.url": "",
"database.dbname": "",
"database.pdb.name": ""


When attempting to connect, we encountered the following error:

Caused by: Error : 1031, Position : 0, SQL = alter session set container=cdb$root, Original SQL = alter session set container=cdb$root, Error Message = ORA-01031: insufficient privileges.


Our concern is that for an Oracle Autonomous Database, the 'ALTER SESSION SET CONTAINER' cannot be executed because access to the root container (CDB$ROOT) is restricted.

Thanks in advance.

Chris Cranford

unread,
Feb 21, 2024, 9:21:37 AM2/21/24
to debe...@googlegroups.com
Hi Kurt,

When using Oracle's pluggable database multi-tenancy solution, the connector must be able to access the PDB and the CDB$ROOT (root) containers.  The PDB is accessed during the snapshot to read the tenant data based on the connector configuration and  the CDB$ROOT is accessed during streaming to query instance data about logs and to execute the LogMiner packages to capture real-time changes. 

Unfortunately, the tables needed for instance data such as V$LOGFILE, V$LOG, V$ARCHIVED_LOG, and so on (including the GV$ variants) are not accessible from within the PDB, only the CDB$ROOT. Since Debezium requires access to these tables in order to manage the logs used by LogMiner since Oracle removed continuous mining in Oracle 19c+, this is mandatory.  I'm afraid without access to these tables, usage of Debezium won't be possible.

Thanks,
Chris
--
You received this message because you are subscribed to the Google Groups "debezium" group.
To unsubscribe from this group and stop receiving emails from it, send an email to debezium+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/debezium/2a74b677-860b-41fd-ad10-07284c9d3bd1n%40googlegroups.com.

Kurt Aquilina

unread,
Feb 23, 2024, 3:25:35 AM2/23/24
to debezium

Dear Debezium Team,

As explained in this document "https://docs.oracle.com/en/database/oracle/oracle-database/19/sutil/steps-in-a-typical-logminer-session.html#GUID-6609EBA2-B2D7-4EAE-8344-A1F6C0A24760"
at section "Understanding How to Run LogMiner Sessions", we do not have access to CDB$ROOT on Autonomous Database. In this case, we must use the per-PDB capture method.
In this mode, we have to provision a local user with a predefined set of privileges to the source PDB whose logs we want to review. All LogMiner processing is restricted to this PDB only.

The same section also explains how to run LogMiner on individual PDBs: "To run LogMiner on individual PDBs, the procedures are slightly different. instead of using DBMS_LOGMNR.ADD_LOGFILE.
you specify a period in which you want to review log files for the PDB. Specify the SCN value of the log that you want to query, with either startScn and, if you choose, endScn, or startTime,
and if you choose, endTime. You then start LogMiner with DBMS_LOGMNR.START_LOGMNR. DBMS_LOGMNR.START_LOGMNR automatically adds the redo logs for you to analyze.".

The privileges required to run LogMiner on individual PDBs are explained at "Requirements for Running LogMiner for Individual PDB" section. Here we can find that a new database procedure
is used to grant the required privileges. This procedure is DBMS_GOLDENGATE_AUTH.GRANT_ADMIN_PRIVILEGE.

Another difference we can notice is at "24.12.4 Typical LogMiner Session Task 3: Specify Redo Log Files for Analysis" section. To query logs for an individual PDB, you use a slightly
different procedure. After you connect to the PDB, you query DBA_LOGMNR_DICTIONARY_BUILDLOG, identify a START_SCN value, and then start LogMiner with DBMS_LOGMNR.START_LOGMNR, specifying
the SCN value of the log that you want to review. DBMS_LOGMNR.START_LOGMNR automatically adds the redo logs for you.

Do you have a plan for development on these changes?

Regards,
Kurt

Chris Cranford

unread,
Feb 23, 2024, 9:16:43 AM2/23/24
to debe...@googlegroups.com
Hi Kurt,

First, thanks for the link.

With Oracle Autonomous, it would appear from the documentation that you likely have an automatic license for GoldenGate, correct?  Assuming that my understanding is accurate, is there any particular reason why you would not consider using the Debezium for Oracle connector with the XStream adapter instead?  That adapter is specifically designed to integrate with the lower-level plumbing of GoldenGate to generate change events identical to LogMiner.

That said, looking at the documentation for how to integrate LogMiner with Autonomous doesn't seem overly complicated, and in fact seems to resemble a very similar integration that you would have used with Oracle 12 and before when continuous mining was still available.  But the main differentiator so far between our LogMiner and XStream adapters rests solely on the Oracle licensing requirements where LogMiner doesn't require GoldenGate while XStream does. This integration point would likely require and/or trigger a GoldenGate licensing requirement because of DBMS_GOLDENGATE_AUTH usage.

So would the XStream adapter work for you in this case?

Thanks,
Chris

Kurt Aquilina

unread,
Mar 7, 2024, 5:51:37 AM3/7/24
to debezium
HI Douglas,

Thanks a lot for your response.

We tested Debezium for Oracle connector with the XStream adapter on Autonomous Database.

Unfortunately, it failed from Autonomous Database side when we tried to create the OutBound Server by running the below statements with different arguments that fits our environment:

DECLARE
 tables  DBMS_UTILITY.UNCL_ARRAY;
 schemas DBMS_UTILITY.UNCL_ARRAY;
BEGIN
 tables(1)  := NULL;
 schemas(1) := 'debezium';
 DBMS_XSTREAM_ADM.CREATE_OUTBOUND(
  server_name     =>  'dbzxout',
  table_names     =>  tables,
  schema_names    =>  schemas);
END;
/

DECLARE
 tables  DBMS_UTILITY.UNCL_ARRAY;
 schemas DBMS_UTILITY.UNCL_ARRAY;
BEGIN
 tables(1)  := NULL;
 schemas(1) := 'debezium';
 DBMS_XSTREAM_ADM.CREATE_OUTBOUND(
  server_name     =>  'dbzxout',
  table_names     =>  tables,
  schema_names    =>  schemas,
  source_container_name => 'container_name');
END;
/

The error message we got indicates that the above commands must be executed from cdb$root container. As we already know, the cdb$root container is not accessible on Autonomous Database.

Do you have any suggestions on how can we make it work?

Many thanks.

Kurt Aquilina

unread,
Mar 7, 2024, 5:53:17 AM3/7/24
to debezium
Hi Chris,

Sorry but addressed the above message incorrectly.

Chris Cranford

unread,
Mar 7, 2024, 8:25:42 AM3/7/24
to debe...@googlegroups.com
Hi Kurt -

So I finally found a list of all the autonomous database features that are not available [1], and unfortunately XStream is also on that list.  I also reconfirmed that access to the LogMiner package DBMS_LOGMNR throws "ORA-65040: operation not allowed from within a pluggable database" when the user attempts to perform LogMiner operations from within the PDB.  So I'm afraid both of these limitations from Oracle exclude any ability to use Debezium's XStream and LogMiner adapters. 

The other alternative to try might be the OpenLogReplicator adapter; however I cannot guarantee there aren't some corner cases where we may still try and attempt to access the CDB$ROOT either in Debezium's code or in the replicator process' code.  This specific database deployment model isn't one for which we've tested, just wanting to be transparent. The OpenLogReplicator tool is an open-source tool that you configure to read the redo and archive logs directly from the file system. Debezium interfaces with this tool using a network connection to generate change events in a similar networking model that XStream uses with the GoldenGate backend processes with Oracle. You can find details about the adapter in the documentation [2].

Unfortunately, if the OpenLogReplicator adapter does not work, then I'm afraid that perhaps the only feature available may be GoldenGate only.

Thanks,
Chris

[1]: https://docs.oracle.com/en/cloud/paas/autonomous-database/dedicated/adbdg/#articletitle
[2]:https://debezium.io/documentation/reference/stable/connectors/oracle.html#oracle-openlogreplicator-support
Reply all
Reply to author
Forward
0 new messages