Tales From A Lazy Fat DBA

Its all about Databases, their performance, troubleshooting & much more …. ¯\_(ツ)_/¯

Posts Tagged ‘goldengate’

Parquet, hadoop, and a quietly dying process : lessons from a migration test using GoldenGate 23ai DAA

Posted by FatDBA on February 8, 2026

I was doing some hands-on testing with Oracle GoldenGate 23ai DAA, trying to move data from an old but reliable Oracle 11g database into Microsoft Azure Fabric. The idea was simple enough. Capture changes from Oracle 11g, push them through GoldenGate 23ai, and land them in Fabric OneLake so they could be used by a Lakehouse or a Mirrored Database. On paper, it sounded clean. In real life… well, it took a bit of digging.

The source side was boring in a good way. Oracle 11g behaved exactly as expected. Extracts were running, trails were getting generated, no drama there. The real work was on the target side. I configured a Replicat using the File Writer with Parquet output, since Parquet is the natural fit for Microsoft Fabric. Fabric loves Parquet. Lakehouse loves Parquet. Mirrored databases too. So far, so good.

I started the Replicat and GoldenGate politely told me it had started. That tiny moment of relief you get when a command doesn’t fail right away. But then I checked the status… and it was STOPPED. No lag, no progress, nothing. That’s usually when you know something went wrong very early, before any real work even started.

So I opened the report file. And there it was. A Java error staring right back at me:

OGG (http://192.168.10.10:9001 OGG23AIDAA as BigData@) 18> START REPLICAT FATD11D
2025-12-12T21:25:18Z  INFO    OGG-00975  Replicat group FATD11D starting.
2025-12-12T21:25:18Z  INFO    OGG-15445  Replicat group FATD11D started.


OGG (http://192.168.10.10:9001 OGG23AIDAA as BigData@) 20> info replicat FATD11D

Replicat   FATD11D    Initialized  2025-12-12 16:24   Status STOPPED
Checkpoint Lag       00:00:00 (updated 00:00:55 ago)
Log Read Checkpoint  File dirdat/i1000000000
                     First Record  RBA 0
Encryption Profile   LocalWallet





OGG (http://192.168.10.10:9001 OGG23AIDAA as BigData@) 21> view report FATD11D

***********************************************************************
     Oracle GoldenGate for Distributed Applications and Analytics
                   Version 23.10.0.25.10 (Build 001)

                      Oracle GoldenGate Delivery
 Version 23.10.1.25.10 OGGCORE_23.10.0.0.0OGGRU_LINUX.X64_251018.0830
    Linux, x64, 64bit (optimized), Generic on Oct 18 2025 14:00:54

Copyright (C) 1995, 2025, Oracle and/or its affiliates. All rights reserved.

                    Starting at 2025-12-12 16:25:18
***********************************************************************

2025-12-12 16:25:19  INFO    OGG-15052  Using Java class path: /testgg/app/ogg/ogg23ai/ogg23aidaa_MA//ggjava/ggjava.jar:/testgg/app/ogg/ogg23ai/ogg23aidaa_DEPLOYMENT/etc/conf/ogg:/u01/app/ogg/ogg
23ai/ogg23aidaa_MA/.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/parquet/hadoop/metadata/CompressionCodecName
        at oracle.goldengate.eventhandler.parquet.ParquetEventHandlerProperties.<init>(ParquetEventHandlerProperties.java:43)
        at oracle.goldengate.eventhandler.parquet.ParquetEventHandler.<init>(ParquetEventHandler.java:53)
        at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
        at java.base/java.lang.Class.newInstance(Class.java:587)
        at oracle.goldengate.datasource.eventhandler.EventHandlerFramework.instantiateEventHandler(EventHandlerFramework.java:219)
        at oracle.goldengate.datasource.eventhandler.EventHandlerFramework.initEventHandler(EventHandlerFramework.java:163)
        at oracle.goldengate.datasource.eventhandler.EventHandlerFramework.init(EventHandlerFramework.java:58)
        at oracle.goldengate.handler.filewriter.FileWriterHandlerEO.init(FileWriterHandlerEO.java:627)
        at oracle.goldengate.datasource.AbstractDataSource.addDataSourceListener(AbstractDataSource.java:602)
        at oracle.goldengate.datasource.factory.DataSourceFactory.getDataSource(DataSourceFactory.java:164)
        at oracle.goldengate.datasource.UserExitDataSourceLauncher.<init>(UserExitDataSourceLauncher.java:45)
        at oracle.goldengate.datasource.UserExitMain.main(UserExitMain.java:109)
Caused by: java.lang.ClassNotFoundException: org.apache.parquet.hadoop.metadata.CompressionCodecName
        at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581)
        at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
        at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:526)
        ... 15 more
2025-12-12 16:25:22  WARNING OGG-00869  java.lang.ClassNotFoundException: org.apache.parquet.hadoop.metadata.CompressionCodecName.

Source Context :
  SourceFile              : [/ade/aime_phxdbifa87/oggcore/OpenSys/src/gglib/ggdal/Adapter/Java/JavaAdapter.cpp]
  SourceMethod            : [HandleJavaException]
  SourceLine              : [350]
  ThreadBacktrace         : [19] elements
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/../lib/libgglog.so(CMessageContext::AddThreadContext())]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/../lib/libgglog.so(CMessageFactory::CreateMessage(CSourceContext*, unsigned int, ...))]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/../lib/libgglog.so(_MSG_String(CSourceContext*, int, char const*, CMessageFactory::MessageDisposition))]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/../lib/libggjava.so()]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/../lib/libggjava.so(ggs::gglib::ggdal::CJavaAdapter::Open())]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/replicat(ggs::gglib::ggdal::CDALAdapter::Open())]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/replicat(GenericImpl::Open())]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/replicat(GenericImpl::GetWriter())]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/replicat(GenericImpl::GetGenericDBType())]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/replicat(ggs::er::ReplicatContext::ReplicatContext(ggs::gglib::ggapp::ReplicationContextParams const&, bool, ggs::gglib::
ggmetadata::MetadataContext*, ggs::er::ReplicatContext::LogBSNManager*))]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/replicat(ggs::er::ReplicatContext::createReplicatContext(ggs::gglib::ggapp::ReplicationContextParams const&, ggs::gglib::
ggdatasource::DataSourceParams const&, ggs::gglib::ggmetadata::MetadataContext*))]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/replicat()]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/replicat(ggs::gglib::MultiThreading::MainThread::ExecMain())]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/replicat(ggs::gglib::MultiThreading::Thread::RunThread(ggs::gglib::MultiThreading::Thread::ThreadArgs*))]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/replicat(ggs::gglib::MultiThreading::MainThread::Run(int, char**))]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/replicat(main)]
                          : [/lib64/libc.so.6()]
                          : [/lib64/libc.so.6(__libc_start_main)]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/replicat(_start)]

2025-12-12 16:25:22  ERROR   OGG-15051  Java or JNI exception:
java.lang.NoClassDefFoundError: org/apache/parquet/hadoop/metadata/CompressionCodecName.

2025-12-12 16:25:22  ERROR   OGG-01668  PROCESS ABENDING.

At that point it clicked. GoldenGate itself was fine. Oracle 11g was fine. Fabric wasn’t even in the picture yet. The problem was simpler. The Parquet libraries were missing.

All of the pre-reqs are there in the DependencyDownloader directory. Inside you will find all scripts for everything… Parquet, Hadoop, OneLake, Kafka, and more. Before touching anything, I checked Java. Java 17 was already installed. I ran the Parquet dependency script. Maven kicked in, downloaded a bunch of JARs, and finished successfully. I restarted the Replicat, feeling pretty confident. And… it failed again. Different error this time, though, which honestly felt like progress.

[oggadmin@D-ADON-01-CC-VM bin]$
[oggadmin@D-ADON-01-CC-VM bin]$ find /u01/app/ogg/ogg23ai -name "*.properties" | egrep -i "sample|example|handler|parquet|filewriter" | head -n 20
/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/AdapterExamples/templates/oci.properties
/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/AdapterExamples/templates/kafka.properties
/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/AdapterExamples/templates/hbase.properties
/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/AdapterExamples/templates/parquet.properties
/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/AdapterExamples/templates/kafkaconnect.properties
/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/AdapterExamples/templates/azureservicebus.properties
/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/AdapterExamples/templates/mongo.properties
/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/AdapterExamples/templates/filewriter.properties
/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/AdapterExamples/templates/bigquery.properties
/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/AdapterExamples/templates/nosql.properties
/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/AdapterExamples/templates/hdfs.properties
/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/AdapterExamples/templates/synapse.properties
/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/AdapterExamples/templates/redshift.properties
/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/AdapterExamples/templates/pubsub.properties
/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/AdapterExamples/templates/s3.properties
/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/AdapterExamples/templates/redis.properties
/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/AdapterExamples/templates/elasticsearch.properties
/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/AdapterExamples/templates/jdbc.properties
/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/AdapterExamples/templates/adw.properties
/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/AdapterExamples/templates/jms.properties
[oggadmin@D-ADON-01-CC-VM bin]$




[oggadmin@D-ADON-01-CC-VM bin]$
[oggadmin@D-ADON-01-CC-VM bin]$ ls -ltrh /testgg/app/ogg/ogg23ai/ogg23aidaa_MA/ggjava
total 60K
-rwxrwxr-x. 1 oggadmin ogg  34K Jun  5  2024 NOTICES.txt
-rwxrwxr-x. 1 oggadmin ogg   95 Oct 21 10:50 ggjava-version.txt
-rwxrwxr-x. 1 oggadmin ogg 9.5K Oct 21 10:50 ggjava.jar
drwxr-xr-x. 5 oggadmin ogg 4.0K Jan 29 16:51 resources
drwxr-xr-x. 6 oggadmin ogg 4.0K Jan 29 16:51 maven-3.9.6



[oggadmin@D-ADON-01-CC-VM bin]$ find /u01/app/ogg/ogg23ai -iname "onelake.sh" -o -iname "*parquet*.sh" -o -iname "*dependency*.sh"
/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/DependencyDownloader/onelake.sh
/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/DependencyDownloader/parquet.sh
[oggadmin@D-ADON-01-CC-VM bin]$ /testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/DependencyDownloader/onelake.sh


[oggadmin@D-ADON-01-CC-VM bin]$ cd /testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/DependencyDownloader/
[oggadmin@D-ADON-01-CC-VM DependencyDownloader]$ ls
aws.sh                    cassandra_dse.sh          gcs.sh                    hbase_hortonworks.sh         kafka.sh             orc.sh             snowflake.sh
azure_blob_storage.sh     cassandra.sh              googlepubsub.sh           hbase.sh                     kinesis.sh           parquet.sh         snowflakestreaming.sh
bigquery.sh               config_proxy.sh           hadoop_azure_cloudera.sh  internal_scripts             mongodb_capture.sh   project            synapse.sh
bigquerystreaming.sh      databricks.sh             hadoop_cloudera.sh        kafka_cloudera.sh            mongodb.sh           redis.sh           velocity.sh
cassandra_capture_3x.sh   docs                      hadoop_hortonworks.sh     kafka_confluent_protobuf.sh  onelake.sh           redshift.sh        xmls
cassandra_capture_4x.sh   download_dependencies.sh  hadoop.sh                 kafka_confluent.sh           oracle_nosql_sdk.sh  s3.sh
cassandra_capture_dse.sh  elasticsearch_java.sh     hbase_cloudera.sh         kafka_hortonworks.sh         oracle_oci.sh        snowflake-fips.sh
[oggadmin@D-ADON-01-CC-VM DependencyDownloader]$






[oggadmin@D-ADON-01-CC-VM DependencyDownloader]$ java -version
openjdk version "17.0.18" 2026-01-20 LTS
OpenJDK Runtime Environment (Red_Hat-17.0.18.0.8-1.0.1) (build 17.0.18+8-LTS)
OpenJDK 64-Bit Server VM (Red_Hat-17.0.18.0.8-1.0.1) (build 17.0.18+8-LTS, mixed mode, sharing)
[oggadmin@D-ADON-01-CC-VM DependencyDownloader]$






[oggadmin@D-ADON-01-CC-VM DependencyDownloader]$ ./onelake.sh
openjdk version "17.0.18" 2026-01-20 LTS
Java is installed.
Apache Maven 3.9.6 (bc0240f3c744dd6b6ec2920b3cd08dcc295161ae)
Maven is accessible.
Root Configuration Script
INFO: This is the Maven binary [../../ggjava/maven-3.9.6/bin/mvn].
INFO: This is the location of the settings.xml file [./docs/settings_np.xml].
INFO: This is the location of the toolchains.xml file [./docs/toolchains.xml].
INFO: The dependencies will be written to the following directory[../dependencies/onelake].
INFO: The Maven coordinates are the following:
INFO: Dependency 1
INFO: Group ID [com.azure].
INFO: Artifact ID [azure-storage-file-datalake].
INFO: Version [12.20.0]
INFO: Dependency 2
INFO: Group ID [com.azure].
INFO: Artifact ID [azure-identity].
INFO: Version [1.13.1]
[INFO] Scanning for projects...
[INFO]
[INFO] ---------------< oracle.goldengate:dependencyDownloader >---------------
[INFO] Building dependencyDownloader 1.0
[INFO]   from pom_central_v2.xml
[INFO] --------------------------------[ pom ]---------------------------------
Downloading from central: https://repo.maven.apache.org/maven2/org/apache/maven/plugins/maven-clean-plugin/3.2.0/maven-clean-plugin-3.2.0.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/apache/maven/plugins/maven-clean-plugin/3.2.0/maven-clean-plugin-3.2.0.pom (5.3 kB at 24 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/apache/maven/plugins/maven-plugins/35/maven-plugins-35.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/apache/maven/plugins/maven-plugins/35/maven-plugins-35.pom (9.9 kB at 431 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/apache/maven/maven-parent/35/maven-parent-35.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/apache/maven/maven-parent/35/maven-parent-35.pom (45 kB at 1.7 MB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/apache/apache/25/apache-25.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/apache/apache/25/apache-25.pom (21 kB at 1.0 MB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/apache/maven/plugins/maven-clean-plugin/3.2.0/maven-clean-plugin-3.2.0.jar
Downloaded from central: https://repo.maven.apache.org/maven2/org/apache/maven/plugins/maven-clean-plugin/3.2.0/maven-clean-plugin-3.2.0.jar (36 kB at 1.4 MB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/apache/maven/plugins/maven-dependency-plugin/2.9/maven-dependency-plugin-2.9.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/apache/maven/plugins/maven-dependency-plugin/2.9/maven-dependency-plugin-2.9.pom (13 kB at 602 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/apache/maven/plugins
.........
...............
...................
[INFO] Copying netty-tcnative-boringssl-static-2.0.65.Final-windows-x86_64.jar to /testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/DependencyDownloader/dependencies/onelake/netty-tcnative-boringssl-static-2.0.65.Final-windows-x86_64.jar
[INFO] Copying reactive-streams-1.0.4.jar to /testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/DependencyDownloader/dependencies/onelake/reactive-streams-1.0.4.jar
[INFO] Copying oauth2-oidc-sdk-11.9.1.jar to /testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/DependencyDownloader/dependencies/onelake/oauth2-oidc-sdk-11.9.1.jar
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  8.334 s
[INFO] Finished at: 2025-12-12T16:45:52-05:00
[INFO] ------------------------------------------------------------------------
[oggadmin@D-ADON-01-CC-VM DependencyDownloader]$
[oggadmin@D-ADON-01-CC-VM DependencyDownloader]$
[oggadmin@D-ADON-01-CC-VM DependencyDownloader]$
[oggadmin@D-ADON-01-CC-VM DependencyDownloader]$













[oggadmin@D-ADON-01-CC-VM templates]$ cd /testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/DependencyDownloader
[oggadmin@D-ADON-01-CC-VM templates]$   ./parquet.sh 1.13.1
openjdk version "17.0.18" 2026-01-20 LTS
Java is installed.
Apache Maven 3.9.6 (bc0240f3c744dd6b6ec2920b3cd08dcc295161ae)
Maven is accessible.
Root Configuration Script
INFO: This is the Maven binary [../../ggjava/maven-3.9.6/bin/mvn].
INFO: This is the location of the settings.xml file [./docs/settings_np.xml].
INFO: This is the location of the toolchains.xml file [./docs/toolchains.xml].
INFO: The dependencies will be written to the following directory[../dependencies/parquet_1.13.1].
.....
...........
.................
.....
Downloading from central: https://repo.maven.apache.org/maven2/org/apache/parquet/parquet-hadoop/1.13.1/parquet-hadoop-1.13.1.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/apache/parquet/parquet-hadoop/1.13.1/parquet-hadoop-1.13.1.pom (15 kB at 69 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/apache/parquet/parquet/1.13.1/parquet-1.13.1.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/apache/parquet/parquet/1.13.1/parquet-1.13.1.pom (25 kB at 790 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/apache/parquet/parquet-column/1.13.1/parquet-column-1.13.1.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/apache/parquet/parquet-column/1.13.1/parquet-column-1.13.1.pom (6.0 kB at 238 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/apache/parquet/parquet-common/1.13.1/parquet-common-1.13.1.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/apache/parquet/parquet-common/1.13.1/parquet-common-1.13.1.pom (3.4 kB at 143 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/apache/parquet/parquet-format-structures/1.13.1/parquet-format-structures-1.13.1.pom
......
..............
...............
[INFO] Copying jackson-annotations-2.12.7.jar to /testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/DependencyDownloader/dependencies/parquet_1.13.1/jackson-annotations-2.12.7.jar
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  2.119 s
[INFO] Finished at: 2025-12-12T16:52:03-05:00
[INFO] ------------------------------------------------------------------------

Once again the replicate on target side failed to start and this time with a different error.

OGG (http://192.168.10.10:9001 OGG23AIDAA as BigData@) 8>  info REPLICAT FATD11D

Replicat   FATD11D    Initialized  2025-12-12 16:24   Status STOPPED
Checkpoint Lag       00:00:00 (updated 00:34:28 ago)
Log Read Checkpoint  File dirdat/i1000000000
                     First Record  RBA 0
Encryption Profile   LocalWallet


OGG (http://192.168.10.10:9001 OGG23AIDAA as BigData@) 9> view report FATD11D

***********************************************************************
     Oracle GoldenGate for Distributed Applications and Analytics
                   Version 23.10.0.25.10 (Build 001)

                      Oracle GoldenGate Delivery
 Version 23.10.1.25.10 OGGCORE_23.10.0.0.0OGGRU_LINUX.X64_251018.0830
    Linux, x64, 64bit (optimized), Generic on Oct 18 2025 14:00:54

Copyright (C) 1995, 2025, Oracle and/or its affiliates. All rights reserved.

                    Starting at 2025-12-12 16:58:47
***********************************************************************

2025-12-12 16:58:47  INFO    OGG-15052  Using Java class path: /testgg/app/ogg/ogg23ai/ogg23aidaa_MA//ggjava/ggjava.jar:/testgg/app/ogg/ogg23ai/ogg23aidaa_DEPLOYMENT/etc/conf/ogg:/u01/app/ogg/ogg
23ai/ogg23aidaa_MA/:/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/DependencyDownloader/dependencies/onelake/*:/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/DependencyDownloader/dependencies/parquet_1.13.
1/*.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
        at oracle.goldengate.eventhandler.parquet.GGParquetWriter.init(GGParquetWriter.java:72)
        at oracle.goldengate.eventhandler.parquet.ParquetEventHandler.init(ParquetEventHandler.java:219)
        at oracle.goldengate.datasource.eventhandler.EventHandlerFramework.initEventHandler(EventHandlerFramework.java:168)
        at oracle.goldengate.datasource.eventhandler.EventHandlerFramework.init(EventHandlerFramework.java:58)
        at oracle.goldengate.handler.filewriter.FileWriterHandlerEO.init(FileWriterHandlerEO.java:627)
        at oracle.goldengate.datasource.AbstractDataSource.addDataSourceListener(AbstractDataSource.java:602)
        at oracle.goldengate.datasource.factory.DataSourceFactory.getDataSource(DataSourceFactory.java:164)
        at oracle.goldengate.datasource.UserExitDataSourceLauncher.<init>(UserExitDataSourceLauncher.java:45)
        at oracle.goldengate.datasource.UserExitMain.main(UserExitMain.java:109)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration
        at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581)
        at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
        at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:526)
        ... 9 more

2025-12-12 16:58:48  WARNING OGG-00869  java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration.

Source Context :
  SourceFile              : [/ade/aime_phxdbifa87/oggcore/OpenSys/src/gglib/ggdal/Adapter/Java/JavaAdapter.cpp]
  SourceMethod            : [HandleJavaException]
  SourceLine              : [350]
  ThreadBacktrace         : [19] elements
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/../lib/libgglog.so(CMessageContext::AddThreadContext())]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/../lib/libgglog.so(CMessageFactory::CreateMessage(CSourceContext*, unsigned int, ...))]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/../lib/libgglog.so(_MSG_String(CSourceContext*, int, char const*, CMessageFactory::MessageDisposition))]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/../lib/libggjava.so()]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/../lib/libggjava.so(ggs::gglib::ggdal::CJavaAdapter::Open())]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/replicat(ggs::gglib::ggdal::CDALAdapter::Open())]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/replicat(GenericImpl::Open())]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/replicat(GenericImpl::GetWriter())]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/replicat(GenericImpl::GetGenericDBType())]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/replicat(ggs::er::ReplicatContext::ReplicatContext(ggs::gglib::ggapp::ReplicationContextParams const&, bool, ggs::gglib::
ggmetadata::MetadataContext*, ggs::er::ReplicatContext::LogBSNManager*))]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/replicat(ggs::er::ReplicatContext::createReplicatContext(ggs::gglib::ggapp::ReplicationContextParams const&, ggs::gglib::
ggdatasource::DataSourceParams const&, ggs::gglib::ggmetadata::MetadataContext*))]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/replicat()]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/replicat(ggs::gglib::MultiThreading::MainThread::ExecMain())]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/replicat(ggs::gglib::MultiThreading::Thread::RunThread(ggs::gglib::MultiThreading::Thread::ThreadArgs*))]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/replicat(ggs::gglib::MultiThreading::MainThread::Run(int, char**))]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/replicat(main)]
                          : [/lib64/libc.so.6()]
                          : [/lib64/libc.so.6(__libc_start_main)]
                          : [/testgg/app/ogg/ogg23ai/ogg23aidaa_MA/bin/replicat(_start)]

2025-12-12 16:58:48  ERROR   OGG-15051  Java or JNI exception:
java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration.

2025-12-12 16:58:48  ERROR   OGG-01668  PROCESS ABENDING.

That one made me pause for a second. The target wasn’t HDFS. I wasn’t running Hadoop. This was Microsoft Fabric. But here’s the catch. Parquet depends on Hadoop, even when you’re not using Hadoop directly. Some core Parquet classes expect Hadoop configuration classes to exist. No Hadoop libs, no Parquet writer.

So back to the DependencyDownloader I went, this time running the Hadoop script. More downloads, more JARs, more waiting.

[oggadmin@D-ADON-01-CC-VM DependencyDownloader]$
[oggadmin@D-ADON-01-CC-VM DependencyDownloader]$ cd /testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/DependencyDownloader

[oggadmin@D-ADON-01-CC-VM DependencyDownloader]$ ./hadoop.sh 3.4.2
openjdk version "17.0.18" 2026-01-20 LTS
Java is installed.
Apache Maven 3.9.6 (bc0240f3c744dd6b6ec2920b3cd08dcc295161ae)
Maven is accessible.
Root Configuration Script
INFO: This is the Maven binary [../../ggjava/maven-3.9.6/bin/mvn].
INFO: This is the location of the settings.xml file [./docs/settings_np.xml].
INFO: This is the location of the toolchains.xml file [./docs/toolchains.xml].
INFO: The dependencies will be written to the following directory[../dependencies/hadoop_3.4.2].
[INFO] ---------------< oracle.goldengate:dependencyDownloader >---------------
[INFO] Building dependencyDownloader 1.0
[INFO]   from pom_central_v2.xml
[INFO] --------------------------------[ pom ]---------------------------------
Downloading from central: https://repo.maven.apache.org/maven2/org/apache/hadoop/hadoop-client/3.4.2/hadoop-client-3.4.2.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/apache/hadoop/hadoop-client/3.4.2/hadoop-client-3.4.2.pom (11 kB at 58 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/apache/hadoop/hadoop-project-dist/3.4.2/hadoop-project-dist-3.4.2.pom
Downloaded from central: https://repo.maven.apach
..........
................
.....................
[INFO] Copying netty-codec-stomp-4.1.118.Final.jar to /testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/DependencyDownloader/dependencies/hadoop_3.4.2/netty-codec-stomp-4.1.118.Final.jar
[INFO] Copying dnsjava-3.6.1.jar to /testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/DependencyDownloader/dependencies/hadoop_3.4.2/dnsjava-3.6.1.jar
[INFO] Copying netty-transport-native-unix-common-4.1.118.Final.jar to /testgg/app/ogg/ogg23ai/ogg23aidaa_MA/opt/DependencyDownloader/dependencies/hadoop_3.4.2/netty-transport-native-unix-common-4.1.118.Final.jar
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  7.627 s
[INFO] Finished at: 2025-12-12T18:02:30-05:00
[INFO] ------------------------------------------------------------------------
[oggadmin@D-ADON-01-CC-VM DependencyDownloader]$
[oggadmin@D-ADON-01-CC-VM DependencyDownloader]$

Once that finished, I restarted the Replicat again. No big expectations this time. This time it stayed up.

OGG (http://192.168.10.10:9001 OGG23AIDAA as BigData@) 2> START REPLICAT FATD11D
2025-12-12T23:07:54Z  INFO    OGG-00975  Replicat group FATD11D starting.
2025-12-12T23:07:54Z  INFO    OGG-15445  Replicat group FATD11D started.

OGG (http://192.168.10.10:9001 OGG23AIDAA as BigData@) 3> info FATD11D
No Extract groups exist.

Replicat   FATD11D    Last Started 2025-12-12 18:07   Status RUNNING
Checkpoint Lag       00:00:00 (updated 00:00:02 ago)
Process ID           47420
Log Read Checkpoint  File dirdat/i10000000001
                     First Record  RBA 167873
Encryption Profile   LocalWallet

The big takeaway from this whole exercise is pretty simple. When you’re doing Oracle database to Microsoft Azure Fabric using GoldenGate 23ai DAA, the tricky part is not Oracle, and not Fabric. It’s the middle layer. Parquet is the bridge, and Parquet brings Hadoop with it, whether you like it or not. If those dependencies aren’t staged correctly, the OGG processes will start, smile at you, and then quietly fall over 😀

Once everything was in place, though, the setup worked exactly the way it should. A clean path from a legacy Oracle 11g database into a modern Microsoft Fabric Lakehouse. No magic. Just the right pieces, in the right order… and a bit of patience

Hope It Helped!
Prashant Dixit

Posted in Uncategorized | Tagged: , , , , | Leave a Comment »

Lessons from Integrating Oracle 11g with GoldenGate – ORA-06512 ORA-06502

Posted by FatDBA on February 7, 2026

This one took time. More than I’d like to admit.

I was trying to run GoldenGate 21c Integrated Extract from a remote extraction server, sourcing redo from an Oracle 11g (11.2.0.4.201020 (October 2020 PSU)) database. Everything worked fine, I was able to create the extract, login to the database, but got a weird issue when tried to register the extract with 11g source database from remote extraction.

GGSCI (fatdbatestlab1) 7> dblogin useridalias ogg11g
Successfully logged into database.

GGSCI (fatdbatestlab1) GGREPAZUR@boom1) 8> REGISTER EXTRACT ext11g DATABASE

xxxx:xxx:xxxx   ERROR   OGG-08221  Cannot register or unregister Extract group EXT11G because of the following SQL error: OCI Error ORA 
(status = 6502-ORA-06502: PL/SQL: numeric or value error: character string buffer too small
ORA-06512: at "SYS.DBMS_XSTREAM_GG_ADM", line 145
ORA-06512: at "SYS.DBMS_XSTREAM_GG_ADM", line 186
ORA-06512: at line 1).

So Integrated Extract was the plan from day one. Even with that clarity, things didn’t go smoothly. Integrated Extract should work with 11g, specially with remote-extraction option due to security concerns and performance issues. The docs say so (if you are on compatible 11.2.0.4), compatibility matrices agree.

Now, just to be clear upfront .. I already knew that if I tried classic (non-integrated) extract remotely, I would hit “OGG-02022 Logmining server does not exist on this Oracle database.” .. That part wasn’t a surprise. Classic extract + remote server + 11g… yeah, that’s expected.

Yet I kept running into weird behavior that just didn’t add up. That’s when the doubt started creeping in … “Am I missing something?” “Is Integrated Extract actually usable with 11g in real life… not just on paper?”

Before touching the source server, I paused and went deeper into Oracle notes and bugs. That’s when I landed on the real issue. This wasn’t a GoldenGate 21c problem. And it wasn’t a remote extraction limitation either. It was an Oracle 11g RDBMS bug.

The Actual Root Cause was the database Bug 28367006.. Once I applied Patch 21683400 & the datapatch things finally started behaving like a sane system again.

GGSCI (fatdbatestlab1) GGREPAZUR@boom1) 8> REGISTER EXTRACT ext11g DATABASE
xxxx:xxx:xxx INFO    OGG-02003  Extract group EXT11G successfully registered with database at SCN 189381938103811.
 

It’s worth calling out that while Oracle 11g is technically supported for Integrated Extract, it is still a very old database release, and expectations need to be set accordingly. Running the latest available PSU is not optional in this kind of setup, and being aware of known bugs, defects, and architectural limitations is part of the job when working with legacy versions. In practice, if you stumble into a new or undocumented issue on 11g, Oracle Support is unlikely to engage development for a fresh bug fix, which means the only real options are workarounds, existing patches, or architectural adjustments. That reality alone makes proactive patching and careful design choices even more critical when pairing modern GoldenGate versions with older database platforms.

Hope It Helped!
Prashant Dixit

Posted in Uncategorized | Tagged: , , | Leave a Comment »

Demo on how to use Oracle XStream Replication Setup Using Java Client

Posted by FatDBA on June 15, 2025

Oracle XStream is a feature of Oracle Database that allows for real-time data replication between databases. Unlike Oracle GoldenGate, which is more GUI-driven and feature-rich, XStream is more code-centric. This makes it highly customizable, especially suitable when you want a Java or C-based client to control how data flows.

XStream vs GoldenGate

  • XStream is older and less frequently updated compared to Oracle GoldenGate.
  • XStream is limited to Oracle-to-Oracle replication and does not support heterogeneous environments.
  • GoldenGate supports a wide range of sources and targets, is easier to manage with its GUI-based tools, and has better support for DDL replication, integrated capture, and coordinated replication.

When and Why to Use XStream

Use XStream when:

  • You need fine-grained control using custom applications written in Java or C.
  • Your environment is Oracle-to-Oracle only.
  • You want a lightweight, programmatic replication tool that can be embedded in your application logic.
  • Licensing or infrastructure limitations do not permit GoldenGate and licen sing fees is a concern.

Do not use XStream if:

  • You require GUI-driven replication monitoring and setup.
  • You need heterogeneous replication (e.g., Oracle to PostgreSQL, MySQL, etc.).
  • Your use case demands continuous support and new feature releases.

Lets do a quick demo on how to use it, I have setup a lab environment for the test.

  • Source Database: dixitdb on 192.168.68.79:1521
  • Target Database: targetdb on 192.168.68.85:1521

Java Application (xio.java) will:

  • Connect to XStream Outbound on the source DB
  • Connect to XStream Inbound on the target DB
  • Transfer changes using Logical Change Records (LCRs)

Source Database Setup (dixitdb)

Step 1: Enable Replication Features

ALTER SYSTEM SET enable_goldengate_replication = TRUE SCOPE=BOTH;

Step 2: Ensure Archive Logging is Enabled

archive log list;

Must show: Database log mode              Archive Mode

Step 3: Create and Grant User for XStream

CREATE USER xstream_admin IDENTIFIED BY Welcome123;
GRANT CONNECT, RESOURCE, DBA TO xstream_admin;

Step 4: Enable Supplemental Logging

ALTER DATABASE ADD SUPPLEMENTAL LOG DATA;
ALTER DATABASE ADD SUPPLEMENTAL LOG DATA (ALL) COLUMNS;

Step 5: Create Demo Table

CREATE TABLE xstream_admin.demo_table (
  id    NUMBER PRIMARY KEY,
  name  VARCHAR2(100),
  value NUMBER
);

Step 6: Grant XStream Admin Privilege

BEGIN
  DBMS_XSTREAM_AUTH.GRANT_ADMIN_PRIVILEGE(
    grantee => 'XSTREAM_ADMIN',
    privilege_type => 'CAPTURE',
    grant_select_privileges => TRUE
  );
END;
/

Step 7: Create Outbound Server

BEGIN
  DBMS_XSTREAM_ADM.CREATE_OUTBOUND(
    server_name => 'XOUT_SRV',
    connect_user => 'XSTREAM_ADMIN'
  );
END;
/

Check status:

SELECT capture_name, status FROM dba_capture;


Target Database Setup (targetdb)

Step 1: Create the Same User

CREATE USER xstream_admin IDENTIFIED BY Welcome123;
GRANT CONNECT, RESOURCE, UNLIMITED TABLESPACE TO xstream_admin;

Step 2: Setup Queue

BEGIN
  DBMS_XSTREAM_ADM.SET_UP_QUEUE(
    queue_table => 'xstream_admin.XIN_SRV',
    queue_name  => 'xstream_admin.XIN_SRV'
  );
END;
/

Step 3: Create Inbound Server

BEGIN
  DBMS_XSTREAM_ADM.CREATE_INBOUND(
    server_name =>'XIN_SRV',
    queue_name  =>'xstream_admin.XIN_SRV',
    apply_user  =>'xstream_admin',
    comment     =>'xstream_admin in'
  );
END;
/

Step 4: Create Target Table

CREATE TABLE xstream_admin.demo_table (
  id    NUMBER PRIMARY KEY,
  name  VARCHAR2(100),
  value NUMBER
);

Step 5: Add Table Rules

VAR dml_rule VARCHAR2(30);
VAR ddl_rule VARCHAR2(30);
BEGIN
  DBMS_XSTREAM_ADM.ADD_TABLE_RULES(
    table_name => 'xstream_admin.demo_table',
    streams_type => 'APPLY',
    streams_name => 'XIN_SRV',
    queue_name => 'xstream_admin.XIN_SRV',
    source_database => 'dixitdb',
    dml_rule_name => :dml_rule,
    ddl_rule_name => :ddl_rule
  );
END;
/

Step 6: Start Apply Process

BEGIN
  DBMS_APPLY_ADM.START_APPLY(apply_name => 'XIN_SRV');
END;
/

Check status:
SELECT apply_name, status FROM dba_apply;

Java Application: xio.java

The xio.java program acts as a custom replication engine. It:

  • Connects to XStream Inbound and Outbound
  • Receives LCRs (Logical Change Records) from the source
  • Sends them to the target

Pre-Requisites:

  • ojdbc8-19.15.0.0.jar (Oracle JDBC driver)
  • xstreams.jar (from $ORACLE_HOME/rdbms/jlib/)
import oracle.streams.*;
import oracle.jdbc.internal.OracleConnection;
import oracle.jdbc.*;
import oracle.sql.*;
import java.sql.*;
import java.util.*;

public class xio
{
public static String xsinusername = null;
public static String xsinpasswd = null;
public static String xsinName = null;
public static String xsoutusername = null;
public static String xsoutpasswd = null;
public static String xsoutName = null;
public static String in_url = null;
public static String out_url = null;
public static Connection in_conn = null;
public static Connection out_conn = null;
public static XStreamIn xsIn = null;
public static XStreamOut xsOut = null;
public static byte[] lastPosition = null;
public static byte[] processedLowPosition = null;

public static void main(String args[])
{
// get connection url to inbound and outbound server
in_url = parseXSInArguments(args);
out_url = parseXSOutArguments(args);

// create connection to inbound and outbound server
in_conn = createConnection(in_url, xsinusername, xsinpasswd);
out_conn = createConnection(out_url, xsoutusername, xsoutpasswd);

// attach to inbound and outbound server
xsIn = attachInbound(in_conn);
xsOut = attachOutbound(out_conn);

// main loop to get lcrs
get_lcrs(xsIn, xsOut);

// detach from inbound and outbound server
detachInbound(xsIn);
detachOutbound(xsOut);
}

// parse the arguments to get the conncetion url to inbound db
public static String parseXSInArguments(String args[])
{
String trace, pref;
String orasid, host, port;

if (args.length != 12)
{
printUsage();
System.exit(0);
}

orasid = args[0];
host = args[1];
port = args[2];
xsinusername = args[3];
xsinpasswd = args[4];
xsinName = args[5];

System.out.println("xsin_host = "+host);
System.out.println("xsin_port = "+port);
System.out.println("xsin_ora_sid = "+orasid);

String in_url = "jdbc:oracle:oci:@"+host+":"+port+":"+orasid;
System.out.println("xsin connection url: "+ in_url);

return in_url;
}

// parse the arguments to get the conncetion url to outbound db
public static String parseXSOutArguments(String args[])
{
String trace, pref;
String orasid, host, port;

if (args.length != 12)
{
printUsage();
System.exit(0);
}

orasid = args[6];
host = args[7];
port = args[8];
xsoutusername = args[9];
xsoutpasswd = args[10];
xsoutName = args[11];


System.out.println("xsout_host = "+host);
System.out.println("xsout_port = "+port);
System.out.println("xsout_ora_sid = "+orasid);

String out_url = "jdbc:oracle:oci:@"+host+":"+port+":"+orasid;
System.out.println("xsout connection url: "+ out_url);

return out_url;
}

// print out sample program usage message
public static void printUsage()
{
System.out.println("");
System.out.println("Usage: java xio "+"<xsin_oraclesid> " + "<xsin_host> "
+ "<xsin_port> ");
System.out.println(" "+"<xsin_username> " + "<xsin_passwd> "
+ "<xsin_servername> ");
System.out.println(" "+"<xsout_oraclesid> " + "<xsout_host> "
+ "<xsout_port> ");
System.out.println(" "+"<xsout_username> " + "<xsout_passwd> "
+ "<xsout_servername> ");
}

// create a connection to an Oracle Database
public static Connection createConnection(String url,
String username,
String passwd)
{
try
{
DriverManager.registerDriver(new oracle.jdbc.OracleDriver());
return DriverManager.getConnection(url, username, passwd);
}
catch(Exception e)
{
System.out.println("fail to establish DB connection to: " +url);
e.printStackTrace();
return null;
}
}

// attach to the XStream Inbound Server
public static XStreamIn attachInbound(Connection in_conn)
{
XStreamIn xsIn = null;
try
{
xsIn = XStreamIn.attach((OracleConnection)in_conn, xsinName,
"XSDEMOINCLIENT" , XStreamIn.DEFAULT_MODE);

// use last position to decide where should we start sending LCRs
lastPosition = xsIn.getLastPosition();
System.out.println("Attached to inbound server:"+xsinName);
System.out.print("Inbound Server Last Position is: ");
if (null == lastPosition)
{
System.out.println("null");
}
else
{
printHex(lastPosition);
}
return xsIn;
}
catch(Exception e)
{
System.out.println("cannot attach to inbound server: "+xsinName);
System.out.println(e.getMessage());
e.printStackTrace();
return null;
}
}

// attach to the XStream Outbound Server
public static XStreamOut attachOutbound(Connection out_conn)
{
XStreamOut xsOut = null;

try
{
// when attach to an outbound server, client needs to tell outbound
// server the last position.
xsOut = XStreamOut.attach((OracleConnection)out_conn, xsoutName,
lastPosition, XStreamOut.DEFAULT_MODE);
System.out.println("Attached to outbound server:"+xsoutName);
System.out.print("Last Position is: ");
if (lastPosition != null)
{
printHex(lastPosition);
}
else
{
System.out.println("NULL");
}
return xsOut;
}
catch(Exception e)
{
System.out.println("cannot attach to outbound server: "+xsoutName);
System.out.println(e.getMessage());
e.printStackTrace();
return null;
}
}

// detach from the XStream Inbound Server
public static void detachInbound(XStreamIn xsIn)
{
byte[] processedLowPosition = null;
try
{
processedLowPosition = xsIn.detach(XStreamIn.DEFAULT_MODE);
System.out.print("Inbound server processed low Position is: ");
if (processedLowPosition != null)
{
printHex(processedLowPosition);
}
else
{
System.out.println("NULL");
}
}
catch(Exception e)
{
System.out.println("cannot detach from the inbound server: "+xsinName);
System.out.println(e.getMessage());
e.printStackTrace();
}
}

// detach from the XStream Outbound Server
public static void detachOutbound(XStreamOut xsOut)
{
try
{
xsOut.detach(XStreamOut.DEFAULT_MODE);
}
catch(Exception e)
{
System.out.println("cannot detach from the outbound server: "+xsoutName);
System.out.println(e.getMessage());
e.printStackTrace();
}
}

public static void get_lcrs(XStreamIn xsIn, XStreamOut xsOut)
{
if (null == xsIn)
{
System.out.println("xstreamIn is null");
System.exit(0);
}

if (null == xsOut)
{
System.out.println("xstreamOut is null");
System.exit(0);
}

try
{
while(true)
{
// receive an LCR from outbound server
LCR alcr = xsOut.receiveLCR(XStreamOut.DEFAULT_MODE);

if (xsOut.getBatchStatus() == XStreamOut.EXECUTING) // batch is active
{
assert alcr != null;
// send the LCR to the inbound server
xsIn.sendLCR(alcr, XStreamIn.DEFAULT_MODE);

// also get chunk data for this LCR if any
if (alcr instanceof RowLCR)
{
// receive chunk from outbound then send to inbound
if (((RowLCR)alcr).hasChunkData())
{
ChunkColumnValue chunk = null;
do
{
chunk = xsOut.receiveChunk(XStreamOut.DEFAULT_MODE);
xsIn.sendChunk(chunk, XStreamIn.DEFAULT_MODE);
} while (!chunk.isEndOfRow());
}
}
processedLowPosition = alcr.getPosition();
}
else // batch is end
{
assert alcr == null;
// flush the network
xsIn.flush(XStreamIn.DEFAULT_MODE);
// get the processed_low_position from inbound server
processedLowPosition =
xsIn.getProcessedLowWatermark();
// update the processed_low_position at oubound server
if (null != processedLowPosition)
xsOut.setProcessedLowWatermark(processedLowPosition,
XStreamOut.DEFAULT_MODE);
}
}
}
catch(Exception e)
{
System.out.println("exception when processing LCRs");
System.out.println(e.getMessage());
e.printStackTrace();
}
}

public static void printHex(byte[] b)
{
for (int i = 0; i < b.length; ++i)
{
System.out.print(
Integer.toHexString((b[i]&0xFF) | 0x100).substring(1,3));
}
System.out.println("");
}
}


—-> Downloaded compatibile version of ojdbc driver ojdbc8-19.15.0.0.jar from oracle’s website
—- $ORACLE_HOME/rdbms/jlib/xstreams.jar
—-> copy ‘xstreams.jar’ from the ORACLE_HOME
—-> Compile the java code to create classes.
javac -cp “ojdbc8-19.15.0.0.jar:xstreams.jar:.” xio.java

— Once complied, call the java program now.
— Run Java Replication App
— This does: Connect to outbound server, Read LCRs (Logical Change Record), Send LCRs to inbound server and Confirm position/flush status

[oracle@oracleontario lib]$ java -cp "ojdbc8-19.15.0.0.jar:xstreams.jar:." xio targetdb
192.168.68.85 1521 XSTREAM_ADMIN Welcome123 XIN_SRV dixitdb 192.168.68.79 1521
XSTREAM_ADMIN Welcome123 XOUT_SRV
xsin_host = 192.168.68.85
xsin_port = 1521
xsin_ora_sid = targetdb
xsin connection url: jdbc:oracle:oci:@192.168.68.85:1521:targetdb
xsout_host = 192.168.68.79
xsout_port = 1521
xsout_ora_sid = dixitdb
xsout connection url: jdbc:oracle:oci:@192.168.68.79:1521:dixitdb
Attached to inbound server:XIN_SRV
Inbound Server Last Position is: null
Attached to outbound server:XOUT_SRV
Last Position is: NULL

Now when the code is running and source and target is ready. Lets try to do an insert and see if we gets it on the target.

-- on source database
SQL> select * from xstream_admin.demo_table;
ID
NAME
VALUE
----------
-----------------------------------------------------------------------------------------
----------- ----------
 10
Maine


SQL> insert into xstream_admin.demo_table (ID, NAME, VALUE) values (101, 'Calgary', 100);

1 row created.

SQL> commit;

Commit complete.

SQL> /

ID
NAME
VALUE
650 ----------
-----------------------------------------------------------------------------------------
----------- ----------
10
Maine
800
101
Calgary
100



--- on target database
SQL>
SQL> select * from xstream_admin.demo_table;
ID
NAME
VALUE
----------
-----------------------------------------------------------------------------------------
----------- ----------
10
Maine
800
101
Calgary
100

Hope It Helped!
Prashant Dixit

Posted in Uncategorized | Tagged: , , , , , , , , | Leave a Comment »