3

I'm looking for a tool to export data from a PostgreSQL DB to an Oracle data warehouse. I'm really looking for a heterogenous DB replication tool, rather than an export->convert->import solution.

Continuent Tungsten Replicator looks like it would do the job, but PostgreSQL support won't be ready for another couple months.

Are there any open-source tools out there that will do this? Or am I stuck with some kind of scheduled pg_dump/SQL*Loader solution?

4 Answers 4

3

You can create a database link from Oracle to Postgres (this is called heterogeneous connectivity). This makes it possible to select data from Postgres with a select statement in Oracle. You can use materialized views to schedule and store the results of those selects.

Sign up to request clarification or add additional context in comments.

1 Comment

I'd like a solution that is more about transfering deltas rather than periodically querying, but this is definitely a better alternative to dump/import, thanks.
2

It sounds like SymmetricDS would work for your scenario. SymmetricDS is web-enabled, database independent, data synchronization/replication software. It uses web and database technologies to replicate tables between relational databases in near real time.

Comments

0

Sounds like you want an ETL (extract transform load) tool. There are allot of open source options Enhydra Octopus, and Talend Open Studio are a couple I've come across. In general ETL tools offer you better flexibility than the straight across replication option. Some offer scheduling, data quality, and data lineage.

Comments

0

Consider using the Confluent Kafka Connect JDBC sink and source connectors if you'd like to replicate data changes across heterogeneous databases in real time. The source connector can select the entire database , particular tables, or rows returned by a provided query, and send the data as a Kafka message to your Kafka broker. The source connector can calculate the diffs based on an incrementing id column, a timestamp column, or be run in bulk mode where the entire contents are recopied periodically. The sink can read these messages, optionally check them against an avro or json schema, and populate the source database with the results. It's all free, and several sink and source connectors exist for many relational and non-relational databases.

*One major caveat - Some JDBC Kafka connectors can not capture hard deletes

To get around that limitation, you can use a propietary connector such as Debezium (http://www.debezium.io), see also Delete events from JDBC Kafka Connect Source.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.