Learn how to make the best use out of our community
Ask a question about how to use a feature in a specific situation
Latest information on our products, services, and events
Release notes & updates of all our products
Dear All,i am trying to create 2 pipelines from Oracle to Postgres.the first one is initial Oracle to Postgres, it worked very well, and schema conversion tool migrated 2 example tables to postgres. but i have an issue with Oracle CDC.i have logminer user on oracle which is striim and i have table owner which is scale_user and i have 2 tables scale_data and scale_data2.i am not using container and PDB on 21C, striim and scale_user are common user.i am receieving an error like below when i insert a data to the tables after commit.i see that reader is looking for tables like CDB$ROOT.TEST.SCALE_DATA but i am not using container so how can resolve the issue.There was no problem with initial load, the problem is related with CD process. Thanks in advance. [{ "_id" : null, "timeStamp" : 1707166962232, "originTimeStamp" : 1707166964000, "key" : null, "sourceUUID" : { "uuidstring" : "01eec465-3101-b301-b854-42010a9c000a" }, "data" : [ "862412", "1", "1", "460" ], "metadata" : { "RbaSqn" : "50
Hey! I have the following setup:MySQL → Striim → KafkaThis all works as expected, but if I have multiple MySQL tables, all changes are streamed into 1 single Kafka topic.Is there a way to configure the KafkaWriter to stream the messages to dedicated topics for each table?For exmaple:Events from table `users` should go to a topic `users` Events from table `items` should go to a topic `itmes`As far as I can see in the docs, you can only specify one topic per KafkaWriter.
I’ve got the following setup:MySQL → Striim → Kafka + CSR and avro encoding The table that I’m replicating is pretty simple: items | CREATE TABLE `items` ( `id` bigint unsigned NOT NULL AUTO_INCREMENT, `name` varchar(100) DEFAULT NULL, `category` varchar(100) DEFAULT NULL, `price` decimal(7,2) DEFAULT NULL, `inventory` int DEFAULT NULL, `inventory_updated_at` timestamp NULL DEFAULT CURRENT_TIMESTAMP, `created_at` timestamp NULL DEFAULT CURRENT_TIMESTAMP, `updated_at` datetime DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP, PRIMARY KEY (`id`), UNIQUE KEY `id` (`id`)) ENGINE=InnoDB AUTO_INCREMENT=2001 DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_0900_ai_ci However the bigint seems to be causing the following error when Striim starts ingesting messages:Message: KafkaWriter is unable to produce events to shop. Component Name: Kafka_2_1_0_mysql_kafka_csr_Target. Component Type: TARGET. Cause: KafkaWriter is unable to produce events to shop. Cause: Suitable Avro type not found fo
Hi Team, Can we create a Postgresql to Oracle CDC pipeline to make sure oracle is sync with postgresql post migration. Regards, Mahesh
we’ve observed a number of lost updates when running CDC. The state of initial load seems fine, but later on we came across NO_OP_UPDATES. I’ve ignored the exception for the CDC, but that resulted in some of the updates being lost and records left in stale state.It seems that it might be connected with updates being in the same batch as the initial insert. How are events ordered within a single batch? Have you observed such scenarios? Source is SQL Server and target is Postgres in case thats relevant.
I’d like to convert a timestamp from location specific one (in areas that have summer time change) to UTC. What would be the best way to do it within Striim? I assume it’s doable using Joda API and I can see in the reference that “Striim supports all date functions natively associated with Joda-Time.” but I didn’t find an example how to call those Joda Time functions
I am trying to write a CQ with MODIFY statement that replaces all “Not Available” to NULL but I am getting an error “Left expression do not refer to an object”. Please refer to the screenshot for reference.
Hi All, Greetings! We need to migrate data from Oracle to PostgresSQL using striim. In this case does striim provides data reconciliation report post loading the data in the target system. Regards, Mahesh
Trying to set up:Source: MySQL replication hosted in RDS, which works on its ownTarget: Kafka (works on its own) + CSR (untested) hosted on Confluent Cloud--using the AvroFormatter.The CSR is untested bc it doesn't seem like CSR is supported w/ the builtin load generators--though maybe I'm wrong.When I deploy it, I get:Error Deploy failed! java.util.concurrent.ExecutionException: java.lang.reflect.InvocationTargetExceptionI’ve tested the connection to the CSR and it works, so I don’t believe it’s a credential issue.
Note: To follow best practices guide, you must have the Persisted Streams add-on in Striim Cloud or Striim Platform.IntroductionChange Data Capture (CDC) is a critical methodology, particularly in scenarios demanding real-time data integration and analytics. CDC is a technique designed to efficiently capture and track changes made in a source database, thereby enabling real-time data synchronization and streamlining the process of updating data warehouses, data lakes, or other systems.Change Data Capture to Multiple SubscribersIt is common for organizations to stream transactional data from a database to multiple consumers – whether it be different lines of the business or separate technical infrastructure (databases, data warehouses, and messaging systems like Kafka). However, a common anti-pattern in CDC implementation is creating a separate read client for each subscriber. This might seem intuitive but is actually inefficient due to competing I/O and additional overhead created on t
Reach out to our world class support team
Need additional support? Connect with a solutions expert.
See Striim Cloud in action with a personalized demo from a Striim team member, or start your trial today.
Already have an account? Login
No account yet? Create an account
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.