I receive the Error "Please upload a local file" when I try to upload MySQL driver in the Developer Edition.I have tried to upload MySQL and MariaDB drivers and I always receive the same error.
I am trying to modify an existing Striim App which routes events based on the table name to different Target Writers. I would like to know if two different WHEN statements can be routed to the same stream. Based on the Striim Router documentation here . Can the sample code be modified to route the events of two case statements to `stream_one` as shown below. CREATE ROUTER myRouter INPUT FROM mySourceStream AS src CASE WHEN TO_INT(src.data[1]) < 150 THEN ROUTE TO stream_one, WHEN TO_INT(src.data[1]) >= 150 THEN ROUTE TO stream_two, WHEN meta(src,"TableName").toString() like 'QATEST.TABLE_%' THEN ROUTE TO stream_one,ELSE ROUTE TO stream_else;Exploring other possible ways to resolve the above issue - Is it possible to write the above as a Boolean logic? CREATE OR REPLACE ROUTER myRouter INPUT FROM mySourceStream AS srcCASE WHEN meta(src,"TableName").toString() like 'QATEST.USERS' THEN ROUTE TO stream_one OR WHEN meta(src,"TableName").toString() like 'QATEST.ACCOUNTS' THEN ROUTE
Hey! I have the following setup:MySQL → Striim → KafkaThis all works as expected, but if I have multiple MySQL tables, all changes are streamed into 1 single Kafka topic.Is there a way to configure the KafkaWriter to stream the messages to dedicated topics for each table?For exmaple:Events from table `users` should go to a topic `users` Events from table `items` should go to a topic `itmes`As far as I can see in the docs, you can only specify one topic per KafkaWriter.
I’ve got the following setup:MySQL → Striim → Kafka + CSR and avro encoding The table that I’m replicating is pretty simple: items | CREATE TABLE `items` ( `id` bigint unsigned NOT NULL AUTO_INCREMENT, `name` varchar(100) DEFAULT NULL, `category` varchar(100) DEFAULT NULL, `price` decimal(7,2) DEFAULT NULL, `inventory` int DEFAULT NULL, `inventory_updated_at` timestamp NULL DEFAULT CURRENT_TIMESTAMP, `created_at` timestamp NULL DEFAULT CURRENT_TIMESTAMP, `updated_at` datetime DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP, PRIMARY KEY (`id`), UNIQUE KEY `id` (`id`)) ENGINE=InnoDB AUTO_INCREMENT=2001 DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_0900_ai_ci However the bigint seems to be causing the following error when Striim starts ingesting messages:Message: KafkaWriter is unable to produce events to shop. Component Name: Kafka_2_1_0_mysql_kafka_csr_Target. Component Type: TARGET. Cause: KafkaWriter is unable to produce events to shop. Cause: Suitable Avro type not found fo
I’d like to convert a timestamp from location specific one (in areas that have summer time change) to UTC. What would be the best way to do it within Striim? I assume it’s doable using Joda API and I can see in the reference that “Striim supports all date functions natively associated with Joda-Time.” but I didn’t find an example how to call those Joda Time functions
Trying to set up:Source: MySQL replication hosted in RDS, which works on its ownTarget: Kafka (works on its own) + CSR (untested) hosted on Confluent Cloud--using the AvroFormatter.The CSR is untested bc it doesn't seem like CSR is supported w/ the builtin load generators--though maybe I'm wrong.When I deploy it, I get:Error Deploy failed! java.util.concurrent.ExecutionException: java.lang.reflect.InvocationTargetExceptionI’ve tested the connection to the CSR and it works, so I don’t believe it’s a credential issue.
I am building a CQ to inject NULL values for String data fields that have “Not available” in the source database.CASE WHEN META(o,"TableName").toString()=="<schema-name>.<table-name>" then CASE WHEN TO_STRING(data[12]) ="Not Available" THEN putUserData(o, 'DENOMINATOR', "NULL") WHEN TO_STRING(data[13]) = "Not Available" THEN putUserData(o, 'SCORE', NULL) WHEN TO_STRING(data[14]) = "Not Available" THEN putUserData(o, 'LOWER_ESTIMATE', NULL) WHEN TO_STRING(data[15]) = "Not Available" THEN putUserData(o, 'HIGHER_ESTIMATE', NULL) ELSE OI am receiving the following compile error : Error Saving ComponentSyntax error at: CASENULL)Please fix the errors and click "Save" again. I also tried the following syntaxCASE WHEN TO_STRING(data[12]) ="Not Available" THEN NULLWHEN TO_STRING(data[13]) = "Not Available" THEN NULLWHEN TO_STRING(data[14]) = "Not Available" THEN NULLWHEN TO_STRING(data[15]) = "Not Available" THEN NULLReference docs : https://www.striim.com/docs/en/handling-null
Are there any examples of using Striim’s KafkaWriter to stream data to a topic in Confluent Cloud?
Already have an account? Login
No account yet? Create an account
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.