Confluent Kafka-connect-JDBC connector showing hexa decimal data in the kafka topic

Refresh

April 2019

Views

47 time

1

I'm trying to copy the data from a table in the oracle db and trying to put that data in a kafka topic. I've used the following JDBC source connector for that :

name=JDBC-DB-source
connector.class=io.confluent.connect.jdbc.JdbcSourceConnector
connection.password = *******
connection.url = jdbc:oracle:thin:@1.1.1.1:1111/ABCD
connection.user = *****
table.types=TABLE
query= select * from (SELECT * FROM JENNY.WORKFLOW where ID = '565231')

key.converter=io.confluent.connect.avro.AvroConverter
key.converter.schema.registry.url=http://localhost:8081

value.converter=io.confluent.connect.avro.AvroConverter
value.converter.schema.registry.url=http://localhost:8081

mode=timestamp+incrementing
incrementing.column.name=ID
timestamp.column.name=MODIFIED

topic.prefix=workflow_data12
poll.interval.ms=6000
timestamp.delay.interval.ms=60000

transforms:createKey
transforms.createKey.type:org.apache.kafka.connect.transforms.ValueToKey
transforms.createKey.fields:ID

So far good. I'm able to get the data into my kafka topic. But the output looks like the following :

key - {"ID":"\u0001"}   
value - {"ID":"\u0001","MODIFIED":1874644537368}

You can observer that my key "ID" is being printed as Hexadecimal format, despite I'm using Avro in my JDBC properties file.

(I'm using kafka-avro-console consumer to view the data on the command line)

(And the column "ID" is of type "NUMBER" in the oracle db.)

Could anyone help me to point out if I'm missing some property? to print the data properly in Avro format.

Thanks in advance!!

1 answers

0

Add this property to your .properties file e.g before query:

numeric.mapping=best_fit

Detail Explanation can be found here