Skip to content

Commit 1f7c9c1

Browse files
committed
1.0.0 release
1 parent 2a31885 commit 1f7c9c1

26 files changed

+29
-21
lines changed
File renamed without changes.

Dockerfile

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
FROM maven:3.8.1-openjdk-11 as build
22

3-
COPY connector/ /connector/
3+
COPY . /connector/
44

55
RUN cd /connector && mvn clean package
66

Makefile

+2-2
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,7 @@
11
SHELL=/bin/bash
22

3-
FQIN:=asaintsever/kafkaconnect-httpsinkconnector
3+
VERSION:=1.0.0
4+
FQIN:=asaintsever/kafkaconnect-httpsinkconnector:$(VERSION)
45
CONTAINER_RUNTIME:=$(shell command -v docker 2> /dev/null || echo podman) # Use docker by default if found, else try podman
56

67
.SILENT: ; # No need for @
@@ -11,7 +12,6 @@ CONTAINER_RUNTIME:=$(shell command -v docker 2> /dev/null || echo podman) # Use
1112

1213
connector-archive:
1314
echo "Building Kafka Connect connector archive ..."
14-
cd connector
1515
mvn clean package
1616

1717
connector-image:

README.md

+5-5
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
The HTTP Sink Connector is a sample implementation of a Kafka Connect connector. This is a `sink` connector, reading events from kafka topics to send them to some HTTP endpoint.
44

5-
Using connector's configuration, you can set the list of Kafka topics to read from and the target HTTP endpoint (only one supported is this implementation). You can also provide your own event formatters (see default [PassthroughStringEventFormatter](connector/src/main/java/asaintsever/httpsinkconnector/event/formatter/PassthroughStringEventFormatter.java) as an example) and HTTP authentication providers (see default [NoAuthenticationProvider](connector/src/main/java/asaintsever/httpsinkconnector/http/authentication/NoAuthenticationProvider.java) and [ConfigAuthenticationProvider](connector/src/main/java/asaintsever/httpsinkconnector/http/authentication/ConfigAuthenticationProvider.java) as examples).
5+
Using connector's configuration, you can set the list of Kafka topics to read from and the target HTTP endpoint (only one supported is this implementation). You can also provide your own event formatters (see default [PassthroughStringEventFormatter](src/main/java/asaintsever/httpsinkconnector/event/formatter/PassthroughStringEventFormatter.java) as an example) and HTTP authentication providers (see default [NoAuthenticationProvider](src/main/java/asaintsever/httpsinkconnector/http/authentication/NoAuthenticationProvider.java) and [ConfigAuthenticationProvider](src/main/java/asaintsever/httpsinkconnector/http/authentication/ConfigAuthenticationProvider.java) as examples).
66

77
This connector will batch events before sending them in order to reduce the number of calls and not overwhelm the HTTP endpoint.
88

@@ -137,8 +137,8 @@ ${TOOLSPOD} "curl -s -X GET http://httpsinkconnector-connect-api:8083/connectors
137137
```sh
138138
# Post single events
139139
# (do not forget the 'jq . -c' part for the JSON sample as we must provide a one line string otherwise kcat will interpret each line of the file as a new value ...)
140-
cat test/sample.txt | ${KCATPOD} "kcat -b local-kafka-brokers:9092 -P -t raw-events"
141-
cat test/sample.json | jq . -c | ${KCATPOD} "kcat -b local-kafka-brokers:9092 -P -t json-events-1"
140+
cat samples/sample.txt | ${KCATPOD} "kcat -b local-kafka-brokers:9092 -P -t raw-events"
141+
cat samples/sample.json | jq . -c | ${KCATPOD} "kcat -b local-kafka-brokers:9092 -P -t json-events-1"
142142
```
143143

144144
Look at the HTTP Sink connector's log (you should see outputs from our test HTTP endpoint):
@@ -152,8 +152,8 @@ ${TOOLSPOD} "curl -s -X GET http://httpsinkconnector-connect-api:8083/connectors
152152
```sh
153153
# Post batches of events
154154
# (each event must be on one line in the batch file)
155-
cat test/sample.txt.batch | ${KCATPOD} "kcat -b local-kafka-brokers:9092 -P -t raw-events"
156-
cat test/sample.json.batch | ${KCATPOD} "kcat -b local-kafka-brokers:9092 -P -t json-events-1"
155+
cat samples/sample.txt.batch | ${KCATPOD} "kcat -b local-kafka-brokers:9092 -P -t raw-events"
156+
cat samples/sample.json.batch | ${KCATPOD} "kcat -b local-kafka-brokers:9092 -P -t json-events-1"
157157
```
158158
159159
Look at the HTTP Sink connector's log (you should see outputs from our test HTTP endpoint with events sent in batch):

deploy/k8s/httpsink-kafkaconnect.yaml

+1-1
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ metadata:
88
# needing to call the Connect REST API directly
99
strimzi.io/use-connector-resources: "true"
1010
spec:
11-
image: asaintsever/kafkaconnect-httpsinkconnector
11+
image: asaintsever/kafkaconnect-httpsinkconnector:1.0.0
1212
replicas: 1
1313
bootstrapServers: "PLAINTEXT://local-kafka-brokers:9092"
1414
config:

connector/pom.xml renamed to pom.xml

+20-12
Original file line numberDiff line numberDiff line change
@@ -14,18 +14,18 @@
1414
<maven.surefire.plugin.version>2.22.0</maven.surefire.plugin.version>
1515
<maven.failsafe.plugin.version>2.22.0</maven.failsafe.plugin.version>
1616
<maven.kafka.connect.plugin.version>0.12.0</maven.kafka.connect.plugin.version>
17-
<java.version>11</java.version>
17+
<java.version>11</java.version>
1818

19-
<slf4j.version>1.7.30</slf4j.version>
20-
<kafka.version>2.7.0</kafka.version>
21-
<!-- Not needed except if our connector has to deal with AVRO directly
22-
<kafka.avro.version>6.0.1</kafka.avro.version>
23-
<avro.version>1.9.2</avro.version>
24-
-->
25-
<jackson.version>2.10.2</jackson.version>
19+
<slf4j.version>1.7.30</slf4j.version>
20+
<kafka.version>2.7.0</kafka.version>
21+
<!-- Not needed except if our connector has to deal with AVRO directly
22+
<kafka.avro.version>6.0.1</kafka.avro.version>
23+
<avro.version>1.9.2</avro.version>
24+
-->
25+
<jackson.version>2.10.2</jackson.version>
2626

27-
<junit.version>5.6.2</junit.version>
28-
<mockito.version>3.3.3</mockito.version>
27+
<junit.version>5.6.2</junit.version>
28+
<mockito.version>3.3.3</mockito.version>
2929
</properties>
3030

3131
<repositories>
@@ -35,7 +35,15 @@
3535
<url>https://packages.confluent.io/maven/</url>
3636
</repository>
3737
</repositories>
38-
38+
39+
<licenses>
40+
<license>
41+
<name>Apache License, Version 2.0</name>
42+
<url>https://www.apache.org/licenses/LICENSE-2.0.txt</url>
43+
<distribution>manual</distribution>
44+
</license>
45+
</licenses>
46+
3947
<dependencies>
4048
<!--
4149
Kafka Connect API & Runtime *must* not be part of our plugin/package (see Kafka Connect doc):
@@ -176,7 +184,7 @@
176184
<sourceUrl>https://github.com/asaintsever/kafka-connect-http-sink/blob/main/connector</sourceUrl>
177185
<dockerNamespace>asaintsever</dockerNamespace>
178186
<dockerName>kafkaconnect-httpsinkconnector</dockerName>
179-
<dockerTag>latest</dockerTag>
187+
<dockerTag>${project.version}</dockerTag>
180188
<componentTypes>
181189
<componentType>sink</componentType>
182190
</componentTypes>
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.

0 commit comments

Comments
 (0)