java - When does Kafka connection require ZooKeeper config? -


the kafka console consumer seems require specify zookeeper instance connect to:

./kafka-console-consumer.sh --zookeeper myzk.example.com:2181 --topic mytopic 

but possible connect kafka broker directly via java api:

public class kafkaclient {   public static void main(string[] args) {      string topic = "mytopic";      properties props = new properties();     props.put("bootstrap.servers", "kafka.example.com:9092");     props.put("acks", "all");     props.put("retries", 0);     props.put("batch.size", 16384);     props.put("linger.ms", 1);     props.put("buffer.memory", 33554432);     props.put("key.serializer", "org.apache.kafka.common.serialization.stringserializer");     props.put("value.serializer", "org.apache.kafka.common.serialization.stringserializer");      producer<string, string> producer = new kafkaproducer<>(props);      callback cb = new callback() {         @override         void oncompletion(recordmetadata rdata, exception exc) {             if(exc) {                 throw exc;             }         }     }      producer.send(new producerrecord<string, string>(topic, 'somekey', 'someval'), cb);     producer.close();   } } 

is there way run consumer without specifying zk node? if not, why?

this depends on version of consumer api being used. starting latest kafka release 0.10.1, new api directly targets brokers default used console consumer. versions prior 0.10.1 default older api targeting zookeeper can set use new consumer api console consumer specifying parameters like: --new-consumer , --bootstrap-server somebroker:9092 command.


Comments

Popular posts from this blog

asynchronous - C# WinSCP .NET assembly: How to upload multiple files asynchronously -

aws api gateway - SerializationException in posting new Records via Dynamodb Proxy Service in API -

asp.net - Problems sending emails from forum -