-
Notifications
You must be signed in to change notification settings - Fork 65
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Provide a java-friendly API #59
Comments
Hi Ben. Can I use this lib in Java at the moment? |
It should be possible instantiating your own Function1<String, ProducerRecord<String, String>> f = new AbstractFunction1<String, ProducerRecord<String, String>() {
public ProducerRecord<String, String> apply(String s) {
return new ProducerRecord<String, String>(topic, s);
}
}; but I think it's a bit clumsy that's why I want to provide an API which is more java friendly. |
How do I call the |
My bad didn't take into account the fact that you had a Can't you call and then |
I can call |
True, forgot about the implicits and Java. You might want to try: import static com.github.benfradet.spark.kafka010.write.dStreamToKafkaWriter
KafkaWriter<String> w = dStreamToKafkaWriter<String, String, String>(javaDStream.dStream());
w.writeToKafka(...); |
This is the code I write. However, it complains that Properties props = new Properties();
props.put("bootstrap.servers", "localhost:9092");
props.put("value.serializer", StringSerializer.class);
KafkaWriter<String> kafkaWriter = new DStreamKafkaWriter<>(myStream.dstream(), scala.reflect.ClassTag$.MODULE$.apply(String.class));
Function1<String, ProducerRecord<String, String>> f = new AbstractFunction1<String, ProducerRecord<String, String>>() {
@Override
public ProducerRecord<String, String> apply(final String s) {
return new ProducerRecord<>("my-topic", s);
}
};
kafkaWriter.writeToKafka(props, f, Option.empty()); Here's the serialization stack trace:
|
Have you tried having Function1<String, ProducerRecord<String, String>> f = new AbstractFunction1<String, ProducerRecord<String, String>>() extends Serializable {
@Override
public ProducerRecord<String, String> apply(final String s) {
return new ProducerRecord<>("my-topic", s);
}
}; ? |
Hey, thanks. It works now. I'll leave here my complete code with proper Java syntax for anyone who's interested in: abstract class MyFunc<T, U> extends AbstractFunction1<T, U> implements Serializable {}
public static void main(String[] args) {
Properties props = new Properties();
props.put("bootstrap.servers", "localhost:9092");
props.put("key.serializer", StringSerializer.class);
props.put("value.serializer", StringSerializer.class);
KafkaWriter<String> kafkaWriter = new DStreamKafkaWriter<>(myStream.dstream(), scala.reflect.ClassTag$.MODULE$.apply(String.class));
Function1<String, ProducerRecord<String, String>> f = new MyFunc<String, ProducerRecord<String, String>>() {
@Override
public ProducerRecord<String, String> apply(final String s) {
return new ProducerRecord<>("my-topic", s);
}
};
kafkaWriter.writeToKafka(props, f, Option.empty());
} |
Great stuff 👍 Feel free to open a PR adding a section in the readme with your code |
I've created pull request #60. Please have a look. Thanks. |
Does it take care that ProducerRecord is created only once and used across all executors? |
What do you mean? the procuder record is created in the function |
It is suggested on most of the links to use a single Producer across all executors. e.g. below link: |
Yeah, you wrote:
that's what confused me. To answer your question, only one Producer is created per executor since you can't share them across executors. |
I have tried this Java code with following latest maven artifact. ----------------IMPORT SECTION-------------------- import com.github.benfradet.spark.kafka.writer.DStreamKafkaWriter;
KafkaWriter.writerToKafka(producerConfig,f,Option.empty) --- Line gives me a following Error on Eclipse IDE Any help appreciate. |
You need to turn your java map into a scala one: import scala.collection.JavaConverters._
// ...
kafkaWriter.writeToKafka(producerConfig.asScala(), f,Option.empty()); |
Hello Ben, It doesn't work on Java RDDs? -> @BenFradet work on JavaDStream.. Do you have any working sample on Java please? I've got a mismatch error in the function:
with this sample code imports//
Cheers! |
JavaDStream
The text was updated successfully, but these errors were encountered: