我正在使用Spring boot Application向Kafka topic发送JSON数据数组,但收到以下错误:
error :org.apache.kafka.common.config.ConfigException: Invalid value
org.apache.kafka.common.serialization.StringSerializer; for
configuration key.serializer: Class
org.apache.kafka.common.serialization.StringSerializer; could not be found.
我尝试将序列化配置更改为:
props.put("key.serializer", org.apache.kafka.common.serialization.StringSerializer;");
props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer;");
至
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer;");
props.put("value.serializer", "org.apache.kafka.common.serialization.ByteArraySerializer;");
配置文件和服务文件代码:
@Configuration
public class KafkaProducerConfig {
@Bean
private static ProducerFactory<String, String> producerConfig() {
Properties props = new Properties();
props.put("bootstrap.servers", "localhost:9092");
props.put("acks", "all");
props.put("retries", 0);
props.put("batch.size", 16384);
props.put("linger.ms", 1);
props.put("buffer.memory", 33554432);
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer;");
props.put("value.serializer", "org.apache.kafka.common.serialization.ByteArraySerializer;");
// The following properties are used by LiKafkaProducerImpl
props.put("large.message.enabled", "true");
props.put("max.message.segment.bytes", 1000 * 1024);
props.put("segment.serializer", DefaultSegmentSerializer.class.getName());
props.put("auditor.class", LoggingAuditor.class.getName());
return new DefaultKafkaProducerFactory(props);
}
}
@Service
public class KafkaSender {
private static final Logger LOGGER = LoggerFactory.getLogger(KafkaSender.class);
@Autowired
private KafkaTemplate<String, String> kafkaTemplate;
@Value("${kafka.topic.name}")
private String topicName;
public void sendData(List<Student> student) {
System.out.println("Inside Student"+ student.toString());
System.out.println("Inside Student"+ student);
// TODO Auto-generated method stub
Map<String, Object> headers = new HashMap<>();
headers.put(KafkaHeaders.TOPIC, topicName);
System.out.println("\nStudent= " + headers);
// Construct a JSONObject from a Map.
JSONObject HeaderObject = new JSONObject(headers);
System.out.println("\nUsing new JSONObject() ==> " + HeaderObject);
final String record = HeaderObject.toString();
final int recordSize = record.length();
kafkaTemplate.send(new GenericMessage<>(student, headers));
LOGGER.info("Data - " + student + " sent to Kafka Topic - " + topicName);
}
}
POST json:
{ "studentId":"Q45678123","firstName":"abc","lastName":"xyz",“年龄”:"12",“地址”:{“公寓”:“apt123”,“街道”:“街道信息”,“州”,“城市”:“城市”,"postCode":"12345“} },{ "studentId":"Q45678123","firstName":"abc","lastName":"xyz",”年龄“:"12",”地址“:{”公寓“:”apt123“,”街道“:”街道信息“,"state":"state","city":"city","postCode":"12345“}}
发布于 2019-09-14 21:51:05
您需要从值的末尾删除分号
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "org.apache.kafka.common.serialization.ByteArraySerializer");
或者,您可以使用class.getName()
方法,就像您对段序列化程序所做的那样,我建议这样做会更安全,因为这样可以保证您需要的序列化程序在编译时可用
https://stackoverflow.com/questions/57934903
复制相似问题