Streaming Audio: Apache Kafka® & Real-Time Data

Handling Message Errors and Dead Letter Queues in Apache Kafka ft. Jason Bell


Listen Later

If you ever wondered what exactly dead letter queues (DLQs) are and how to use them, Jason Bell (Senior DataOps Engineer, Digitalis) has an answer for you. Dead letter queues are a feature of Kafka Connect that acts as the destination for failed messages due to errors like improper message deserialization and improper message formatting. Lots of Jason’s work is around Kafka Connect and the Kafka Streams API, and in this episode, he explains the fundamentals of dead letter queues, how to use them, and the parameters around them. 

For example, when deserializing an Avro message, the deserialization could fail if the message passed through is not Avro or in a value that doesn’t match the expected wire format, at which point, the message will be rerouted into the dead letter queue for reprocessing. The Apache Kafka® topic will reprocess the message with the appropriate converter and send it back onto the sink. For a JSON error message, you’ll need another JSON connector to process the message out of the dead letter queue before it can be sent back to the sink. 

Dead letter queue is configurable for handling a deserialization exception or a producer exception. When deciding if this topic is necessary, consider if the messages are important and if there’s a plan to read into and investigate why the error occurs. In some scenarios, it’s important to handle the messages manually or have a manual process in place to handle error messages if reprocessing continues to fail. For example, payment messages should be dealt with in parallel for a better customer experience. 

Jason also shares some key takeaways on the dead letter queue: 

  • If the message is important, such as a payment, you need to deal with the message if it goes into the dead letter queue 
  • To minimize message routing into the dead letter queue, it’s important to ensure successful data serialization at the source
  • When implementing a dead letter queue, you need a process to consume the message and investigate the errors 


EPISODE LINKS: 

  • Kafka Connect 101: Error Handling and Dead Letter Queues
  • Capacity Planning your Kafka Cluster
  • Tales from the Frontline of Apache Kafka DevOps ft. Jason Bell
  • Tweet: Morning morning (yes, I have tea)
  • Tweet: Kafka dead letter queues 
  • Watch the video version of this podcast
  • Join the Confluent Community
  • Learn more with Kafka tutorials, resources, and guides at Confluent Developer
  • Live demo: Intro to Event-Driven Microservices with Confluent
  • Use PODCAST100 to get an additional $100 of free Confluent Cloud usage (details)
...more
View all episodesView all episodes
Download on the App Store

Streaming Audio: Apache Kafka® & Real-Time DataBy Confluent, founded by the original creators of Apache Kafka®

  • 4.8
  • 4.8
  • 4.8
  • 4.8
  • 4.8

4.8

44 ratings


More shows like Streaming Audio: Apache Kafka® & Real-Time Data

View all
Planet Money by NPR

Planet Money

30,892 Listeners

The Changelog: Software Development, Open Source by Changelog Media

The Changelog: Software Development, Open Source

285 Listeners

Talk Python To Me by Michael Kennedy

Talk Python To Me

586 Listeners

Software Engineering Daily by Software Engineering Daily

Software Engineering Daily

631 Listeners

AWS Podcast by Amazon Web Services

AWS Podcast

201 Listeners

DataFramed by DataCamp

DataFramed

268 Listeners

Tech Lead Journal by Henry Suryawirawan

Tech Lead Journal

12 Listeners

System Design by Wes and Kevin

System Design

93 Listeners

Postgres FM by Nikolay Samokhvalov and Michael Christofides

Postgres FM

20 Listeners

Kubernetes for Humans by Komodor

Kubernetes for Humans

2 Listeners

Learn System Design by Ben Kitchell

Learn System Design

32 Listeners