У нас вы можете посмотреть бесплатно 🎄Twelve Days of SMT 🎄 - Day 11: Filter and Predicate или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Apache Kafka 2.6 added support for defining predicates against which transforms are conditionally executed, as well as a Filter Single Message Transform to drop messages - which in combination means that you can conditionally drop messages. The predicates that ship with Apache Kafka are: RecordIsTombstone - The value part of the message is null (denoting a tombstone message) HasHeaderKey- Matches if a header exists with the name given TopicNameMatches - Matches based on topic There is also a Filter transform as part of Confluent Platform which is also discussed in this video. --- 👾 Demo code and details: https://github.com/confluentinc/demo-... 🗒️ SMT reference: https://docs.confluent.io/platform/cu... https://docs.confluent.io/platform/cu... Learn more about Kafka Connect here: 🏃♂️ Quick: • Kafka Connect in 60 seconds 🚶 More detail: https://rmoff.dev/kafka-connect-zero-... --- ⏱️ Time codes: 00:00:00 Introducing Filter and Predicate 00:01:11 Examining the source data 00:02:11 Unifying the topic names (write two topics to the same target object) 00:02:36 Conditionally rename fields in messages in Kafka Connect 00:02:47 Configuring the Single Message Transform to conditionally rename fields based on topic name 00:03:43 Using predicates with Single Message Transform 00:04:43 Renaming topics with RegexRouter 00:05:06 How to define a predicate 00:05:33 The three predicates that are included with Apache Kafka 00:06:13 Configuration recap 00:08:34 Tangent: listing the topics used by a connector 00:10:23 Dropping null messages (tombstones) as they pass through Kafka Connect 00:11:59 Examining null records in a Kafka topic with kafkacat 00:12:52 Default behaviour of JDBC Sink with null records 00:13:21 Diagnosing the cause of a Kafka Connect task failure 00:14:26 Excluding null records (tombstones) from Kafka Connect pipeline 00:16:42 Using the Filter transform from Confluent Platform 00:17:48 Tangent: what is Kafkacat? 00:18:48 Piping output from kafkacat to jq 00:19:22 Filtering messages in Kafka Connect based on the contents of a field 00:20:02 Apache Kafka "Filter" and Confluent Platform "Filter" transformations 00:22:17 Filtering messages in Kafka Connect based on a field's numeric value 00:26:29 Recap --- ☁️ Confluent Cloud: https://confluent.cloud/signup?utm_so... 💾Download Confluent Platform: https://www.confluent.io/download/?ut... 📺 Kafka Connect connector deep-dives: • Kafka Connect ✍️Kafka Connect documentation: https://docs.confluent.io/current/con... 🧩Confluent Hub: https://www.confluent.io/hub/?utm_sou...