У нас вы можете посмотреть бесплатно Snowflake Snowpipe streaming, Kafka setup with python why Kafka, snowflake real time data ingestion или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
This video is the part 1 of Snowpipe streaming implementation demonstration. In this vide we will complete the setup of kafka and python and will see how kafka works in real project scenario. In the part 2 we will use this setup to ingest the kafka topic events into snowflake in real time. Adding below is the kafka setup file i used. ------------------------------------------------------------------------------------------------------------------------------------- -------------------------------------------------------------------------------------------------- Set up Kafka -- Download setup and extract https://kafka.apache.org/downloads Create logs directory ------------------------------------------------ kafka_logs-- zookeeper kafka_logs-- server_logs change the zookeeper.properties: ------------------------------------------------------ dataDir=E:/kafka_logs/zookeeper maxClientCnxns=1 change the server.properties: ---------------------------------------------------- uncomment listeners log.dirs=E:/kafka_logs/kafka_server zookeeper.connection.timeout.ms=90000 Start Zookeeper: : Ensure Java installation --------------------------------------- E:/kafka_2.12-3.6.0/bin/windows/zookeeper-server-start.bat E:/kafka_2.12-3.6.0/config/zookeeper.properties --ERROR : The system cannot find the path specified. solution : $env:JAVA_HOME= set JAVA_HOME = "" Start Kafka-server: ----------------------------------------- E:/kafka_2.12-3.6.0/bin/windows/kafka-server-start.bat E:/kafka_2.12-3.6.0/config/server.properties Create topic: ------------------------------------ E:/kafka_2.12-3.6.0/bin/windows/kafka-topics.bat --create --topic hello_world --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1 Start Producer: -------------------------------------- E:/kafka_2.12-3.6.0/bin/windows/kafka-console-producer.bat --topic hello_world --bootstrap-server localhost:9092 Start Consumer: ------------------------------------- E:/kafka_2.12-3.6.0/bin/windows/kafka-console-consumer.bat --topic hello_world --from-beginning --bootstrap-server localhost:9092 kafka-python installation: -------------------------------------------------- pip install kafka-python Python Code: ---------------------------------- from time import sleep from json import dumps from kafka import KafkaProducer topic_name='hello_world' producer = KafkaProducer(bootstrap_servers=['localhost:9092'],value_serializer=lambda x: dumps(x).encode('utf-8')) for e in range(500): data = {'message counter=' : e} print(data) producer.send(topic_name, value=data) sleep(2) ------------------------------------------------------------------------------------------------------------------------------------- #snowflake #snowflaketutorial #snowflakedeveloper #snowpipe #snowflakeStreaming #snowpipeStreaming #python #kafka #database #datascience #dataengineering #tutorial #dataintegration #datawarehouse #datawarehousing #realtime #dbt #datamodeling #datamodelling #streamlit #plsq #oracle #postgresql