All Courses
Building ETL and Data Pipelines with Bash, Airflow and Kafka
edX
Course
Beginner
Free to Audit
Certificate

Building ETL and Data Pipelines with Bash, Airflow and Kafka

IBM

This course provides you with practical skills to build and manage data pipelines and Extract, Transform, Load (ETL) processes using shell/python scripts, Airflow and Kafka.

3 hrs/week5 weeksEnglish9,787 enrolled
Free to Audit

About this Course

Well-designed and automated data pipelines and ETL processes are the foundation of a successful Business Intelligence platform. Defining your data workflows, pipelines and processes early in the platform design ensures the right raw data is collected, transformed and loaded into desired storage layers and available for processing and analysis as and when required. This course is designed to provide you the critical knowledge and skills needed by Data Engineers and Data Warehousing specialists to create and manage ETL, ELT, and data pipeline processes. Upon completing this course you’ll gain a solid understanding of Extract, Transform, Load (ETL), and Extract, Load, and Transform (ELT) processes; practice extracting data, transforming data, and loading transformed data into a staging area; create an ETL data pipeline using Bash shell-scripting, build a batch ETL workflow using Apache Airflow and build a streaming data pipeline using Apache Kafka. You’ll gain hands-on experience with practice labs throughout the course and work on a real-world inspired project to build data pipelines using several technologies that can be added to your portfolio and demonstrate your ability to perform as a Data Engineer. This course pre-requisites that you have prior skills to work with datasets, SQL, relational databases, and Bash shell scripts. 3b:T

What You'll Learn

  • Describe and differntiate between Extract, Transform, Load (ETL) and Extract, Load, Transform (ELT) processes
  • Define data pipeline components, processes, tools and technologies
  • Create ETL processes using Bash shell scripts
  • Develop batch data pipelines using Apache Airflow
  • Create streaming data pipelines using Apache Kafka

Prerequisites

  • Computer and IT literacy.

Instructors

R

Rav Ahuja

Global Program Director

Y

Yan Luo

Ph.D., Data Scientist and Developer

J

Jeff Grossman

Data Science and Engineering SME

Topics

Extract Transform Load (ETL)
SQL (Programming Language)
Apache Airflow
Business Intelligence
Scripting
Data Warehousing
Staging Area
Shell Script
Python (Programming Language)
Bash (Scripting Language)
Apache Kafka
Relational Databases

Course Info

PlatformedX
LevelBeginner
PacingUnknown
CertificateAvailable
PriceFree to Audit

Skills

استخراج وتحويل وتحميل البيانات
لغة SQL
أباتشي إيرفلو
ذكاء الأعمال
البرمجة النصية
Data Warehousing
Staging Area
Shell Script
Python (Programming Language)
Bash (Scripting Language)

Start Learning Now