nicely explain
PySpark and AWS: Master Big Data with PySpark and AWS
Master Spark, PySpark AWS, Spark applications, Spark Ecosystem, Hadoop, and mastering PySpark
Course Description
The hottest buzzwords in the Big Data analytics industry are Python and Apache Spark. PySpark supports the collaboration of Python and Apache Spark. In this course, you’ll start right from the basics and proceed to the advanced levels of data analysis. From cleaning data to building features and implementing machine learning (ML) models, you’ll learn how to execute end-to-end workflows using PySpark.
Right through the course, you’ll be using PySpark to perform data analysis. You’ll explore Spark RDDs, Data frames, and a bit of Spark SQL queries. Also, you’ll explore the transformations and actions that can be performed on the data using Spark RDDs and Data frames. You’ll also explore the ecosystem of Spark and Hadoop and their underlying architecture. You’ll use the Data bricks environment to run the Spark scripts and explore it as well.
Finally, you’ll have a taste of Spark with AWS cloud. You’ll see how we can leverage AWS storages, databases, computations, and how Spark can communicate with different AWS services and get its required data.
By the end of this course, you’ll be able to understand and implement the concepts of PySpark and AWS to solve real-world problems.
The code bundles are available here: https://github.com/PacktPublishing/PySpark-and-AWS-Master-Big-Data-with-PySpark-and-AWS
Audience:
This course requires python programming experience as a prerequisite.
Goals
What will you learn in this course:
- Learn the importance of Big Data.
- Explore the Spark and Hadoop architecture and ecosystem.
- Learn about PySpark Data frames and PySpark Data Frames actions.
- Use PySpark Data Frames transformations.
- Apply collaborative filtering to develop a recommendation system using ALS models.
Prerequisites
What are the prerequisites for this course?
- requires python programming experience

Curriculum
Check out the detailed breakdown of what’s inside the course
Introduction
5 Lectures
-
Why Big Data 03:11 03:11
-
Applications of PySpark 03:12 03:12
-
Introduction to Instructor 00:46 00:46
-
Introduction to Course 01:49 01:49
-
Projects Overview 03:25 03:25
Introduction to Hadoop, Spark Ecosystems and Architectures
11 Lectures

Spark RDDs
36 Lectures

Spark DFs
40 Lectures

Collaborative Filtering
11 Lectures

Spark Streaming
9 Lectures

ETL Pipeline
12 Lectures

Project - Change Data Capture / Replication Ongoing
25 Lectures

Instructor Details

Packt Publishing
Founded in 2004 in Birmingham, UK, Packt's mission is to help the world put software to work in new ways, through the delivery of effective learning and information services to IT professionals.
Working towards that vision, we have published over 6,500 books and videos so far, providing IT professionals with the actionable knowledge they need to get the job done - whether that's specific learning on an emerging technology or optimizing key skills in more established tools.
As part of our mission, we have also awarded over $1,000,000 through our Open Source Project Royalty scheme, helping numerous projects become household names along the way.
Course Certificate
User your certification to make a career change or to advance in your current career. Salaries are among the highest in the world.

Our students work
with the Best


































Feedbacks
Really Helpful.
Related Video Courses
View MoreAnnual Membership
Become a valued member of Tutorials Point and enjoy unlimited access to our vast library of top-rated Video Courses
Subscribe now
Online Certifications
Master prominent technologies at full length and become a valued certified professional.
Explore Now