<h4>Key Features</h4><ul><li>This book offers an easy introduction to the Spark framework published on the latest version of Apache Spark 2</li><li>Perform efficient data processing, machine learning and graph processing using various Spark components</li><li>A practical guide aimed at beginners to
Apache Spark 2 for Beginners
โ Scribed by Rajanarayanan Thottuvaikkatumana
- Publisher
- Packt Publishing
- Year
- 2016
- Tongue
- English
- Leaves
- 322
- Category
- Library
No coin nor oath required. For personal study only.
โฆ Synopsis
Key Features
- This book offers an easy introduction to the Spark framework published on the latest version of Apache Spark 2
- Perform efficient data processing, machine learning and graph processing using various Spark components
- A practical guide aimed at beginners to get them up and running with Spark
Book Description
Spark is one of the most widely-used large-scale data processing engines and runs extremely fast. It is a framework that has tools that are equally useful for application developers as well as data scientists.
This book starts with the fundamentals of Spark 2 and covers the core data processing framework and API, installation, and application development setup. Then the Spark programming model is introduced through real-world examples followed by Spark SQL programming with DataFrames. An introduction to SparkR is covered next. Later, we cover the charting and plotting features of Python in conjunction with Spark data processing. After that, we take a look at Spark's stream processing, machine learning, and graph processing libraries. The last chapter combines all the skills you learned from the preceding chapters to develop a real-world Spark application.
By the end of this book, you will have all the knowledge you need to develop efficient large-scale applications using Apache Spark.
What you will learn
- Get to know the fundamentals of Spark 2 and the Spark programming model using Scala and Python
- Know how to use Spark SQL and DataFrames using Scala and Python
- Get an introduction to Spark programming using R
- Perform Spark data processing, charting, and plotting using Python
- Get acquainted with Spark stream processing using Scala and Python
- Be introduced to machine learning using Spark MLlib
- Get started with graph processing using the Spark GraphX
- Bring together all that you've learned and develop a complete Spark application
About the Author
Rajanarayanan Thottuvaikkatumana, Raj, is a seasoned technologist with more than 23 years of software development experience at various multinational companies. He has lived and worked in India, Singapore, and the USA, and is presently based out of the UK. His experience includes architecting, designing, and developing software applications. He has worked on various technologies including major databases, application development platforms, web technologies, and big data technologies. Since 2000, he has been working mainly in Java related technologies, and does heavy-duty server-side programming in Java and Scala. He has worked on very highly concurrent, highly distributed, and high transaction volume systems. Currently he is building a next generation Hadoop YARN-based data processing platform and an application suite built with Spark using Scala.
Raj holds one master's degree in Mathematics, one master's degree in Computer Information Systems and has many certifications in ITIL and cloud computing to his credit. Raj is the author of Cassandra Design Patterns - Second Edition, published by Packt.
When not working on the assignments his day job demands, Raj is an avid listener to classical music and watches a lot of tennis.
Table of Contents
- Spark Fundamentals
- Spark Programming Model
- Spark SQL
- Spark Programming with R
- Spark Data Analysis with Python
- Spark Stream Processing
- Spark Machine Learning
- Spark Graph Processing
- Designing Spark Applications
โฆ Table of Contents
Cover
Copyright
Credits
About the Author
About the Reviewer
www.PacktPub.com
Table of Contents
Preface
Chapter 1: Spark Fundamentals
An overview of Apache Hadoop
Understanding Apache Spark
Installing Spark on your machines
Python installation
R installation
Spark installation
Development tool installation
Optional software installation
IPython
RStudio
Apache Zeppelin
References
Summary
Chapter 2: Spark Programming Model
Functional programming with Spark
Understanding Spark RDD
Spark RDD is immutable
Spark RDD is distributable
Spark RDD lives in memory
Spark RDD is strongly typed
Data transformations and actions with RDDs
Monitoring with Spark
The basics of programming with Spark
MapReduce
Joins
More actions
Creating RDDs from files
Understanding the Spark library stack
Reference
Summary
Chapter 3: Spark SQL
Understanding the structure of data
Why Spark SQL?
Anatomy of Spark SQL
DataFrame programming
Programming with SQL
Programming with DataFrame API
Understanding Aggregations in Spark SQL
Understanding multi-datasource joining with SparkSQL
Introducing datasets
Understanding Data Catalogs
References
Summary
Chapter 4: Spark Programming with R
The need for SparkR
Basics of the R language
DataFrames in R and Spark
Spark DataFrame programming with R
Programming with SQL
Programming with R DataFrame API
Understanding aggregations in Spark R
Understanding multi-datasource joins with SparkR
References
Summary
Chapter 5: Spark Data Analysis with Python
Charting and plotting libraries
Setting up a dataset
Data analysis use cases
Charts and plots
Histogram
Density plot
Bar chart
Stacked bar chart
Pie chart
Donut chart
Box plot
Vertical bar chart
Scatter plot
Enhanced scatter plot
Line graph
References
Summary
Chapter 6: Spark Stream Processing
Data stream processing
Micro batch data processing
Programming with DStreams
A log event processor
Getting ready with the Netcat server
Organizing files
Submitting the jobs to the Spark cluster
Monitoring running applications
Implementing the application in Scala
Compiling and running the application
Handling the output
Implementing the application in Python
Windowed data processing
Counting the number of log event messages processed in Scala
Counting the number of log event messages processed in Python
More processing options
Kafka stream processing
Starting Zookeeper and Kafka
Implementing the application in Scala
Implementing the application in Python
Spark Streaming jobs in production
Implementing fault-tolerance in Spark Streaming data processing applications
Structured streaming
References
Summary
Chapter 7: Spark Machine Learning
Understanding machine learning
Why Spark for machine learning?
Wine quality prediction
Model persistence
Wine classification
Spam filtering
Feature algorithms
Finding synonyms
References
Summary
Chapter 8: Spark Graph Processing
Understanding graphs and their usage
The Spark GraphX library
GraphX overview
Graph partitioning
Graph processing
Graph structure processing
Tennis tournament analysis
Applying the PageRank algorithm
Connected component algorithm
Understanding GraphFrames
Understanding GraphFrames queries
References
Summary
Chapter 9: Designing Spark Applications
Lambda Architecture
Microblogging with Lambda Architecture
An overview of SfbMicroBlog
Getting familiar with data
Setting the data dictionary
Implementing Lambda Architecture
Batch layer
Serving layer
Speed layer
Queries
Working with Spark applications
Coding style
Setting up the source code
Understanding data ingestion
Generating purposed views and queries
Understanding custom data processes
References
Summary
Index
โฆ Subjects
Computers & Technology, Databases & Big Data, Data Processing, Programming, Algorithms, Programming Languages, Software, Databases, Enterprise Applications, Business
๐ SIMILAR VOLUMES
Apache Spark two for beginners.</div>
Develop applications for the big data landscape with Spark and Hadoop. This book also explains the role of Spark in developing scalable machine learning and analytics applications with Cloud technologies. Beginning Apache Spark 2 gives you an introduction to Apache Spark and shows you how to work wi
Develop applications for the big data landscape with Spark and Hadoop. This book also explains the role of Spark in developing scalable machine learning and analytics applications with Cloud technologies. Beginning Apache Spark 2 gives you an introduction to Apache Spark and shows you how to work wi