This Apache Spark - Application Performance Tuning training course delivers the key concepts and expertise that developers need to improve the performance of their Apache Spark applications.You will learn how to identify common sources of poor performance in Spark applications, techniques for avoiding or solving them, and best practices for Spark application monitoring.
By attending Apache Spark - Application Performance Tuning workshop, delegates will learn to:
- Understand Apache Spark's architecture, job execution, and how techniques such as lazy execution and pipelining can improve runtime performance
- Evaluate the performance characteristics of core data structures such as RDD and DataFrames
- Select the file formats that will provide the best performance for your application
- Identify and resolve performance problems caused by data skew
- Use partitioning, bucketing, and join optimizations to improve SparkSQL performance
- Understand the performance overhead of Python-based RDDs, DataFrames, and user-defined functions
- Take advantage of caching for better application performance
- Understand how the Catalyst and Tungsten optimizers work
- Understand how Workload XM can help troubleshoot and proactively monitor Spark applications performance
- New features in Spark and specifically how the Adaptive Query Execution engine improves performance
- Spark examples and hands-on exercises are presented in Python and the ability to program in this language is required.
- Basic familiarity with the Linux command line is assumed.
- Basic knowledge of SQL is helpful.
The Apache Spark - Application Performance Tuning class is ideal for:
- Software developers, engineers, and data scientists who have experience developing Spark applications and want to learn how to improve the performance of their code.