This Hadoop Operations training course will cover cluster planning, installation, administration, resource management, and monitoring.
Apache Hadoop is an open source software project that enables distributed processing of large data sets across clusters of commodity servers. It is designed to scale up from a single server to thousands of machines, with very high degree of fault tolerance.
By attending Hadoop Operations workshop, delegates will learn:
- Designing Hadoop Clusters
- Hadoop in the Cloud
- Deploying Hadoop Clusters
- Hadoop Cluster Availability
- Securing Hadoop Clusters
- Operating Hadoop Clusters
- Stabilizing Hadoop Clusters
- Capacity Management for Hadoop Clusters
- Performance Tuning of Hadoop Clusters
- Cloudera Manager and Hadoop Clusters
The Hadoop Operations class is ideal for:
- Developers interested in expanding their knowledge of Hadoop from the operations perspective.