Austin Summit Preview: Architecture and Best Practices to Deploy Hadoop and Spark Clusters with Sahara
Hadoop is the world standard for big data platforms, built with performance and data consistency in mind. While the Hadoop ecosystem today is used by major enterprises all over the world, that doesn't mean it’s easy to deploy. Hadoop has always been considered to be a bare metal solution, and building huge clusters requires a lot of patience and expertise from DevOps teams.
Deploying Hadoop to an OpenStack private cloud instead of bare metal raises questions and concerns, including:
- Can Hadoop keep its performance on VMs?
- How reliable is virtualized storage for HDFS data?
- Will the cloud reduce deployment complexity or it is going to bring its own caveats?
Add this session to your summit calendar: Architecture and Best Practices to Deploy Hadoop and Spark Clusters with Sahara
If you haven't registered for the OpenStack Summit yet, register here.