Rackspace Technology is seeking a highly skilled Senior Big Data Hadoop ML Engineer (GCP) for a remote position in Canada. The ideal candidate will have extensive experience in the Apache Hadoop ecosystem and developing batch processing systems.
Key responsibilities include:
- Developing scalable code for large-scale batch processing using Hadoop, Oozie, Pig, Hive, Map Reduce, Spark (Java), Python, and HBase
- Managing and maintaining batch pipelines supporting Machine Learning workloads
- Leveraging GCP for big data processing and storage solutions
- Implementing DevOps best practices for CI/CD and Infrastructure as Code
Requirements:
- Proficiency in Hadoop ecosystem (Map Reduce, Oozie, Hive, Pig, HBase, Storm)
- Strong programming skills in Java, Python, and Spark
- Experience with GCP and other cloud services
- 10+ years of experience in customer-facing software/technology or consulting
- 5+ years of experience with "on-premises to cloud" migrations
- Technical degree in Computer Science, Software Engineering, or related field
This role offers the opportunity to work on cutting-edge big data and machine learning projects while collaborating with a dynamic team in a remote setting. Join Rackspace Technology, consistently recognized as a best place to work, and be part of a mission to embrace technology, empower customers, and deliver the future.