We are seeking a highly skilled and experienced Senior Big Data Engineer to join our dynamic team. The ideal candidate will have a strong background in Java, developing and scaling both stream and batch processing systems, and a solid understanding of public cloud technologies, especially GCP. This role involves working in a remote environment, requiring excellent communication skills and the ability to solve complex problems independently and creatively.
Key Responsibilities:
- Design and develop scalable data processing solutions, leveraging advanced Java techniques to ensure performance, reliability, and maintainability across streaming and batch workflows.
- Build reusable and reliable code for stream and batch processing systems at scale.
- Work with technologies such as Pub/Sub, Kafka, DataFlow, Flink, Hadoop, Pig, Oozie, Hive, and Spark.
- Implement automation/DevOps best practices for CI/CD, IaC, Containerization, etc.
Requirements:
- Strong Java 11-17 experience
- Proficiency in asynchronous programming paradigms and reactive programming techniques
- Expertise in public cloud services, particularly in GCP
- Experience with GCP managed services (e.g., Data Proc, Cloud Composer, GCS, DataFlow)
- Knowledge in containerization technologies like Docker and Kubernetes
- Strong programming abilities in Java and Python
- Google Associate Cloud Engineer Certification or other Google Cloud Professional level certification
- 4+ years of experience in customer-facing software/technology or consulting
- 4+ years of experience with "on-premises to cloud" migrations or IT transformations
- 4+ years of experience building and operating solutions on GCP (ideally) or AWS/Azure
- Technical degree in Computer Science, Software Engineering, or related field
Rackspace Technology offers a competitive salary range and benefits package. Join us on our mission to embrace technology, empower customers, and deliver the future.