Acquia empowers the world's most ambitious brands to create digital customer experiences that matter. With open source Drupal at its core, the Acquia Digital Experience Platform (DXP) enables marketers, developers, and IT operations teams at thousands of global organizations to rapidly compose and deploy digital products and services that engage customers, enhance conversions, and help businesses stand out.
We are seeking an experienced Staff Software Engineer with strong technical skills to join our team to enhance and build robust software used to engage customers through our marketing cloud platform.
As a Staff Software Engineer, you will:
- Be a leader and mentor to the local team
- Partner with other distributed, agile team members to build beautiful and powerful data pipelines, data stores and applications
- Build metadata driven solutions that are reusable and highly configurable
- Automate testing and deployment in Snowflake across AWS, Azure and GCP
- Write and mentor in clean, SOLID, and testable code
- Participate in peer code reviews
- Design modules using industry proven best practices and present it to the team
- Take complete ownership of the modules
- Work on our fully Cloud-based infrastructure developing far reaching modules that have scalability and availability at their core
Skills:
- Expert with hands-on experience of 6-10 years in Snowflake, SQL, data pipelines, data modeling and query optimization
- 4-5 years development experience with Java
- Strong hands-on experience building and operating distributed systems and/or service oriented architectures
- Experience with Agile (e.g. Scrum) and TDD
- Strong working experience with Git
- Familiarity with principles of domain-driven design, clean code best practices, SOLID principles
- Strong understanding of API design and REST fundamentals
- Knowledge of software testing best practices (unit testing, integration testing, functional testing, etc)
- Experience with big data analytics or real time analytics solutions is preferred
Additional Advantages:
- Experience with high volume data pipelines e.g. Spark, Kafka, Hive
- Hands-on Cloud hosting experience (AWS, GCP, Azure)
- Experience with Infra as Code like Terraform
- Snowflake certification
- Experience with Jenkins and Jenkins Pipelines
Must be available during normal US EST business hours and for a minimum of 40 hours a week. Part of the interview process includes a programming and technical assessment.