Appier is a software-as-a-service (SaaS) company that uses artificial intelligence (AI) to power business decision-making. Founded in 2012 with a vision of democratizing AI, Appier's mission is turning AI into ROI by making software intelligent. As a Software Engineer, Data Backend, you will be involved in helping to build critical components of Appier's AI-powered platform.
Responsibilities include:
- Designing, developing, and maintaining data pipelines
- Building, managing, and optimizing data platforms (e.g., Spark clusters, Kafka clusters)
- Profiling and tuning performance of critical components
- Providing expert advice and solutions to enhance the performance of big data systems and applications
- Establishing and improving the foundational architecture for platforms
Minimum qualifications:
- BS/MS degree in Computer Science
- 2+ years of experience in building and operating large-scale distributed systems or applications
- Experience in developing Java/Scala projects
- Experience in building data pipelines using Apache Spark
- Experience in managing data lake or data warehouse
- Expertise in developing data structures and algorithms on top of Big Data platforms
Preferred qualifications:
- Experience in developing Golang/Python projects
- Experience in profiling and optimizing JVM performance
- Experience in managing data platforms (Hadoop, Kafka, Flink, Trino/ClickHouse, etc.)
- Experience in cloud services (AWS, GCP, Azure)
- Experience in contributing to open source projects
- Experience in open table formats (Apache Iceberg, Delta Lake, Hudi)
Join Appier to be part of a dynamic team working on cutting-edge AI technologies and big data solutions. This role offers the opportunity to work on challenging problems at scale and contribute to the development of innovative AI-powered business solutions.