Edelman, a trusted name in communication, is seeking a Data Engineer with 3-5 years of experience to join their innovative team. This role offers an exciting opportunity to work with cutting-edge technologies in an AGILE environment, focusing on cloud infrastructure and tools like Apache Airflow, Databricks, and Snowflake.
The position involves designing and implementing modern data pipelines, optimizing workflows, and enabling seamless data integration across platforms. You'll work with state-of-the-art tools for data ingestion, transformation, storage, and analysis, ensuring high data quality and reliability.
The tech stack includes ETL/ELT pipelines, distributed computing frameworks, data lakes, and data warehouses. The team is actively exploring Generative AI techniques for data enrichment and automated reporting. You'll have the chance to work on both batch processing and streaming data pipelines, with opportunities to collaborate on AI-driven solutions.
Edelman values diversity, equity, inclusion, and belonging, creating an environment where every team member's voice matters. The company offers a collaborative atmosphere where data engineers can thrive while building robust, scalable systems that power insightful decision-making.
The ideal candidate should have strong experience with AWS, Python, SQL, and various data processing tools. They should be capable of working independently while collaborating effectively with cross-functional teams. This role presents an excellent opportunity for someone passionate about data engineering who wants to make a significant impact in a forward-thinking environment.