Data Engineer I, Global Engineering Insights & Software Tools (GEIST)

Global technology company leading in e-commerce, cloud computing, and artificial intelligence
Data
Entry-Level Software Engineer
In-Person
5,000+ Employees
1+ year of experience
Enterprise SaaS · AI

Description For Data Engineer I, Global Engineering Insights & Software Tools (GEIST)

Join Amazon's Global Engineering Insights & Software Tools (GEIST) team as a Data Engineer, where you'll play a crucial role in developing technology that redefines customer experience within Global Engineering Services (GES). This position offers an opportunity to work on building integrated data tools and platforms supporting a vast portfolio of Amazon's building network. You'll be responsible for developing comprehensive data platforms using AWS big data stack, including Python, Redshift, QuickSight, and more.

The role involves creating and managing ETL pipelines, implementing data warehouse infrastructure, and developing real-time data solutions. You'll work in a fast-paced environment where innovation and experimentation are encouraged, collaborating closely with Engineering, Product, and Technical Program teams to support key decisions for Global Engineering Services.

The GEIST team serves as the software solution provider for all Global Engineering needs, working on a large product suite ranging from Planning & Forecasting applications to robust in-flight management systems. These solutions are used by thousands of internal and external users globally. This is an excellent opportunity to be part of a revolutionary vision from its earliest stages, making a significant impact on Amazon's engineering services infrastructure.

The ideal candidate will have experience with big data technologies, ETL tools, and a strong foundation in data engineering principles. You'll be joining a diverse and inclusive workplace that values innovation and customer-centric solutions. This role offers the unique opportunity to shape the future of Amazon's engineering services while working with cutting-edge technologies and a global team of experts.

Last updated a month ago

Responsibilities For Data Engineer I, Global Engineering Insights & Software Tools (GEIST)

  • Design, implement, and support data warehouse/data lake infrastructure using AWS bigdata stack
  • Develop and manage ETLs to source data from various systems
  • Create unified data model for analytics and reporting
  • Creation and support of real-time data pipelines
  • Research latest big data technologies to provide new capabilities
  • Manage numerous requests concurrently and strategically
  • Partner/collaborate across teams/roles to deliver results

Requirements For Data Engineer I, Global Engineering Insights & Software Tools (GEIST)

Python
  • 1+ years of data engineering experience
  • Experience with data modeling, warehousing and building ETL pipelines
  • Experience with one or more query language (SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala)
  • Experience with one or more scripting language (Python, KornShell)

Interested in this job?

Jobs Related To Amazon Data Engineer I, Global Engineering Insights & Software Tools (GEIST)

Business Intelligence Engineer I, AGI-DS Analytics

Business Intelligence Engineer role at Amazon focusing on data analysis, visualization, and business insights

Business Intelligence Engineer, Amazon

Business Intelligence Engineer role at Amazon focusing on search service optimization, data analysis, and capacity planning with competitive compensation.

Data Engineer 2026, LMEA Science

Entry-level Data Engineer position at Amazon Logistics focusing on building and managing data pipelines and reporting solutions for the Last Mile team in Tokyo, Japan.

Business Intelligence Engineer I, TESS

Business Intelligence Engineer role at Amazon's TESS team, focusing on data analysis, dashboard development, and strategic decision-making in transportation systems.

Data Engineer, WW Standardization & Automation(WWSnA)

Data Engineer position at Amazon focusing on data platform standardization and automation for Operations Finance, requiring Python and SQL expertise.