Data Engineer

Global leader in business payments and cash management, moving more than $10 trillion in payments annually
Portsmouth, NH 03801, USA
Data
Mid-Level Software Engineer
Hybrid
1,000 - 5,000 Employees
6+ years of experience
Finance · Enterprise SaaS

Description For Data Engineer

Bottomline, a global powerhouse in business payments and cash management, is seeking a Data Engineer to join their dynamic team in a hybrid work environment. With over 30 years of experience and processing more than $10 trillion in annual payments, Bottomline stands at the forefront of financial technology innovation.

The ideal candidate will be instrumental in designing and maintaining the company's data infrastructure, creating efficient pipelines, and ensuring seamless data integration across the enterprise. This role combines technical expertise with business acumen, requiring someone who can not only build robust data solutions but also collaborate effectively with stakeholders across all levels of the organization.

Key technical requirements include proficiency in languages like Java, Python, and SQL, along with experience in modern data architectures including cloud services (AWS, Azure, GCP) and data warehouse tools like Snowflake. The position demands a strong background in ETL processes, database technologies, and Apache ecosystem tools such as Kafka, Airflow, and Spark.

This role offers the opportunity to work with cutting-edge technologies while solving complex data challenges for a company that processes trillions in payments. The hybrid work environment provides flexibility while maintaining collaborative opportunities at the Portsmouth, NH office. Bottomline's commitment to innovation, customer delight, and professional growth makes this an ideal position for a data engineer looking to make a significant impact in the fintech space.

The company promotes an inclusive culture and welcomes talent at all career stages, demonstrating their commitment to creating an open and supportive work environment. This role represents a chance to be part of a company that's not just processing transactions, but transforming how businesses handle payments globally.

Last updated 11 minutes ago

Responsibilities For Data Engineer

  • Design and develop data pipelines for data extraction, transformation, and loading
  • Collaborate with data scientists and analysts to optimize models and algorithms
  • Integrate data from different sources including databases, data warehouses, APIs, and external systems
  • Ensure data consistency and integrity during integration process
  • Transform raw data into usable format through cleansing, aggregation, filtering, and enrichment
  • Optimize data pipelines and processing workflows for performance
  • Monitor and tune data systems and resolve performance bottlenecks
  • Implement data quality checks and validations
  • Establish governance of data and algorithms
  • Collaborate with leaders on data management vision

Requirements For Data Engineer

Java
Python
Kafka
  • Bachelor's degree in computer science, data science, software engineering, information systems, or related field (master's preferred)
  • 6+ years of work experience in data management disciplines
  • Experience with Snowflake and big data solutions
  • Strong programming skills in Java, Python, or C/C++
  • Experience with SQL and ETL processes
  • Proficiency in cloud services (AWS, Azure, GCP)
  • Experience with database technologies (SQL, NoSQL, Oracle, Hadoop, Teradata)
  • Knowledge of Apache technologies (Kafka, Airflow, Spark)
  • Strong problem-solving and debugging skills
  • Excellent business acumen and interpersonal skills
  • Ability to translate technical concepts across different stakeholders

Interested in this job?

Jobs Related To Bottomline Data Engineer

Data Developer

Data Developer position at Rimes, building financial data pipelines and analytics solutions for global institutional clients.

Data Engineer

Data Engineer position at Reach Financial focusing on data platform modernization, pipeline development, and data architecture transformation.

Data Engineer

Data Engineer position at Capco working on transformative banking projects, requiring SQL and Python expertise, offering comprehensive benefits and growth opportunities.

Mid Big Data Engineer

Mid Big Data Engineer position at Capco Poland, focusing on designing and implementing scalable data solutions for financial services using Scala, Python, and Spark.

Azure Data Engineer

Azure Data Engineer position at Capco, building and optimizing data pipelines for financial services clients using Azure Data Factory and Databricks.