HireSleek

Principal Data Engineer US Security & IT

Jobgether

About Jobgether

This position is posted by Jobgether on behalf of a partner company. We are currently looking for a Principal Data Engineer in the United States.

Job Summary

This role offers the opportunity to lead the development and evolution of a high-performance, large-scale data ecosystem in a fast-growing fintech environment. You will take ownership of data infrastructure, pipelines, and analytics platforms, ensuring reliable, scalable, and efficient processing of high volumes of data.

Key Responsibilities

  • Own end-to-end design, development, and optimization of data warehousing infrastructure, KPIs, and SLAs.
  • Architect, build, and maintain large-scale, high-performance data pipelines to support analytics, product insights, and operational workflows.
  • Lead ETL/ELT development using Redshift, DBT, AWS DMS, and other modern data tools.
  • Collaborate with cross-functional teams (engineering, analytics, product, finance, risk) to gather requirements and deliver robust, high-quality datasets.
  • Evaluate, integrate, and guide the adoption of new technologies to evolve the data ecosystem.
  • Optimize warehouse performance through query tuning, data modeling improvements, and cost management.
  • Mentor and provide technical guidance to other engineers, fostering best practices in data engineering and architecture.

Requirements

  • 10+ years of experience in data engineering, building and scaling production-grade systems.
  • Expert-level knowledge of AWS Redshift or similar platforms, including performance tuning, table design, and workload management.
  • Hands-on experience with ETL/ELT frameworks such as DBT, AWS DMS, or equivalent tools.
  • Strong SQL skills and proficiency in at least one programming language (Python, Scala, or Java).
  • Experience building and maintaining AWS-based data platforms, including S3, Lambda, Glue, and EMR.
  • Proven experience designing scalable, fault-tolerant pipelines processing 100GB–1TB of data per day.
  • Deep understanding of data modeling, distributed systems, and data warehouse/lake design patterns.
  • Ability to work in a fast-paced, collaborative environment with strong communication and documentation skills.

Preferred Qualifications

  • Experience in high-growth, data-intensive fintech, lakehouse architectures (Snowflake, Databricks, Iceberg, Delta Lake), streaming technologies (Kafka, Kinesis, Flink), machine learning pipelines, or infrastructure as code (Terraform, CloudFormation).

Benefits

  • Competitive salary range of $170,000–$250,000 USD OTE, including base and variable components.
  • Unlimited PTO.

To apply for this job please visit jobs.lever.co.