HireSleek

Data Developer

Trendyol

About Trendyol

We’re shaping the future of financial technology at Trendyol. As Trendyol’s technology teams, we’re not only building for today we’re designing the financial experiences of tomorrow. From payment infrastructure and digital wallets to smart credit systems and personalized financial services, we create solutions that empower millions of users across our ecosystem. With Trendyol Pay, we enable fast, secure, and seamless payment journeys. Through Trendyol Finance, we develop inclusive and accessible products that simplify financial decisions. We are united by a shared purpose: To create a positive impact in our ecosystem by enabling commerce through technology.

About The Role

As a Data Developer, you will act as the bridge between raw data and meaningful business insights. You will create the infrastructure that powers our data-driven initiatives, building pipelines and systems that connect disparate data sources and ensure the seamless flow of information. You will play a key role in how we leverage data to make informed decisions by transforming, loading, and managing data so that it is reliable, accessible, and ready for analysis.

Key Responsibilities

  • Architect and maintain efficient, reusable and reliable data systems.
  • Design data models and data warehousing solutions that support analytics and reporting across the business.
  • Collaborate with data scientists, analysts to understand data needs and deliver high-quality, trusted data products.
  • Optimize performance of data processes, including monitoring and troubleshooting jobs, managing resource consumption.
  • Enforce governance, security and compliance standards, including data quality, lineage, reliability.
  • Stay current with data technologies and recommend improvements to existing infrastructure and architecture.

Expected Qualifications

  • Bachelor’s degree in Computer Science, Engineering, Management Information Systems, Mathematics or related field.
  • Proficiency in developing data pipelines and ETL/ELT workflows using python, spark or SQL.
  • Advanced SQL skills and familiarity with structured/unstructured data formats. (Parquet, Avro, JSON)
  • Deep understanding of data processing and optimization in cloud environments. (GCP preferred)
  • Experience with workflow management platforms. (Airflow preferred)
  • Familiarity with containerization technologies like Docker and orchestration tools like Kubernetes is preferred.
  • Experience with real-time data processing and streaming technologies.
  • Proficiency in at least one programming language. (python preferred)
  • Familiarity with modern data architectures including data lakes, data lakehouses and table formats. (Apache Iceberg, Hudi etc.)
  • Experience with CI/CD pipelines, Git, and Infrastructure as Code (e.g., Terraform) preferred.
  • Working with AI technologies — including building and experimenting with AI agents, LLMs, and automation.

To apply for this job please visit jobs.lever.co.