Trendyol
About Trendyol
We’re shaping the future of financial technology at Trendyol. As Trendyol’s technology teams, we’re not only building for today we’re designing the financial experiences of tomorrow. From payment infrastructure and digital wallets to smart credit systems and personalized financial services, we create solutions that empower millions of users across our ecosystem. With Trendyol Pay, we enable fast, secure, and seamless payment journeys. Through Trendyol Finance, we develop inclusive and accessible products that simplify financial decisions. We are united by a shared purpose: To create a positive impact in our ecosystem by enabling commerce through technology.
Job Summary
As a Data Engineer, you will act as the bridge between raw data sources and actionable business intelligence. You will create the foundational architecture that enables our data-driven decisions, building robust, scalable pipelines and systems that connect, transform, and store data. You will play a key role in how we unlock value from our data by ensuring it is clean, reliable, and easily accessible for data scientists, analysts, and business teams.
Key Responsibilities
- Build and maintain scalable data pipelines for batch and real-time data processing.
- Work with cloud-native data platforms to ensure reliable, cost-effective data processing solutions.
- Build and maintain REST APIs to serve processed and aggregated data to downstream applications and teams.
- Collaborate with cross-functional teams to define data architecture and infrastructure requirements.
- Monitor and improve pipeline performance, scalability, and resilience.
Required Qualifications
- Bachelor’s degree in Computer Science, Engineering or related Information Technologies field.
- 2+ years of experience as a Data Engineer or in a similar data-focused role.
- Strong Software Engineering skills.
- Experience with Scala or Java programming language.
- Experience with Go programming language is a plus.
- Solid experience with at least one big data processing framework such as Apache Spark or Apache Flink.
- Experience with cloud-native data infrastructure (GCP preferred) to ensure reliable, cost-effective solutions.
- Familiarity with real-time data processing and streaming architectures for low-latency analytics.
- Familiarity with modern data architectures including data lakes, data warehouses (BigQuery, Snowflake), and lakehouse table formats (Apache Iceberg, Delta Lake).
- Experience with workflow orchestration tools such as Apache Airflow, Prefect, or Temporal.
- Strong proficiency in SQL for data manipulation, querying, and optimization.
- Experience with RESTful APIs.
- Experience with both SQL and NoSQL databases.
- Experience with testing frameworks, including unit testing and data quality validation for data pipelines.
- Experience with containerization technologies such as Docker and Kubernetes.
- Knowledge of CI/CD pipelines for data engineering.
To apply for this job please visit jobs.lever.co.