HireSleek

Data Engineer or

Jobgether

About Jobgether

This position is posted by Jobgether on behalf of a partner company. We are currently looking for a Data Engineer – REMOTE.

Job Summary

In this role, you will design and implement scalable data solutions that empower the business to make data-driven decisions. You will play a crucial role in developing data pipelines and working with cutting-edge technologies on cloud platforms. This position offers an exciting opportunity to contribute to our client’s operational improvement by leveraging comprehensive data insights that enhance customer experiences.

Key Responsibilities

  • Design, build, and optimize ETL/ELT workflows using Databricks, SQL, and Python/PySpark.
  • Develop and maintain robust, scalable, and efficient data pipelines for processing large datasets.
  • Work on cloud platforms (Azure, AWS) to build and manage data lakes and scalable architectures.
  • Utilize cloud services like Azure Data Factory and AWS Glue for data processing.
  • Use Databricks for big data processing and analytics.
  • Leverage Apache Spark for distributed computing and data transformations.
  • Create and manage SQL-based data solutions ensuring scalability and performance.
  • Develop and enforce data quality checks and validations.
  • Collaborate with cross-functional teams to deliver impactful data solutions.
  • Leverage CI/CD pipelines to streamline development and deployment of workflows.
  • Maintain clear documentation for data workflows and optimize data systems.

Requirements

  • Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field.
  • 3–6 years of experience in Data Engineering or related roles.
  • Hands-on experience with big data processing frameworks and data lakes.
  • Proficiency in Python, SQL, and PySpark for data manipulation.
  • Experience with Databricks and Apache Spark.
  • Knowledge of cloud platforms like Azure and AWS.
  • Familiarity with ETL tools (Alteryx is a plus).
  • Strong understanding of distributed systems and big data technologies.
  • Basic understanding of DevOps principles and CI/CD pipelines.
  • Hands-on experience with Git, Jenkins, or Azure DevOps.

Benefits

  • Flexible remote working conditions.
  • Opportunities for professional growth and training.
  • Collaborative and inclusive company culture.
  • Access to modern technologies and tools.
  • Health and wellness benefits.
  • Work-life balance.
  • Participation in innovative projects.
  • Dynamic and fast-paced working environment.

To apply for this job please visit jobs.lever.co.