Shieldai
About Shieldai
Founded in 2015, Shield AI is a venture-backed deep-tech company with the mission of protecting service members and civilians with intelligent systems. Its products include the V-BAT aircraft, Hivemind Enterprise, and the Hivemind Vision product lines. With nine offices and facilities across the U.S., Europe, the Middle East, and the Asia-Pacific, Shield AI’s technology actively supports operations worldwide.
Job Summary
Shield AI is seeking a Data Engineer to join our Finance team in support of our financial enterprise reporting platform. This role will help establish secure, automated, and scalable data pipelines that connect finance, compliance, and operations to actionable insights. You will enable Finance to improve forecasting, budgeting, and reporting accuracy while supporting compliance with government and cyber requirements.
Key Responsibilities
- Develop ETL pipelines (Python, SQL, PySpark) to integrate data from ERP, FP&A, and other enterprise systems into Microsoft Fabric.
- Automate ingestion, transformation, and validation workflows to ensure accuracy, timeliness, and compliance of finance and accounting data.
- Build and maintain Lakehouse structures (raw, processed, curated) to support efficient data modeling and reporting.
- Curate datasets for end users and analysts to consume in their own reporting and dashboarding tools.
- Design and maintain orchestration frameworks and dataflows to support higher-frequency refresh schedules while optimizing Fabric compute unit (CU) usage.
- Support development and deployment of machine learning workflows (e.g., GL/expense classification models) as part of the financial reporting platform.
- Collaborate with Finance, FP&A, and Product stakeholders to evolve data models and reporting capabilities.
Required Qualifications
- 3+ years of experience in data engineering or analytics with hands-on ETL pipeline development.
- Proficiency in SQL, Python, and PySpark, with experience in Microsoft Fabric, Databricks, or Snowflake.
- Understanding of data warehousing and Lakehouse architectures, medallion models, and orchestration workflows.
- Experience with Azure or AWS cloud services (S3, EC2, IAM, Databricks, or Azure equivalents).
- Familiarity with compliance-driven data handling, including audit readiness and government reporting requirements.
Preferred Qualifications
- Experience in a defense or government technology environment.
- Exposure to finance systems (Costpoint, Vena, ERP/FP&A platforms) and accounting data structures.
- Knowledge of compliance frameworks and data governance/security policies (DCAA, FAR/DFARS, CUI).
- Background in automation or machine learning techniques to support data validation and compliance.
- Experience delivering scalable data solutions in Agile environments.
To apply for this job please visit jobs.lever.co.