Employment Type :
Full-time
Experience :
4-7 Years
Overview :
Data Engineer with strong DevOps and analytical skills to design, build, and maintain scalable data pipelines for financial data. This role supports Engineering, Operations and analytics teams by delivering high-quality, reliable datasets and automated workflows.
Key Responsibilities :
- Build and maintain ETL/ELT pipelines for financial datasets.
- Develop data pipelines and optimized structures for analytics and reporting.
- infrastructure-as-code, and pipeline monitoring (DevOps/DataOps).
- Ensure data quality, lineage, and governance across platforms.
- Collaborate with Engineering, Operations and Analytics teams to support reporting and insights.
- Optimize pipeline performance, reliability, and cloud resource usage for cost optimization.
Required Qualifications :
- Masters degree in computer science, Data Engineering, Information Systems, or related field.
- 3–5 years of hands-on experience in data engineering.
- Strong SQL and Python; experience with Spark or equivalent.
- Experience with cloud data platforms (Spanner, BigQuery, Redshift, or similar).
- Proficiency with DevOps tools (Git, CI/CD pipelines, Terraform/IaC).
- Experience with orchestration tools (Airflow, Pentaho).
- Understanding of financial datasets and domain concepts.
- Knowledge of data quality/observability tools.