Senior Data Engineer (EU Based)
We are looking for a Senior Data Engineer based in EU to design, build, and scale data pipelines and lakehouse architectures across AWS and Databricks. You will work on enterprise-grade data foundations, integrate diverse data sources (SQL Server, Dynamics 365, Azure Storage, APIs), and ensure robust, secure, and compliant data operations.
This role requires strong engineering fundamentals, excellent knowledge of AWS data services, and practical experience with Databricks and Spark.
Key Responsibilities
• Design and implement scalable data ingestion pipelines using Spark, Databricks, and
AWS-native services (S3, Glue, Lambda, Step Functions).
• Develop bronze/silver/gold lakehouse architectures and reusable
ingestion/transformation frameworks.
• Integrate data from SQL Server, Dynamics 365, Azure Blob/ADLS, REST APIs, and
heterogeneous operational sources.
• Own end-to-end ETL/ELT workflows including orchestration, versioning, and
monitoring.
• Implement data quality checks, schema evolution handling, and data reconciliation.
• Work alongside architects to define secure, high-performance data patterns aligned
with enterprise security and compliance requirements.
• Collaborate with DevOps on CI/CD for Databricks, Terraform IaC, and automated
environment provisioning.
• Optimize Spark workloads for cost and performance, including cluster tuning and
storage layout optimization (Z-ORDER, OPTIMIZE, AQE).
Required Skills & Experience
• Strong hands-on experience with AWS data ecosystem: S3, Glue, Lambda, Step
Functions, IAM, KMS.
• Deep knowledge of Databricks (Spark SQL, PySpark, Delta Lake, cluster/runtime
configuration).
• Proven experience ingesting data from:
• Strong SQL (analytical SQL + transactional SQL) and query optimization abilities.
• Understanding of data architecture: medallion/lakehouse, warehouse concepts,
schema modeling.
• Solid grounding in security: IAM, RBAC, encryption at rest/in transit, VPC basics.
• Experience with data migration patterns, reconciliation strategies, and handling
large-scale datasets.
• Familiarity with Terraform, Databricks provider, and CI/CD is a plus (not mandatory).
• Excellent communication and stakeholder collaboration skills.
Preferred qualifications
• Hands-on experience with Amazon Web Services (AWS): Ideally 3 years
• Practical experience with Databricks: Ideally 4 years
What we offer
- Competitive salary based on skills and experience.
- 20 days of paid vacation and 3 sick leave days per year.
- Corporate accountant.