Back to jobs
Python
AWS
Spark
SQL
Kafka
dbt
Airflow
Remote
Data Engineer (Python + AWS)
Retail Analytics Firm
Remote (UK)full-timemid
£60k – £80k / year
About the role
Design and maintain data pipelines that process 50 million retail transactions per day. You'll work with Kafka, Spark, S3, Redshift, and dbt to build reliable analytical infrastructure.
Requirements
- 3+ years data engineering
- Python (PySpark or pandas)
- AWS (S3, Glue, Redshift, Lambda)
- SQL and data modelling
- Airflow or similar orchestration
Nice to have
- dbt
- Kafka / Kinesis
- Terraform
- Great Expectations for data quality
Apply Now
Posted April 15, 2026
Skills needed
Prepare for this interview
Practice the exact questions companies ask for this type of role.
Open Interview Simulator