We are looking for talents that comply with the below requirements: - Experience with Databricks (PySpark), Apache Airflow, Snowflake (SQL) and Jenkins
- Object-oriented/object function scripting languages - Building and optimising ETL / ELT data pipelines - Working with AWS cloud services in production - Working with Data Lake in AWS S3
And, by the way: - Knowledge of Scrum, Kanban or other agile frameworks
Want to apply?
Upload your CV here* (max. 4MB)
Upload your photo or video here (max. 4MB)