Data Engineer

Porto

Technologies

AWS
PsySpark
AWS S3
Apache Airflow
Snowflake
Jenkins

Job description

Requirements

We are looking for talents that comply with the below requirements: - Experience with Databricks (PySpark), Apache Airflow, Snowflake (SQL) and Jenkins
- Object-oriented/object function scripting languages - Building and optimising ETL / ELT data pipelines - Working with AWS cloud services in production - Working with Data Lake in AWS S3


And, by the way: - Knowledge of Scrum, Kanban or other agile frameworks

Want to apply?

Cargo
Name*
Email*
Phone number*
Linkedin
Upload your CV here* (max. 4MB)
Upload your photo or video here (max. 4MB)
Submit
This site uses cookies from Google to deliver its services and to analyze traffic. Your IP address and user-agent are shared with Google along with performance and security metrics to ensure quality of service, generate usage statistics, and to detect and address abuse.