Submit

Data Engineer

Job description

  • International project
  • English: mandatory
  • German: nice to have
  • Some days will be required to go to the Lisbon office


Requirements

More focused on creating efficient and durable data pipelines with good focus on Spark framework as well. This is for the RISE project.
  • Data Pipeline Development: Design and build scalable ETL/ELT pipelines on GCP using Dataflow, Dataproc (Spark), and BigQuery.
  • Data Engineering & Processing: Develop distributed data transformations with for batch and streaming workloads.
  • Infrastructure as Code: Automate GCP infrastructure provisioning using Terraform and ensure environment consistency.
  • Data Integration & Storage: Implement data ingestion from diverse sources into Big Query, Cloud Storage.
  • CI/CD & Optimization: Integrate pipelines into CI/CD workflows, monitor performance, and optimize cost and throughput.
  • Collaboration & Documentation: Work with cross-functional teams, document solutions, and follow best practices.
  • Start Date: Mid November 2025


Want to apply?
Position
Name*
Email*
Phone number*
Country*
City*
Linkedin
Faça upload do seu CV* (max. 4MB)
Upload your photo or video (max. 4MB)
Submit