Data Engineer Senior | GCP


Job description

As a part of your job, you will:

  • Design and Implement data pipelines, applying the best practices for cloud deployment, considering the performance and cost optimization;
  • Process large volumes of data, either in real-time or batch applying common data processing features, such as filtering, aggregation, data standardization, and data schema harmonization, …
  • Be a cloud enthusiast, searching for complimentary knowledge and positively influencing others with your motivation;
  • Take responsibility for the deliverables, working with different stakeholders including Business Analysts, Product Owners, Scrum Masters, Data Scientists, and other teams that may assist in the implementation of specific tasks (e.g., data infrastructure teams);
  • Data modeling on Relational and NoSQL Databases.


What are we looking for?

  • Minimum of 4 years as a Data Engineer;
  • Experience as a Tech Lead;
  • Strong expertise in SQL and experience working with large-scale databases and data warehousing solutions;
  • In-depth knowledge of GCP services, including BigQuery, Cloud Storage, Composer, Dataflow, Pub/Sub, and Data Fusion;
  • Familiarity with batch processing and stream processing frameworks and tools (e.g., Apache Beam, Apache Kafka, Apache Spark Streaming);
  • Solid understanding of BI concepts, data modeling, star schemas, and ETL processes;
  • Proficiency in Python programming for data manipulation and automation.
    Experience with Git or other version control systems;
  • Strong problem-solving and troubleshooting skills in a complex data environment;
  • Fluent in English.

Nice to have:

  • Experience in Terraform.

Want to apply?
Phone number*
Faça upload do seu CV* (max. 4MB)
Upload your photo or video (max. 4MB)