Submit

Data Engineer – SQL & Data Modeling

Lisboa

Job description

Our client is a data-driven organization building scalable, reliable, and high-performance data platforms to power analytics, reporting, and machine learning initiatives. We are looking for a skilled Data Engineer with strong expertise in SQL and data modeling to join our growing data team.

As a Data Engineer, you will design, develop, and maintain robust data pipelines and data models to ensure high-quality, accessible, and performant data for business intelligence and analytics. You will work closely with data analysts, scientists, and software engineers to transform raw data into structured, optimized data assets. Expertise in SQL and data modeling is essential; experience with Snowflake is a significant advantage.

Key Responsibilities:

  • Design and implement dimensional and relational data models (e.g., star schema, snowflake schema) to support analytics and reporting.
  • Write complex, optimized SQL queries for data extraction, transformation, and loading (ETL/ELT).
  • Build and maintain scalable data pipelines using modern data orchestration tools.
  • Ensure data quality, consistency, and integrity through validation, testing, and monitoring.
  • Collaborate with stakeholders to translate business requirements into technical data solutions.
  • Optimize data storage, query performance, and cost efficiency in cloud data warehouses.
  • Document data models, pipelines, and transformation logic for team knowledge sharing.
  • Support the migration and modernization of legacy data systems to cloud-native platforms.
  • Participate in code reviews, data architecture discussions, and continuous improvement initiatives.

Requirements

Requirements:

Education: Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or a related field (or equivalent experience).

Experience:

  • Minimum of 3–5 years of professional experience in data engineering or a related role.
  • Proven hands-on experience with SQL for complex querying, performance tuning, and ETL development.
  • Strong expertise in data modeling (conceptual, logical, and physical models), including normalization, denormalization, and dimensional modeling.
  • Experience with cloud data platforms; Snowflake experience is a strong plus.
  • Familiarity with ETL/ELT frameworks and data integration patterns.


Technical Skills:

  • Advanced proficiency in SQL (window functions, CTEs, query optimization, indexing strategies).
  • Expertise in data modeling tools (e.g., ERwin, Lucidchart, dbt, or Data Vault).
  • Hands-on experience with Snowflake (virtual warehouses, time travel, zero-copy cloning, SnowSQL) is highly desirable.
  • Knowledge of data warehousing concepts ( Kimball, Inmon, Data Vault).
  • Proficiency with scripting languages (e.g., Python or Bash) for automation.
  • Familiarity with version control (Git) and CI/CD for data pipelines.
  • Experience with orchestration tools (e.g., Apache Airflow, dbt Cloud, Prefect) is a plus.


Soft Skills:

  • Strong analytical thinking and problem-solving abilities.
  • Excellent communication skills to explain technical concepts to non-technical stakeholders.
  • Detail-oriented with a focus on data accuracy and system reliability.
  • Ability to work collaboratively in cross-functional agile teams.

Nice-to-Have:

  • Snowflake SnowPro Core or SnowPro Advanced Certification.
  • Experience with dbt (data build tool) for transformation and modeling.
  • Knowledge of real-time data processing (e.g., Kafka, Flink).
  • Familiarity with BI tools (e.g., Tableau, Power BI, Looker).
  • Experience with other cloud platforms (AWS Redshift, Google BigQuery, Azure Synapse).

Want to apply?
Position
Name*
Email*
Phone number*
Country*
City*
Linkedin
Faça upload do seu CV* (max. 4MB)
Upload your photo or video (max. 4MB)
Submit