PKN Data Architect


Job description

The service to be provided is focused on data architect and data engineer role for PKN product (Product Knowledge Network), including conceptualize, develop and advising activities. The application is customized standard-software using a triplestore (Eccenca’s Corporate Memory & GraphDB), including integration of different source and target systems . PKN is a tool to manageinterdependencies between products, between products and related assets (e.g. Video-Links). These relationships can be built implicitly based on rules or explicitly based on connections between two products


To achieve this, we expect the following competences:

Verify and consolidate application architectural requirements
Gathering and clarify data Requirements
Conceptualizing functional change requests / new requirements into architectural concepts (‘data architect’) regarding
      o Knowledge graph-based applications (CMEM by Eccenca, Web frontend)
      o Interfaces to consumer/provider systems based on kafka streaming technology
      o Workflows for ETL automation based on third party software
Implementing or adjusting workflows in an enterprise knowledge graph platform

To deliver the above competences, we consider relevant for the outsourcing company to assure consultants with a skillset and experience covering:

Proven Experience as Data Architect role, with graph databases exposure (e.g. Neo4J, GraphDB
Data Engineering: Strong foundation in traditional data engineering concepts, including ETL processes, data streaming, and database management
Basic knowledge in AWS (S3, ECS, EKS, Cloudwatch)
Deep knowledge using SQL
Ontology Design and Development: Ability to create and manage ontologies that accurately represent the knowledge in a specific domain
Intermediate knowledge of Git (e.g. CI/CD)
Basic knowledge of cyber security concepts
Basic knowledge on programming languages, like Python, Java, or JavaScript

We give preference to consultants that also demonstrate competences in:

Proficiency in Semantic Technologies like : RDF, OWL, RDFS, semantic web vocabularies (DC, FOAF, SKOS) and graph validation(SHACL)
Experience in RDF query languages like SPARQL
At least 3 years of proven relevant professional experience in global IT operations, preferably in the semantic web environment and with SaaS and cloud solutions
Advanced knowledge of Agile methodologies and DevOps approach
Knowledge on Event driven architectures and experience with Apache Kafka (e.g., Confluent)

Want to apply?
Phone number*
Faça upload do seu CV* (max. 4MB)
Upload your photo or video (max. 4MB)