Data Engineering Tech Lead - Porto
Porto
Porto, Porto District, Portugal

We are seeking a skilled Tech Lead with a strong foundation in AWS and extensive experience in solution architecture, database management, and data engineering. The ideal candidate will possess deep expertise in AWS services such as Lambda, Glue, Step Functions, and Cloud
Formation, alongside proficiency in Python, SQL, and database technologies. Familiarity with Iceberg tables for managing large datasets is highly advantageous, and experience with Azure platform technologies is a
-
- have.

As a Tech Lead (with at least 2-3 years of previous experience in similar roles, with 4-5 years in data engineering technical roles), you will be responsible for leading the design, implementation, and maintenance of scalable data solutions on the AWS or Azure cloud platforms, aligned with the defined data architecture. You will collaborate closely with
- functional teams to develop and optimise data pipelines, ETL processes, and orchestration workflows, utilising tools like Apache Airflow and AWS Step Functions. Your role will involve translating business requests into technical tasks and distributing these tasks effectively among team members. Additionally, you will provide technical leadership, mentorship, and guidance to the engineering team, ensuring best practices in production awareness and troubleshooting are upheld.

HYBRID: 3X WEEK ONSITE

Key Responsibilities

Utilise expert knowledge of AWS services like Lambda, Glue, and Step Functions to design, implement, and maintain scalable data solutions.

Develop robust solution architectures considering scalability, performance, security, and cost optimisation.

Demonstrate proficiency in cloud networking, including VPCs, subnets, security groups, and routing tables.

Design efficient data models for optimal query performance.

Write and optimise SQL queries and identify performance bottlenecks.

Manage ETL processes and data integration into Redshift, My
SQL, and Postgre
SQL.

Create documentation and provide training to team members.

Set up and manage logging and tracing mechanisms in AWS using services like AWS Cloud
Trail and AWS X-Ray.

Implement orchestration solutions using Apache Airflow and AWS Step Functions.

Utilise Athena for interactive query analysis of large datasets in Amazon S3.

Provide technical leadership and guidance, acting as a subject matter expert in AWS and data engineering technologies.

Write comprehensive solution documents and technical documentation.

Stay updated on emerging technologies and industry trends.

Challenge business requirements and propose innovative solutions for efficiency and performance improvement.

The base skills we are searching are:

Expertise in AWS: Extensive experience with AWS services, including Lambda, Glue, Step Functions, Cloud
Formation, and Cloud
Watch.

Strong Solution Architecture Knowledge: Ability to design scalable and efficient data solutions on AWS, adhering to best practices for cloud architecture and infrastructure.

Proficiency in Python and Databases: Strong programming skills in Python and experience with relational databases (My
SQL, Postgre
SQL, Redshift) and No
SQL databases.

Orchestration and Workflow Management: Experience with orchestration tools such as Apache Airflow and AWS Step Functions for automating and managing data workflows.

ETL Tools and Big Data Experience: Knowledge of ETL tools and experience working with large volumes of data, with a preference for experience with Kafka.

Experience with Iceberg Tables: Familiarity with Iceberg tables for managing large datasets efficiently, ensuring data consistency, and supporting ACID transactions.

Production Awareness and Troubleshooting: Proactive approach to production monitoring and troubleshooting, with the ability to anticipate and mitigate potential issues.

Technical Leadership and Communication: Capability to evolve into a technical lead role, with excellent communication and teamwork skills for effective collaboration with
- functional teams.

Strong Analytical and Problem-Solving Skills: Ability to analyse requirements, define technical approaches, and propose innovative solutions to complex problems.

Documentation and Requirements Analysis: Experience in writing solution documents and technical documentation, with the ability to challenge and refine business requirements.  Exposure to Azure Databricks: Familiarity with Databricks for data engineering and analytics tasks.

CANDIDATE PROFILE

– Must have:

BS/MS degree in Computer Science, Engineering or equivalent working experience

English (B2 or higher level)

Passion for technology and support.

Problem-solving skills.

– Nice to have:

Knowledge in Apache Flink, Kafka, and other streaming data technologies.

Experience with
- native architectures and serverless computing.

Certification in AWS, Azure or relevant technologies.

Responder ao anúncio
Seja o primeiro a candidar-se à vaga de emprego oferecida!
0.1380