AWS Lead Data Engineer
Social network you want to login/join with:
Work Regime: Full-time & Hybrid (3x per week at the office)
Overview / Summary:
We are looking for a
Senior Data Engineer acting as a Tech Lead (with at least 2-3 years of previous experience in similar roles, with 4-5 years in data engineering technical roles) that will be responsible for:
- Leading the design, implementation, and maintenance of scalable data solutions on the AWS or Azure cloud platforms, aligned with the defined data architecture;
- You will collaborate closely with
- functional teams to develop and optimise data pipelines, ETL processes, and orchestration workflows, utilising tools like Apache Airflow and AWS Step
Functions; - Your role will involve translating business requests into technical tasks and distributing these tasks effectively among team members
- Additionally, you will provide technical leadership, mentorship, and guidance to the engineering team, ensuring best practices in production awareness and troubleshooting are upheld.
- Utilise expert knowledge of AWS services like Lambda, Glue, and Step Functions to design, implement, and maintain scalable data solutions;
- Develop robust solution architectures considering scalability, performance, security, and cost optimisation;
- Demonstrate proficiency in cloud networking, including VPCs, subnets, security groups, and routing tables;
- Write and optimise SQL queries and identify performance bottlenecks;
- Manage ETL processes and data integration into Redshift, My
SQL, and Postgre
SQL; - Create documentation and provide training to team members;
- Set up and manage logging and tracing mechanisms in AWS using services like AWS Cloud
Trail and AWS X-Ray; - Implement orchestration solutions using Apache Airflow and AWS Step Functions;
- Utilise Athena for interactive query analysis of large datasets in Amazon S3;
- Provide technical leadership and guidance, acting as a subject matter expert in AWS and data engineering technologies;
- Write comprehensive solution documents and technical documentation;
- Stay updated on emerging technologies and industry trends;
- Challenge business requirements and propose innovative solutions for efficiency and performance improvement.
Requirements
Mandatory Requirements:
- Production Awareness and Troubleshooting: Proactive approach to production monitoring and troubleshooting, with the ability to anticipate and mitigate potential issues;
- Technical Leadership and Communication: Capability to evolve into a technical lead role, with excellent communication and teamwork skills for effective collaboration with
- functional teams; - Strong Analytical and Problem-Solving Skills: Ability to analyse requirements, define technical approaches, and propose innovative solutions to complex problems;
- ETL Tools and Big Data Experience: Knowledge of ETL tools and experience working with large volumes of data, with a preference for experience with Kafka;
- Expertise in AWS: Extensive experience with AWS services, including Lambda, Glue, Step Functions, Cloud
Formation, and Cloud
Watch; - Strong Solution Architecture Knowledge: Ability to design scalable and efficient data solutions on AWS, adhering to best practices for cloud architecture and infrastructure;
- Proficiency in Python and Databases: Strong programming skills in Python and experience with relational databases (My
SQL, Postgre
SQL, Redshift); - Orchestration and Workflow Management: Experience with orchestration tools such as Apache Airflow and AWS Step Functions for automating and managing data workflows;
- Fluent in written and spoken English;
- Degree in Computer Engineering or similar.
- Knowledge in Apache Flink, Kafka, and other streaming data technologies;
- Experience with
- native architectures and serverless computing; - Certification in AWS, Azure or relevant technologies;
- Exposure to Azure Databricks: Familiarity with Databricks for data engineering and analytics tasks;
- Experience with Iceberg Tables: Familiarity with Iceberg tables for managing large datasets efficiently, ensuring data consistency, and supporting ACID transactions.
- Our company does not sponsor work visas or work permits. All applicants must have the legal right to work in the country where the position is based.
- Only candidates who meet the required qualifications and match the profile requested by our clients will be contacted.
Future - Build the future, join our living ecosystem!
Job Description
Job Title: AWS Lead Data Engineer Location: Porto, Portugal
Work Regime: Full-time & Hybrid (3x per week at the office)
Overview / Summary:
We are looking for a
Senior Data Engineer acting as a Tech Lead (with at least 2-3 years of previous experience in similar roles, with 4-5 years in data engineering technical roles) that will be responsible for:
- Leading the design, implementation, and maintenance of scalable data solutions on the AWS or Azure cloud platforms, aligned with the defined data architecture;
- You will collaborate closely with
- functional teams to develop and optimise data pipelines, ETL processes, and orchestration workflows, utilising tools like Apache Airflow and AWS Step
Functions; - Your role will involve translating business requests into technical tasks and distributing these tasks effectively among team members
- Additionally, you will provide technical leadership, mentorship, and guidance to the engineering team, ensuring best practices in production awareness and troubleshooting are upheld.
- Utilise expert knowledge of AWS services like Lambda, Glue, and Step Functions to design, implement, and maintain scalable data solutions;
- Develop robust solution architectures considering scalability, performance, security, and cost optimisation;
- Demonstrate proficiency in cloud networking, including VPCs, subnets, security groups, and routing tables;
- Design efficient data models for optimal query performance;
- Write and optimise SQL queries and identify performance bottlenecks;
- Manage ETL processes and data integration into Redshift, My
SQL, and Postgre
SQL; - Create documentation and provide training to team members;
- Set up and manage logging and tracing mechanisms in AWS using services like AWS Cloud
Trail and AWS X-Ray; - Implement orchestration solutions using Apache Airflow and AWS Step Functions;
- Utilise Athena for interactive query analysis of large datasets in Amazon S3;
- Provide technical leadership and guidance, acting as a subject matter expert in AWS and data engineering technologies;
- Write comprehensive solution documents and technical documentation;
- Stay updated on emerging technologies and industry trends;
- Challenge business requirements and propose innovative solutions for efficiency and performance improvement.
Requirements
Mandatory Requirements:- Production Awareness and Troubleshooting: Proactive approach to production monitoring and troubleshooting, with the ability to anticipate and mitigate potential issues;
- Technical Leadership and Communication: Capability to evolve into a technical lead role, with excellent communication and teamwork skills for effective collaboration with
- functional teams; - Strong Analytical and Problem-Solving Skills: Ability to analyse requirements, define technical approaches, and propose innovative solutions to complex problems;
- ETL Tools and Big Data Experience: Knowledge of ETL tools and experience working with large volumes of data, with a preference for experience with Kafka;
- Expertise in AWS: Extensive experience with AWS services, including Lambda, Glue, Step Functions, Cloud
Formation, and Cloud
Watch; - Strong Solution Architecture Knowledge: Ability to design scalable and efficient data solutions on AWS, adhering to best practices for cloud architecture and infrastructure;
- Proficiency in Python and Databases: Strong programming skills in Python and experience with relational databases (My
SQL, Postgre
SQL, Redshift); - Orchestration and Workflow Management: Experience with orchestration tools such as Apache Airflow and AWS Step Functions for automating and managing data workflows;
- Fluent in written and spoken English;
- Degree in Computer Engineering or similar.
- Knowledge in Apache Flink, Kafka, and other streaming data technologies;
- Experience with
- native architectures and serverless computing; - Certification in AWS, Azure or relevant technologies;
- Exposure to Azure Databricks: Familiarity with Databricks for data engineering and analytics tasks;
- Experience with Iceberg Tables: Familiarity with Iceberg tables for managing large datasets efficiently, ensuring data consistency, and supporting ACID transactions.
Benefits
Important:- Our company does not sponsor work visas or work permits. All applicants must have the legal right to work in the country where the position is based.
- Only candidates who meet the required qualifications and match the profile requested by our clients will be contacted.
Future - Build the future, join our living ecosystem!
Requirements
Mandatory Requirements: Production Awareness and Troubleshooting: Proactive approach to production monitoring and troubleshooting, with the ability to anticipate and mitigate potential issues; Technical Leadership and Communication: Capability to evolve into a technical lead role, with excellent communication and teamwork skills for effective collaboration with
- functional teams; Strong Analytical and Problem-Solving Skills: Ability to analyse requirements, define technical approaches, and propose innovative solutions to complex problems; ETL Tools and Big Data Experience: Knowledge of ETL tools and experience working with large volumes of data, with a preference for experience with Kafka; Expertise in AWS: Extensive experience with AWS services, including Lambda, Glue, Step Functions, Cloud
Formation, and Cloud
Watch; Strong Solution Architecture Knowledge: Ability to design scalable and efficient data solutions on AWS, adhering to best practices for cloud architecture and infrastructure; Proficiency in Python and Databases: Strong programming skills in Python and experience with relational databases (My
SQL, Postgre
SQL, Redshift); Orchestration and Workflow Management: Experience with orchestration tools such as Apache Airflow and AWS Step Functions for automating and managing data workflows; Fluent in written and spoken English; Degree in Computer Engineering or similar. Complementary requirements: Knowledge in Apache Flink, Kafka, and other streaming data technologies; Experience with
- native architectures and serverless computing; Certification in AWS, Azure or relevant technologies; Exposure to Azure Databricks: Familiarity with Databricks for data engineering and analytics tasks; Experience with Iceberg Tables: Familiarity with Iceberg tables for managing large datasets efficiently, ensuring data consistency, and supporting ACID transactions. #J-18808-Ljbffr
- Informações detalhadas sobre a oferta de emprego
Empresa: LUZA Group Portugal Localização: Porto
Porto, Porto District, PortugalPublicado: 11. 8. 2025
Vaga de emprego atual
Seja o primeiro a candidar-se à vaga de emprego oferecida!