Data Engineer
Syffer is an
- inclusive consulting company focused on talent, tech, and innovation. We exist to elevate companies and humans around the world, making change from the inside out.
We believe that technology combined with human kindness positively impacts every community worldwide. Our approach is simple: we see a world without borders and believe in equal opportunities. We are guided by our core principles of spreading positivity, good energy, and promoting equality and care for others.
Our hiring process is unique! People are selected based on their value, education, talent, and personality. We do not consider ethnicity, religion, national origin, age, gender, sexual orientation, or identity.
It's time to burst the bubble, and we will do it together!
What You'll Do:
- Analyze user problems, ensure a clear understanding of architecture, and maintain open communication with Data Architects, peers, and Project Managers;
- Design and implement data pipelines and infrastructure (e. g. , with Terraform), follow data best practices, and manage interface contracts with version control and code reviews;
- Apply strong knowledge of data warehousing, ETL/ELT processes, data lakes, and modeling throughout development;
- Define, execute, and document functional and technical tests in collaboration with the Project Manager, sharing regular updates on results;
- Participate in Deployment Reviews, monitor
- deployment behavior, log errors, and ensure proper use of deployment and monitoring strategies.
What You Are:
- Proficiency with Py
Spark and Spark SQL for data processing; - Experience with Databricks using Unit Catalog;
- Knowledge of Delta Live Tables (DLT) for automated ETL and workflow orchestration in Databricks;
- Familiarity with Azure Data Lake Storage;
- Experience with orchestration tools (e. g. , Apache Airflow or similar) for building and scheduling ETL/ELT pipelines;
- Knowledge of data partitioning and data lifecycle management on
- based storage; - Familiarity with implementing data security and data privacy practices in a cloud environment;
- At least one year of experience with Terraform and good Git
Ops practices; - Additional knowledge and experience such as Databricks Asset Bundles, Kubernetes, Apache Kafka, Vault are a plus.
What You'll Get:
- Wage according to your professional experience;
- Remote work whenever possible;
- Health insurance from the beginning of employment;
- Delivery of work equipment suited to your role;
- And more.
Work together with expert teams on large,
- term projects with
- leading clients.
Are you ready to step into a diverse and inclusive world with us?
Together we will promote uniqueness!
Required Skills:
Collaboration, Data Warehousing, Apache Kafka, Data Modeling, Data Processing, Azure, Spark, Pipelines, Version Control, Kubernetes, Security, Infrastructure, SQL, Design, Communication, Management, and more.
#J-18808-Ljbffr- Informações detalhadas sobre a oferta de emprego
Empresa: syffer Localização: Lisboa
Lisboa, Lisboa, PortugalPublicado: 18. 7. 2025
Vaga de emprego atual
Seja o primeiro a candidar-se à vaga de emprego oferecida!