Publicis Media | Data Engineering Data Engineer (Mid/Senior)
Feeling Ingenious? We can't wait to see what you bring to the team! Apply now!
This is Us
Ingenious Lion is a Digital Tech Hub designed exclusively for Publicis Groupe, one of the world’s leading marketing agencies.
As the marketing industry continues to drive digital transformation, Publicis Groupe empowers its clients with innovative data solutions and media ecosystems. Ingenious Lion has one mission: to partner with Publicis’ global teams in creating innovative data and digital tech products that will shape the future.
What makes us successful? Two key ingredients: we pursue excellence and aim for the best, which means we cultivate a thriving company culture. We focus on creating an environment where people don’t just work; they grow, challenge themselves, strive for excellence, and are recognized for their immense value to the team.
If you're curious to learn more about us, here are four things you should know:
- Our company culture is anchored in five powerful values: Joy, Honesty, Connectivity, Own Challenger, and Greater World. Do these values resonate with you?
- We are committed to diversity and inclusion because we believe our unique differences make us stronger.
- We encourage open communication and create an environment where everyone’s voice is heard and every contribution is valued.
- Finally, we’re somewhat “obsessed” with quality. We believe in continuous learning and knowledge sharing; when we hit a roadblock, we don’t just complain—we create solutions to overcome it.
In one sentence, we are a group of people with an unusual aptitude for generating great ideas, groundbreaking concepts, and infusing them with positive energy and good fun.
A Sneak Peek at Your Future Role
As a Data Engineer, your mission is to design, build, and optimise data solutions that support the transition towards a modern Lakehouse architecture. You’ll work on creating efficient data workflows, ensuring that information is reliable, structured, and ready to power
- making across the organisation.
This role focuses on enabling seamless data movement and transformation, combining the flexibility of a data lake with the performance of a data warehouse. You’ll play a key part in shaping the platform, strengthening data foundations, and unlocking advanced analytics opportunities.
You’ll take ownership of data solutions
-
- end, from modelling and transformation to workflow automation, to ensuring performance and quality. Collaborating with an international team, you’ll align technical delivery with business priorities, while promoting best practices in scalability, governance, and reliability.
Here Are the Key Tasks You-'ll Be Working on Every Day
- Design, build, and optimise data workflows that support
- scale migrations and transformations, ensuring accuracy, reliability, and performance across different environments. - Develop and maintain processes that orchestrate
-
- end data operations, enabling efficient movement and integration of datasets for analytical and operational purposes. - Contribute to the evolution of a modern data platform that combines the scalability of a data lake with the structure and governance of a data warehouse, supporting the Lakehouse architecture.
- Collaborate with colleagues in an agile setting, aligning technical delivery with business priorities and ensuring solutions are fit for purpose.
- Take ownership of monitoring, troubleshooting, and improving data workflows, continuously enhancing processes for efficiency, scalability, and resilience.
-"We-'re Thrilled to Have You Consider Joining Our Team. Here Are the Qualifications We-'re Seeking-"
- At least three years of experience with Databases and creating scalable datapipelines.
- Experience designing and maintaining scalable data pipelines with Databricks, also notebooks and jobs to support ETL processes.
- Proficiency with SQL, Python (with Py
Spark), and Databricks. - Working experience with AWS Redshift, including proficiency with Redshift SQL.
- Strong understanding of modern data architectures, particularly the Delta
Lakehouse model. - Knowledge of data warehouse and data lake concepts, with practical experience in preparing and managing data tables.
- Strong communication skills in English, both written and spoken.
Skills That Would Make You Shine Even Brighter!
- You are familiar with Terraform, including workflow automation and configuration (YAML-based).
- You know
-
- code concepts and structured workflow management. - You have previous involvement with Airflow and Dagster.
Department Publicis Media | Data Engineering
Location Portugal
Remote
Status fully yearly
Salary 56000 - 64400 eur
Employment type Full Time
We look forward to chatting with you!
All fields are required to send the application.
Upload your CV or drag and drop
PDF up to 2MB
Are you legally allowed to work in Portugal? (this is not an exclusion question)
- Yes
- No
By sending this application you agree to the processing of your personal data according to our privacy policy
- Informações detalhadas sobre a oferta de emprego
Empresa: JAyDHubMaker Localização: Lisboa
Lisboa, Lisboa, PortugalPublicado: 12. 10. 2025
Vaga de emprego atual
Seja o primeiro a candidar-se à vaga de emprego oferecida!