Analytics Engineer (M/F/D)
Havi Tech Hub | Intelligent Automation, Data & Analytics | Data Products & Analytics
Your new Company
Havi , a global leader since 1974, employs over 10, 000 people and serves customers in more than 100 countries. Specializing in the foodservice industry, Havi provides innovative supply chain and logistics solutions, including analytics, planning, distribution, and freight management.
Havi’s diverse teams collaborate seamlessly across locations and functions, embodying a spirit of integrity and creativity to serve their customers in the best way possible.
Your new Role
The Analytics Engineer will design, build, and operate trusted, documented, and performant analytics datasets and the metrics layer that power
- making, Intelligent Automation, and AI (via the enterprise AI gateway/RAG).
This person will use SQL + dbt and
- engineering practices (tests, CI/CD and code review) to turn raw platform data into certified domain data products for Supply Chain (e. G. , Distribution & Freight, GBS/Control Tower, HR).
The Analytics Engineer will:
Modeling & Semantic Layer:
- Design structured dbt projects (staging → intermediate → marts) with a trusted semantic layer for certified KPI’s;
- Enforce naming standards, ownership tags, and versioning to prevent
- hoc SQL drift.
Data Quality & Contracts:
- Build robust schema and custom tests for Tier-1 models (e. G. , not null, unique, relationships, etc. );
- Establish data contracts with producers/consumers, integrate checks into CI, and ensure fast failure on breaking changes.
Performance & Cost Optimization:
- Optimize large fact models using incremental strategies, partitioning, clustering, and surrogate keys;
- Monitor query SLA’s, caching, concurrency, and cost per workload – proactively resolving performance hotspots.
CI/CD & Reliability:
- Apply Git workflows with PR reviews, run dbt builds/tests in CI, and manage dev/QA/prod environments;
- Maintain runbooks for failures and track reliability metrics (change failure rate, MTTR) using observability tools.
Documentation & Lineage:
- Keep model documentation and
- level lineage up to date;
- Align glossary terms with Data Governance and publish clear, accessible docs (dbt docs + catalog).
BI Enablement:
- Expose certified data marts and metrics to BI tools (Power BI, Tableau and Looker);
- Define aggregates,
- level security, and refresh strategies;
- Collaborate on dashboard performance and retire outdated sources.
DS/AI Enablement:
- Deliver AI-ready feature tables with freshness SLA’s and stability;
- Tag sensitivityand ownership;
- Partner with AIteams to meet safety and evaluation standards for data used in AI applications.
Stewardship, Privacy & Compliance:
- Apply masking and pseudonymization for sensitive domains (HR and Legal);
- Respect data retention and residency policies;
- Ensure
- readiness through tests, lineage, and access controls.
Collaboration with Platform & Ingestion:
- Work closely with ingestion teams on specs,
- arriving data, backfills, and incident resolution;
- Validate handovers using clear acceptance criteria.
Continuous Improvement:
- Build reusable macros, packages, and templates;
- Enhance developer experience and contribute to architectural best practices.
What you will need to be succeed
To succeed as an Analytics Engineer you will need:
- Bachelor’s Degree in Computer Science , Information Systems , Data/Analytics , Statistics , Engineering or related (mandatory);
- Master’s Degreein Data/Analytics or Information Systems (would be a plus);
- Certifications in dbt (Fundamentals/Advanced), Databricks , Cloud (AWS/Azure/GCP data), Power BI (would be a plus);
- +4 years of experience working in Analytics Engineering / BI Data Modeling field;
- Track record shipping tested dbt models, semantic layers/metrics, and
- adoption data marts ;
- Strong SQL and warehouse performance tuning (partitioning, clustering, pruning and incremental models);
- Hands-on CI/CD (git, PRs, automated tests), orchestration (Airflow/Dagster/dbt Cloud), and documentation;
- Familiarity with data contracts , lineage/metadata , DQ tools , and privacy for sensitive domains (HR/Legal);
- Understanding of Supply Chain data – orders, shipments, lanes,
-
- serve and OTIF (would be a plus);
- Innovative mindset – Thinks beyond the obvious, ask questions, adapt fast, learns from failure and anticipates risks and opportunities;
- API and
Data Contract Collaboration – Understands API specs and works seamlessly with ingestion teams to define and maintain robust data contracts;
- Performance and Cost awareness – Monitors query efficiency and dashboard performance with a Fin
Ops mindset—ensuring
- effective, scalable solutions;
- Collaborating skills – Builds trust relationships, avoid isolation, promotes open dialogue, shares information, supports team decisions, embraces diverse perspectives, listens actively;
- Delivering Results – Sets ambitious goals, speaks up about risks, proposes solutions, and acts swiftly. Takes ownership and empowers others. Avoids complacency and
- shifting;
- Pragmatic and
- driven – Strikes the right balance between speed and governance. Automates repetitive tasks using macros, tests, and templates to boost productivity and reduce manual effort;
- Customer Focus – Build partnerships, seeks feedback, anticipates needs, communicates transparently, and adapts to change;
- Excellent communication skills – Fluency in English.
What the Company can offer you
Have the opportunity to join a
- functional team in an international company with a multicultural working environment!
- Informações detalhadas sobre a oferta de emprego
Empresa: Hays Localização: Lisboa
Lisboa, Lisboa, PortugalPublicado: 18. 9. 2025
Vaga de emprego atual
Seja o primeiro a candidar-se à vaga de emprego oferecida!