Data Platform Engineer (m/f/d)
Data Platform | AI / ML | GCP | Direct Hiring
The company you will work for
Enhance situational awareness and
- making across land, sea, and air through advanced data processing and analysis.
Your New Mission
Architect and implement scalable infrastructure for data and machine learning workloads, leveraging modern
- native and
- premises distributed systems.
Build and maintain robust,
- performance data pipelines to ingest, process, and store
- scale datasets from diverse sources.
Administer and
- tune databases and data warehouses to ensure high availability, data integrity, and optimal performance.
Consolidate data from various origins—APIs, internal systems, and
- party providers—into unified,
- ready datasets.
Develop tools and interfaces that empower business users and analysts to independently access insights and interact with ML models.
Contribute to the evolution and scaling of the Data & AI Platform, supporting new use cases and capabilities.
Uphold data governance standards, ensuring compliance with privacy regulations and implementing robust security practices.
Establish validation and monitoring mechanisms to maintain data accuracy, consistency, and reliability across systems.
Partner with data scientists, analysts, and engineering teams to understand platform needs and deliver tailored infrastructure solutions.
Implement observability tools and performance tuning strategies to ensure system reliability, scalability, and
- efficiency.
Maintain clear, comprehensive documentation of data workflows, APIs, schemas, and ML systems to support onboarding and collaboration.
What you need to be succeed
Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or a related field (Mandatory).
5+ years of experience in data & ML platform engineering or a similar role.
Proficiency in programming languages such as Python, Go, Java, or Rust.
Strong knowledge of containerization technologies (e. g. Docker) and orchestration systems (e. g. Kubernetes) in production environments.
Experience with SQL, database management systems (e. g. , My
SQL, Postgre
SQL, SQL Server), data modeling and schema design.
Experience with cloud platforms and their data services, with a focus on Google Cloud.
Familiarity with big data technologies (e. g. Spark), data warehousing solutions (e. g. , Redshift, Snowflake), data pipeline orchestration (e. g. Airflow), and database systems (SQL and no
SQL).
Design and manage MLOps pipelines to support model training, deployment and monitoring at scale
Experience implementing observability stacks and logging pipelines (e. g. , Prometheus, Grafana, Loki, ELK).
Experience with Ia
C frameworks (e. g. Ansible, Terraform),
Understanding of Dev
Ops best practices and tools: GIT, CI/CD, telemetry and monitoring, etc.
What the company can offer you
Meal Allowance
Health Insurance
Remote Work Model
Next Steps
If you are interested in this opportunity, please send us your updated CV. If you are looking for a different kind of professional challenge, feel free to contact us to discuss other career opportunities — always in complete confidentiality.
- Informações detalhadas sobre a oferta de emprego
Empresa: Hays Localização: Setúbal
Setúbal, Setubal, PortugalPublicado: 7. 6. 2025
Vaga de emprego atual
Seja o primeiro a candidar-se à vaga de emprego oferecida!