Staff Data Engineer/Data Architect
Transporeon is a Saa
S company founded in 2000 in Ulm, Germany. The company provides logistics solutions across several areas, including:
- Buying & selling of logistics services
- Organizing shipment execution
- Organizing dock, yard, truck, and driver schedules
- Invoice auditing for logistics services
Your daily tasks. . .
- Lead the design and implement robust, scalable, and secure
- based data pipelines and architectures in AWS (later maybe migrating to MS Azure). - Ensure best practices in code quality, architecture, and design.
- Design and implement secure, scalable, and
- performance cloud infrastructure. - Manage cloud resources, optimize costs, ensure high availability and disaster recovery.
- Automate infrastructure provisioning and deployment processes using Infrastructure as Code (Ia
C) tools such as Terraform, Cloud
Formation, and ARM templates. - Collaborate with
- functional teams to understand data needs and deliver comprehensive cloud solutions. - Oversee cloud infrastructure management, including monitoring, maintenance, and scaling of cloud resources.
- Ensure compliance with industry standards and regulatory requirements.
- Implement data governance policies and practices and ensure high data quality, integrity, and security across all cloud platforms.
- Identify and implement process improvements to enhance efficiency, quality, and scalability of data engineering and cloud operations.
- Stay current with emerging technologies and industry trends to drive innovation. Utilize AI to increase our efficiency.
- stack. . .
- Infrastructure: Glue, Lambda, Step Function, Batch, ECS, Quicksight, Machine Learning, Sagemaker, Dagster
- Dev
Ops: Cloudformation, Terraform, Git, Code
Build - Database: Redshift, Postgre
SQL, Dynamo
DB, Athena (Trino), Snowflake, Databricks - Language: Bash, Python (Py
Spark, Pydantic, Py
Arrow), SQL
- Min. 8 years of proven experience as a Data Engineer with current focus on data architecture design.
- Extensive experience with AWS and Azure cloud platforms and their data services (e. g. , AWS Redshift, AWS Glue, AWS S3, Azure Data Lake, Azure Synapse, Snowflake, Databricks).
- Strong understanding of ETL processes, data warehousing, and big data technologies.
- Proficiency in SQL, Python, and other relevant programming languages.
- Experience with infrastructure as code (Ia
C) tools such as Terraform, Cloud
Formation, or ARM templates. - Knowledge of containerization and orchestration (e. g. , Docker, Kubernetes).
- Understanding of cloud cost management and optimization strategies.
- Familiarity with CI/CD pipelines and Dev
Ops practices. - Excellent leadership, communication, and interpersonal skills.
- Ability to work in a
- paced, dynamic environment. - Strong
- solving and analytical skills. - Familiarity with data visualization tools (e. g. , Power BI, Quick
Sight) is a plus. - Openness to use AI, in our case Cursor, as your daily tool.
Remote role in following countries: Estonia, Latvia, Lithuania, Poland, Slovakia, Hungary, Romania, Portugal, Spain, Italy, Croatia. We are not offering freelancing contract, only local employment contract.
Our Inclusiveness Commitment
We believe in celebrating our differences. That is why our diversity is our strength. To us, that means actively participating in opportunities to be inclusive. Diversity, Equity, and Inclusion have guided our current success while also moving our desire to improve. We actively seek to add members to our community who represent our customers and the places we live and work.
We have programs in place to make sure our people are seen, heard, and welcomed and most importantly that they know they belong, no matter who they are or where they are coming from. #J-18808-Ljbffr
- Informações detalhadas sobre a oferta de emprego
Empresa: Trimble Inc. Localização: Madeira
Funchal, Madeira, PortugalPublicado: 1. 10. 2025
Vaga de emprego atual
Seja o primeiro a candidar-se à vaga de emprego oferecida!