Arbetsbeskrivning
Join Us in Shaping the Future of Smart Ports Are you a seasoned Data Engineer ready to drive real impact with your skills? At Stena Line Ports & Terminals Digital, we're building the intelligent infrastructure behind one of Europe's most dynamic logistics networks. We’re looking for a Senior Data Engineer to help us transform raw data into actionable insights that streamline operations, optimize the Freight customer journey, and power innovative technologies across our ports. If you're passionate about modern data architecture, cloud-native engineering, and making data the heartbeat of operational excellence—this is your opportunity to lead the charge.
Some of your key responsibilities:
- Model business entities and relationships in dbt, enforcing naming conventions, documentation and testing to guarantee data contract stability.
- Build and maintain scalable ELT pipelines in Azure Data Factory and Azure Functions, ingesting data from on-prem TOS, third-party port systems and SaaS sources into Azure Data Lake Storage Gen2.
- Publish curated data marts and analytics views via Dremio, optimising reflections for interactive BI and advanced analytics workloads.
- Design event-driven integrations over Azure Service Bus and Event Grid to distribute high-value data to micro-services running on Red Hat OpenShift.
- Implement data-quality, lineage and privacy controls; embed GDPR and PII-handling rules directly in pipelines and storage policies.
- Collaborate with Data Scientists to productionise features and model outputs, ensuring repeatable, automated data refresh and monitoring.
- Tune performance, capacity and cost of data workloads; propose partitioning, indexing and caching strategies that keep SLAs intact 24/7.
- Set up observability (logs, metrics, alerts) with Azure Monitor and Power BI dashboards to achieve rapid incident response and root-cause analysis.
- Promote best-in-class DevOps IaC, Azure DevOps for CI/CD, automated unit & integration tests for every pipeline.
- Mentor other engineers, run code reviews, and contribute to internal standards and shared dbt packages.
What you will experience
This role is for you who thrive in an environment that is not set in stone, allowing you to contribute to building solutions and implementing new products in a collaborative setting. You will have the opportunity to work on forward-looking projects and collaborate with a talented and growing team passionate about developing and implementing new products across our networks.
We believe a hybrid solution between the office and working remotely will create the best environment for us to be creative, productive and find a work-life balance creating magic today and tomorrow.
Who you are
At Stena Line, your personality matters as much as how good you are at what you do. Regardless of your role, welcoming, caring, and reliable guide you in your everyday work and the challenges you face. We believe you are a people-oriented individual who excels in collaboration and communication with everyone you meet. You are a self-starter who enjoys sharing new ideas and ways of working.
You approach problems with a sharp analytical mindset, capable of translating complex business needs into efficient, maintainable data models and pipelines. With excellent communication skills, you thrive in cross-functional environments and enjoy collaborating across teams and time zones to deliver smart, data-driven solutions.
Qualifications:
- A university degree in Computer Science, Data Engineering, or a related field, with several years of hands-on experience designing and building scalable data platforms in the Microsoft Azure ecosystem.
- Deep expertise in dbt, including model development, testing, and macros; fluency in Azure Data Factory, particularly with mapping and wrangling data flows; strong command of SQL and/or Python for complex data transformations.
- Proven track record working with Dremio or equivalent lakehouse query engines (e.g., Presto/Trino, Azure Synapse Serverless), including performance optimization at scale.
- Strong understanding of event-driven architectures and messaging patterns using technologies like Azure Service Bus or Apache Kafka.
- Proficiency in containerization technologies such as Docker, along with a solid grasp of Kubernetes or OpenShift for orchestrating data workloads.
- Hands-on experience implementing data governance best practices, including security controls, Azure Key Vault, secrets management, and managed identities.
- Familiarity with CI/CD pipelines, Git workflows, test-driven development for data, and Infrastructure as Code using tools like Terraform, Bicep, or Pulumi.
- Experience in the shipping or logistics industry is a bonus—but not required. Your curiosity and drive to learn are what matter most.
Interested?
This is a full-time, permanent position based in Gothenburg within our Ports & Terminals Digital team. To apply, please register your profile and send in your application in English as soon as possible but no later than May 28th, 2025. The selection process is ongoing and the position can be filled before the last apply date.
Please note that due to GDPR we do not accept applications via e-mail or postal service. We have collective bargaining agreements with Unionen, among others, who you can contact for more information.
If you have any questions regarding the position you are welcome to contact Jessica van Osnabrugge, Engineering Manager, at
[email protected] or about the recruitment process you are welcome to contact Hanna Gustavsson, Talent Acquisition Partner,
[email protected].
Please note that we kindly decline any offers from recruitment or staffing agencies regarding this recruitment.