Assignment description
Assignment Overview
We are looking for a Senior Data Engineer to join a high-impact data and analytics team within a global enterprise organization. The consultant will take end-to-end responsibility for developing and supporting cloud-native data products in line with modern data mesh architecture and DevOps practices.
The mission includes building scalable pipelines and infrastructure on Google Cloud Platform (GCP) using tools like Dataflow, BigQuery, and DBT, while ensuring high standards in observability, security, and performance.
Key Responsibilities
- Build, optimize, and support new and existing data products
- Manage CI/CD pipelines and Terraform infrastructure in GCP
- Ensure non-functional requirements (security, scalability, observability) are met
- Collaborate with product owners and cross-functional teams around data vision
- Support data mesh principles and continuous improvement efforts
- Contribute to reducing technical debt and driving cloud/data maturity
Must-Have Skills & Experience
- 4+ years of hands-on experience as Data Engineer on GCP
- Strong experience with Dataflow, BigQuery, and DBT
- Proficient in SQL and data-centric programming (Python, Java, or Scala)
- Familiar with data formats like Avro and Parquet
- Experience with CI/CD, Terraform, and DevOps in a cloud context
- Good understanding of data modeling techniques
- Knowledge of NoSQL and RDBMS databases
- Collaborative mindset and excellent communication skills
- Fluent in English (written and verbal)
Nice-to-Have
- Experience with data visualization tools
- Understanding of retail industry use cases
- Familiarity with data mesh architecture and cross-team data collaboration
Personal Attributes
- Passion for data, technology, and team collaboration
- Self-driven and comfortable making technical decisions
- Curious, proactive, and quality-focused
- Structured and delivery-oriented
Required skills – GCP, Devops, CD/CI
Languages – English & Swedish (Proficient)