Competence
Competence | Experience | Priority |
---|---|---|
Linux | 3-5 Years | Essential |
Python | 3-5 Years | Essential |
SOLID principles | 3-5 Years | Essential |
Description
For our client we are looking for a software developer.
You will be part of the team working building the most modern data platform, which is the foundation for all our machine learning and advanced analytics products. The builds data processing pipelines and accompanying services related to data quality and data processing. We work in a truly agile way, and you will have an important voice in priorities, technical decisions and ways-of-work. You work location is in central Stockholm, in a brand new and laid-back office.
Work tasks
• Develop data processing pipelines using Python, SQL, Airflow and Spark.
• Develop reusable modules for data processing, using modern software engineering practices across unit testing, continuous integration, packaging, documentation.
• Be an active team player in proposing improvements, participating in code reviews and planning our roadmap.
Competences
Required skills/qualifications:
• You strive for excellence in code and in an automation.
• Applied experience in software development principles such as DRY, SOLID, idempotency etc. Can explain and refactor for modularity, testability and overall robustness.
• Comfortable in scripting and development in Linux-based environment
• Experience in developing Python-based applications is required, but your overall experience with programming is equally valued.
• Experience with databases querying and modelling in traditional relational databases and preferably in modern columnar-based formats such as Parquet
• Good writing skills in English. We like clear and easy to understand stories, comments, pull-requests, and reviews. Slides are not important.
• Good verbal communication skills. We want an active member in our agile ceremonies, pair programming and technical workshops with the team.
Beneficial/”Nice to have” qualifications:
• Experience with Docker and Kubernetes
• Experience with batch and stream data processing using in Spark
Tools/software:
• Python
• Spark
• Databricks
• Git-based development
• Linux
Personal Competences
• Open minded, willing to learn new things and adapt.
• You are enthusiastic about the technologies we are using
• Good verbal and written communication in English.