Fabric Data Engineer

Berner is looking for a Fabric Data Engineer, where you will play a key role in shaping and building the Group’s modern data platform on Microsoft Fabric. You will design and implement scalable data pipelines, Lakehouse architectures, and integrations that serve multiple business units and international subsidiaries. You will work closely with business stakeholders and contribute to solutions that support decision-making, while developing both technical and business expertise.

The role is located in Helsinki, and you can work flexibly in a hybrid model, with approximately three-day weekly presence at the office. The role might require a few business trips per year. The employment is full-time and permanent, and the preferred start date is as soon as possible.

Role and responsibilities
Your key responsibilities will be:

  • Designing and implementing Lakehouse architecture, data pipelines, and integrations on the Microsoft Fabric platform
  • Building ETL/ELT processes from various source systems into a unified data platform
  • Writing and optimizing Spark transformations (PySpark, Spark SQL) and SQL queries
  • Ensuring data quality, availability, and security
  • Building and maintaining CI/CD pipelines and version control (Git, deployment pipelines)
  • Collaborating with the business, identifying data needs and translating them into functional solutions

The primary focus will be on developing the centralized Fabric platform. During the first year, you will focus on integrating key data sources into Microsoft Fabric, gaining a thorough understanding of the current implementation, and further developing and refining it based on evolving business needs.

Experience, skills and knowledge
What Berner expects from you:

  • Experience in data engineering roles, preferably in Microsoft Fabric or Azure environments
  • Strong expertise in key Fabric components: OneLake, Lakehouse, Data Factory, Data Pipelines, Spark Notebooks
  • Deep SQL skills – ability to design, write, and optimize queries
  • Python skills for data processing and automation
  • Experience with Delta Lake format, medallion architecture, data partitioning, and incremental loading
  • Ability to communicate clearly with both technical and non-technical stakeholders
  • A proactive and solution-oriented mindset
  • Business understanding and a genuine interest in how data supports decision-making
  • Fluent in English, Finnish is seen as a benefit

It is considered a plus if you have:

  • Microsoft Certification: Fabric Data Engineer Associate certification
  • Experience with dbt, AWS, or Snowflake
  • Knowledge of Power BI and understanding of Direct Lake connectivity and semantic models
  • Relevant higher education degree (e.g. computer science, software engineering, or similar)

What Berner offers to you?
Berner is a well-established Finnish company with a long history and a strong track record of profitable growth across multiple business areas. The company invests systematically in technology, data, and AI, providing a strong foundation for long-term development. In this role, you will have the opportunity to build the Group’s data platform using modern technologies. This reflects to Berner’s strong strategic commitment to data-driven operations and offers the opportunity to work with a modern, scalable architecture. The role itself is new, highlighting the company’s willingness to invest in this area. You will enjoy a hybrid working model, flexible working hours, and a low-hierarchy culture where everyone’s voice is heard.

Does this sound like a perfect opportunity for you?
Apply by Wednesday 22.4.2026 at the latest by sending your CV and salary request via the link below. The recruitment partner is Avila Oy and for more information about the position, please contact Avila’s Recruitment Consultant Loviisa Mäki, loviisa.maki@avila.fi

Jätä hakemus

Jaa sosiaalisessa mediassa