We’re now looking for a hands-on Lead Data Platform Engineer to help us kickstart our customers data platform journey. You’ll design and build key components, make pragmatic technology choices, and work closely with platform, infrastructure and engineering teams to create a scalable, developer-friendly data platform.

We’re looking for someone who thrives on rolling up their sleeves — writing code, solving integration challenges, and delivering real value from day one — while also bringing a structured mindset, curiosity, and strategic thinking to help shape the platform long term.

What You’ll Do
• Design, implement and integrate core platform components for dataset catalog, storage standards, publishing SDKs, orchestration and metadata tracking
• Evaluate and integrate open-source tools like Flyte, Dagster, Amundsen, DataHub
• Help shape how Databricks fits into the platform strategy and make it play well with the rest of the platform
• Establish conventions and tooling for dataset formats (e.g. Parquet), versioning, access control, and schema definitions
• Help define and build the golden path for producing and consuming datasets across teams
• Embed platform thinking into everything — making the platform extensible, observable, and pleasant to use

You Might Be a Fit If You…
• Have 7+ years of hands-on software engineering experience, including building data-intensive systems or platforms
• Are familiar with data pipeline orchestration (Airflow, Flyte, Dagster, etc.)
• Know your way around cloud platforms, ideally with Azure, but GCP or AWS is also fine
• Are comfortable working with Databricks and its ecosystem – hands-on experience is a strong plus
• Understand the value of metadata, lineage, and data product ownership
• Have experience building or contributing to internal platforms, developer tools or shared services
• Are comfortable with Python and CLI tooling, and ideally know SQL as well
• Are pragmatic: you can balance speed and structure, you ship iteratively, and you’re not religious about tools
• Can work across team boundaries, explain your thinking clearly, and enjoy helping others succeed

Nice to Have
• Experience integrating with tools like Power BI, Backstage, or open metadata catalogs
• Familiarity with Delta Lake and architectures (e.g. lakehouse)
• Background in data engineering, operations, or developer experience roles

We offer continuously. That means that we sometimes remove the assignments before deadline. If you are interested we recommend that you apply immediately.