Build the future of data. Join the Snowflake team.
Snowflake’s vision is to enable every organization to be data-driven. We’re at the forefront of innovation, helping customers realize the full potential of their data with our data cloud. We are now going beyond the traditional data warehouse and helping customers unlock the power of the open data lake house architecture with our Polaris project—an open-source implementation of the Iceberg REST catalog.
As a Senior Software Engineer on the Polaris and Data Lake Catalog team, you’ll play a key role in building and evolving our open and interoperable data lake ecosystem. You’ll work on some of the most complex and exciting challenges in distributed systems, contributing to Snowflake’s mission of providing a truly open data lake architecture, free from vendor lock-in.
AS A SENIOR SOFTWARE ENGINEER, YOU WILL:
- Design and implement scalable, distributed systems to enable support for Iceberg DML/DDL transactions, schema evolution, partitioning, time travel, and more.
- Architect and build systems that integrate Snowflake queries with external Iceberg catalogs (e.g., AWS Glue, Databricks Unity) and various data lake architectures, enabling seamless interoperability across cloud providers.
- Develop high-performance, low-latency solutions for catalog federation, allowing customers to manage and query their data lake assets across multiple catalogs from a single interface.
- Collaborate with Snowflake’s open-source team and the Apache Iceberg community to contribute new features and enhance the Iceberg REST specification.
- Work on core data access control and governance features for Polaris, including fine-grained permissions such as row-level security, column masking, and multi-cloud federated access control.
- Contribute to our managed Polaris service, ensuring that external query engines like Spark and Trino can read from and write to Iceberg tables through Polaris in a way that’s decoupled from Snowflake’s core data platform.
- Build tooling and services that automate data lake table maintenance, including compaction, clustering, and data retention for enhanced query performance and efficiency.
OUR IDEAL SENIOR SOFTWARE ENGINEER WILL HAVE:
- 8+ years of experience designing and building scalable, distributed systems.
- Strong programming skills in Java, Scala, or C++ with an emphasis on performance and reliability.
- Deep understanding of distributed transaction processing, concurrency control, and high-performance query engines.
- Experience with open-source data lake formats (e.g., Apache Iceberg, Parquet, Delta) and the challenges associated with multi-engine interoperability.
- Experience building cloud-native services and working with public cloud providers like AWS, Azure, or GCP.
- A passion for open-source software and community engagement, particularly in the data ecosystem.
- Familiarity with data governance, security, and access control models in distributed data systems.
BONUS POINTS FOR EXPERIENCE WITH:
- Contributing to open-source projects, especially in the data infrastructure space.
- Designing or implementing REST APIs, particularly in the context of distributed systems.
- Managing large-scale data lakes or data catalogs in production environments.
- Working on highly-performant and scalable query engines such as Spark, Flink, or Trino.
WHY JOIN THE POLARIS & DATA LAKE CATALOG TEAM AT SNOWFLAKE?
- Be part of a pioneering effort to build the most open and interoperable data lake ecosystem in the industry.
- Work on a high-impact open-source project that solves real-world data challenges for enterprise customers like Netflix, AWS, and others.
- Collaborate with some of the brightest minds in the data ecosystem, including core contributors to Apache Iceberg.
- Have the opportunity to innovate in one of the fastest-growing and evolving areas in data infrastructure, where you can make a direct impact on Snowflake’s growth and the broader open-source community.
Every Snowflake employee is expected to follow the company’s confidentiality and security standards for handling sensitive data. Snowflake employees must abide by the company’s data security plan as an essential part of their duties. It is every employee's duty to keep customer information secure and confidential.