Own your future
Our culture isn't something people join, it's something they build and shape. We believe that every person deserves to be heard and empowered. If you're on the fence about whether you're a fit, we say go for it. Let’s build something great together.
Key Responsibilities
- Design and build analytics-ready datasets and implement a semantic layer for efficient data consumption.
- Develop and maintain DBT models, ensuring modularity, reusability, and extensibility.
- Ensure data quality through robust testing, clear data lineage, and well-defined semantics.
- Optimize query performance and implement materialization strategies to enhance scalability.
- Collaborate with cross-functional teams to address data needs and maximize platform value.
- Document data models and establish best practices for long-term maintainability and usability.
Must haves
- Proficient in SQL (complex queries, modularization, optimization for performance and readability)
- Familiarity Snowflake
- Hands-on experience with DBT for transforming data and managing the ELT pipeline
- Upper-Intermediate+ English and ability to communicate effectively with international teams.
Nice to have
- Experience with Python, Java or other programming languages
- Experience with data orchestration tools (Dagster, Airflow)
- Familiarity with data visualization tools