Own your future:
Our culture isn't something people join, it's something they build and shape. We believe that every person deserves to be heard and empowered. If you're on the fence about whether you're a fit, we say go for it. Let’s build something great together.
Key Responsibilities:
- Design and build efficient, robust, and reliable data processing pipelines.
- Support the software architecture, design, and implementation of the data platform..
- Optimize data processing by balancing performance, cost, and storage trade-offs using various cloud technologies.
- Contribute to architectural decisions, infrastructure setup, and production operations.
- Design and develop analytics-ready datasets, including implementing a semantic layer for data consumption.
- Create and maintain DBT models with a focus on modularity, reusability, and extensibility.
- Ensure data quality through comprehensive testing and by defining clear data lineage and semantics.
- Optimize query performance and implement materialization strategies to ensure scalability.
- Collaborate with cross-functional teams to address data needs and maximize platform value.
- Document data models and establish best practices for maintainability and usability.
Must Haves:
- Proven experience in designing data models and processing pipelines.
- Proficiency in SQL (e.g., writing complex queries, modularization, and optimizing for performance and readability).
- Experience with programming languages such as Python (preferred), Java, or similar.
- Familiarity with the modern data stack and cloud-native data platforms (e.g., Snowflake, BigQuery, Redshift).
- Hands-on experience with DBT or similar tools for data transformation and ELT pipeline management.
- Experience with data orchestration tools (e.g., Dagster, Airflow).
- Excellent communication skills, including experience collaborating directly with senior stakeholders.
- Analytical mindset with the ability to effectively communicate technical concepts to team members and stakeholders.
- Upper-Intermediate + level of English.
Nice to Have:
- Experience with GitOps and continuous delivery for data pipelines.
- Familiarity with Infrastructure-as-Code tools (e.g., Terraform).
- Knowledge of data visualization tools.