If you are passionate about music & data......this is for you! We've been exclusively retained on a unique opportunity to have a direct impact on shaping the company's data infrastructure from the ground up.
Role Overview
As an Analytics Engineer, you'll be instrumental in the design, implementation, and optimization of the company's new Snowflake-based data warehouse. Working closely with data analysts, data scientists, and business teams, you'll develop scalable data models and efficient ETL/ELT processes that transform raw data into actionable insights. This role offers you the chance to shape data infrastructure that will drive the company's success.
Key Responsibilities
Data Warehouse Construction: Design, build, and optimize a scalable Snowflake data warehouse that enables cross-team data accessibility.
DBT Development: Lead the creation and maintenance of well-documented DBT data models, shaping the transformation pipeline.
CI/CD Deployment: Implement and manage CI/CD pipelines to ensure smooth testing, deployment, and monitoring of data pipelines and DBT models.
ETL/ELT Mastery: Architect scalable ETL/ELT processes to ensure a continuous flow of accurate and business-ready data.
Data Advocacy: Partner with data analysts, scientists, and other business teams to drive faster, smarter decisions by understanding and meeting their data needs.
Optimize for Scale: Maintain Snowflake's high performance, preparing it to handle growing data volumes with optimal query efficiency.
Data Quality & Governance: Establish rigorous data quality checks and governance standards to support informed decision-making.
Documentation & Knowledge Sharing: Create and maintain clear documentation for all pipelines, models, and processes to foster team collaboration and a data-driven culture.
Innovate & Improve: Stay updated on the latest trends in analytics engineering to drive creativity and continuous improvement in data architecture.
Skills & Qualifications
2-5 years in analytics engineering, data engineering, or a similar role with demonstrated success in building scalable data systems.
Advanced DBT skills for data modeling and transformation.
Deep experience with Snowflake (or similar cloud data warehouses), including performance tuning.
Proficiency in SQL and experience with complex query optimization.
Familiarity with Airflow and Python is a plus.
Strong experience with data modeling (e.g., star or snowflake schema) and ensuring data quality.
Ability to translate complex technical details into clear insights for non-technical stakeholders.
Proficiency in Git and version control practices for collaborative coding.
Preferred Qualifications:
Familiarity with data visualization tools like Looker or Tableau.
Knowledge of data security best practices, especially for sensitive data.
Experience with ETL/ELT tools (e.g., Fivetran, Stitch) for efficient data movement.
Candidates must be authorized to work in the US for consideration. This client unfortunately can't sponsor visas.
**An advert never does a role justice so if you're not sure, feel free to apply and one of our consultants will give you a call to give you a more detailed overview!*