You will work with technologies that include Python, AWS, Snowflake, Databricks, and Airflow.
What You'll Do As a Sr Data Engineer:
Contribute to maintaining, updating, and expanding existing Core Data platform data pipelines
Build and maintain APIs to expose data to downstream applications
Develop real-time streaming data pipelines
Tech stack includes Airflow, Spark, Databricks, Delta Lake, and Snowflake
Collaborate with product managers, architects, and other engineers to drive the success of the Core Data platform
Contribute to developing and documenting both internal and external standards and best practices for pipeline configurations, naming conventions, and more
Ensure high operational efficiency and quality of the Core Data platform datasets to ensure our solutions meet SLAs and project reliability and accuracy to all our stakeholders (Engineering, Data Science, Operations, and Analytics teams)
Qualifications:
You could be a great fit if you have:
7+ years of data engineering experience developing large data pipelines
Proficiency in at least one major programming language (e.g. Python, Java, Scala)
Hands-on production environment experience with distributed processing systems such as Spark
Hands-on production experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines
Experience with at least one major Massively Parallel Processing (MPP) or cloud database technology (Snowflake, Databricks, Big Query).
Experience in developing APIs with GraphQL
Advance understanding of OLTP vs OLAP environments