Senior Data Engineer at Kellton in Charlotte, North Carolina

Posted in Other about 3 hours ago.

Type: full-time





Job Description:

Position: Sr. Data Engineer

Work Location: Remote (working CST or EST) and able to travel quarterly for PI planning

Duration: 6 Month (Contract-to-Hire)

Manager Notes:

Sr. Data Engineer

Experience on several DE projects, hands on

Strong snowflake expertise - administrations Advanced SQL and Python skills Extracting data from APIs ETL processes, data ingestion AWS - Lambda, airflow Nice to have DBT - nice to have

Job Summary

We are looking for a hands-on Senior Data Engineer with expertise in developing data ingestion pipelines. This role is crucial in designing, building, and maintaining our data infrastructure, focusing on creating scalable pipelines, ensuring data integrity, and optimizing performance.

Key skills include strong Snowflake expertise, advanced SQL proficiency, data extraction from APIs using Python and AWS Lambda, and experience with ETL/ELT processes. Workflow automation using AWS Airflow is essential, and experience with Fivetran and DBT is a plus.

Job Responsibilities
• Design, build, test, and implement scalable data pipelines using Python and SQL.
• Maintain and optimize our Snowflake data warehouse's performance, including data ingestion and query optimization.
• Extract data from APIs using Python and AWS Lambda and automate workflows with AWS Airflow.
• Perform analysis and critical thinking to troubleshoot data-related issues and implement checks/scripts to enhance data quality.
• Collaborate with other data engineers and architect to develop new pipelines and/or optimize existing ones.
• Maintain code via CI/CD processes as defined in our Azure DevOps platform.

Job Qualifications
• 7+ years of experience in Data Engineering roles, with a focus on building and implementing scalable data pipelines for data ingestion.
• Expertise in Snowflake, including data ingestion and performance optimization.
• Strong SQL skills for writing efficient queries and optimizing existing ones.
• Proficiency in Python for data extraction from APIs using AWS Lambda, Glue, etc.
• Experience with AWS services such as Lambda, Airflow, Glue, S3, SNS, etc.
• Highly self-motivated and detail-oriented with strong communication skills.
• Familiarity with ETL/ELT processes.
• Experience with Fivetran and DBT is a plus.
More jobs in Charlotte, North Carolina

Other
37 minutes ago

Vaco
Other
about 1 hour ago

Thomas & Hutton
Other
about 1 hour ago

Thomas & Hutton
More jobs in Other

Other
less than a minute ago

Leprino Foods
Other
less than a minute ago

Leprino Foods
Other
less than a minute ago

Leprino Foods