Position: Sr. Data Engineer (AWS, Pyspark/Snowflake/Glue, Tableau or MySQL)
Location: 100% Remote, EST Hours
Duration: 6+ month contract, will go longer
Hourly Rate: 65/hr to 75/hr (W2 Only Client)
Responsibilities:
Runs a strategic program or initiative and navigates the overall landscape/domain
Helps set the strategic agenda and define near and long-term priorities-Convenes leadership and associated committees / groups
Ensures adherence to Firm processes
Orchestrates and prioritizes the portfolio of projects/initiatives or topics that are critical to the agenda
Acts as a catalyst to drive and implement change management efforts
Partners closely with collaborating groups and stakeholders to enlist support and drive adoption
Adviser / coach to business sponsors to further enhance change programs as appropriate (e.g. capability building, role modelling, new formal mechanisms)
Works closely with leaders and advises with the relevant domain expertise Action officer, such as shaping performance dialogues.
Orchestrates the resources and colleagues/stakeholders within and associated with the efforts. Ensures connection to best thinking in the Firm on the topic.
What Gets You The Job:
5-7 years of experience as a Data engineer with extensive working experience using Pyspark, Advanced SQL, Snowflake, complex understanding of SQL and Skills, Spark, Snowflake and Glue.
AWS expertise - Lambda, Glue, S3 etc.
Experience in software development, CI/CD, Agile methodology
Experienced on Big Data platform like Hadoop, hbase, CouchDB, hive, Pig etc. Worked with TeraData, Oracle, MySQL, Tableau, QlikView or similar reporting and BI packages.
Used SQL, PL/SQL and similar languages, UNIX shell scripting.
Experienced with data modeling, design patterns, building highly scalable and secured solutions. Knowledge of agile software development process and familiarity with performance metric tools.
Experience working with JSON, yaml, xml etc.
Data Governance (Knowledge about meta data mgmt., data catalog/access). ETL scheduling tools like - Apache Airflow
Nice to have: SAP success factor experience.
Education:
Bachelors' or Masters' degree from an accredited college/university in business related or technology related field.
Relevant certifications in AWS e.g. Cloud Practitioner, Solutions Architect Associate
Migration and Data Integration strategies / certification
Experience with any MPP data warehouses e.g. Snowflake, Big Query, Redshift etc.
Please send your resume to Dave Lim, Senior Technical Recruiter for immediate consideration.
Irvine Technology Corporation (ITC) is a leading provider of technology and staffing solutions for IT, Security, Engineering, and Interactive Design disciplines servicing startups to enterprise clients, nationally. We pride ourselves in the ability to introduce you to our intimate network of business and technology leaders - bringing you opportunity coupled with personal growth, and professional development! Join us. Let us catapult your career!
Irvine Technology Corporation provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, sex, national origin, age, disability or genetics. In addition to federal law requirements, Irvine Technology Corporation complies with applicable state and local laws governing non-discrimination in employment in every location in which the company has facilities.