Important note: Due to the nature of the role, please do not apply for this opportunity if you need to do so through or with a third party- no 1099 or C2C candidates will be considered.
Summary: Optomi, in partnership with a flagship institution in investment & finance, is inviting applicants for a Tech Lead. This player-coach will enjoy leading a high-impact group of software & data engineers working on developing greenfield big-data & streaming applications and solutions to comply with a new SEC mandate.
Important note on location: Candidates Must be either local to or within commuting distance of offices in either Rockville, Maryland (Preferred) or Woodbridge, New Jersey.
Candidates will ONLY be required to come in 2-4x a month + as needed depending on candidate flexibility/comfortability
Experience of the Right Candidate:
7+ years of professional experience in Software and/or Data Engineering roles
7+ years of professional experience supporting ETL-intensive/centric projects
5+ years of professional experience supporting big data projects
Past experience leading teams of engineers in a player-coach capacity
Believes in Scrum/Agile, and has deep experience delivering software when working on teams that use Scrum/Agile methodology
Technical Expertise of the Right Candidate:
Strong experience in Java and preferably also Scala
Strong experience in big data technologies like AWS EMR, AWS EKS, Apache Spark
Strong experience with serverless technologies like AWS Dynamo DB, AWS Lambda
Strong experience in processing with JSON and csv files
Must be able to write complex SQL queries
Experience in performance tuning and optimization
Familiar with columnar storage formats (ORC, Parquet) and various compression techniques
Unit testing using JUnit or ScalaTest
Git/Maven/Gradle
Code Reviews
Experience with CI/CD pipelines
What the Right Candidate Will enjoy:
Understand complex business requirements
Design and develop ETL pipeline for collecting, validating and transforming data according to the specification
Develop automated unit tests, functional tests and performance tests.
Maintain optimal data pipeline architecture
Design ETL jobs for optimal execution in AWS cloud environment
Reduce processing time and cost of ETL workloads
Lead peer reviews and design/code review meetings
Provide support for production support operations team
Implement data quality checks.
Identify areas where machine learning can be used to identify data anomalies