SDET (Software Development Engineer in Test) #TS-9035
Our client, an AWS partnered analytics organization, owns the world's largest financial data store and runs continuous analytics on global stock data movement with the goal of being ahead of potential bad actors in the market. Highlights:
We own the world's largest financial store (37 petabytes and growing) and look at 155+ billion financial transactions daily- more than Twitter, Visa®, PayPal and Facebook combined.
Leading Innovator in Machine Learning/AI, Big Data, AWS, trading algorithms
AWS- select Partner: forging one of the biggest and most unique partnerships formed with AWS.
Deep culture of internal upskilling
Named 2020 #1 best place to work for US organizations with up to 5k employees.
Must Haves:
Java or Python (may consider scala but prefer the first two):
Currently, old code base in Java, new pipelines being written in Python
Strong SQL (majority of the code they'll be working with)
Production-quality Data Pipeline experience
Bonus
Big data
About Us:
Unisys is at the forefront of innovation, dedicated to ensuring market integrity and developing secure, data-driven applications for our clients. Our Market Surveillance team is expanding, and we are looking for a talented SDET to build and maintain testing frameworks for market surveillance patterns and ensure data pipelines meet the highest standards of quality and performance.
Role Overview:
As an SDET on the Market Surveillance team, you'll play a critical role in developing automated testing solutions and ensuring the robustness of our surveillance and analytics systems. This position requires a combination of strong coding, testing, and data handling skills.
Key Responsibilities:
Develop and execute test plans and automated test scripts (unit, integration, end-to-end).
Design and extend testing frameworks to support new and existing data pipelines.
Write production-level code in Java or Python (Scala is also considered).
Perform thorough data validation using complex SQL queries and other database concepts.
Collaborate with engineers and analysts to maintain the reliability and performance of critical systems.
Build and maintain automation scripts for data pipelines and ETL processes.
Create and support testing solutions in cloud environments like AWS.
Must-Have Qualifications:
Proficiency in Java or Python (legacy code in Java; new pipelines in Python).
Strong SQL skills for data validation and querying.
Experience with production-quality data pipelines (or a standout skill like Machine Learning to make up for limited pipeline experience).
Solid foundation in testing principles, frameworks, and methodologies.
Competency in Bash/Perl scripting.
Ability to work effectively in a remote team environment with strong communication skills.
3-7 years of experience in software testing and development.
Bonus Skills:
Big Data tools (e.g., Spark, Hadoop) and distributed data experience.
Familiarity with AWS and the AWS CLI.
Understanding of supervised ML concepts.
Experience with event streaming tools like Kafka.
Why Join Us?
Competitive salary and benefits package.
Opportunity to work on high-impact, cutting-edge data surveillance technologies.
Collaborative, inclusive work culture with a focus on professional growth.
Flexibility for remote work if you bring specialized skills like Big Data.
How to Apply:
If you're passionate about data quality, automated testing, and making a meaningful impact in market surveillance, we want to hear from you! Please apply directly through our LinkedIn job posting or send your resume and cover letter to Gaurav.gosavi@unisys.com.