We are seeking motivated and proactive Software Engineers with expertise in Java, Kafka / real-time messaging, Apache Flink, and MQ technologies to design, develop, and maintain a foundational data hub. This framework will enable seamless, event-driven communication and data streaming to support near real-time risk reporting, analytics and business reporting
Responsibilities:
Develop and deploy real-time messaging applications using Kafka.
Design / Implement Kafka producers/consumers for high-throughput, low-latency data processing in a trading environment.
Integrate Kafka with various trading platforms and financial systems.
Troubleshoot Kafka-related issues and optimize performance for high-frequency trading scenarios.
Leverage Apache Flink for real-time stream processing, including event-driven data transformations and aggregation.
Qualifications:
Bachelor's degree in Computer Science, Software Engineering, or a related field.
Strong Java development experience with expertise in real-time messaging and Kafka.
Experience integrating Kafka with trading systems and managing high-volume, low-latency data streams.
Proficiency in Apache Flink for stream processing and real-time data analytics.
Familiarity with event-driven architecture, distributed systems, and fault tolerance principles.
Proficiency with Apache messaging technologies (e.g., Apache ActiveMQ or Apache Kafka) and MQ systems (e.g., IBM MQ, Tibco EMS).
Experience with Docker, Kubernetes, and microservices architecture is a plus.
Strong understanding of message queuing, reliability, and fault-tolerant systems.
All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, or national origin.