New York
Posted 5 months ago

A major global brokerage firm is looking for a Hadoop Kubernetes engineer. You will work on a variety of financial and trading applications, from commercial to custom-built ones.

We have a very complex and large environment with predominantly updated technology and legacy phased out. You should have strong HDFS skills with a through understanding of Hadoop and how it handles Big Data. You should also have worked with Kubernetes on Azure, with AKS experience. Native Kubernetes and a strong understanding of the Azure ecosystem is essential for this role. Data warehousing/data ingestion skills, preferably with Dremio are highly desirable as well.

You will work as a full life cycle mid-level engineer. You will work on containerization, monitoring, deployment, scalability, and integration. Working between our complex proprietary tools and custom-built applications makes this an exciting and challenging environment. You should have strong automation skills together with a deep understanding of ‘what rests under the hood of things’, making it possible to get the best of both world: automation and manual customizations. You should be able to manage own volume and work with other Hadoop, Kubernetes and cloud challenges. Any experience with market data, fixed income and foreign exchange space is a major plus.

Our work is exciting, and we focus on the latest technological advances. Our team members are all focused and knowledgeable. We promote a lot of internal learning, while immediately applying new skills at work. Full benefits, bonuses and industry leader status add to the employee satisfaction every step of the way.

Job Features

Job CategoryTechnology
Salary$120,000-150,000
SkillsHadoop, Kubernetes, Azure, HDFS, AKS, Dremio, trading+
TypePermanent

Apply Online