GSET Data Platform Eng - GBM Public - Software Engineer - NYC

Goldman SachsNew York, NY
14h$110,000 - $130,000

About The Position

As a Data Platform & Analytics Engineer, you will design, build, and maintain large volume, high processing & low latency post-trade stack that is used to Source, Transform, Persist & Distribute Equities Trading data to various consumers from Regulatory, Compliance, Billing, Risk, Analytics, Client reports etc. This stack is being rebuilt as a cloud-native platform and presents a once-in-a-generation opportunity to architect high volume (multi-billion msgs/day), low latency streaming solutions. This is an Analytics-heavy role; your work drives the routing decisions of trading algorithms to be optimized for Exchange Fees and Rebates. We are currently integrating agentic AI capabilities to our Analytics platform to deepen our client outreach. You will also be responsible for developing integrated Data Quality controls and visibility/entitlement platforms to ensure nuanced data access enforcement and real-time validation of data in terms of Completeness & Accuracy.

Requirements

  • Minimum 1 year of work experience in High Performance Server side/Data processing applications
  • Minimum 1 year of Java or 2 years of Python programming experience
  • Exposure to scripting languages, Relational Databases (OLTP/OLAP) and cloud services (Snowflake/Singlestore).
  • Clear understanding of algorithms and data structures
  • Familiarity with core programming concepts and techniques (e.g. concurrency, memory management), I/O, performance optimization, high volume, near real-time processing
  • Comfortable with standard SDLC tools, e.g. version control systems, Build & debugging tools
  • Strong written and oral communication skills
  • Highly motivated, committed and capable of working against timelines with minimal guidance

Nice To Haves

  • Finance domain expertise & exposure to Trading, Investment banking
  • Bachelor’s degree / Master’s degree in Computer Science, Computer Engineering or related field
  • Experience with several of the following: Large-scale, distributed enterprise systems – Flink, Kafka, Spark, Hadoop etc..
  • High Performance, High availability systems
  • Databases/cloud-services including Snowflake/Singlestore.
  • Agentic AI workflows, exposure to working with generative AI models.

Responsibilities

  • Implementing and/or enhancing Data Platform ecosystem, which is challenging as it must address scale, resiliency, performance/throughput with optimal usage of resources
  • Working with Trading & Post-Trade teams, Regulatory and Compliance operations and coverage teams to ensure smooth rollout & migration of any upgrades or changes
  • Liaising with senior stakeholders across the firm to implement technology strategies
  • Working on the latest technologies in high performance computing, cloud-stack, big data & distributed processing.
  • Rolling out new agentic AI capabilities to drive Operational efficiency and Revenue growth.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service