Data Engineer

MLG CapitalGoerke's Corners, WI
5h

About The Position

About MLG Capital MLG Capital is a private real estate investment manager focused on delivering long-term, tax-efficient, risk-adjusted returns through diversified real estate strategies across the United States. As the firm continues to scale its data capabilities, the Data Engineering team plays a critical role in enabling faster insights, stronger analytics, and better decision-making across acquisitions, asset management, investor operations, and portfolio management. Role Overview We are hiring a second Data Engineer to expand our Data Engineering capabilities and accelerate delivery across the organization. Within six months, this role is expected to take on approximately 50% of current Data Engineering workload, materially increasing team capacity and enabling faster execution of our roadmap. This position is ideal for an engineer who wants ownership, greenfield build opportunities, and direct visibility into how their pipelines power the business. The role is 80% hands-on data engineering / 20% business analysis, partnering with stakeholders across multiple business lines.

Requirements

  • SQL.
  • Programming experience in Python, Java, and/or R.
  • Strong documentation skills, including code documentation and translating business needs into technical requirements.
  • Experience collaborating through Git-based workflows.
  • Strong communication skills with the ability to partner with senior developers, business users, executives and stakeholders.
  • Exposure to BI tools, with Power BI experience preferred.

Nice To Haves

  • Background in real estate, finance, or investment management.

Responsibilities

  • Design, build, and maintain robust data pipelines that ingest from new and existing sources; continuously optimize for performance, reliability, and scalability.
  • Model and expand schemas in the corporate Data Lake (DL) to support current workloads and future analytics/AI use cases.
  • Centralize and standardize business metrics across datasets, ensuring consistent definitions for downstream BI and analytics.
  • Enhance and refactor existing pipelines to reduce latency, improve fault tolerance, and simplify maintenance.
  • Work in a modern, collaborative dev workflow using Azure, Azure Data Factory (ADF)
  • Create efficiencies that allow the data team to support more business initiatives simultaneously.
  • Translate business requirements into technical specs in partnership with Portfolio Management, Asset Management, Fund Accounting, and other stakeholders; validate source-to-metric logic end-to-end.
  • Own data quality and metric validation across reporting and BI workflows (including Power BI exposure), supporting reliable decision-making.
  • Increase overall DE throughput and team capacity, taking on a substantial share of pipeline ownership to accelerate roadmap delivery.
  • Contribute to a scalable data engineering foundation that enables faster analytics today and internal AI tooling tomorrow.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service