Junior Data Engineer

Superior Insurance Partners LLCChicago, IL
13h$70,000 - $85,000

About The Position

Superior Insurance Partners is a rapidly growing insurance brokerage platform, focused primarily on providing commercial lines, personal lines, and employee benefit solutions to companies and individuals. Superior acquires and partners with leading independent insurance agencies primarily in the Midwest and Eastern US. The company’s mission is to improve the lives of its agency partners. Superior does this by creating a highly tailored plan for each of its agency partners to help them achieve their goals, and providing customized resources including accounting/finance, recruiting, HR, AMS/IT, marketing, and M&A support. Agency partners are aligned through long-term economic incentives while leveraging the benefits of best practices, scale, and resources across Superior’s shared platform. Superior is backed by Tyree & D'Angelo Partners (“TDP”), a leading Chicago-based private equity firm that makes control ownership investments in, and partners with, lower middle market businesses with the goal of creating meaningful value for all involved. TDP is currently investing out of its third fund and has managed and created over $3 billion of capital and company enterprise value. TDP has significant experience investing in service businesses and has completed over 1000 investment partnerships in its history. Junior Data Engineer / Analytics Engineer Role Overview We are hiring a Junior Data Engineer / Analytics Engineer to support and scale our data platform. This role focuses on building, maintaining, and improving data ingestion and ETL pipelines that connect third-party systems (including insurance agency management & ERP systems) into our Azure-based analytics environment. This is a hands-on, backend-focused role. The successful candidate will own the operational reliability of data pipelines and help modernize existing ETL processes to be more efficient, standardized, and scalable.

Requirements

  • Understanding of ETL concepts and data pipelines
  • Hands-on experience or strong exposure to Azure Data Factory or similar ETL tools
  • Experience working with APIs (REST), including reading documentation and handling JSON payloads
  • Familiarity with cloud-based databases such as Azure SQL, SQL Server, or similar
  • Experience writing SQL queries (joins, aggregations, filtering)
  • Strong attention to detail and data accuracy
  • Ability to document work & processes and follow defined standards

Nice To Haves

  • Experience with insurance systems such as Applied EPIC or Vertafore AMS360
  • Exposure to Power BI datasets or backend reporting models
  • Experience improving or refactoring existing ETL pipelines
  • Familiarity with basic authentication methods (API keys, OAuth, tokens)
  • Experience supporting production data environments

Responsibilities

  • Build, maintain, and enhance Azure Data Factory (ADF) pipelines
  • Own day-to-day reliability of data ingestion and ETL processes
  • Troubleshoot pipeline failures, performance issues, and data anomalies
  • Refactor and improve existing ETL processes to increase efficiency, reliability, and maintainability
  • Reduce manual data movement through automation and standard patterns
  • Support schema changes, incremental loads, and new data elements
  • Integrate data from third-party systems such as: Insurance agency management systems (e.g., Applied EPIC, Vertafore AMS360, or similar) CRM and ERP platforms File-based sources (CSV, Excel, SFTP) REST-based APIs
  • Work with API documentation to authenticate, extract, and ingest data
  • Handle pagination, rate limits, and incremental data extraction
  • Assist with vendor-provided APIs or SDKs when available
  • Write SQL queries to validate data completeness and accuracy
  • Reconcile ETL outputs to source system reports and extracts
  • Identify and resolve data quality issues
  • Assist with performance tuning, indexing, and schema cleanup
  • Create and maintain runbooks for data pipelines and ingestion processes
  • Document ETL logic, assumptions, and known limitations
  • Follow and reinforce standards for naming, structure, and deployment
  • Support change tracking and controlled releases
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service