Sr. Data & ML Operations Engineer

Concora Credit Inc.Beaverton, OR
16h

About The Position

As a Sr. Data & ML Operations Engineer, you’ll help drive Concora Credit’s Mission to enable customers to Do More with Credit – every single day. The impact you’ll have at Concora Credit: As a Sr. Data & ML Operations Engineer, you ensure the reliability, stability, and operational excellence of Enterprise Data and ML/AI platforms. You'll be responsible for supporting end-to-end Data Operations, including ETL/ELT pipelines, reporting and dashboarding workloads, advanced analytics and machine learning workflows, ensuring they run consistently and efficiently across Azure services such as Databricks, Data Factory, Data Lake, SQL Server, and Power BI. This role requires close coordination with business stakeholders, data engineers, ML engineers, and BI engineers to deliver accurate, timely datasets and reports that align with business needs. You'll also provide operational support for Fivetran, MOVEit Automation, GoAnywhere, and enterprise file movement processes. We hire people, not positions. That's because, at Concora Credit, we put people first, including our customers, partners, and Team Members. Concora Credit is guided by a single purpose: to help non-prime customers do more with credit. Today, we have helped millions of customers access credit. Our industry leadership, resilience, and willingness to adapt ensure we can help our partners responsibly say yes to millions more. As a company grounded in entrepreneurship, we're looking to expand our team and are looking for people who foster innovation, strive to make an impact, and want to Do More! We’re an established company with over 20 years of experience, but now we’re taking things to the next level. We're seeking someone who wants to impact the business and play a pivotal role in leading the charge for change.

Requirements

  • 5-7 years of experience in Data Operations, ML Operations, or Data Engineering roles.
  • Extensive hands-on experience with Azure Databricks with Unity Catalog, ADF, ADLS
  • Strong experience in SQL, Python, and PySpark
  • Experience working in regulated environments (Finance etc.)
  • Familiarity with secure file transfer tools.
  • Experience with monitoring tools (Azure Monitor, Log Analytics, etc.)
  • Solid understanding of data warehousing, ETL/ELT, and data modeling best practices.
  • Experience with version control and CI/CD pipelines (Azure DevOps, GitHub, etc.)
  • Knowledge of RBAC, data security, and governance in cloud environments.
  • Knowledge of Spark performance tuning, partitioning, and job orchestration.
  • Bachelor’s or Master’s degree (or equivalent) in Computer Science or related field.
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and collaboration abilities across technical and non-technical teams.
  • Ability to work independently in a fast-paced, agile environment.
  • Passion for delivering clean, high-quality, and maintainable code.

Responsibilities

  • Manage and monitor ETL/ELT pipelines, analytics reporting jobs, ML workflows across Azure services, ensuring SLA compliance and the timely resolution of Incidents, failures, and performance bottlenecks.
  • Proactively monitor platform health, performance, and SLAs using Databricks monitoring, logging, and alerting, to minimize downtime and data issues.
  • Perform root cause analysis and implement long-term fixes that prevent recurring issues and improve overall operational reliability.
  • Contribute to platform automation, CI/CD for data/ML pipelines, infrastructure as code, and disaster recovery planning.
  • Partner with data engineers, ML engineers, and BI engineers to troubleshoot issues, implement improvements, automate operations, and drive continuous enhancement of data/ML platform reliability.
  • Provide operational support for MOVEit Automation, GoAnywhere, and secure file transfer processes.
  • Operate and support Fivetran for data ingestion, ensuring connector health, sync reliability.
  • Manage Tokenization processes to ensure secure handling of sensitive data, including tokenization, detokenization, and compliance with PCI.
  • Audit PCI controls regularly to maintain compliance.
  • Develop and maintain code enhancements, automation scripts that support both data engineering and operational needs.
  • Provide after-hours support for critical systems and applications.
  • Develop and maintain documentation, runbooks, and operational standards.
  • These duties must be performed with or without reasonable accommodation.

Benefits

  • Medical, Dental and Vision insurance for you and your family
  • Relax and recharge with Paid Time Off (PTO)
  • 6 company-observed paid holidays, plus 3 paid floating holidays
  • 401k (after 90 days) plus employer match up to 4%
  • Pet Insurance for your furry family members
  • Wellness perks including onsite fitness equipment at both locations, EAP, and access to the Headspace App
  • We invest in your future through Tuition Reimbursement
  • Save on taxes with Flexible Spending Accounts
  • Peace of mind with Life and AD&D Insurance
  • Protect yourself with company-paid Long-Term Disability and voluntary Short-Term Disability
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service