About The Position

As a Senior Quality Engineer for the Enterprise Data Integration Framework (EDIF) at LPL Financial, you will play a critical role in ensuring the quality, reliability, and performance of our foundational data integration solutions. You will be responsible for designing, developing, and executing comprehensive test strategies for large-scale data pipelines and integration services, contributing to the integrity of LPL's core data assets.

Requirements

  • Bachelor’s degree in Computer Science, Information Technology, Engineering, Information Systems, or a related discipline.
  • 5+ years of experience in Quality Engineering, Data Quality, or Test Automation, with a strong focus on data-centric applications and data integration platforms.
  • 5+ years experience validating data pipelines, ETL/ELT workflows, and distributed data systems, preferably within enterprise-scale data integration frameworks.
  • 5+ years experience in SQL for complex data querying, reconciliation, and validation.
  • 5+ years experience in Python for test automation, data validation, and scripting.
  • 3+ years experience with AWS cloud services, including S3, Lambda, Glue, Step Functions, CloudWatch, IAM, and Athena; experience with other cloud platforms (Azure, GCP) is a plus.
  • Strong communication skills with the ability to explain complex technical issues to both technical and non-technical stakeholders.
  • Ability to work independently while thriving in a highly collaborative, fast-paced environment.

Nice To Haves

  • Strong understanding of data warehousing concepts, data modeling, and relational database management systems.
  • Solid knowledge of data quality and validation concepts, including schema evolution, referential integrity, reconciliation, negative testing, and edge-case handling.
  • Experience using Git-based version control and integrating tests into CI/CD pipelines.
  • Familiarity with Agile/Scrum development methodologies and working within iterative delivery models.
  • Excellent analytical, debugging, and problem-solving skills, with strong attention to detail.
  • Financial services industry experience is highly desirable.

Responsibilities

  • Design, develop, and implement robust test plans, test cases, and test scripts for complex EDIF data integration and ingestion workflows.
  • Perform functional, integration, system, regression, performance, and data validation testing across diverse data sources and targets.
  • Build and maintain automated test suites for AWS-based data pipelines using Python, PySpark, SQL, and AWS test harnesses.
  • Validate end-to-end ingestion workflows across Amazon S3, AWS Lambda, AWS Glue, Step Functions, EventBridge, and downstream data stores.
  • Verify schema evolution, data transformations, and enrichment logic applied in Glue ETL jobs.
  • Create and maintain reusable test datasets, mock files, and validation utilities supporting EDIF microbatch processing.
  • Perform in-depth data analysis and reconciliation to ensure data accuracy, completeness, and consistency throughout the EDIF platform.
  • Validate and monitor EDIF’s event-driven observability, including event lifecycle logging, CloudWatch metrics, error events, alerts, and recovery workflows.
  • Identify, document, prioritize, and track defects; partner with development teams for timely resolution and retesting.
  • Work closely with data engineers, developers, and architects to understand data models, data flows, and business requirements to ensure comprehensive test coverage.
  • Partner with Data Engineers to ensure code is testable, resilient, and aligned with QA automation best practices.
  • Participate in code reviews and design discussions, providing input on testability, quality, and potential risks.
  • Participate in release management, change control, and pre-deployment quality gates to ensure readiness across all ingestion pipelines.
  • Support nightly ingestion cycles by maintaining test coverage for new vendors, schema changes, schedules, and data-quality requirements.
  • Contribute to the ongoing enhancement of QA processes, methodologies, automation maturity, data quality controls, and QA standards within the team.
  • Mentor junior quality engineers and share best practices in data quality assurance.
  • Stay current with industry best practices in data quality, data governance, and data integration technologies.

Benefits

  • 401K matching
  • health benefits
  • employee stock options
  • paid time off
  • volunteer time off
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service