Quality Engineer II – Enterprise Data Integration Framework (EDIF)

LPL FinancialSan Diego, CA
1d$37 - $62Hybrid

About The Position

What if you could build a career where ambition meets innovation? At LPL Financial, we empower professionals to shape their success while helping clients pursue their financial goals with confidence. What if you could have access to cutting-edge resources, a collaborative environment, and the freedom to make an impact? If you're ready to take the next step, discover what’s possible with LPL Financial. Job Overview: The Quality Engineer II, EDIF will play a crucial role in ensuring the quality, reliability, and performance of LPL Financial's Enterprise Data Integration Framework. This individual will design, develop, and execute comprehensive test strategies for data integration solutions, contributing to the integrity and accuracy of critical financial data across the organization.

Requirements

  • Bachelor’s degree in Computer Science, Information Technology, Engineering, Information Systems, or a related discipline.
  • 3+ years of experience in Quality Engineering, Data Quality, or Test Automation, with exposure to data-centric applications.
  • 3+ years experience testing data pipelines, ETL/ELT workflows, or distributed data systems in a production or enterprise environment
  • 3+ years experience in SQL for complex data querying, reconciliation, and validation.
  • 2+ years experience in Python for test automation, data validation, and scripting.
  • 2+ years experience with AWS cloud services, including S3, Lambda, Glue, Step Functions, CloudWatch, IAM, and Athena; experience with other cloud platforms (Azure, GCP) is a plus.
  • Ability to work independently while thriving in a highly collaborative, fast-paced environment.
  • Effective written and verbal communication skills for collaborating with technical team members.
  • Strong analytical and troubleshooting skills with attention to detail.

Nice To Haves

  • Experience with automated testing frameworks and data validation or comparison tools.
  • Foundational understanding of data warehousing concepts, data modeling, and relational database principles.
  • Knowledge of data quality concepts, including schema validation, data completeness, referential integrity, and negative testing.
  • Experience using Git-based version control and contributing tests to CI/CD pipelines.
  • Knowledge of data quality concepts, including schema validation, data completeness, referential integrity, and negative testing
  • Experience using Git-based version control and integrating tests into CI/CD pipelines.
  • Familiarity with Agile/Scrum development methodologies and working within iterative delivery models.
  • Financial services industry experience is a plus

Responsibilities

  • Design and execute test plans, test cases, and test scripts for EDIF data integration and ingestion workflows under guidance from senior team members.
  • Perform functional, integration, system, regression, performance, and data validation testing across diverse data sources and targets.
  • Develop and maintain automated tests for AWS-based data pipelines using Python, SQL, and basic PySpark.
  • Validate ingestion workflows across Amazon S3, AWS Lambda, AWS Glue, Step Functions, EventBridge, and downstream data stores.
  • Verify schema evolution, data transformations, and enrichment logic applied in Glue ETL jobs.
  • Create and maintain test datasets, mock files, and validation utilities supporting EDIF microbatch processing.
  • Conduct data analysis and reconciliation to ensure data accuracy, completeness, and consistency across ingestion pipelines.
  • Validate EDIF event-driven processing by reviewing event logs, CloudWatch metrics, error events, and basic alerting behavior.
  • Identify, document, and track defects; partner with development teams to validate fixes and perform regression testing.
  • Work closely with data engineers and developers to understand data flows, schemas, and business rules to ensure appropriate test coverage.
  • Apply QA best practices to ensure test cases are maintainable, repeatable, and aligned with automation standards.
  • Participate in code reviews and design discussions with a focus on test coverage, data quality risks, and functional correctness.
  • Support release testing and pre-deployment validation activities for new vendors, schema updates, and ingestion changes.
  • Maintain and update existing automated test suites as pipelines evolve.
  • Follow established QA processes, data quality standards, and testing methodologies.
  • Continuously expand knowledge of data integration, cloud-native testing, and data quality techniques.

Benefits

  • LPL Total Rewards package is highly competitive, designed to support your success at work, at home, and at play – such as 401K matching, health benefits, employee stock options, paid time off, volunteer time off, and more.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service