This position is for an Onshore Test Lead to support client Insight suite of products. This Test Lead will be responsible for leading Insight products related testing efforts which include gathering requirements, test case creation, test execution, defect logging and retest, coordinating with various stakeholders including technical teams, business teams, product owners, project managers, external vendors and offshore teams. This is a data centric product, and we are looking for a candidate who is a data enthusiast and who has a zeal to work with data, analyze and understand various metrics being derived out of data. Experience with testing AI prompting tools is a must. Essential Job Functions: Testing various dashboards and certifying the metrics on these dashboard are correct after comparing with the backend underlying data. Testing prompt based AI tools to make sure the prompts are returning the right values back to the UI. Focus on testing to verify the accuracy of this data as various business rules are applied. Understand the data flow, validate the content on UI screens, understand and test the business rules involved with data transformation and data aggregation. Create and execute scenarios to test various API. Preparing request block and analyzing the responses in JSON/XML formats. Validate the flow of data from disparate sources ingested into multiple databases inside Databricks, post which data is transformed by pipelines and workflows built within Azure Databricks and Azure Data Factory (ETL process). Thoroughly test the ETL rules built for data transformation & complex business rule built for data aggregation. Strong in SQL skills. Should possess ability to understand and write complex queries. Execute tests using SQL or Python or PySpark as per the user stories to validate the data inside various databases within the Databricks environment. Test different source and target tables available in Azure Databricks that are sourced, cleansed, transformed, joined, aggregated and final data sent to downstream applications. Automate recurring QA processes through the use of advanced languages such as Python or PySpark or Java as needed. Design and build out an automation framework using PyTest to validate different scenarios and its data. This includes both automating new tests and/or updating existing scripts. Previously should have exposure to code repository tools, creating branches, pull requests and perform code merge activities. Previously should have exposure to SonarQube and main code quality, fix code smells etc. Create and execute detailed manual test cases from time to time using functional requirements and technical specifications within Jira to ensure quality and accuracy. Log appropriate defects within Jira when product does not conform to specifications. Participate in daily stand-ups with project team as part of the agile methodology. Coordinate with development team members regarding defect validation and assist development team members with re-creating defects. Create appropriate test cases within TestRail Test Management tool. Update tasks information in Jira as appropriate to communicate progress with onshore test lead.
Stand Out From the Crowd
Upload your resume and get instant feedback on how well it matches this job.
Job Type
Full-time
Career Level
Mid Level
Education Level
No Education Listed