Enterprise Data Consultant II

M&T BankWilmington, DE
1d$164,505 - $174,505Onsite

About The Position

Evaluate large datasets for data quality, identify trends, and implement quality checks. Manage, design, and implement processes and strategies for Extract, Transform, and Load (ETL)/Data Load & Data Integration-based data quality assessment and controls. Work with the Platform Administration team to ensure enterprise-level Data Quality (DQ) tools are properly maintained, upgraded, and meet performance expectations. Collaborate with the data product team and leadership to ensure ongoing communication of stakeholder requirements for new and improved solutions. Create, monitor, and maintain data quality performance indicators on Power BI DQ reports. Develop end-to-end DQ strategies using Informatica, Monte Carlo, and Power BI. Oversee DQ issue tracking, root cause analysis, remediation, and escalation as appropriate. Use SOAP UI for automating APIs. Map workflows, design profiles, create scorecards, and perform performance tuning to optimize the use of M&T Data Quality infrastructure (currently). Integrate ETL tools, Data Quality tools, and reporting tools using programming languages like Python. Work with Data Quality rules and relevant dimensions like completeness, validity, accuracy and timeliness to create robust DQ rules and controls. Use DQ observability and machine learning-based DQ tools. Create Python scripts to extract DQ scores and create DQ dashboards using Power BI. Utilize Informatica Data Quality Tools to automate data quality rules and controls. Manage data in Snowflake, SQL, Oracle, Teradata, Azure, and AWS to build data quality processes across different M&T data repositories. Create DQ reports in Power BI to track data quality KPIs, capturing DQ score, exception count, and trend over time. Use unit testing and code reviews to meet business DQ requirements.

Requirements

  • Master’s degree in Computer Applications, Computer Science and Engineering or a related technical field plus three (3) years of experience in the job offered or as Bigdata Analyst, Data Quality Analyst, Project Lead, or related occupation. The employer will alternatively accept a Bachelor’s degree in Computer Applications, Computer Science and Engineering or a related technical field plus five (5) years of experience in the job offered or as Bigdata Analyst, Data Quality Analyst, Project Lead, or related occupation.
  • Requires experience with the following: Developing and creating data profiles on large data sets present in Snowflake, SQL Server, Hadoop, and Teradata.
  • Creating custom data quality frameworks utilizing data quality tools including PowerBI, Monte Carlo, and Informatica.
  • Developing robust data quality rules using DQ tools including Informatica, Monte Carlo.
  • Developing data observability alerts on data pipelines in Snowflake, SQL Server, Hadoop, and Teradata tables using data quality tools including Monte Carlo and Informatica.
  • Developing API code to process and load data extracts, including JSON and CSV files, using integration tools like SOAP UI and Python scripts.
  • Developing and maintaining data quality dashboards to visualize key DQ metrics, using analytics tools including PowerBI and Tableau.
  • Developing and automating data reconciliation tasks across data sources using data manipulation tools including excel macros, python and SQL.
  • Developing machine learning models for detecting data anomalies using DQ tools including Monte Carlo, Power BI and Informatica.
  • Performing data quality assessment across all datasets in Snowflake, SQL Server, Teradata using Python/SQL and feed them to data quality metrics.
  • Creating and maintaining escalation framework using Python and Power BI to send automated reminder emails to stakeholder(s).
  • Performing data manipulation, data structuring, data design flow, and query optimization using programming languages including Python, SQL.
  • Developing and maintaining dynamic and interactive dashboards using Tableau and ETL automation.

Responsibilities

  • Evaluate large datasets for data quality, identify trends, and implement quality checks.
  • Manage, design, and implement processes and strategies for Extract, Transform, and Load (ETL)/Data Load & Data Integration-based data quality assessment and controls.
  • Work with the Platform Administration team to ensure enterprise-level Data Quality (DQ) tools are properly maintained, upgraded, and meet performance expectations.
  • Collaborate with the data product team and leadership to ensure ongoing communication of stakeholder requirements for new and improved solutions.
  • Create, monitor, and maintain data quality performance indicators on Power BI DQ reports.
  • Develop end-to-end DQ strategies using Informatica, Monte Carlo, and Power BI.
  • Oversee DQ issue tracking, root cause analysis, remediation, and escalation as appropriate.
  • Use SOAP UI for automating APIs.
  • Map workflows, design profiles, create scorecards, and perform performance tuning to optimize the use of M&T Data Quality infrastructure (currently).
  • Integrate ETL tools, Data Quality tools, and reporting tools using programming languages like Python.
  • Work with Data Quality rules and relevant dimensions like completeness, validity, accuracy and timeliness to create robust DQ rules and controls.
  • Use DQ observability and machine learning-based DQ tools.
  • Create Python scripts to extract DQ scores and create DQ dashboards using Power BI.
  • Utilize Informatica Data Quality Tools to automate data quality rules and controls.
  • Manage data in Snowflake, SQL, Oracle, Teradata, Azure, and AWS to build data quality processes across different M&T data repositories.
  • Create DQ reports in Power BI to track data quality KPIs, capturing DQ score, exception count, and trend over time.
  • Use unit testing and code reviews to meet business DQ requirements.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service