Data Analyst, Software Engineering II

EquitableCharlotte, NC
13h$110,000 - $123,000Hybrid

About The Position

Equitable Financial Life Insurance Company seeks a Data Analyst, Software Engineering II for its Charlotte, NC location. Create and enhance data products especially for Annuity and Life insurance data for all finance consumers such as Business Users, Actuarial Analysts, and Modelers. Collaborate directly with a team of Data Engineers, Data Analysts, and Business Stakeholders to assemble data from various data sources, develop process workflows (ETL) and automation to pipeline data for downstream analysis and consumption by Financial and Actuarial models and/or other related downstream Business Stakeholders or use cases. Work with business and technical teams to convert business requirements into technical specifications. Understand insurance domain key concepts such as Long Duration Targeted Improvements (LDTI) and Reinsurance to interpret the business requirements and complete impact analysis on existing data products. Create and maintain LDTI financial cashflow data products based on various data cohorts and using actuarial accounting transactions such as premiums, claims, commissions, and expenses. Utilize actuarial accounting and valuation processes knowledge to develop, to provide production support and to debug production issues related to data products. Identify and define IT data controls to ensure the resiliency, accuracy, and completeness of data products. Evaluate complex business problems and provide subject matter knowledge proficiency for technology initiatives. Collaborate with technical personnel, members of the Data Analytics team, and business partners to assemble data from various sources into datasets to analyze the information to solve business problems and improve business efficiency. Analyze Business Requirements through data mining, data quality checks, and data validation of raw data elements for the required enhancements for data product design. Explore data to find its meaning and relationship to Annuity and Life insurance products for solving business problems, including analysis of complex data and large data sets using tools such as Hue, Hive, and Impala. Develop complex SQL/HQL queries to join multiple high-volume datasets, apply complex data transformations and filtering, and then produce output datasets in Hadoop Data Lake platform or in Enterprise Data Warehouse (EDW) using tools such as Hive, Impala and SQL. Use query optimization techniques to achieve high query output performance. Develop data dashboards and reports using visualization tools such as Tableau or PowerBI Document technical and business definitions and document data lineage of automation process. Create source to target mapping documents to capture the transformation and filtering applied to produce new datasets. Perform technical testing and support business acceptance testing as a member of development project teams. Test data products to ensure it is accurate and complete. Identify data quality rules, test cases and document test results.

Requirements

  • Requires a Bachelor’s or foreign equivalent degree in Computer Science, Electronic Engineering, Information Systems, or a related IT field plus at least 3 years of progressive post-Baccalaureate experience as a Data Analyst, Systems Analyst, Data Engineer or related position involving Data Analysis and exploration, Data Mining, Data Science and Big Data Technologies (Hive and Impala).
  • Full term of experience must include hands-on development utilizing SQL, Python, Enterprise Data Warehouse (EDW), and Oracle.
  • Must have at least 2 years of experience in Cloudera Hadoop Data Lake platform including Hive, HQL, Impala, Hue
  • Using optimization techniques for performance tuning of complex HQL queries
  • Working with Actuarial and accounting models to understand and document the data requirements and creating data products to support it
  • Understanding of key insurance domain concepts such as LDTI, Reinsurance, Actuarial valuation, and assumption processes
  • Data analysis experience in Annuity and Life insurance data lifecycle including systematically collecting, cleansing, mapping, transforming, and modeling data
  • Developing LDTI financial cashflows using transactions related to Premiums, Claims, Commissions and Expenses
  • Using actuarial accounting transactions with its various dimensions and worked with ledger/Subledger systems in SAP S4
  • Data Validation/Testing with experience in performing regression testing and creating test case documents
  • experience in defining and executing data quality rules
  • Creating data documentation including Requirements specifications document, Source to target (STM) mapping documents, data dictionaries and key field tracing (lineage) documents
  • Experience in defining IT controls to ensure accuracy and completeness of Data Products
  • Business Intelligence and Data Visualization in Tableau or PowerBI
  • Agile Methodology
  • Change Management
  • Tools- Archer, Service NOW

Responsibilities

  • Create and enhance data products especially for Annuity and Life insurance data for all finance consumers such as Business Users, Actuarial Analysts, and Modelers.
  • Collaborate directly with a team of Data Engineers, Data Analysts, and Business Stakeholders to assemble data from various data sources, develop process workflows (ETL) and automation to pipeline data for downstream analysis and consumption by Financial and Actuarial models and/or other related downstream Business Stakeholders or use cases.
  • Work with business and technical teams to convert business requirements into technical specifications.
  • Understand insurance domain key concepts such as Long Duration Targeted Improvements (LDTI) and Reinsurance to interpret the business requirements and complete impact analysis on existing data products.
  • Create and maintain LDTI financial cashflow data products based on various data cohorts and using actuarial accounting transactions such as premiums, claims, commissions, and expenses.
  • Utilize actuarial accounting and valuation processes knowledge to develop, to provide production support and to debug production issues related to data products.
  • Identify and define IT data controls to ensure the resiliency, accuracy, and completeness of data products.
  • Evaluate complex business problems and provide subject matter knowledge proficiency for technology initiatives.
  • Collaborate with technical personnel, members of the Data Analytics team, and business partners to assemble data from various sources into datasets to analyze the information to solve business problems and improve business efficiency.
  • Analyze Business Requirements through data mining, data quality checks, and data validation of raw data elements for the required enhancements for data product design.
  • Explore data to find its meaning and relationship to Annuity and Life insurance products for solving business problems, including analysis of complex data and large data sets using tools such as Hue, Hive, and Impala.
  • Develop complex SQL/HQL queries to join multiple high-volume datasets, apply complex data transformations and filtering, and then produce output datasets in Hadoop Data Lake platform or in Enterprise Data Warehouse (EDW) using tools such as Hive, Impala and SQL.
  • Use query optimization techniques to achieve high query output performance.
  • Develop data dashboards and reports using visualization tools such as Tableau or PowerBI
  • Document technical and business definitions and document data lineage of automation process.
  • Create source to target mapping documents to capture the transformation and filtering applied to produce new datasets.
  • Perform technical testing and support business acceptance testing as a member of development project teams.
  • Test data products to ensure it is accurate and complete. Identify data quality rules, test cases and document test results.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service