Equitable Financial Life Insurance Company seeks a Data Analyst, Software Engineering II for its Charlotte, NC location. Create and enhance data products especially for Annuity and Life insurance data for all finance consumers such as Business Users, Actuarial Analysts, and Modelers. Collaborate directly with a team of Data Engineers, Data Analysts, and Business Stakeholders to assemble data from various data sources, develop process workflows (ETL) and automation to pipeline data for downstream analysis and consumption by Financial and Actuarial models and/or other related downstream Business Stakeholders or use cases. Work with business and technical teams to convert business requirements into technical specifications. Understand insurance domain key concepts such as Long Duration Targeted Improvements (LDTI) and Reinsurance to interpret the business requirements and complete impact analysis on existing data products. Create and maintain LDTI financial cashflow data products based on various data cohorts and using actuarial accounting transactions such as premiums, claims, commissions, and expenses. Utilize actuarial accounting and valuation processes knowledge to develop, to provide production support and to debug production issues related to data products. Identify and define IT data controls to ensure the resiliency, accuracy, and completeness of data products. Evaluate complex business problems and provide subject matter knowledge proficiency for technology initiatives. Collaborate with technical personnel, members of the Data Analytics team, and business partners to assemble data from various sources into datasets to analyze the information to solve business problems and improve business efficiency. Analyze Business Requirements through data mining, data quality checks, and data validation of raw data elements for the required enhancements for data product design. Explore data to find its meaning and relationship to Annuity and Life insurance products for solving business problems, including analysis of complex data and large data sets using tools such as Hue, Hive, and Impala. Develop complex SQL/HQL queries to join multiple high-volume datasets, apply complex data transformations and filtering, and then produce output datasets in Hadoop Data Lake platform or in Enterprise Data Warehouse (EDW) using tools such as Hive, Impala and SQL. Use query optimization techniques to achieve high query output performance. Develop data dashboards and reports using visualization tools such as Tableau or PowerBI Document technical and business definitions and document data lineage of automation process. Create source to target mapping documents to capture the transformation and filtering applied to produce new datasets. Perform technical testing and support business acceptance testing as a member of development project teams. Test data products to ensure it is accurate and complete. Identify data quality rules, test cases and document test results.
Stand Out From the Crowd
Upload your resume and get instant feedback on how well it matches this job.
Job Type
Full-time
Career Level
Entry Level