Duties: Lead data engineering delivery roadmap for two products. Manage the migration of scrambled model training data/users from MTD to AWS, providing them all required tools/utilities so that other forecasting model training users can directly operate off AWS independently. Spearhead the card forecasting model serving migration to cloud. Run an initiative to deliver Tableau reports at scale on the public cloud for stakeholders during CCAR cycles. Run an initiative to have all loss forecasting models be cross-resilient and operate in AWS East/West regions. QUALIFICATIONS: Minimum education and experience required: Bachelor's degree in Computer Engineering, Computer Science, Information Technology, Data Analytics, Data Engineering, or related field of study plus ten (10) years of experience in the job offered or as Director/Manager of Software Engineering, Software Engineer/Developer, Application Architect, Technology Analyst, or related occupation. Skills Required: This position requires seven (7) years of experience with the following skills: using Abinitio, Unix scripts, SQL, and PL SQL to develop and optimize ETL pipelines and real-time streaming solutions; using SQL to design and optimize complex queries, manage Teradata and DB2 relational databases, and implement data models to support business intelligence and reporting needs; designing and implementing comprehensive data architecture solutions while focusing on data modeling, integration, and governance to ensure data quality, accessibility, and scalability across enterprise; data integration, transformation, and loading to support enterprise data warehousing and analytics solutions; and implementing Agile, Waterfall, and Hybrid SDLC methodologies, including Scrum and Kanban, to enhance team collaboration, streamline project delivery, and adapt to changing requirements. This position requires four (4) years of experience with the following skills: using Apache Spark and Apache Kafka to develop and optimize data pipelines and real time streaming solutions; using Java, J2EE, Python, and Scala programming to develop scalable and high- performance applications; utilizing frameworks such as SpringBoot and Spring to streamline development processes; designing and implementing RESTful APIs while focusing on scalable architecture, ensuring efficient data exchange using JSON, and ensuring seamless integration with client applications; using SQL to design and optimize complex queries, manage Teradata and DB2 relational databases, and implement data models to support business intelligence and reporting needs; deploying and managing containerized applications using Kubernetes, specifically configuring clusters and optimizing resource utilization for scalable and resilient cloud-native environments; designing and implementing comprehensive data architecture solutions while focusing on data modeling, integration, and governance to ensure data quality, accessibility, and scalability across enterprise; data integration, transformation, and loading to support enterprise data warehousing and analytics solutions; using Apache Maven for project management and build automation, including dependency management, lifecycle management, and integration with continuous integration tools to streamline the development process; leveraging AWS Cloud Services, including EC2, S3, RDS, and Lambda, to design and implement scalable, secure, and cost-effective cloud solutions that meet diverse business needs; using MongoDB and Hbase for developing and managing NoSQL databases, particularly focusing on schema design, indexing, and optimizing queries to support high-performance and scalable applications; and using JUnit for unit testing Java applications, focusing on test-driven development (TDD) practices to ensure code quality, reliability, and maintainability.
Stand Out From the Crowd
Upload your resume and get instant feedback on how well it matches this job.
Job Type
Full-time
Career Level
Director