About The Position

At U.S. Bank, we’re on a journey to do our best. Helping the customers and businesses we serve to make better and smarter financial decisions and enabling the communities we support to grow and succeed. We believe it takes all of us to bring our shared ambition to life, and each person is unique in their potential. A career with U.S. Bank gives you a wide, ever-growing range of opportunities to discover what makes you thrive at every stage of your career. Try new things, learn new skills and discover what you excel at—all from Day One. Job Description U.S. Bank is seeking a full-time Sr. Software Engineer (Data Engineering-AWS) (Multiple openings) in Charlotte, NC. Essential Responsibilities:

Requirements

  • This position requires a Bachelor’s degree or equivalent in Computer Science or Computer Engineering and 5 years (progressive, post-baccalaureate) software development experience.
  • Must also have 24 months of experience with each of the following: 1) Designing and building data systems using Amazon Web Services (AWS) core services, including AWS Glue, AWS Data Sync, AWS EFS, AWS Step Functions and Data Pipeline. 2) Building data models, data pipelines, and data storage solutions using AWS compute services including EC2, EMR, S3, EBS, EFS, RDS, Lambda, and programming languages including Python, Spark, and Spark Streaming. 3) Building ETL (Extract, Transform, Load) pipelines and workflows, and developing and optimizing ETL/ELT processes to extract, transform, and load structured and unstructured data from on-premises databases, flat files, and mainframe sources into Snowflake and other cloud data warehouses. 4) Automating and orchestrating migration workflows using CI/CD pipelines, containerization (Docker, Kubernetes, OpenShift), and scheduling/orchestration tools (Airflow or AWS Glue). 5) Data Warehousing, Database technologies, and Big Data Eco-system technologies including AWS Redshift, Databricks, AWS RDS, Cassandra, and Hadoop. 6) Leading and executing complex cloud migration projects specifically involving Snowflake as the target platform. 7) Designing and deploying enterprise-grade DATA applications that meet stringent security, scalability, and compliance requirements, including supporting U.S. financial regulatory compliance by implementing secure data access controls and audit trails. Employer will accept experience gained concurrently. U.S. Bank is subject to and conducts background checks consistent with the regulatory requirements applicable to our industry and operations.

Responsibilities

  • Design and build data engineering solutions on Amazon Web Services (AWS) to migrate and integrate data from on-premises systems into Snowflake, Redshift and Databricks.
  • Designing, developing, and maintaining scalable data pipelines for batch and real-time processing using Apache Kafka, Spark, AWS Glue and ETL frameworks.
  • Architect and implement scalable, fault-tolerant data solutions on AWS that meet long-term business continuity and compliance needs, showcasing exceptional technical leadership.
  • Building ETL (Extract, Transform, Load) pipelines and workflows, ensuring data quality and consistency.
  • Migrating legacy on-premises systems, including mainframe data stores, to modern cloud-based platforms on AWS with Snowflake Data Warehouse as the target system.
  • Drive modernization of legacy systems to cloud-native platforms, contributing to the organization's competitiveness and operational resilience—key indicators of exceptional ability.
  • Modernizing existing on-premises data workloads by implementing cloud-native solutions using AWS services such as S3, Redshift, Databricks, Glue, Lambda, and Step Functions for integration and orchestration.
  • Designing and implementing Snowflake schemas, data models, and performance optimizations to support analytics, reporting, and enterprise applications.
  • Develop reusable frameworks and automation templates for cloud data migration, reducing onboarding time for new projects and increasing engineering efficiency across teams.
  • Automating and orchestrating migration workflows using CI/CD pipelines, containerization (Docker, Kubernetes, OpenShift), and scheduling/orchestration tools such as Airflow, AWS Glue and AWS Step Functions.
  • Ensure data security and compliance with financial industry standards using AWS IAM, KMS, and related monitoring tools like AWS CloudWatch, AWS SNS and DATADOG.
  • Collaborating with business stakeholders, data scientists, and application teams to validate migrated data and ensure accuracy, completeness, and business usability.
  • Ensuring data governance, lineage, and compliance with banking regulatory and security standards during migration and in steady-state operations.
  • Mentor junior engineers and lead knowledge-sharing sessions on AWS data engineering best practices, demonstrating sustained leadership and influence in the field.
  • Author internal technical documentation and migration playbooks used across departments, establishing yourself as a subject matter expert in cloud data integration.

Benefits

  • Healthcare (medical, dental, vision)
  • Basic term and optional term life insurance
  • Short-term and long-term disability
  • Pregnancy disability and parental leave
  • 401(k) and employer-funded retirement plan
  • Paid vacation (from two to five weeks depending on salary grade and tenure)
  • Up to 11 paid holiday opportunities
  • Adoption assistance
  • Sick and Safe Leave accruals of one hour for every 30 worked, up to 80 hours per calendar year unless otherwise provided by law
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service