Sr. Data Engineer, Data Innovation and Tools Rationalization

U.S. BankMinneapolis, MN
1d$119,765 - $140,900Hybrid

About The Position

We are seeking a highly skilled Senior Data Engineer to join the Data Innovation & Tools rationalization team within the Enterprise Data Office. This role plays a key hands‑on leadership role in advancing the modernization of enterprise data capabilities by designing, building, and scaling reusable data product engineering patterns aligned with the Enterprise Data Strategy. About the Data Innovation and Tools Rationalization Team We are the innovation and tooling engine for the Enterprise Data Office, focused on reusable patterns, accelerators, and tool rationalization that reduce friction and speed up delivery and adoption of governed data products. Vision | Make data products and AI capabilities easier to build, safer to deploy, and faster to adopt across the bank. Mission | Deliver reusable data product patterns, accelerators, and clear integration pathways that help teams ship data products faster while enabling safer AI adoption and reducing technology sprawl through disciplined tool evaluation and rationalization. Values | In addition to U.S. Bank core values, we prioritize: Head high: We build with excellence. Our work is intentional, high‑quality, and designed to last, so we are always proud of what we deliver and comfortable standing behind it. Accountability Over Activity: We take end‑to‑end accountability, from problem framing through delivery, adoption, and outcomes. Strategic Intelligence: We think in systems, anticipate downstream impact, and collaborate to win as a pod, not as individuals. Relentless Craft: We are passionate about the work we do. Our drive comes from curiosity, purpose, and a genuine love of building impactful solutions. About the Role We are seeking a highly skilled and forward-thinking Senior Data Engineer to join the Data Innovation and Tools Rationalization team. This role is focused on building and scaling next-generation data product engineering patterns that enable faster, more consistent, and more reliable delivery across the enterprise. The ideal candidate will combine strong hands‑on engineering skills with a product mindset and the ability to provide technical leadership across teams. You will design and evolve reusable frameworks, influence engineering standards, evaluate emerging technologies, and partner closely with execution and enablement teams to operationalize modern data patterns at scale. This role plays a critical part in accelerating platform adoption, improving developer productivity, and reducing fragmentation across data and analytics solutions.

Requirements

  • Deep understanding of financial institution/Banking concepts
  • Strong understanding of modern data engineering concepts, including batch and streaming data processing, data modeling, and data product design.
  • Experience building scalable data solutions on cloud-based data platforms.
  • Familiarity with enterprise data ecosystems and shared platform models.
  • Ability to assess tradeoffs across tools, architectures, and implementation approaches.
  • Strong analytical and problem-solving skills with a focus on root cause analysis and optimization.
  • Proficiency with big data technologies (Spark, Airflow, Hadoop, Hive)
  • Hands-on experience with Snowflake and Databricks, including performance tuning.
  • Proficiency in SQL and Python, with experience building production-grade data pipelines.
  • Experience with CI/CD pipelines and infrastructure-as-code patterns for data platforms.
  • Familiarity with orchestration and workflow management tools.
  • Experience developing reusable libraries, templates, or internal frameworks.
  • Exposure to cloud platforms such as Azure, AWS, or GCP and cloud-native data services.
  • Understanding data quality, observability, and monitoring practices.
  • Bachelor’s Degree in a quantitative field such as computer science, data science, mathematics, or statistics.
  • 6 to 8+ years of statistical and/or analytical experience.

Nice To Haves

  • Typically, 8+ years of experience in data engineering, analytics engineering, or platform engineering roles.
  • Demonstrated experience building and supporting data solutions in a cloud environment.
  • Proven track record of designing reusable components or standards adopted by multiple teams.
  • Experience working in regulated or large-scale enterprise environments preferred.
  • Strong organizational skills with the ability to manage multiple initiatives concurrently.
  • Deep understanding of banking and financial institution terms.
  • Knowledge of banking regulation and requirements for regulatory reporting.
  • Strong analytical, organizational, problem-solving, and project management skills.
  • Hands-on experience with programming languages such as Python and SQL.
  • Proficiency with big data technologies including Hadoop, Hive, and Spark.
  • Expertise in visual analytics tools such as Power BI, Tableau, or equivalent platforms.
  • Experience with Power Platform tools such as Power Automate and Power Apps
  • Proven track record in automating and optimizing ETL processes at scale.
  • Hands-on experience with cloud platforms (e.g., Azure, AWS, GCP) and cloud-native data services.
  • Excellent written and verbal communication skills for documenting technical processes and engaging with cross-functional teams and present to senior management.
  • Familiarity with AI and ML tooling as it relates to data engineering and platform enablement is a plus.

Responsibilities

  • Designing, building and owning next-generation data product engineering patterns on modern cloud platforms including Snowflake and Databricks.
  • Providing technical guidance and mentorship to other data engineers, promoting consistent engineering practices and high‑quality solutions.
  • Developing reusable engineering assets such as frameworks, build kits, CI/CD templates, and performance optimization approaches.
  • Partnering with Enablement and Execution teams to operationalize and scale data engineering patterns across delivery teams, serving as a technical point of reference for adoption and implementation.
  • Evaluating, testing, and experimenting with emerging data and AI tools, platforms, and services. Participate in technical proofs of concept, comparing alternative solutions, and making data-driven recommendations for platform and tool rationalization.
  • Documenting project outcomes, transition plans, adoption guides, and solution usage scripts to support enterprise rollout.
  • Supporting platform modernization efforts through hands-on development, tuning, and optimization.
  • Collaborating with data product owners, architects, and platform teams to align engineering solutions with enterprise data strategy.

Benefits

  • Healthcare (medical, dental, vision)
  • Basic term and optional term life insurance
  • Short-term and long-term disability
  • Pregnancy disability and parental leave
  • 401(k) and employer-funded retirement plan
  • Paid vacation (from two to five weeks depending on salary grade and tenure)
  • Up to 11 paid holiday opportunities
  • Adoption assistance
  • Sick and Safe Leave accruals of one hour for every 30 worked, up to 80 hours per calendar year unless otherwise provided by law
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service