Data Engineer - Snowflake & Activation Platforms

Frontier Credit UnionIdaho Falls, ID
7h

About The Position

The Data Engineer is responsible for designing, building, and operating the downstream data layer that powers analytics, Salesforce enablement, marketing activation, lending insights, automation, and AI initiatives at Frontier Credit Union. This role owns the transformation, modeling, validation, and operationalization of data after it has been ingested into Snowflake. Upstream data sourcing, APIs, and ingestion pipelines are owned by a dedicated Software Engineering function. This role partners closely with that team to define data contracts and ensure ingested data is production-ready for downstream use. Activation platforms include Salesforce CRM, Salesforce Marketing Cloud, internal data applications, and AI-enabled workflows. The focus of this role is ensuring data is trustworthy, well-modeled, and usable by these systems. This position is intended for an experienced Data Engineer who can contribute immediately. We are not seeking an entry-level candidate or someone who requires significant training. The ideal candidate brings strong SQL and Python skills, deep Snowflake experience, and a proven track record of owning production data pipelines and data products.

Requirements

  • Advanced SQL skills, including complex transformations, window functions, and performance tuning.
  • Strong Python experience for data pipelines, automation, and data tooling.
  • Hands-on experience with Snowflake in production environments.
  • Experience building downstream data pipelines and curated data layers.
  • Familiarity with orchestration, scheduling, and transformation frameworks.
  • Experience supporting downstream consumers such as BI tools, CRM platforms, or internal applications.
  • Strong ownership mindset for data products
  • Ability to operate independently with clear accountability.
  • Strong problem-solving and systems-thinking skills.
  • Clear communication across technical and business audiences.
  • Comfort working with sensitive data in regulated environments.
  • Bachelor’s degree in Computer Science, Engineering, Information Systems, or equivalent professional experience.
  • 3–6 years of experience in data engineering, analytics engineering, or a closely related role.
  • Demonstrated experience building Snowflake-based transformations and data products.
  • Strong production experience with SQL and Python.

Nice To Haves

  • Experience supporting Salesforce CRM or Salesforce Marketing Cloud as downstream consumers.
  • Experience building internal data tools or applications (Streamlit preferred)
  • Experience in financial services or other regulated industries.
  • Experience supporting automation or AI-driven analytics workflows.

Responsibilities

  • Serves as the primary owner of Snowflake-based transformations and downstream data products.
  • Translates raw ingested data into trusted, governed, analytics-ready datasets.
  • Ensures downstream consumers (Salesforce, Marketing Cloud, BI tools, internal apps, and automation workflows) receive consistent and reliable data.
  • Identifies and resolves data quality issues arising post-ingestion.
  • Reduces downstream friction by enforcing standards, documentation, and observability across data layers.
  • Design, build, and maintain Snowflake schemas, tables, views, and transformation layers.
  • Implement scalable modeling patterns to support CRM, marketing, lending, and operational analytics.
  • Write and optimize complex SQL transformations with a focus on correctness, performance, and cost efficiency.
  • Maintain consistent metric definitions and reusable logic across business domains.
  • Build and maintain transformation pipelines that convert ingested data into curated, business-ready datasets.
  • Implement dependency management, scheduling, and monitoring for downstream workflows.
  • Support incremental processing, backfills, and reprocessing as business needs evolve.
  • Partner with upstream engineers to define schema expectations, freshness SLAs, and data contracts.
  • Prepare and maintain datasets that support Salesforce CRM and Salesforce Marketing Cloud use cases.
  • Partner with Salesforce Administrators and Marketing teams to ensure downstream data supports segmentation, analytics, and reporting needs.
  • Support identity resolution, deduplication logic, and business rules at the data layer.
  • Ensure consistency between Snowflake-curated data and downstream operational systems.
  • Implement data quality checks, validation rules, and anomaly detection for critical datasets.
  • Monitor data freshness, volume, and schema stability for downstream tables.
  • Build visibility into data readiness for analytics, automation, and operational use.
  • Document data models, definitions, and ownership for key datasets.
  • Build internal data tools and lightweight applications (e.g., Streamlit) to support operations, analytics, and decision-making.
  • Develop tools for pipeline health monitoring, data readiness checks, and operational reporting.
  • Translate business questions into durable data products rather than one-off analyses.
  • Prepare feature-ready datasets to support automation and AI initiatives.
  • Partner with analytics and AI stakeholders to operationalize models and analytical outputs.
  • Ensure downstream pipelines support reproducibility, auditability, and long-term maintainability.
  • Follow established development, testing, and deployment standards for data transformations.
  • Use version control and documentation standards to support maintainability.
  • Participate in incident response related to downstream data failures.
  • Continuously improve reliability, performance, and clarity of data products.
  • Work closely with Software Engineers responsible for data ingestion and APIs.
  • Collaborate with BI, Salesforce, Marketing, Lending, and Operations teams.
  • Participate in requirements discovery, design reviews, and prioritization discussions.
  • Communicate technical tradeoffs and constraints clearly to non-technical stakeholders.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service