Senior Data Engineer

GLOBAL Recruitment SolutionsDallas, TX
16hOnsite

About The Position

You will design, build, and operate secure, audited, and cost-efficient data pipelines on Snowflake—from raw ingestion to Data Vault 2.0 models and onward to business-friendly consumption layers (mart/semantic). You'll use Qlik/Glue/ETLs for ingestion, dbt Cloud for modeling/testing, MWAA/Airflow and/or dbt Clouds orchestration for scheduling and Terraform (with HashiCorp practices) for infrastructure-as-code. The ideal candidate must have hands-on experience with data ingestion frameworks and Snowflake platform database/schema design, security, networking, and governance that satisfy regulatory and compliance audit requirements.

Requirements

  • Bachelor's Degree and 6 years of experience in Advanced data engineering, enterprise architecture, project leadership OR High School Diploma or GED and 10 years of experience in Advanced data engineering, enterprise architecture, project leadership
  • US Citizenship / Green Card status

Nice To Haves

  • Snowflake Platform (hands-on, production): Secure account setup: databases/schemas/stages, RBAC/ABAC role design, grants, network policies/rules, storage integrations.
  • Data protection: Dynamic Data Masking, Row Access Policies, Tag-based masking, PII classification/lineage tagging.
  • Workloads & features: Streams/Tasks, Snowpipe, external tables, file formats, copy options, retries & dedupe patterns.
  • Operations: warehouse sizing, multi-cluster, resource monitors, Time Travel & Fail-safe, cross-region/account replication.
  • Networking concepts: AWS PrivateLink/S3 access patterns, external stages, (at least) high-level familiarity with VPC/DNS/ endpoint flows.

Responsibilities

  • Modeling & Warehousing Design and implement scalable data ingestion frameworks
  • Implement Raw ? DV 2.0 (Hubs/Links/Satellites) ? Consumption patterns in dbt Cloud with robust tests (unique/notnull/relationships/freshness).
  • Build performant Snowflake objects (tables, streams, tasks, materialized views) and optimize clustering/micro-partitioning.
  • Orchestration Author and operate Airflow (MWAA) DAGs and/or dbt Cloud jobs; design idempotent, rerunnable, lineage-tracked workflows with SLAs/SLOs.
  • Security & Governance Enforce RBAC/ABAC, network policies/rules, masking/row access policies, tags, data classification, and least-privilege role hierarchies.
  • Operationalize audit-ready controls (change management, approvals, runbooks, separation of duties, evidence capture).
  • IaC & DevOps Use CI/CD flows, Terraform, Git branching for code promotion.
  • Data Quality & Observability Bake tests into dbt; implement contract checks, reconciliations, and anomaly alerts.
  • Monitor with Snowflake ACCOUNT_USAGE/INFORMATION_SCHEMA, event tables, and forward logs/metrics to SIEM/APM (e.g., Splunk, Datadog).
  • Cost & Performance Right-size warehouses, configure auto-suspend/auto-resume, multi-cluster for concurrency, resource monitors, and query optimization.
  • Compliance Build controls and evidence to satisfy internal audit, SOX/GLBA/FFIEC/PCI-like expectations.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service