Sr. Data Engineer

Come and SeeColorado Springs, CO
4dOnsite

About The Position

The Senior Data Engineer / Data Architect is responsible for designing, building, and governing the data infrastructure that powers Come and See Foundation’s analytics and business intelligence capabilities. This senior technical leader owns the architecture of our Snowflake-based data platform, ensures data flows reliably from source to insight, and partners with stakeholders across the organization to build a scalable, secure, and well-governed data ecosystem. Snowflake is the core of our stack, serving as both the data warehouse and the primary transformation layer. The ideal candidate is a seasoned engineer with deep Snowflake expertise, strong architectural instincts, and a passion for using technology in service of a mission-driven organization.

Requirements

  • Expert-level proficiency in Snowflake, including architecture, performance tuning, Snowpark, native transformation features (tasks, streams, dynamic tables), data sharing, and cost management
  • Advanced SQL skills with experience in complex data modeling, warehouse design, and query optimization at scale
  • Strong programming skills in Python for pipeline development, automation, and data engineering tasks
  • Experience with Airflow or similar orchestration tools for pipeline scheduling and management
  • Proficiency with cloud platforms (AWS, Azure, or GCP) and cloud-native data services
  • Experience integrating with CRM platforms such as Virtuous or HubSpot
  • Familiarity with digital fundraising platforms such as FundraiseUp
  • Experience with RudderStack or other customer data platforms (CDPs) for event stream ingestion
  • Experience with Google Analytics data ingestion and integration
  • Experience with Git/GitHub for version control and collaborative development
  • Ability to work within an Agile/Scrum environment with 2-week sprint cycles
  • Architectural Thinking: Designs solutions that are scalable, maintainable, and aligned with long-term organizational goals
  • Analytical Thinking: Methodical and diligent with outstanding planning and problem-solving capabilities
  • Communication: Clearly communicates technical concepts to both technical and non-technical audiences
  • Collaboration: Partners effectively across functional teams including IT, analytics, programs, and executive leadership
  • Data Evangelist: Enthusiastic about data and committed to building a data-driven culture
  • Integrity: Handles sensitive donor and constituent data with the highest standards of ethics and transparency
  • Independence: Self-directed and capable of leading technical work with minimal supervision
  • High Standards: Holds self and team to excellence in code quality, documentation, and architecture
  • Adaptability: Thrives in a dynamic environment and adjusts quickly to evolving priorities and technology
  • Bachelor’s degree in Computer Science, Data Science, Software Engineering, or a related technical field; advanced degree preferred
  • Minimum 5-7 years of experience in data engineering, data architecture, or a related role
  • Minimum 2-3 years of hands-on Snowflake experience in a production environment, including native transformation capabilities
  • Demonstrated experience designing and delivering enterprise-grade data platforms at scale

Nice To Haves

  • Experience with containerization technologies (Docker, Kubernetes)
  • Familiarity with Tableau or other BI tools, with ability to support analytics team data requirements
  • Working knowledge of machine learning workflows, MLOps, and supporting data science teams (XGBoost, LightGBM, scikit-learn)
  • Experience managing large-scale data migrations or platform transitions
  • Exposure to data governance frameworks and data cataloging tools
  • Experience in nonprofit, faith-based, or mission-driven organizations a plus

Responsibilities

  • Data Architecture & Platform Engineering
  • Design, implement, and continuously improve the organization’s Snowflake data warehouse architecture, including schema design, data modeling, performance optimization, and native transformation workflows
  • Own Snowflake as the primary transformation layer, leveraging native features (Snowpark, tasks, streams, dynamic tables) in place of external transformation tooling
  • Establish data architecture standards, patterns, and best practices across the data platform
  • Lead the evaluation and selection of new data technologies and tools that align with organizational strategy
  • Architect and maintain scalable, fault-tolerant data pipelines that reliably process high volumes of structured and unstructured data
  • Design and implement data models (dimensional, relational, or data vault approaches as appropriate) that support BI and analytics use cases at scale
  • Data Pipeline & ELT Development
  • Build, optimize, and maintain robust ELT pipelines to integrate data from Virtuous, FundraiseUp, HubSpot, RudderStack, Google Analytics, and other sources into Snowflake
  • Develop and maintain data ingestion frameworks using Python and SQL
  • Implement Airflow-based orchestration for pipeline scheduling, monitoring, alerting, and failure recovery
  • Manage platform migrations and technology transitions with minimal disruption to downstream analytics
  • Implement automation for routine data engineering tasks to improve reliability and reduce manual overhead
  • Cloud Infrastructure & Integration
  • Manage and optimize cloud data infrastructure on Snowflake, ensuring cost-efficiency, performance, and scalability
  • Develop and maintain APIs for data access and interoperability across organizational systems
  • Implement and manage RudderStack event streaming pipelines, ensuring clean, reliable event data flows into Snowflake
  • Maintain integrations with CRM platforms (Virtuous, HubSpot), fundraising tools (FundraiseUp), and analytics platforms (Google Analytics, Tableau)
  • Collaborate with IT and engineering partners to define data access protocols, authentication, and authorization patterns
  • Evaluate and implement containerization (Docker, Kubernetes) strategies for deploying and scaling data engineering applications where appropriate
  • Data Governance & Quality
  • Implement and enforce data governance policies, including data classification, lineage tracking, and stewardship across the data lifecycle
  • Lead data quality assurance efforts, establishing validation frameworks and automated quality checks within pipelines
  • Enforce role-based access controls and data security protocols to protect sensitive donor and constituent data
  • Ensure compliance with applicable data privacy regulations (GDPR, CCPA) in all data handling practices
  • Maintain thorough documentation for data models, pipeline architecture, and operational processes in Git/GitHub
  • BI Enablement & Stakeholder Partnership
  • Partner closely with Data Analysts and Data Scientists to ensure the Snowflake platform meets analytical requirements and enables high-quality Tableau reporting
  • Collaborate with the Director of Data & BI/Analytics, COO, and other senior leaders to align data infrastructure with organizational and ministry strategy
  • Serve as a key liaison between technical data teams and business stakeholders, translating requirements into sound engineering solutions
  • Support the Customer 360 intelligence platform, including donor scoring models, self-serve dashboards, and AI-powered analytics infrastructure
  • Provide technical mentorship to junior data engineering staff and foster a culture of engineering excellence
  • Evangelize data best practices and data literacy across the organization
  • Project Management & Continuous Improvement
  • Manage data engineering projects from design through delivery using Scrum methodology and 2-week sprint cycles
  • Continuously evaluate and improve data engineering processes, adopting emerging best practices and Snowflake capabilities
  • Identify cost optimization opportunities within the Snowflake platform and implement efficiency improvements
  • Maintain version control discipline using Git/GitHub across all pipeline and transformation code
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service