Senior Data Architect

Datavail InfotechBoulder, CO
4d

About The Position

10+ years of IT experience Databricks (data engineering, SQL, notebooks, app/backend integration patterns) Enterprise data warehouses and lakehouse platforms (Nice to have Fabric,Snowflake, Synapse, BigQuery – experience) Deep knowledge of data modeling, data warehousing, and lakehouse patterns Hands‑on experience with at least one modern data platform: Experience integrating data platforms with applications using APIs, services, or event‑driven patterns Solid understanding of cloud architecture concepts (security, networking, scalability, cost management) Strong communication skills with the ability to engage both technical and business stakeholders Experience working in client‑facing or consulting environments Bachelor's Degree in Computer Science, Information Technology, Engineering, Business, or related field AND Deep hands on experience with Databricks Lakehouse Platform and Apache Spark Strong proficiency in PySpark, SQL, Delta Lake, Databricks SQL Experience implementing Unity Catalog, data governance, and enterprise security controls Cloud experience with Azure Databricks, AWS Databricks, or GCP Databricks Familiarity with CI/CD, DevOps, and automation for Databricks workloads

Requirements

  • 10+ years of IT experience
  • Databricks (data engineering, SQL, notebooks, app/backend integration patterns)
  • Deep knowledge of data modeling, data warehousing, and lakehouse patterns
  • Hands‑on experience with at least one modern data platform
  • Experience integrating data platforms with applications using APIs, services, or event‑driven patterns
  • Solid understanding of cloud architecture concepts (security, networking, scalability, cost management)
  • Strong communication skills with the ability to engage both technical and business stakeholders
  • Experience working in client‑facing or consulting environments
  • Bachelor's Degree in Computer Science, Information Technology, Engineering, Business, or related field
  • Deep hands on experience with Databricks Lakehouse Platform and Apache Spark
  • Strong proficiency in PySpark, SQL, Delta Lake, Databricks SQL
  • Experience implementing Unity Catalog, data governance, and enterprise security controls
  • Cloud experience with Azure Databricks, AWS Databricks, or GCP Databricks
  • Familiarity with CI/CD, DevOps, and automation for Databricks workloads

Nice To Haves

  • Enterprise data warehouses and lakehouse platforms (Nice to have Fabric,Snowflake, Synapse, BigQuery – experience)
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service