10+ years of IT experience Databricks (data engineering, SQL, notebooks, app/backend integration patterns) Enterprise data warehouses and lakehouse platforms (Nice to have Fabric,Snowflake, Synapse, BigQuery – experience) Deep knowledge of data modeling, data warehousing, and lakehouse patterns Hands‑on experience with at least one modern data platform: Experience integrating data platforms with applications using APIs, services, or event‑driven patterns Solid understanding of cloud architecture concepts (security, networking, scalability, cost management) Strong communication skills with the ability to engage both technical and business stakeholders Experience working in client‑facing or consulting environments Bachelor's Degree in Computer Science, Information Technology, Engineering, Business, or related field AND Deep hands on experience with Databricks Lakehouse Platform and Apache Spark Strong proficiency in PySpark, SQL, Delta Lake, Databricks SQL Experience implementing Unity Catalog, data governance, and enterprise security controls Cloud experience with Azure Databricks, AWS Databricks, or GCP Databricks Familiarity with CI/CD, DevOps, and automation for Databricks workloads
Stand Out From the Crowd
Upload your resume and get instant feedback on how well it matches this job.
Job Type
Full-time
Career Level
Senior