Senior Data Engineer - Labor Management

Publix Super MarketsLakeland, FL
8dHybrid

About The Position

Publix Super Markets, Inc., the largest employee-owned company in the U.S., is driven by a dynamic technology team of 2,300+ professionals delivering innovative solutions to 1,400+ stores and 200,000+ associates across 8 states. From IT security and platform engineering to architecture, software development, and infrastructure, we offer career opportunities at every level—internships through technical leadership. Join a company consistently ranked among Fortune’s “100 Best Companies to Work For” and help us build more than great subs—build the future of technology at Publix. Become part of the Labor Management team, designing, building, and delivering a robust application that supports Publix business areas in managing labor activities such as forecasting, time attendance and scheduling. In this role, you will own data processing validation and transformation, build and maintain scalable data pipelines, and manage Azure and on-prem data services. Location: Hybrid (Lakeland, FL) Work Model: Enjoy the best of both worlds—collaborate in person and innovate remotely. Why Join Us? Hybrid Flexibility: Work remotely when you need focus time and join us onsite for high-impact collaboration and brainstorming sessions. Operational Efficiency: Ensure technology solutions support efficient workflows and enable automation to improve operational effectiveness. Cutting-Edge AI Projects: Drive innovation in AI platforms, integrating advanced tools and frameworks to solve complex business challenges. Empowered Culture: We value autonomy, creativity, and continuous learning—your ideas shape the future of technology in our organization. Your Impact Design and implement secure, scalable ingestion frameworks to migrate data from on prem SQL Server databases into Azure Data Lake using standardized patterns Develop and maintain reusable transformation frameworks (e.g., medallion architecture: bronze/silver/gold) to standardize data modeling, improve data reliability, and accelerate downstream analytics and reporting Establish data governance, security, and compliance controls across the lakehouse, including data classification, access controls, lineage, and auditing to ensure enterprise and regulatory standards are met Design, implement, and monitor data quality events to detect, alert, and resolve data anomalies across pipelines and platforms Provide hands-on development, deployment, maintenance, and optimization of data pipelines and workflows using Databricks. Collaborate with technical teams and other cross-functional stakeholders to understand requirements, design solutions, and implement them using Databricks' platform. Optimize and tune Databricks jobs for performance and scalability. Assist technical team through problem determination and resolution on highly complex problems. Maintain strong analytical, planning, problem solving, writing, and presentation skills. #LI-JB1

Requirements

  • Bachelor’s degree in management information systems, Computer Science, Business, or other analytical disciplines or equivalent experience
  • 5+ years of experience in application design, development, and delivery in an enterprise environment
  • 5+ years of experience developing large-scale data manipulation processes.
  • 5+ years of experience working with relational databases.
  • 5+ years of experience implementing large scale enterprise applications.
  • 5+ years of experience in analyzing complex, enterprise business problems or processes and translating business requirements into technology solutions that factor in system performance, usability, quality, cross-system interdependencies, scalability, and total cost of ownership.
  • 2+ years of experience utilizing Databricks for Python
  • Proficient in writing complex queries using an ANSI-compliant SQL language against an enterprise relational database management system.
  • Experience planning and managing all activities associated with delivering solutions in a large-scale distributed environment.
  • 7+ years of experience in application design, development, and delivery in an enterprise environment
  • 7+ years of experience developing large-scale data manipulation processes.
  • 7+ years of experience working with relational databases.
  • 7+ years of experience implementing large scale enterprise applications.
  • 7+ years of experience in analyzing complex, enterprise business problems or processes and translating business requirements into technology solutions that factor in system performance, usability, quality, cross-system interdependencies, scalability, and total cost of ownership.

Nice To Haves

  • Experience in implementing enterprise applications using platform services like azure App service, Azure SQL, Azure Service Bus, notification hubs, event hubs, Microservices, stream analytics, Snowflake DB, Redis Cache, OpenAPI, IoT Hub, application insights, etc.
  • Experience with Azure DevOps for managing project tasks, managing code repositories, and creating yaml pipelines for CI/CD builds and deployments.
  • Azure AZ-305 certification or equivalent experience in designing/architecting Azure solutions
  • Experience with C# and/or Python development
  • Experience with Unity Catalog, Data/Delta Lake
  • Experience with advanced ML specializations (AutoML, forecasting, ensemble methods)

Responsibilities

  • Design and implement secure, scalable ingestion frameworks to migrate data from on prem SQL Server databases into Azure Data Lake using standardized patterns
  • Develop and maintain reusable transformation frameworks (e.g. medallion architecture: bronze/silver/gold) to standardize data modeling, improve data reliability, and accelerate downstream analytics and reporting
  • Establish data governance, security, and compliance controls across the lakehouse, including data classification, access controls, lineage, and auditing to ensure enterprise and regulatory standards are met
  • Design, implement, and monitor data quality events to detect, alert, and resolve data anomalies across pipelines and platforms
  • Provide hands-on development, deployment, maintenance, and optimization of data pipelines and workflows using Databricks.
  • Collaborate with technical teams and other cross-functional stakeholders to understand requirements, design solutions, and implement them using Databricks' platform.
  • Optimize and tune Databricks jobs for performance and scalability.
  • Assist technical team through problem determination and resolution on highly complex problems.
  • Maintain strong analytical, planning, problem solving, writing, and presentation skills.

Benefits

  • Hybrid Flexibility: Work remotely when you need focus time and join us onsite for high-impact collaboration and brainstorming sessions.
  • Operational Efficiency: Ensure technology solutions support efficient workflows and enable automation to improve operational effectiveness.
  • Cutting-Edge AI Projects: Drive innovation in AI platforms, integrating advanced tools and frameworks to solve complex business challenges.
  • Empowered Culture: We value autonomy, creativity, and continuous learning—your ideas shape the future of technology in our organization.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service