Sr Data Engineer

LennarBentonville, AR
4d

About The Position

Senior Data Engineer We are Lennar Lennar is one of the nation's leading homebuilders, dedicated to making an impact and creating an extraordinary experience for their Homeowners, Communities, and Associates by building quality homes and providing exceptional customer service, giving back to the communities in which we work and live in, and fostering a culture of opportunity and growth for our Associates throughout their career. Lennar has been recognized as a Fortune 500® company and consistently ranked among the top homebuilders in the United States. Join a Company that Empowers you to Build your Future Lennar is seeking a Senior Data Engineer to help lead the design, build, and long-term ownership of our enterprise data platform. This role sits at the center of a major infrastructure transformation—migrating our core data platform to AWS and Snowflake—and will be critical to ensuring that work is delivered with engineering rigor, operational resilience, and clear knowledge transfer back to the internal team. The ideal candidate is a deeply technical, hands-on engineer who thrives in complex cloud environments. They bring production experience with AWS data services, dbt, Python, and Snowflake—and they know how to build pipelines and transformation layers that are not just functional, but maintainable, observable, and built to scale. They can work alongside external partners without being dependent on them, and they take pride in owning the quality of what they ship. You’ll join a high-performing Data & Analytics team operating at the intersection of real estate, operations, and AI—building the data foundation that powers pricing models, operational intelligence tools, and strategic decisions across 40+ divisions of one of the nation’s largest homebuilders. A career with purpose. A career built on making dreams come true. A career built on building zero defect homes, cost management, and adherence to schedules.

Requirements

  • Bachelor’s degree or higher in Computer Science, Engineering, or a related technical field.
  • 7+ years of data engineering experience, including meaningful production ownership of cloud-native pipelines in AWS environments.
  • Deep hands-on experience with AWS data services (S3, Glue, Lambda, ECS, Step Functions, IAM) and Snowflake as a primary data warehouse.
  • Strong Python skills with a track record of writing modular, well-tested, and production-ready data engineering code.
  • Fluency with dbt—including model structuring, testing, documentation, and managing dbt projects at scale in a team environment.
  • Strong SQL skills and understanding of data warehouse design principles including dimensional modeling, layered transformation patterns (raw/staging/mart), and performance optimization.
  • Proven ability to collaborate across technical and non-technical teams—communicating clearly, translating business requirements into engineering decisions, and building trust with stakeholders at multiple levels.
  • Comfortable operating with autonomy in ambiguous environments—scoping work, setting realistic timelines, and raising blockers proactively without waiting to be asked.

Nice To Haves

  • Experience with Terraform or other infrastructure-as-code tools, familiarity with orchestration platforms (Airflow, Prefect), or prior work supporting platform migrations alongside external implementation partners.

Responsibilities

  • Architect and build scalable, production-grade data pipelines on AWS (S3, Glue, Lambda, ECS, MWAA) integrated with Snowflake as the core data warehouse.
  • Own and evolve the dbt transformation layer—writing well-structured, tested, and documented models that serve analysts, data scientists, and operational reporting consumers across the business.
  • Write clean, modular Python for data ingestion, transformation logic, and orchestration—applying software engineering best practices including testing, versioning, and code review.
  • Partner closely with an external implementation vendor during a multi-phase platform migration, ensuring technical decisions align with long-term internal ownership goals and that knowledge transfers effectively to the team.
  • Build and maintain data quality frameworks, pipeline observability, and alerting systems that give the team confidence in production data across critical domains including pricing, supply chain, and sales operations.
  • Collaborate across Data Engineering, Data Science, Analytics Engineering, and business stakeholders to translate complex requirements into reliable, well-tested data products.
  • Contribute to infrastructure-as-code practices using Terraform to provision and manage cloud resources in a repeatable, auditable way (experience a plus; willingness to learn required).
  • Communicate clearly with both technical and non-technical audiences—documenting systems, setting expectations on delivery, and flagging risks early without overcommitting.

Benefits

  • robust health insurance plans, including Medical, Dental, and Vision coverage
  • 401(k) Retirement Plan, complete with a $1 for $1 Company Match up to 5%
  • Paid Parental Leave
  • Associate Assistance Plan
  • Education Assistance Program
  • up to $30,000 in Adoption Assistance
  • up to three weeks of vacation annually, alongside generous Holiday, Sick Leave, and Personal Day policies
  • New Hire Referral Bonus Program
  • significant Home Purchase Discounts
  • Everyone’s Included Day
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service