Sr. Developer - Snowflake and Python

CognizantWilmington, DE
2dHybrid

About The Position

As a Senior Developer – Snowflake & Python, you will make an impact by designing, developing, and maintaining high‑quality data pipelines that support business‑critical analytics and data operations. You will be a valued member of the data engineering team and work collaboratively with cross‑functional stakeholders across technology, analytics, and product teams. In this role, you will: Develop, test, and maintain ETL pipelines using Snowflake, Python, and the M&T framework (Data Butler). Design and implement scalable data pipeline solutions leveraging Snowflake Tasks, Snowpark, Streams, SnowSQL, and Snowpipe. Manage code repositories and perform version control and code check‑ins using GitLab. Collaborate with teams to gather requirements and deliver high‑quality data solutions. Troubleshoot and resolve pipeline, application, and data‑warehouse issues while performing validation testing to ensure data integrity. Work model We believe hybrid work is the way forward as we strive to provide flexibility wherever possible. Based on this role’s business requirements, this is a hybrid position requiring 2-3 days/week in a client or Cognizant office in Wilmington, DE. Regardless of your working arrangement, we support a healthy work‑life balance through our wellbeing programs. The working arrangements for this role are accurate as of the date of posting and may change based on project, business, or client requirements. We will always communicate role expectations clearly.

Requirements

  • 9–12 years of experience in data engineering or software development.
  • Strong expertise in Snowflake data warehousing, including Tasks, Snowpark, Streams, SnowSQL, and Snowpipe.
  • Proven hands‑on experience developing applications and ETL processes using Python.
  • Proficiency using GitLab for version control and code management.
  • Excellent problem‑solving skills and attention to detail.
  • Ability to work both independently and collaboratively in a fast‑paced environment.
  • Strong communication and stakeholder‑management skills.

Nice To Haves

  • Knowledge of Informatica.
  • Experience with additional data‑warehousing solutions or ETL tools.
  • Familiarity with Agile development methodologies.
  • Experience contributing to continuous improvement and optimization in data engineering environments.

Responsibilities

  • Develop, test, and maintain ETL pipelines using Snowflake, Python, and the M&T framework (Data Butler).
  • Design and implement scalable data pipeline solutions leveraging Snowflake Tasks, Snowpark, Streams, SnowSQL, and Snowpipe.
  • Manage code repositories and perform version control and code check‑ins using GitLab.
  • Collaborate with teams to gather requirements and deliver high‑quality data solutions.
  • Troubleshoot and resolve pipeline, application, and data‑warehouse issues while performing validation testing to ensure data integrity.

Benefits

  • Medical/Dental/Vision/Life Insurance
  • Paid holidays plus Paid Time Off
  • 401(k) plan and contributions
  • Long-term/Short-term Disability
  • Paid Parental Leave
  • Employee Stock Purchase Plan
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service