Data Engineer II

Tailored BrandsDublin, CA
10d$133,494 - $168,870Hybrid

About The Position

At Tailored Brands, we help people love the way they look and feel for their most important moments. Our Technology team loves the way they feel and thrive at work, with: · Flexible work opportunities, including remote and hybrid options · Small, empowered teams that have fun delivering real value for our customers · A culture that values a 50-year legacy while eagerly embracing the future Want to be part of this? We currently have an exciting opportunity for Data Engineer II to join our Tailored Technology team. This individual will assist with assessing the design, development, and maintenance of data infrastructure and pipelines within the organization. These professionals are responsible for ensuring the reliability, scalability, and efficiency of data systems, as well as optimizing data workflows for performance and usability. They often collaborate with cross-functional teams to understand data requirements, architect solutions, and implement best practices for data management, processing, and storage. They also play a crucial role in identifying opportunities for automation, streamlining processes, and implementing cutting-edge technologies to drive innovation in data-driven decision-making. Additionally, they may mentor and provide technical guidance to junior team members, participate in strategic planning, and contribute to the overall data strategy of the organization.

Requirements

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
  • 5+ years of experience in a data engineering or platform role, ideally working with large, complex datasets; retail industry experience a plus.
  • Strong understanding of software engineering best practices, including agile development, coding standards, code reviews, testing, and operational support.
  • Hands-on experience designing, building, and enhancing cloud-based data warehouse and ETL/ELT solutions, including integration with Application Programming Interfaces (APIs) and external data sources.
  • Proven ability to develop scalable, accurate, and timely data pipelines as data volume and complexity grow.
  • Proficiency in engineering and automating batch and streaming data pipelines, including data mappings and transformations.
  • Strong experience with cloud platforms (AWS and GCP), SQL, and scripting/programming languages such as Python, PySpark, and Bash.
  • Familiarity with data modeling, data warehousing environments, and ETL/ELT tools such as Fivetran, Stitch, or similar platforms.
  • Experience writing complex SQL queries and optimizing performance in Snowflake and relational databases such as Oracle or MySQL.
  • Knowledge of CI/CD practices and tools (e.g., GitHub, Jenkins, Kubernetes, Ansible) for production deployments.
  • Ability to independently troubleshoot and resolve complex production and data quality issues.
  • Experience supporting analysts and data scientists with query optimization, performance tuning, and data processing.
  • Strong communication and interpersonal skills, with the ability to collaborate effectively across technical and non-technical teams.
  • A proactive mindset with the ability to identify improvement opportunities and present recommendations to leadership.

Nice To Haves

  • Familiarity with workflow orchestration and scheduling tools such as Airflow or Automic UC4 is a plus.
  • Exposure to business intelligence tools such as Tableau or MicroStrategy is advantageous.

Responsibilities

  • Design, assess and support efficient, reliable data pipelines that move data across enterprise systems including Snowflake, Oracle, AS400, SQL Server, Salesforce and external partner platforms using API’s, AWS S3, and GCP buckets.
  • Enhance and scale cloud-based data warehousing and ETL capabilities to automate large-scale data ingestion and transformation across diverse data sources.
  • Identify opportunities to automate, optimize, and modernize existing data processes to improve performance, reliability, and productivity.
  • Own the solutions and designs you build, ensuring they meet operational standards, performance expectations, and long-term scalability.
  • Contribute to strong data operations environment by adhering to SLAs, improving data quality, and implementing monitoring and automation best practices.
  • Collaborate closely with data scientists, analytics teams, and business partners to deliver data solutions that support reporting, analytics, and predictive use cases.
  • Mentor and coach junior engineers by providing technical guidance, design feedback, and supporting production operations.
  • Adapt quickly to changing business priorities and data needs while meeting delivery timelines and supporting production operations.

Benefits

  • This role is eligible for healthcare including medical, dental and vision, retirement savings (401k with a company match), income protection programs such as life, accident and disability insurance, paid time off for sick leave, vacation, bereavement, jury duty, and holidays, wellbeing program, commuter, adoption assistance, education assistance, legal services, and employee merchandise discounts. For more detailed information go to mytbtotalrewards.com.
  • Meeting-Free Fridays (encouraged) | so you can catch up on work and self-development
  • Summer Fridays | from Memorial Day to Labor Day so you can enjoy a head-start to the weekend
  • Holiday Early Departure | close out early the business day before a company observed holiday
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service