Data Analytics Engineer

Veteran Benefits GuideEnterprise, NV
3d

About The Position

What is VBG: Veteran Benefits Guide has been proud to serve our nation’s service members for more than 10 years. Founded by a U.S. Marine Corps Veteran, VBG assists Veterans through the challenging VA claims process to efficiently secure their hard-earned benefits. Now operating with more than 225 team members nationwide, VBG has helped over 55,000 Veterans through the VA claims process. The company is dedicated to honoring service and supporting the Veteran community through ongoing advocacy, community partnerships, and meaningful opportunities within its workforce. Who we’re looking for: The Data Analytics Engineer is responsible for transforming raw and staged data into trusted, well-modeled, and analytics-ready datasets that empower reporting, dashboards, and data-driven decision-making across the organization. This role bridges the gap between engineering and analysis — ensuring data is clean, consistent, connected, and optimized for use by Analysts, BI Developers, and business teams. You will work closely with Data Engineers (who ingest data), BI Developers (who build dashboards), and Analysts (who generate insights) to build the semantic layer of the warehouse. You will own data modeling, cleansing, deduplication, and constructing unified datasets that bring together information from systems such as Salesforce, NetSuite, Google, and internal applications. This position is open to candidates located in the following states: Arizona (AZ), Washington (WA), Nevada (NV), Utah (UT), Illinois (IL), Ohio (OH), New Jersey (NJ), Virginia (VA), North Carolina (NC), and Florida (FL).

Requirements

  • Advanced SQL skills (window functions, CTEs, performance tuning).
  • Experience with transformation frameworks (dbt strongly preferred).
  • Strong understanding of data warehousing concepts: star schema, snowflake schema, fact/dimension modeling.
  • Familiarity with cloud warehouses (Snowflake, BigQuery, Redshift, Synapse).
  • Ability to troubleshoot mismatched metrics, broken joins, or duplicated data.
  • Bachelor’s degree in Data Analytics, Computer Science, Information Systems, or related field.
  • 3–5+ years of experience in analytics engineering, BI development, or data modeling.

Nice To Haves

  • Preferred: Snowflake SnowPro Core Certification
  • Preferred: Snowflake SnowPro Advanced Data Engineer Certification
  • Preferred: dbt Analytics Engineer Certification
  • Preferred: AWS Data Engineer – Associate Certification
  • Preferred: AWS Solutions Architect – Associate Certification
  • Experience with Python or R for data validation or automation scripts.
  • Knowledge of BI tools (Power BI, Tableau, Looker) and how they interact with semantic layers.
  • Familiarity with CI/CD for analytics code and version control (Git).
  • Exposure to data governance, cataloging, and documentation tools.

Responsibilities

  • Build, maintain, and optimize curated data models using SQL, dbt, or similar transformation tools.
  • Create dimensional models (fact/dimension) and semantic layers to support reporting and advanced analytics.
  • Construct unified datasets that bring together cross-system information (e.g., Salesforce, NetSuite, Google Ads).
  • Profile, validate, and cleanse data to eliminate duplicates, missing fields, and inconsistencies.
  • Implement automated data tests to ensure accuracy, completeness, and referential integrity.
  • Investigate and resolve issues flagged by Analysts when metrics do not match or data looks incorrect.
  • Partner with DBAs and Data Engineers to ensure performance at the warehouse structures and optimized queries.
  • Adhere to and help define data governance, documentation standards, and semantic layer best practices.
  • Maintain version-controlled analytics codebases using Git or similar workflows.
  • Work closely with Analysts to understand their data needs and translate them into robust models.
  • Support BI Developers by providing clean, reliable datasets that power dashboards and reports.
  • Communicate issues, improvements, and data model changes clearly to technical and non-technical audiences.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service