AI Systems Analyst III

CareSourceDayton, OH
1d

About The Position

The AI Systems Analyst III provides the technical and analytical rigor behind the intake and triage process. Working within the CoE, this role analyzes business requirements against technical capabilities, ensuring that data readiness, lineage, and architectural fit are validated before a project enters the build phase. They maintain the enterprise AI Registry, serving as the librarian of the AI Mesh. Essential Functions: Analyze incoming AI use cases to determine technical feasibility, data availability, and appropriate risk tiering (green/yellow/red). Perform detailed data lineage and quality assessments to ensure training/RAG datasets meet governance standards for accuracy and PII/PHI protection. Maintain the "AI Agent Registry & Catalog," documenting agent capabilities, API dependencies, and ownership within the Azure APIM and Mesh architecture. Draft technical requirements and "Definition of Ready" artifacts for the Platform Engineering team, ensuring a smooth handoff from CoE to Engineering. Support the AI Engineering Committee (AIEC) by documenting architectural fit and identifying potential technical debt in proposed solutions. Monitor and aggregate telemetry data on token usage, cost, and error rates to support the "Value Analyst" in ROI reporting. Assist in the creation of "Data Dictionaries" and "Knowledge Graphs" required to ground RAG (Retrieval Augmented Generation) pipelines. Validate that yellow layer automations (UiPath, Databricks) utilize approved MCP connectors and do not bypass API gateways. Collaborate with Data Governance to tag and classify datasets specifically approved for LLM training or fine-tuning. Track the lifecycle of AI models from "Pilot" to "Production" to "Retirement" in the enterprise inventory system. Support the configuration of "Model Routers" by analyzing performance benchmarks across different LLMs (GPT-4 vs. Llama) for specific tasks. Create process flow diagrams for "Agentic Workflows" to visualize how multiple agents interact and hand off tasks. Perform any other job related duties as requested.

Requirements

  • Bachelor's degree in Information Systems, Computer Science, Data Analytics, or related field required required
  • Equivalent years of relevant work experience may be accepted in lieu of required education
  • Five (5) years in Systems Analysis, Data Analysis, or Technical Product Ownership required
  • Experience documenting technical requirements for data-intensive applications, API integrations, or cloud platforms required
  • Proficiency in SQL and data profiling, capable of analyzing datasets for quality, lineage, and PII/PHI exposure
  • Understanding of API specifications (REST/JSON) and microservices architecture to document agent dependencies
  • Ability to translate business needs into technical user stories and "Definition of Ready" criteria for engineering teams
  • Familiarity with Data Governance tools (e.g., Purview, Collibra) and metadata management
  • Knowledge of RAG (Retrieval Augmented Generation) concepts to help define knowledge base requirements

Nice To Haves

  • Background in Healthcare Payer data (Claims, Member, Provider) is preferred
  • CBAP (Certified Business Analysis Professional) or PMI-PBA preferred
  • Azure Data Fundamentals certification preferred

Responsibilities

  • Analyze incoming AI use cases to determine technical feasibility, data availability, and appropriate risk tiering (green/yellow/red).
  • Perform detailed data lineage and quality assessments to ensure training/RAG datasets meet governance standards for accuracy and PII/PHI protection.
  • Maintain the "AI Agent Registry & Catalog," documenting agent capabilities, API dependencies, and ownership within the Azure APIM and Mesh architecture.
  • Draft technical requirements and "Definition of Ready" artifacts for the Platform Engineering team, ensuring a smooth handoff from CoE to Engineering.
  • Support the AI Engineering Committee (AIEC) by documenting architectural fit and identifying potential technical debt in proposed solutions.
  • Monitor and aggregate telemetry data on token usage, cost, and error rates to support the "Value Analyst" in ROI reporting.
  • Assist in the creation of "Data Dictionaries" and "Knowledge Graphs" required to ground RAG (Retrieval Augmented Generation) pipelines.
  • Validate that yellow layer automations (UiPath, Databricks) utilize approved MCP connectors and do not bypass API gateways.
  • Collaborate with Data Governance to tag and classify datasets specifically approved for LLM training or fine-tuning.
  • Track the lifecycle of AI models from "Pilot" to "Production" to "Retirement" in the enterprise inventory system.
  • Support the configuration of "Model Routers" by analyzing performance benchmarks across different LLMs (GPT-4 vs. Llama) for specific tasks.
  • Create process flow diagrams for "Agentic Workflows" to visualize how multiple agents interact and hand off tasks.
  • Perform any other job related duties as requested.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service