Business Analyst - Data Architecture

reacHIREJersey City, NJ
7hHybrid

About The Position

Making a decision to return to work can be exciting and scary all at the same time. We get it 100% - many of us at reacHIRE are returners, too!! And, we believe meaningful relationships formed along the way back to work should be built on communication and trust. Which is why our team of Program Managers are here to listen to your unique story and help you take the best next steps toward your next opportunity. We are excited to partner with Fidelity Investments for a 6-month return-to-work program starting in May 2026. If you are a professional returning to work after a 2+ year career break or worked in part time or independent contractor roles, this could be the perfect opportunity! reacHIRE is invested in helping professionals return with confidence; providing the resources and support needed via Program Managers who will help guide and navigate the entire process alongside you. We know the confidence gap and imposter syndrome can get in the way of meeting amazing Returners, so please don’t hesitate to apply - we’d love to hear from you. Please note that this is a hybrid role and we are unable to consider candidates who would need to relocate for the program. Overview: Are you an experienced business and data analyst passionate about empowering clients with innovative event and data solutions? Do you thrive in a collaborative environment that offers endless opportunities to learn, grow, and innovate? If so, a career with the Application and Platform Enabler team in Enterprise Technology could be the perfect fit for you! What You’ll Contribute Day-to-Day: Leverage a blend of business, data, and technology expertise to help clients integrate messaging and streaming solutions Play a vital role in delivering Fidelity’s commitment to creating exceptional customer experiences in financial services Shorten time-to-platform for Kafka/Artemis by front-loading governance and architecture diligence, reducing rework Catch design and data compliance issues early, preventing downstream incidents, audit findings, and costly retrofits Turn standards into actionable, right-sized checklists and templates—balancing control with speed Apply AI thoughtfully to triage requests, summarize architectures, and surface governance gaps—boosting throughput and consistency Establish transparent intake paths, readiness criteria, and decision records so teams know exactly what “good” looks like Improve data integrity and schema lifecycle hygiene across teams, strengthening reliability of event-driven solutions Bridge platform, security, architecture, and product—ensuring designs are feasible, compliant, and value-driven Create repeatable artifacts that scale—reducing onboarding friction and enabling self-service where appropriate

Requirements

  • 3–5+ years in Business/Data Analysis, Data Governance, or Platform Enablement roles.
  • Hands-on experience onboarding teams to data/streaming platforms (e.g., Kafka) or messaging systems (e.g., Artemis or equivalent).
  • Proven track record reviewing solution architectures and integration patterns for data quality, security, and compliance risks.
  • Experience facilitating data governance processes (metadata, lineage, classification, stewardship, retention, and access controls).
  • Background working with cross-functional teams (platform engineering, security, architecture, and product).
  • Familiarity with regulated environments and enterprise data standards (e.g., SOX, HIPAA, PCI, GDPR, CCPA—depending on context).
  • Experience using AI/ML tools to automate or enhance data governance activities (e.g., policy mapping, metadata extraction, anomaly detection).
  • Exposure to event-driven architectures, pub/sub design, and schema management (e.g., Confluent Schema Registry or equivalent).
  • Experience building or maintaining documentation, workflows, and playbooks for platform onboarding and governance reviews.
  • Ability to interpret solution diagrams, sequence flows, and integration designs; spot gaps, anti-patterns, and noncompliance early.
  • Strong grasp of data classification, lineage, ownership/stewardship, access provisioning, retention, encryption, and audit requirements.
  • Understanding of topics/queues, producers/consumers, partitions, DLQs, schema evolution, and message durability and security.
  • Proficiency using AI to accelerate governance tasks—summarizing designs, generating control checklists, mapping metadata, and surfacing anomalies.
  • Build repeatable intake processes, readiness checklists, and decision logs that scale across teams.
  • Translate technical constraints into business risk, align with platform standards, and drive clear decisions.
  • Data Analysis & Validation: Use SQL and data profiling tools to verify data quality, schema compatibility, and downstream impacts.
  • Documentation Excellence: Create crisp artifacts—architecture review notes, governance assessments, onboarding guides, and FAQs.
  • Tooling & Platforms: Comfortable with Jira/Azure DevOps, Confluence/SharePoint, data catalog/lineage tools, and CI/CD-adjacent workflows for governance.
  • Communication & Facilitation: Lead design reviews, run working sessions, and coach teams through remediation with an enablement mindset.

Responsibilities

  • Leverage a blend of business, data, and technology expertise to help clients integrate messaging and streaming solutions
  • Play a vital role in delivering Fidelity’s commitment to creating exceptional customer experiences in financial services
  • Shorten time-to-platform for Kafka/Artemis by front-loading governance and architecture diligence, reducing rework
  • Catch design and data compliance issues early, preventing downstream incidents, audit findings, and costly retrofits
  • Turn standards into actionable, right-sized checklists and templates—balancing control with speed
  • Apply AI thoughtfully to triage requests, summarize architectures, and surface governance gaps—boosting throughput and consistency
  • Establish transparent intake paths, readiness criteria, and decision records so teams know exactly what “good” looks like
  • Improve data integrity and schema lifecycle hygiene across teams, strengthening reliability of event-driven solutions
  • Bridge platform, security, architecture, and product—ensuring designs are feasible, compliant, and value-driven
  • Create repeatable artifacts that scale—reducing onboarding friction and enabling self-service where appropriate
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service