Senior Solutions Architect

World Wide Technology Healthcare SolutionsJenks, OK
1dRemote

About The Position

At World Wide Technology, we work together to make a new world happen. Our important work benefits our clients and partners as much as it does our people and communities across the globe. WWT is dedicated to achieving its mission of creating a profitable growth company that is also a Great Place to Work for All. We achieve this through our world-class culture, generous benefits and by delivering cutting-edge technology solutions for our clients. Founded in 1990, WWT is a global technology solutions provider leading the AI and Digital Revolution. WWT combines the power of strategy, execution and partnership to accelerate digital transformational outcomes for organizations around the globe. Through its Advanced Technology Center, a collaborative ecosystem of the world's most advanced hardware and software solutions, WWT helps clients and partners conceptualize, test and validate innovative technology solutions for the best business outcomes and then deploys them at scale through its global warehousing, distribution and integration capabilities. With over 14,000 employees across WWT and Softchoice and more than 60 locations around the world, WWT's culture, built on a set of core values and established leadership philosophies, has been recognized 14 years in a row by Fortune and Great Place to Work® for its unique blend of determination, innovation and creating a great place to work for all. We’re seeking a hands-on Data Architect to lead the design, modernization, and governance of our analytics platform on Microsoft Fabric. You will define the target architecture across OneLake, Lakehouse/Data Warehouse, Direct Lake, Power BI, and Data Engineering experiences, while orchestrating migrations and integrations from Oracle, Snowflake and other. This role blends deep technical architecture with practical delivery—partnering with data engineers, BI developers, and business stakeholders to deliver trusted, performant, and governed data products.

Requirements

  • 10+ years in data engineering/architecture; 3+ years leading modern cloud data platforms.
  • Deep expertise in Microsoft Fabric, including:
  • OneLake, Lakehouse (Delta), Warehouse, Data Engineering (Spark notebooks), Data Factory (Pipelines), Dataflows Gen2, Power BI (Direct Lake, Composite Models), and Git integration.
  • Strong Oracle experience: PL/SQL, schema design, performance tuning, partitioning, Oracle CDC tooling (e.g., GoldenGate), and migration to cloud data lakes/warehouses.
  • Strong Snowflake experience: virtual warehouses, Time Travel, zero-copy cloning, Streams & Tasks, Snowpipe, role-based access control, data sharing, and performance tuning.
  • Expertise in Delta Lake, Spark SQL/PySpark, SQL (analytical functions), data modeling (dimensional/star, data vault, semantic modeling).
  • Hands-on experience with Azure services: ADLS/OneLake, Key Vault, Azure AD, Event Hubs/Kafka, Functions/Logic Apps (nice to have).
  • Proven track record with data governance, catalog/lineage, security policies, and compliance.
  • Strong communication skills with the ability to lead design sessions and influence senior stakeholders.

Nice To Haves

  • Experience with policy authoring and lineage across Fabric, Snowflake, and Oracle sources.
  • Familiarity with Databricks or Synapse (for comparison/interop) and migration trade-offs to Fabric.
  • Experience implementing streaming architectures (e.g., IoT/real-time analytics).
  • Background in domain-driven design and data product operating models.
  • Microsoft Certified: Fabric Analytics Engineer Associate
  • Azure Data Engineer Associate (DP-203)
  • Snowflake SnowPro Core
  • Oracle Database certifications
  • TOGAF Certifications

Responsibilities

  • Define end-to-end Fabric data engineering architecture (OneLake, Lakehouse, Warehouse, Delta tables, medallion layers) aligned to business domains, enterprise architecture, platform architecture and data product strategy.
  • Establish dimensional and semantic models for Power BI leveraging Direct Lake, Composite Models, and shared datasets.
  • Create standards for data modeling, partitioning, indexing, and performance optimization across Fabric pipelines, notebooks, and warehouses.
  • Develop reference architectures for batch, micro-batch, and streaming ingestion; choose the right pattern (Dataflows Gen2, Pipelines, Notebooks, Spark Structured Streaming).
  • Lead migration paths from Oracle (e.g., PL/SQL-based systems) and Snowflake to Fabric Lakehouse/Warehouse; define incremental loads, CDC, and cutover strategies.
  • Design robust ingestion using Snowpipe/Snowflake Tasks & Streams, Oracle CDC (e.g., GoldenGate), or landing via ADF/Fabric Pipelines to Delta Lake.
  • Rationalize Snowflake objects (schemas/tables/stages) and Oracle PL/SQL logic into Spark/SQL transformations, reusable notebook patterns, and Dataflows Gen2 where appropriate.
  • Implement secure, governed data sharing and zero-copy migration patterns, minimizing downtime and cost.
  • Proficient in building reliable, realtime data pipelines using Kafka—covering event streaming architecture, streaming ingestion with Fabric and Spark, Kafka Connect and schema management, and the design of lowlatency processing with Kafka Streams or Spark.
  • Operationalize data catalog, lineage, classifications, policies for Fabric and connected sources.
  • Define RBAC, workspace and item-level security, row-level and object-level security for BI and warehouse artifacts.
  • Establish data quality rules, observability (logging/metrics), SLAs, and error handling across pipelines and streaming jobs.
  • Partner with InfoSec for encryption, key management, and compliance (e.g., HIPAA/PCI/SOX depending on industry).
  • Optimize Direct Lake vs Import vs DirectQuery decisions, tune warehouse compute, cache, and layout.
  • Optimize Spark jobs (partitioning, broadcast joins, caching), Delta Lake management (Z-order, vacuum), and Power BI model performance.
  • Implement cost visibility and FinOps practices across Fabric capacities, Snowflake virtual warehouses, and Oracle licensing impacts.
  • Translate business requirements into data product backlogs, architecture epics, and release plans.
  • Provide hands-on guidance to data engineers and BI developers; perform code and design reviews.
  • Evangelize best practices and fabric-native approaches; create architectural runway for future features.
  • Collaborate with product owners, analytics leads, and enterprise architecture to ensure alignment and reuse.

Benefits

  • Health and Wellbeing: Health, Dental, and Vision Care, Onsite Health Centers, Employee Assistance Program, Wellness program
  • Financial Benefits: Competitive pay, Profit Sharing, 401k Plan with Company Matching, Life and Disability Insurance, Tuition Reimbursement
  • Paid Time Off: PTO and Sick Leave (starting at 20 days per year) & Holidays (10 per year), Parental Leave, Military Leave, Bereavement
  • Additional Perks: Nursing Mothers Benefits, Voluntary Legal, Pet Insurance, Employee Discount Program
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service