Performance Architect

SoleraWestlake, OH
15h

About The Position

Performance Architect / Westlake Who We Are Solera is a global leader in data and software services that strives to transform every touchpoint of the vehicle lifecycle into a connected digital experience. In addition, we provide products and services to protect life’s other most important assets: our homes and digital identities. Today, Solera processes over 300 million digital transactions annually for approximately 235,000 partners and customers in more than 90 countries. Our 6,500 team members foster an uncommon, innovative culture and are dedicated to successfully bringing the future to bear today through cognitive answers, insights, algorithms, and automation. For more information, please visit solera.com. The Role This role is for a Performance Architect who leads the design, development, and delivery of performance test frameworks for our next generation software platform (computer vision, machine learning, sensor fusion, coaching workflows, reporting, alert management engines, high accuracy vehicle event analysis engines.) The successful candidate will provide technical leadership as part of the Software QA Team and is accountable for all aspects of the QA process. This position requires solid experience testing N-Tier application services and data platform within an Agile development environment. Also, understands the database well.

Requirements

  • BS in Computer Science or related field OR 18+ years of technical experience with deep expertise in performance engineering and architecture.
  • 12+ years of experience architecting performance frameworks with multiple performance testing tools, techniques, and strategies
  • Knowledge and experience in complete development lifecycle, including code standards/reviews, source control processes, building and testing.
  • Experience designing modern testing frameworks and architectures, and in implementing comprehensive CI/CD processes
  • Ideas to leverage AI to improve efficiency, speed, coverage, accuracy, reach, visibility, etc.
  • Quality Intelligence: dashboards/metrics, AI/ML‑assisted test generation & failure analysis
  • Experience designing, implementing, maintaining, and scaling test automation frameworks with a focus on extensibility, scalability, maintainability, and high performance.
  • Strong experience in developing and implementing End-to- End test strategies.
  • Experience defining performance SLAs, KPIs, NFR models, and scalability patterns across distributed systems.
  • Deep expertise in performance observability including distributed tracing, telemetry, logs, APM tools, and instrumentation strategies.
  • Prior work experience and understanding of Agile.
  • Excellent verbal and written communication skills and ability to interact effectively across all levels.
  • Ability to take a proactive, problem-solving/trouble-shooting approach to identifying and solving problems.
  • Experience with AI/ML-driven testing platforms and predictive analytics.
  • Partner with DevOps to optimize pipeline reliability and speed, including test parallelization, containerization, and environment as code.

Responsibilities

  • With a performance lens, architect and guide the design, development, and maintenance of tools, test ecosystems, and automation frameworks for current and next‑gen platforms.
  • Collaborate with Product Management, SW Engineering, DevOps, QA, and other technical teams in release planning and coordination
  • Build a parametrized, reusable, scalable, and maintainable performance testing ecosystem. Architect and govern a scalable, multi‑platform performance engineering ecosystem.
  • Implement data strategies that enable validating applications under test at scale and under real-world loads.
  • Implement efficient strategies to performance test all application layers (mobile, front end, single user, API, services, database, backend, scalability, etc.).
  • Take ownership of the performance analysis activity including but not limited to tracing, debugging, troubleshooting, identifying breakpoints or chokepoints, latent issues, etc.
  • Generate powerful consumable performance results with visualizations that help convey results and provide sufficient data for pinpointing root cause of identified issues
  • Interpret functional requirements and designs to plan, develop, write, execute, and automate the validation of application performance.
  • Own and maintain performance environments and coordinate with relevant stakeholders the maintenance of shared tools, infrastructure, and environments.
  • Provide feedback and influence into the design process to help us build a testable and performant platform, applications, and data models
  • Architect and implement observability models using distributed tracing, telemetry, and APM integrations. Drive architectural reviews to proactively ensure scalable and performant system design.
  • Evaluate and select performance tools, frameworks, and emerging technologies, especially AI.
  • Implement AI/ML-driven approaches to anomaly detection, performance prediction, and test optimization.
  • Document and communicate performance framework functions, interfaces, performance criteria, test cases, and results to diverse audiences
  • Influence engineering leadership with architectural recommendations, risk assessments, and performance insights.
  • Lead or assist in data analysis to determine data health and consistency, or root cause of issues
  • Train engineers on new tools, methodologies, and technologies. Mentor engineers and help foster their personal and professional growth
  • Identify areas of improvements and drive/implement solutions to drive efficiency and quality.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service