Data Integration Engineer

RADICL
1dHybrid

About The Position

RADICL is seeking a Data Integration Engineer to join our platform team. You'll design and build API integrations and log parsing pipelines that power our security data ingestion, handling both structured and unstructured data from diverse security tools and log sources. You'll be integrating log collectors and forwarders (Fluentd, Fluent Bit, Vector, Logstash, Elastic Beats) and working with data from EDR platforms, cloud providers, compliance tools, and other security data sources. Your work will directly impact how quickly and accurately we onboard new data sources and how effectively our AI-powered virtual SOC delivers threat detection and compliance capabilities. You will also be a key part of implementing our agentic automation for parsing and mapping dynamic data sources. If the above excites you, RADICL Defense is seeking high performing, motivated individuals to join our mission. As an early member, you will work closely alongside an experienced founding team and realize the life-changing experience of building a company. You will work with the latest technologies in software, cybersecurity, and cloud and will have a significant impact on the formation of our platform and offering. About You You enjoy fast-paced environments, bring a positive attitude, and excel at getting things done. You enjoy being part of a high performing team and are also able to self-direct and self-start. You consider yourself to be top tier talent and are eager to help others raise their game. You enjoy collaborating with engineers and stakeholders, are an excellent communicator, and able to engage and interact with people of various backgrounds and skill levels. You want your work to have meaning, to be important. You want to be part of creating something great.

Requirements

  • Data integration fundamentals: ETL design, RESTful API development, data mapping, and schema design
  • Log parsing and normalization: Experience with regex, Grok-style patterns, or similar; comfort with both structured (JSON, CEF, syslog, CSV) and unstructured log formats
  • Log collection/forwarding tools: Experience with one or more of Fluentd, Fluent Bit, Vector, Logstash, Elastic Beats (Filebeat, Winlogbeat), OpenTelemetry Collector, rsyslog, or syslog-ng
  • Security data platforms: Familiarity with security data ingestion patterns; experience with Splunk, Microsoft Sentinel, Graylog, Sumo Logic, Google Chronicle, IBM QRadar, or similar platforms is valuable
  • Programming: Strong proficiency in Go, Python, and/or TypeScript
  • Data modeling: Ability to design schemas for security/observability data; familiarity with Elasticsearch and Elastic Common Schema or similar is a plus
  • Cloud and infrastructure: Experience with AWS, Azure, or GCP; familiarity with Kafka or streaming pipelines is a plus
  • Security domain awareness: Understanding of SIEM concepts, security tooling (EDR, firewalls, cloud logs), and compliance data (CMMC, NIST) is highly valued
  • Education: Bachelor's degree in computer science, data engineering, or related field—or equivalent professional experience (typically 3–6+ years)

Nice To Haves

  • Agentic/AI pipelines: Interest or experience with LLM-based data extraction (nice-to-have)

Responsibilities

  • Design and build API integrations for ingesting security data from third-party tools (EDR, cloud providers, compliance platforms, and other security data sources)
  • Develop log parsing and data mapping pipelines to normalize structured and unstructured log formats into our unified schema
  • Integrate with log collectors and forwarders (e.g., Fluentd, Fluent Bit, Vector, Logstash, Elastic Beats)
  • Create and maintain data mappings between source schemas and our internal data model
  • Optimize ingestion performance and ensure data quality, completeness, and timeliness
  • Document integration specifications and support internal and external stakeholders with technical guidance
  • Collaborate with streaming and backend engineers to integrate new data flows end-to-end
  • Contribute to the design of our emerging agentic parsing pipeline (LLM-powered extraction for complex formats)

Benefits

  • RADICL offices are in downtown Boulder, Colorado with easy-to-access employee parking provided by the company.
  • We offer comprehensive, competitive benefits including health, dental, and vision as well as 401K and a responsible PTO plan.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service