Sumitomo Mitsui Banking Corporationposted 19 days ago
$85,000 - $170,000/Yr
Full-time • Mid Level
GA

About the position

Jenius Bank is looking for a hands-on Software Engineer - Data proficient in Java, Scala, and Python languages. You'll be part of the team that is responsible for building Data and Analytics Platform for the Digital Bank Unit. As a Software Engineer - Data on the team, you will get an opportunity to perform proof of concept on new cloud technologies and build a highly scalable, data platform to support critical business functions, create rest APIs to expose data services for internal and external consumers.

Responsibilities

  • A solid experience and understanding of considerations for large-scale solutioning and operationalization of data warehouses, data lakes and analytics platforms on GCP is necessary.
  • Monitors the Data Lake constantly and ensures that the appropriate support teams are engaged at the right times.
  • Design, build and test scalable data ingestion pipelines, perform end to end automation of ETL process for various datasets that are being ingested.
  • Determine the best way to extract application telemetry data, structure it, send to proper tool for reporting (Kafka, Splunk).
  • Create reports to monitor usage data for billing and SLA tracking.
  • Work with business and cross-functional teams to gather and document requirements to meet business needs.
  • Provide support as required to ensure the availability and performance of ETL/ELT jobs.
  • Provide technical assistance and cross training to business and internal team members.
  • Collaborate with business partners for continuous improvement opportunities.

Requirements

  • Bachelor's degree in Computer Science, Computer Engineering, or Information Systems Technology.
  • 6+ years of experience in Data Engineering with an emphasis on Data Warehousing and Data Analytics.
  • 4+ years of experience with one of the leading public clouds.
  • 4+ years of experience in design and build of scalable data pipelines that deal with extraction, transformation, and loading.
  • 4+ years of experience with Python, Scala with working knowledge on Notebooks.
  • 1+ years hands on experience on GCP Cloud data implementation projects (Dataflow, DataProc, Cloud Composer, Big Query, Cloud Storage, GKE, Airflow, etc.).
  • At least 2 years of experience in Data governance and Metadata Management.
  • Ability to work independently, solve problems, update the stakeholders.
  • Analyze, design, develop and deploy solutions as per business requirements.
  • Strong understanding of relational and dimensional data modeling.
  • Experience in DevOps and CI/CD related technologies.
  • Excellent written, verbal communication skills, including experience in technical documentation and ability to communicate with senior business managers and executives.

Benefits

  • Competitive portfolio of benefits to its employees.
  • Annual discretionary incentive award.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service