Advanced Field Service Engr

HoneywellTulsa, OK
16d

About The Position

Advanced Field Service Engineer for Intelligrated Services, LLC in Tulsa, OK Responsibilities: Use Databricks Delta Live Tables to manage unstructured blob data. Lead the data migration efforts for different teams in the Databricks Lakehouse to get data from API/event hubs/Azure blob storage/unstructured data. Advanced Building of lake-house architecture for the team for a single source of truth of data using Azure Databricks. Managing and handling expectations of teammates and external teams to set project timeline. Advanced Scripting programs in Python and SQL to get daily active user data from Tableau API to Azure. Extensively using Azure Data Factory pipelines to get data from Azure Blobs and Azure data Kusto explorer cluster to Azure SQL database. Creating Extracts and Tableau jobs after filtering data via SQL for the BI team to develop Tableau dashboards. Updating tables and SQL Queries and table structures for the team thus saving CPU processing time. Updating multiple pipelines from full refresh to incremental refresh for the team -saving time, disk space and money. Extensively using Azure Logic Apps to create custom notifications for the Azure data factory pipelines. Creating incremental refresh automation for tables from Azure data explorer cluster to Azure SQL tables without changing data capture (Watermarking). Using structured thinking to solve complex business data problems and architect the data for the team. Using Azure logic app and Power Automate flows to get data from emails and SharePoint to SQL database. Moving tableau table calculations to the backend - Databricks for faster dashboard loading. Monitor and assess Data pipeline performance; identify problems and institute corrective actions as required. Identify, track, and manage risk. Responsible for understanding and interpreting customer/business data requirements to work in partnership with customers/business towards derived solutions and leveraging existing solutions. Responsible for the direction, oversight, and outcome of all assigned tasks.

Requirements

  • Qualified applicants must have a Bachelors degree or foreign equivalent in Information Technology, or a related field, and 2 years of experience with Information Technology.
  • experience in SQL query scripting, debugging, and execution or related experience
  • Experience working in a data-intensive organization with large-scale datasets
  • Excel proficiency
  • Employer will accept 2 years of experience in lieu of a Bachelor’s degree.
  • If offered employment must have legal right to work in U.S.

Responsibilities

  • Use Databricks Delta Live Tables to manage unstructured blob data.
  • Lead the data migration efforts for different teams in the Databricks Lakehouse to get data from API/event hubs/Azure blob storage/unstructured data.
  • Advanced Building of lake-house architecture for the team for a single source of truth of data using Azure Databricks.
  • Managing and handling expectations of teammates and external teams to set project timeline.
  • Advanced Scripting programs in Python and SQL to get daily active user data from Tableau API to Azure.
  • Extensively using Azure Data Factory pipelines to get data from Azure Blobs and Azure data Kusto explorer cluster to Azure SQL database.
  • Creating Extracts and Tableau jobs after filtering data via SQL for the BI team to develop Tableau dashboards.
  • Updating tables and SQL Queries and table structures for the team thus saving CPU processing time.
  • Updating multiple pipelines from full refresh to incremental refresh for the team -saving time, disk space and money.
  • Extensively using Azure Logic Apps to create custom notifications for the Azure data factory pipelines.
  • Creating incremental refresh automation for tables from Azure data explorer cluster to Azure SQL tables without changing data capture (Watermarking).
  • Using structured thinking to solve complex business data problems and architect the data for the team.
  • Using Azure logic app and Power Automate flows to get data from emails and SharePoint to SQL database.
  • Moving tableau table calculations to the backend - Databricks for faster dashboard loading.
  • Monitor and assess Data pipeline performance; identify problems and institute corrective actions as required.
  • Identify, track, and manage risk.
  • Responsible for understanding and interpreting customer/business data requirements to work in partnership with customers/business towards derived solutions and leveraging existing solutions.
  • Responsible for the direction, oversight, and outcome of all assigned tasks.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service