Signature Aviation logo

Data Warehouse Engineer

Signature Aviation
Full-time
On-site
Orlando, Florida, United States
Description

Signature Aviation’s Global Data Operations Team has an exciting opportunity for a motivated Data Warehouse Engineer to join our team and make significant contributions to designing, building, operating, and maintaining enterprise-level, scalable, and highly reliable data warehouse solutions. The Data Warehouse Engineer will work with a talented team to manage, deliver, and maintain data infrastructure, supporting custom applications and enterprise analytics systems. The right candidate will tackle high-priority challenges, leveraging expertise in Microsoft Azure and Databricks to support business intelligence, reporting, and advanced analytics.



Responsibilities

Essential Duties and Responsibilities

  • Data Warehouse Architecture: Design, implement, and maintain scalable data warehouse environments using tools like Azure Synapse Analytics, Databricks, and Snowflake, ensuring high availability and performance.

  • ETL/ELT Pipeline Development: Develop and optimize ETL/ELT pipelines using Azure Data Factory, Databricks, or dbt to integrate data from diverse sources, including structured and unstructured datasets.
  • Security and Compliance: Ensure data warehouse solutions adhere to security and compliance best practices, implementing role-based access controls (RBAC), encryption, and compliance with regulations (e.g., GDPR, CCPA) in Azure and Databricks environments.
  • Performance Optimization: Monitor and optimize data warehouse performance, including query execution, data storage, and cost efficiency, using Azure and Databricks tools.
  • Collaboration: Work closely with data analysts, data scientists, business stakeholders, and IT teams to ensure seamless integration and delivery of data services.
  • Troubleshooting: Identify and resolve issues related to data pipelines, warehouse performance, and integrations, ensuring minimal downtime and rapid recovery.
  • Documentation: Develop and maintain technical documentation, including data models, ETL process flows, architecture diagrams, and operational procedures.
  • Continuous Improvement: Stay up-to-date with emerging technologies, including advancements in Azure, Databricks, and other data platforms, to drive innovation and inform architectural decisions.


Qualifications

Minimum Education and/or Experience

  • 3+ years of experience in data engineering, data warehousing, or a similar role.

  • Hands-on experience with Microsoft Azure (e.g., Azure Synapse Analytics, Azure Data Factory, Azure Data Lake) and Databricks.
  • Proficiency in SQL and scripting languages such as Python or Scala.
  • Experience with ETL/ELT tools (e.g., Azure Data Factory, Databricks, Apache Airflow, or dbt).
  • Strong understanding of data modeling, data warehousing principles, and cloud architecture.

Additional Knowledge and Skills

  • Microsoft Certified: Azure Data Engineer Associate (DP-203)

  • Microsoft Certified: Azure Fundamentals (AZ-900)
  • Databricks Certified Data Engineer Professional
  • Databricks Certified Associate Developer for Apache Spark
  • ITIL v3 or v4 Foundations
  • Familiarity with data governance frameworks and standards.
  • Knowledge of DevOps automation tools (e.g., Azure DevOps, GitHub Actions) for data pipeline deployment.
  • Understanding of API security, data lake security, and Azure cloud security practices.
  • Ability to guide Azure and Databricks pricing models, optimizing resource usage for cost-effective solutions.
  • Experience with Agile/Kanban methodologies in Azure DevOps or similar platforms.
  • Familiarity with scripting for automation (e.g., Python, PowerShell, or Bash).
  • In-depth knowledge of Azure services, including Azure Synapse Analytics, Azure Data Lake, and Azure Data Factory.
  • Working knowledge of Databricks for big data processing, Delta Lake, and machine learning workflows.
  • General knowledge of Azure networking, security, IaaS, and PaaS services.
  • Ability to develop and maintain architecture diagrams, technical documentation, and solution roadmaps.
  • Experience troubleshooting issues across data pipelines, databases, and cloud infrastructure.
  • Familiarity with project management and workflow tools such as Jira, Azure DevOps, or ServiceNow.
  • Experience with Linux servers and troubleshooting in OS, network, or database environments.
  • Ability to support large-scale data infrastructure with a full understanding of the software development lifecycle (SDLC).
  • Excellent analytical and interpersonal skills, with a focus on providing exceptional customer service in a complex environment.


Apply now
Share this job