Description
Primary Details
Time Type: Full time
Worker Type: Employee
Key Responsibilities:
- Design, develop, and optimize data pipelines and workflows within Azure Databricks using Python for data processing and automation.
- Collaborate with the platform team to ensure that the underlying infrastructure supports the scalability, performance, and security requirements of Databricks workloads.
- Maintain and enhance ETL pipelines, ensuring they are efficient, reliable, and meet business requirements.
- Troubleshoot and resolve issues related to data processing, pipeline failures, and performance bottlenecks within Databricks.
- Develop and implement strategies to optimize data pipelines for speed, cost-efficiency, and performance, ensuring alignment with business objectives.
- Contribute to the improvement of data architecture, data models, and overall infrastructure within the Azure ecosystem.
- Lead efforts to monitor, maintain, and scale data workflows and data storage solutions within Azure Databricks.
- Collaborate with fellow data engineers, data scientists, and analysts to ensure data integrity and smooth flow across systems.
- Stay up-to-date with the latest industry trends, tools, and best practices in data engineering and cloud technologies.
Requirements:
- 3+ years of experience as a data engineer with expertise in building and optimizing data pipelines in a cloud-based environment, specifically Azure.
- Proficiency in Azure Databricks and Python for developing data processing scripts and optimizing workflows.
- Experience with ETL processes, data pipeline design, automation, and troubleshooting.
- Hands-on experience with Apache Spark within Databricks, including performance tuning and optimization.
- Knowledge of Azure Data Lake, Azure SQL, and other Azure-based data storage and processing services.
- Strong understanding of cloud data platforms, including networking, storage, security, and performance considerations within Azure.
- Ability to work effectively in cross-functional teams to coordinate and deliver high-quality data solutions.
- Excellent problem-solving and analytical skills, with the ability to identify and resolve issues related to data workflows.
- Strong communication skills, both technical and non-technical, with the ability to collaborate effectively with different teams.
Preferred Qualifications:
- Experience with Azure Synapse Analytics, Azure Data Factory, or similar data integration tools.
- Familiarity with CI/CD practices and version control systems like Git.
- Experience in Agile methodologies and project management tools (e.g., Jira, Trello).
- Familiarity with data governance principles and best practices in ensuring data quality and security.
Skills:
Adaptability, Agile Application Development, Communication, Critical Thinking, Customer Service, GitLab CI/CD, Intentional collaboration, Managing performance, Process Improvements, Risk Management, Scalability Testing, Software Development, Software Development Life Cycle (SDLC) Methodologies, Stakeholder Management, Team Management
How to Apply:
To submit your application, click "Apply" and follow the step by step process.
Equal Employment Opportunity:
QBE is an equal opportunity employer and is required to comply with equal employment opportunity legislation in each jurisdiction it operates.
Requirements
Please refer to job description.