Azure Data Engineers

Location: Canberra
Discipline: Big Data & Analytics
Job type: Temporary
Salary: $Negotiable
Contact name: Alister Pardew

Contact email: alisterp@thenetworkit.com
Job ref: BBBH8037_1660179092
Published: over 1 year ago
Startdate: ASAP

Benefits

  • Contract to 30 June 2023 + 12-month extension
  • Canberra CBD location, Hybrid working arrangements
  • Bring your experience with Azure Data Bricks to this major project

About the Company
This Government Department enables access to quality skills, training and employment to support Australians find secure work in fair, productive and safe workplaces - supporting individuals, businesses, and our nation to prosper.

About the role
The department is seeking experienced data engineers with specific experience in Azure Databricks, Azure Data Factory and supporting services to drive the design and implementation of cloud-based data analytics infrastructure for key data-centric projects.

You will be required to

  • Conduct data modelling on existing and emerging transactional and business intelligence data assets to inform data architecture.
  • Develop, implement, and review data-related infrastructure, processes, and procedures.
  • Develop infrastructure to support hybrid data and analytics projects, integrating cloud and on-premises hosted services.
  • Work as a part of a data engineering and platform support team on project-based objectives to drive delivery of outcomes; and
  • Work under limited direction and be accountable for undertaking planning, analysis, design, development, peer review, testing, and delivery activities within tight timeframes.


Mandatory Experience

  1. Create, update, and maintain new data pipelines along with debugging and updating of existing data pipelines in pyspark/SQL. This includes building pipeline automation, linked services, and other supporting processes in Azure Data Factory.
  2. Experience in:
  3. using Azure resources such as keyvaults, contributor groups, storage containers, Access Control Lists;
  4. using Databricks; coding in pyspark and SQL, integration with ADF and storage containers, SQL Endpoint usage with BI apps, dbfs usage, machine learning capabilities.
  5. Generating supporting documentation of data pipelines, and other relevant processes.
  6. Ample experience in CI/CD deployment via Azure DevOps, using git repo, branching, merging, conflict resolution


Candidate Requirements
Candidates must be Australian Citizens, with minimum Baseline Security Clearance at the start of the engagement

How to apply
If you think that you want to make a real impact for a leading company, then APPLY NOW or contact Alister on 0478 176 553 or alisterp@thenetworkit.com for a confidential chat.