Job title: Senior DevOps Data Engineer EL1
Job type: Contract
Emp type: Full-time
Functional Expertise: IT & Telecoms
Pay interval: Hourly
Location: 44 Mort St, Braddon, ACT
Job published: 16-10-2025
Job ID: 43403

Job Description

Role: EL1 DevOps Data Engineer

Client: Federal Government Agency

Engagement Type: Contract

Duration: 12 months with potential 12 months extension

Location: VIC QLD NSW ACT SA

Work Arrangements: Hybrid

Security Clearance: Must be able to obtain Baseline

Candidate MUST be an Australian Citizen

 

Purpose

A senior member of a project stream of the data warehouse platform, the Senior DevOps Data Engineer will be responsible for updating data development documentation, hands-on data platform development, providing technical guidance to developers and reviewing/approving data designs and data models.

Key Responsibilities

  • Provide technical leadership to the project delivery team through design reviews, advising on implementation best practices, defining and refining release process and technical workshops.
  • Review and refine the data delivery framework to enhance integration of new sources, performance of existing data delivery and rapid development of new subject areas.
  • Own the technical implementation of new data from analysis to delivery (designing and delivering data dictionary, defining the release steps, defining best practices of data modelling and review steps).
  • Facilitate continuous improvement measures in delivery principles, coding standards, documentation and provide training sessions to team.
  • Provide knowledge transfer sessions and peer review on best practices on development, data modelling, release process etc.
  • Prioritise work items and add them to a work queue.
  • Understand, analyse and size user requirements.
  • Development and maintenance of SQL analytical and ETL code.
  • Development and maintenance of system documentation.
  • Collaboration with data consumers, database development, testers and IT support teams.

Skills & Experience

Mandatory

  1. Demonstrated competency in developing, auditing and reviewing code in 3 out of the 5 Data Warehouses listed - MS SQL, Teradata, Snowflake, Redshift, Databricks.
  2. High level of competency in Programming, including knowledge of supplementary programming languages such as Python, SAS, R or Java, including building ETL pipelines using these languages.
  3. Strong expertise with SQL, including experience with reverse-engineering end user SQL code and using SQL ETL frameworks like DBT or SQLMesh.
  4. Demonstrated experience in developing and customising SAS Visual Investigator data models, workflows, entities and rules to support investigative case management.
  5. Ability to understand DevOps process and can use DevOps tools in accordance with the process.
  6. Ability to demonstrate knowledge of version controls and its appropriate uses.

 

If interested, please APPLY here or reach out to Sejal, Delivery Consultant @Talent Street, on scheema@talentstreet.com.au.