- Immediate Start
- Inner Eastern Suburb - Melbourne
- High brand company
- 3 month contract
- Competitive daily rate
- Build and maintain data pipelines using pyspark
- Work across AWS cloud platform using Redshift, S3, EMR, Airflow, Jenkins, GIT) provide data driven solutions.
- You will be responsible for the ETL and data modelling
- Ensure system performance is efficient and data integrity is accurate
- Ensure the accuracy of code quality, logging, exception management and performance.
- Build reusable code, libraries, patterns and consumable frameworks for data sourcing, staging, transform, conforming and extraction
- Demonstrated experience Cloud – AWS is essential
- Must have experience in pyspark
- Advanced level on data warehousing/ETL
- Proven experience programming in SQL and python (or pyspark)
- Excellent communication skills and mentoring experience
- Proven experience working in an agile environment
- Must have relevant Australian working rights
This role is kicking off this side of Christmas where you will need to hit the ground running. APPLY NOW and then call Donna Bowen on 03 9020 1994.