Job Description
Your Benefits:
- Immediate Start
- Melbourne CBD
- High performing team
- 6 month contract
- Competitive daily rate
Your Role:
- Transfer data from their "Data Swamp" into their "Data Lake"
- Work across AWS cloud platform using Redshift, S3, EMR, Airflow, Jenkins, GIT) provide data driven solutions.
- Building and maintain data pipeline (ETL)
- Ensure system performance is efficient and data integrity is accurate
- Ensure the accuracy of code quality, logging, exception management and performance.
- Build reusable code, libraries, patterns and consumable frameworks for data sourcing, staging, transform, conforming and extraction
You will need to have:
- Demonstrated experience Cloud – AWS is essential
- Proven experience working on data migration projects and data warehousing environments
- Advanced level on data warehousing/ETL (Informatica and SSIS environment)
- Proven experience programming in SQL and python (or pyspark)
- Excellent communication and stakeholder management skills
- Demonstrated experience with data ownership and data governance
- Proven experience working in an agile environment
- Must have relevant Australian working rights
This is an exciting opportunity to take you into the New Year on a greenfield project. You will be working in a fast paced and high performing team. APPLY NOW and then call Donna Bowen on 03 9020 1994.