linkedin

Connecting...

Senior Data Engineer

Contract type: Permanent
Location: Melbourne
Industry: Business Intelligence
Salary Big Data, AWS, National Brand,
Start Date: 2022-07-24
Reference: V-119423
Contact name: Donna Bowen
Contact email: dbowen@siriustechnology.com.au
Job published: July 25, 2022 10:14

Job Description

Our trusted client is a leader in cloud transformations. They are global and Australian owned making big waves in the Tech world. Do you want to be a part of a passionate tech team working on the latest tech? 

You will be the conduit between tech and the business translating requirements and really uncovering needs to provide analytical solutions. 

Your Benefits:
  • Full time permanent role
  • AWS Partner + Latest Cloud Technology 
  • Hybrid work environment (1 day in the office)
  • Melbourne CBD
  • Paid Certifications 
As a Data Engineer you will help deliver scalable, automated, repeatable ‘big data’ and analytic patterns within the AWS stack. Their Data Engineers understand both data pipeline development, operations and management as well as delivery of automated data solutions that are ready for production operations.

Responsibilities and Experience:
  • Knowledge of best practices and IT operations with designing, building, and maintaining data processing systems

  • Excellent problem-solving and troubleshooting skills

  • Appreciation and experience with Data Management & Data Governance best practices

  • Strong architectural capabilities when it comes to designing and implementing data solutions

  • Experience with Agile Software Development methodologies

  • Process oriented with great documentation skills

  • Strong communication & stakeholder engagement skills

  • Successful record of delivering complex data solutions in highly technical, fast-paced & ambiguous environments

Tech:
  • Experience with big data tools: Hadoop, Spark, Kafka, Kinesis etc.

  • Knowledge of relational SQL and NoSQL databases, including Snowflake, PostgreSQL, Cassandra, etc.

  • Extensive experience with data pipelines and workflow management tools: Azkaban, Luigi, NiFi, Airflow, etc.

  • Experience with AWS cloud data services: EMR, RDS, Redshift, Kinesis, Glue

  • Previous exposure to stream-processing systems: Storm, Spark-Streaming, etc.

  • Experience with object-oriented/object function scripting languages: SQL, Python, PySpark, Scala, etc
You will need: 
  • Relevant Tertiary Qualification on Industry experience IE/ Information Systems/Math
  • Proficient Data Modelling and creating data flows
  • Worked on Big Data Stack - Hadoop/Spark or Kafka 
  • Exceptional communication skills to liaise with non-technical stakeholders and uncover needs
  • Demonstrated experience in Data Analyst roles and providing insights
  • Proficient working in a cloud environment (AWS/GPC/Azure)
  • Relevant Australian working rights
 
If you are looking to be an integral team member within a market leader we have your next work home. Please APPLY NOW.
2020 11 17

Employee turnover on the rise?

Stop it now with our

Ultimate Guide to Remote Employee Retention!


Get your free guide now >>