twittertwittertwitter

Job Detail

Big Data Architect (Cloudera)

Sydney


Job Description

  • At-least 2-4 full cycle Cloudera implementation experience
  • Permanent Role – Immediate Interview + Competitve renumeration
  • Excellent communication skills

Our Client is a NYSE listed global leader in consulting, and technology outsourcing solutions with presence in across the globe and a well-known multi-billion $ company.

You will participate in problem definition, solution design and leading design, development, deployment and implementation of Bid Data Solution – Cloudera in leading Telecom company.

What will you do?

  • Perform Requirement Analysis, Technical Design and lead/participate in developing requirement specifications such as user stories.
  • Exceptional stakeholder management experience – including at-least 2-3 end-to-end Cloudera hands-on implement experience
  • Draft technical requirement, solution architecture based on client's requirements – by translating functional requirements into conceptual and detailed design documents
  • Experience in implementing/configuring Cloudera services – incorporating  client processes and requirements
  • Design, develop and deploy high – quality solutions and data architecture as per client requirements
  • Lead: Guide team development activities including coding and configuration and lead Cloudera implementation
  • Involved in trouble shooting client support activities, including liaising directly with the clients.
  • Hands-on experience in Backup, Restore, Cluster set-up, Alert and monitoring of any failures, Performance tuning of the cluster, Troubleshooting, Capacity planning, Production deployment of code from non-prod to prod, Scaling and administration (node addition, decommissioning/recommission, load balancing), etc
  • Ability to able translate functional requirements and business rules into technology solutions and scope the work

Skill / Experience Requirements:

  • Must have designed and implemented At-least 2-4 full cycle Cloudera implementation experience
  • Sound knowledge of BigData, Hadoop concepts and principles
  • Experience in HDFS, MapReduce, Yarn, Hive, Pig, Sqoop, Oozie, NoSQL/HBase, Spark Yarn and writing Hive script, Pig scripts, Sqoop jobs, Oozie jobs to schedule Hadoop jobs.
  • Strong knowledge and knowledge of Linux shell scripting / Linux environment.
  • Experience in identifying problem areas and performance fine-tuning of existing HDFS, HBase, hive tables, hive scripts, pig scripts,
  • Experience in capturing, integrating and processing unstructured, semi structured and structured data – hand-on experience in Sqoop, Kafka message broker configuration, Flume ingestion configuration.
  • Experience in developing batch jobs and real time jobs using MapReduce and Kafka
  • Experience in writing Scala/Python codes on top on Spark.
  • Exceptional communication and customer engagement skills
  • Educational Qualifications: Engineering Bachelors or master’s degree level or equivalent

What is on offer:

  • Permanent Big Data Architect role
  • Excellent client facing engagement role
  • Competitive remuneration
  • Open to Sponsor right candidate – Open to 457 VISA holders

How to apply:

  • If you on our website, apply using the link below
  • If you are NOT on our website, please apply on the link: https://www.appetencyrecruitment.com.au/companydescription.php?tid=96

 

Other titles for this role in other organisation might be: BigData Architect, Process Analyst, Cloudera Implementation Consultant, Business Intelligence Implementation Consultant, BigData Lead, Lead Consultant, Senior Cloudera Consultant, Cloudera Architect

Appetency Recruitment Services @ 03 8560 3750