Previous Job
Previous
Hadoop Admin
Ref No.: 18-65089
Location: Sleepy Hollow, New York
Position Type:Contract
Start Date: 09/06/2018
 
Hello,
 
Greetings for the day! 
Hope you are doing great.
 
Please have a look on below position if you are interested then please do send us your updated resume.
 
It's a Contract position. We work on C2C/W2.
 
Please find the related Job description and revert back with your affirmation if you are interested to it.
Interview will be placed ASAP… 
Looking forward to hear from you.
 
 
Job Description:
 
Role:                                   Big Data Hadoop Admin
Location:                            Sleepy Hollow, NY
Duration:                            12+ Months contract
Interview Mode:                Telephonic and Skype
 
 
Must have Details for HortonWorks Admin role
§  Candidate should not only be good enough on Hadoop/HortonWorks and Spark but also possess good expertise on administration, migrations, data replications and restorations of NiFi, Hive, HDFS, HBASE, Airflow, Elastic Search, Atlas, Ranger and Ambari clusters.
§  Experience on Hortonworks Data Platforms is important.
§  Experience on setting up and managing big data technology stacks on AWS.
§  Troubleshooting the issues and performance optimization of cluster and usage.
§  The candidate should have enough experience when it comes to Horton works cluster management.
§  We need Hadoop Administrator who has detail with the Hortorn work cluster that has NiFi, Ranger, Atlas, Kafka, Hive, Hbase, Ambari, Spark, Zookeeper Deployed the Cluster in AWS Infrastructure, using S3 in place of HDFS.
§   Built and managed NiFi and Data processing pipelines.
 
Job Requirements/Skill Set
§  Bachelor's degree in Computer Science or related disciplines
§  Required: 5+ to three plus years hands-on experience of On-prem or Cloud Hadoop/Spark environment.
§  Required: 5+ plus years of Big data or high data volume management experience
§  Responsible for implementation and ongoing administration of Hadoop infrastructure.
§  Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments.
§  Cluster maintenance as well as creation and removal of nodes using tools like Ambari, Ganglia, Nagios, Cloudera Manager Enterprise, Dell Open Manage and other tools.
§  Capacity planning, Performance tuning and troubleshooting of Hadoop clusters.
§  Monitor Hadoop cluster connectivity and security
§  Manage and Monitor Hadoop clusters.
§  Responsible for scripting/automation of environments using tools such as (CHEF, Puppet or Ansible)
§  Work with scripting languages such as Python, Unix Shell or Ruby
§  Knowledge of AWS CloudFormation or Azure ARM scripts
§  HDFS support and maintenance.
§  Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability.
§  Ability to work in a fast-paced environment a must.
§  Pharmaceutical experience preferred but not required, consumer / retail / technology /Insurance experience preferred.
 
 
 
 
Thanks & Regards,
Piyush Kumar
Sr. Technical Recruiter
 
IDC Technologies, Inc
Work: 408-668-9513
Mailto: piyush.kumar@idctechnologies.com
Website: www.idctechnologies.com
_____________________________________
Empowering Technologies Services
Remote Services | IT Services | BPO |
IT Consulting | Staffing Solutions |