Previous Job
Previous
Hadoop with Kafka
Ref No.: 18-23095
Location: Newark, Delaware
Position Type:Direct Placement
Start Date: 04/05/2018
 Hadoop Developer with Kafka and Flume experience
Mandatory Technical Skills:  
• Experience in real time streaming Kafka, Flume
• Experience in No SQL Technologies ( Cassandra )
• Extensive knowledge of Hadoop stack and storage technologies HDFS, MapReduce, Yarn, HBASE, HIVE, sqoop, Impala , spark and oozie
• Extensive Knowledge on Big data Enterprise architecture (Cloudera  preferred)
• Knowledge of JAVA/J2EE
• Experience in Data warehouse concepts
• Must have experience in big data application for Banking or Financials Organization
Good To have
• Experience with Big Data Analytics & Business Intelligence and Industry standard tools integrated with Hadoop ecosystem. ( R , Python )
• Visual Analytics Tools knowledge  ( Tableau )
• Data Integration, Data Security on Hadoop ecosystem. ( Kerberos )
• Awareness or experience with Data Lake with Cloudera ecosystem

 
Hadoop Developer with Kafka and Flume experience
Mandatory Technical Skills:  
• Experience in real time streaming Kafka, Flume
• Experience in No SQL Technologies ( Cassandra )
• Extensive knowledge of Hadoop stack and storage technologies HDFS, MapReduce, Yarn, HBASE, HIVE, sqoop, Impala , spark and oozie
• Extensive Knowledge on Big data Enterprise architecture (Cloudera  preferred)
• Knowledge of JAVA/J2EE
• Experience in Data warehouse concepts
• Must have experience in big data application for Banking or Financials Organization
Good To have
• Experience with Big Data Analytics & Business Intelligence and Industry standard tools integrated with Hadoop ecosystem. ( R , Python )
• Visual Analytics Tools knowledge  ( Tableau )
• Data Integration, Data Security on Hadoop ecosystem. ( Kerberos )
• Awareness or experience with Data Lake with Cloudera ecosystem

 

Hadoop Developer with Kafka and Flume experience
Mandatory Technical Skills:  
• Experience in real time streaming Kafka, Flume
• Experience in No SQL Technologies ( Cassandra )
• Extensive knowledge of Hadoop stack and storage technologies HDFS, MapReduce, Yarn, HBASE, HIVE, sqoop, Impala , spark and oozie
• Extensive Knowledge on Big data Enterprise architecture (Cloudera  preferred)
• Knowledge of JAVA/J2EE
• Experience in Data warehouse concepts
• Must have experience in big data application for Banking or Financials Organization
Good To have
• Experience with Big Data Analytics & Business Intelligence and Industry standard tools integrated with Hadoop ecosystem. ( R , Python )
• Visual Analytics Tools knowledge  ( Tableau )
• Data Integration, Data Security on Hadoop ecosystem. ( Kerberos )
• Awareness or experience with Data Lake with Cloudera ecosystem