Previous Job
Previous
Big Data Developer
Ref No.: 18-01888
Location: New York, New York
Position Type:Contract to Hire
Big Data Developer
  • As a member of the software engineering team, the key responsibility for the Big Data Developer is to setup data warehouse in Hadoop and create ETLs
  • Must have experience in Hadoop, Sqoop, Flume.
  • Must know how to use MapReduce or Pig framework to develop the ETL jobs.
  • RDBMS experience
  • Nice to Have Cassandra, DB2, and SQL Server database experience
  • Programming in Java and Python. 
  • Experience with J2EE, JSP, and Javascript
  • Writing complex SQL and ETL batch processes.
  • Experience with PHP and Perl
  • Working with large data volumes, including processing, transforming and transporting large scale data using big data stack: M/R, Hive SQL, Spark etc 
  • Data warehousing and analytic architecture implementation on a major RDBMS including at least one of the following: Oracle, MySQL, and/or SQLServer. 
  • Amazon Web Services, including at least one of the following: on demand computing, S3, and/or equivalent cloud computing approach. 
  • Building custom data loads using scripting language Python or Shell script. 
  • Experience with Big Data stack of technologies, including Hadoop, HDFS, Hive, and Hbase.