Previous Job
Previous
Hadoop Architect
Ref No.: 18-73685
Location: Milpitas, California
Start Date: 10/04/2018
   Role: Hadoop Architect
Positions - 2
Location options: Pleasanton CA



Below is the Job Description
?        10-15 years of experience as Principal Consultant, Subject Mater Expert working on Big Data Platforms (Hadoop, Map Reduce, NoSQL, Real Time Streaming, Search, Spark, Java platforms)
?        Experience in driving large initiatives for building world class products on Java based and Big Data Platforms
?        Experience with managing and handling very large data repositories, delivering distributed and highly scalable applications
?        Ability to quickly prototype, architect and build software using latest/greatest technologies
?        Good Customer facing, interpersonal and communication skills
?        Experience in addressing non-functional requirement's and large application deployment architectures and concerns such as scalability, performance, availability, reliability, security etc
?        Experience in any one or more of the following technologies
o   Experience on Hadoop (Apache/Cloudera/Hortonworks) and/or other Map Reduce Platforms
o   Having at least one good cloud migration experience with AWS/Google Cloud or similar
o   Experience on Hive, Pig, Sqoop, Flume and/or Mahout
o   Experience on NO-SQL – HBase, Cassandra, MongoDB
o   Experience on Spark, Storm, Kafka
o   Experience around Search Platform - Solr, Elastic 
o   Good background of Configuration Management/Ticketing systems like Maven/Ant/JIRA etc.
o   Strong in Shell Scripting programming, Java, EDW platforms
o   Knowledge around any Data Integration and/or EDW tools is plus
?        Candidate should have at least one or two projects delivered at Production Scale on above technology stacks.
 
Key Responsibilities
?        Trusted technical advisor for the Customer
?        Architect, Design and implement core technologies in fast paced environment, with minimum guidance
?        Ability to quickly prototype, architect and build software using latest/greatest technologies
?        Implement robust, highly scalable, highly optimized distributed components
?        Evaluate and integrate latest technologies and third party tools/APIs
?        Optimize architecture for security, operational stability, scalability and cost
?        Develop capacity planning models
 
Other Responsibilities:
?        Enable Competency building and support for the Center of Excellence activities
?        Handling proposals and support business development
?        Providing architectural and capability presentations to Customers on Big Data Platform
?        Support in developing the Industry/Horizontal solution along with the Domain teams
 
Educational Qualification:
?        Must have a degree in Computer Science or Engineering or equivalent work experience.
?        Highly proficient & Customer facing Project experience involving design, development and deployment in one of the areas mentioned above
?        Must have a proven record of delivering technical solutions