Search for More Jobs
Forward job to a friend
Apply without Registering
Apply by creating/using an account
Please enter your registered email address, and we'll email you a link to reset your password right away.
NO THIRD PARTY OR EMPLOYERS
The Big Data & Analytics Engineering team is actively recruiting for an Engineer Specialist with a solid understanding and extensive hands-on experience engineering Hadoop products for large scale deployments. The candidate must have experience with all components of the Hadoop ecosystem and will contribute to the architecture and engineering responsibilities of the Hadoop offering within Client's portfolio.
• Design Hadoop deployment architectures (with features such as high availability, scalability, process isolation, load-balancing, workload scheduling, etc.).
• Install, validate, test, and package Hadoop products on Red Hat Linux platforms.
• Publish and enforce Hadoop best practices, configuration recommendations, usage design/patterns, and cookbooks to developer community.
• Engineer process automation integrations.
• Perform security and compliance assessment for all Hadoop products.
• Contribute to Application Deployment Framework (requirements gathering, project planning, etc.).
• Evaluate capacity for new application on-boarding into a large scale Hadoop cluster.
• Provide Hadoop SME and Level-3 technical support for troubleshooting.
• 10+ years overall IT experience.
• 2+ years of experience with Big Data solutions and techniques.
• 2+ years Hadoop application infrastructure engineering and development methodology background.
• Experience with Cloudera distribution (CDH) and Cloudera Manager is preferred.
• Advanced experience with HDFS, Spark, MapReduce, Hive, HBase, ZooKeeper, Impala, SOLR, KAFKA and Flume.
• Experience installing, troubleshooting, and tuning the Hadoop ecosystem.
• Experience with multi-tenant platforms taking into account Data Segregation, Resource Management, Access Controls, etc.
• Experience with Red Hat Linux, UNIX Shell Scripting, Java, RDBMS, NoSQL, and ETL solutions.
• Experience with Kerberos, TLS encryption, SAML, LDAP
• Experience with full Hadoop SDLC deployments with associated administration and maintenance functions.
• Experience developing Hadoop integrations for data ingestion, data mapping and data processing capabilities.
• Experience with designing application solutions that make use of enterprise infrastructure components such as storage, load-balancers, 3-DNS, LAN/WAN, and DNS.
• Experience with concepts such as high-availability, redundant system design, disaster recovery and seamless failover.
• Overall knowledge of Big Data technology trends, Big Data vendors and products.
• Good interpersonal with excellent communication skills - written and spoken English.
• Able to interact with client projects in cross-functional teams.
• Good team player interested in sharing knowledge and cross-training other team members and shows interest in learning new technologies and products.
• Ability to create documents of high quality. Ability to work in a structured environment and follow procedures, processes and policies.
• Self-starter who works with minimal supervision. Ability to work in a team of diverse skill sets and geographies.
• Exposure to Banking internal standards, policies and procedures is a plus (does not apply to external candidates).
Apply by creating/using an account
Axelon Services, Corp is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender, gender identity, national origin, disability, or protected veteran status.