Previous Job
Previous
Software Engineer III
Ref No.: 18-13659
Location: Cincinnati, Ohio
Job Description: Experienced Hadoop Systems engineer supporting 21 servers running a complete Hadoop stack including Spark with Cloudera distribution. The successful candidate will work as part of an integrated team including Linux admins, ETL developers, and Hadoop/Cloudera engineers.
Responsibilities:
• Contribute to the design, implementation, and support multiple Cloudera Distributions of Apache Hadoop (CDH) in support of full-lifecycle application development, analytics, and data management.
• Typical system administration and programming skills such as storage capacity management, performance tuning
• Manage and monitor Hadoop cluster and platform infrastructure
• Automating cluster node provisioning and repetitive tasks
• Collaborates with other departments on requirements, design, standards, and architecture of applications.
• Work directly with external vendors to resolve issues & perform technical tasks.
• Provide Production Support as needed.
• Advise manager of platform risks, issues, and concerns.
• Project and recommend periodic incremental and organic growth based on ongoing and forecasted workloads.
• Contribute to the design, implementation, and support of software and hardware environments and configurations that yield non-functional requirements, such as security, accessibility, auditability, compliance, data retention, usability, and performance.
• Work closely with infrastructure, network, database, business intelligence and application teams to ensure solutions meet requirements.
• Provide subject matter expertise on the capabilities and use of cluster components.
Skills:
• 5+ years’ experience with administration of Hortonworks or Cloudera distributions of Hadoop required.
• 5+ years’ experience with OS administration around memory, CPU, system capacity planning, networks and troubleshooting skills.
• Experience working with RDBMS such as Oracle, SQL Server desired.
• Experience with Hadoop Monitoring Tools such as Ambari, CDH CM.
• Securing Hadoop clusters using Kerberos LDAP/AD integration and Encryption.
• Effective oral, written, and interpersonal communication skills.
• Demonstrated ability to establish priorities, organize and plan work to satisfy established timeframes.
• Proven ability to handle multiple tasks and projects simultaneously.
• Excellent problem-solving skills, core java application troubleshooting.
• Ability to work in a team environment under limited supervision is desired
Qualifications
Education: Master’s Degree or equivalent experience.
Field of Study: Computer Science, Information Technology or a related discipline.
Experience:
• Familiarity with: Python, Scala, Spark, R.
• Familiarity with all components of the CDH environments, including but not limited to, Cloudera Manager, Cloudera Management Services, HDFS, YARN, Zookeeper, Hive, Spark, Hue, Kudu, Impala, HBase, Key Management Server, Kafka, Flume, Solr, SSL, Sqoop, and Sentry.
• Working in relation with Data Warehouses and Data Marts a plus.
• UC4 job scheduling knowledge is a plus.
• Experience with Red Hat Enterprise Linux, Enterprise Class Server HW is desired.
• Experience with scripting for automation and configuration management.