Previous Job
ETL Hadoop Engineer
Ref No.: 18-00099
Location: Chicago, Illinois
  • Modify current email pipeline to make it brand aware.
  • Design the ingestion and ETL process.
  • Develop scalable ETL process to ingest data from upstream source and load into Teradata/Cerebro/Hadoop.
  • Design aggregate tables in Teradata/Cerebro to support define metrics.
  • Troubleshoot and remediate issues impacting processes in ETL framework.
  • Experience with Hadoop and Hive.
  • Modifying existing code to provide defect fixes for existing ETLs.
  • Hands-on experience with all aspects of designing, developing, testing and implementing ETL solutions
  • Ability to work in a fast paced environment on several projects
  • Strong analytical and diagnostic skills
  • Good knowledge of metadata, and using/managing metadata
  • Ability to develop and organize high-quality documentation
  • Take responsibility for performance tuning
  • Experience with data warehousing and star-schema (dimensional) data models a plus.
  • Working knowledge on Linux/Unix Operating systems
  • Hands on experience in Database technologies like Oracle, Mysql , Teradata or likes.
  • Strong scripting skills - python ( a big plus ), Perl , shell etc.