Previous Job
Lead Big-Data Engineer for AWS
Ref No.: 17-03552
Location: dallas, Texas
Start Date: 12/07/2017

Thanks Ragini!
Large company based in Grapevine, TX is seeking an AWS Developer for a contract-to-hire opportunity on their Big Data team.  The manager will interview for this role immediately and can start this AWS Developer before the end of the year.  The pay rate for this role is 80 - 90 per hour based upon experience.
In the role as AWS Developer, you will serve as a Lead Big-Data Engineer for AWS / Hadoop Integration and serve as an integral part of the technical team responsible for the company's consumer connectivity, customization, and commerce efforts as part of the data management organization.  This role contributes to emerging technologies in data analysis, architecture and development.
This AWS Developer / Big Data Engineer will be part of the design of the company's next generation consumer platform which will power future personalized consumer experiences across all touch-points.  This role is generally characterized by working levels of scope and independent decision making, with a moderate level of technical complexity.
Additionally, you will:
  • Guide and mentor local, on-shore and off-shore delivery teams in all aspects of Data Integration including ETL, data movement, data replication, data distribution and cloud technologies.
  • Develop and produce project documentation to ensure consistent results across all projects.
  • Provide technical leadership for solution development and review with regard to data integration architecture design and implementation for initiatives within the Enterprise.
  • Develop and drive the review / approval of data integration specific Solution Definitions through the enterprise architecture review process.
  • Collaborate with the data modelers, business intelligence architects, business analysts, and data subject matter experts to define and enforce integration standards and best practices that meet the needs of the business.
Required Skills:
  • At least 7 years of related organizationally-based experience required, including.
  • Working on large volume databases is required.
  • Hadoop experience is required.
  • Shell scripting experience is required.
  • Terradata, Data Integration, and cloud technologies experience is a preferred.
  • Working in complex projects with multi-tier architecture required.
  • Expert knowledge in Big Data ETL Hadoop Stack.
  • Minimum 5 years development experience on Hadoop platform including PIG, Hive, Sqoop, Hbase, Flume, Spark and related tools.
  • Minimum 1 year development experience on AWS EMR, Lambda, Dynamo DB, S3 services.
  • Minimum 7 Years Data Integration Development Experience.
  • Expert knowledge of relational and dimensional data concepts.
  • Expert knowledge of some ETL tools such as SSIS ,InformaticaPowerCenter, or SnapLogic.
  • Working knowledge of  SQL including DDL and DML.
  • Demonstrated ability to influence critical business outcomes in a global environment.
  • Working knowledge of Waterfall or Lean / Agile delivery environment.
  • Ability to communicate effectively, with others using spoken and written English, to audiences with varied levels of responsibility and technical expertise.