Previous Job
Big Data Integrations Specialist
Ref No.: 18-04774
Location: Bridgewater, New Jersey
Position Type:Direct Placement
Start Date: 07/03/2018
 This position, as a member of the Business Intelligence & Database team, will assist in the development of our Big Data Analytics Platform. The Big Data Integration specialist is responsible for designing, developing, and deploying data integration solutions on both RDMS and Big Data (Hadoop) platforms. The position will create and implement business intelligence solutions as well as extract, transform, and load (ETL) solutions using integration tools, programming, performance tuning, data modeling. The ideal candidate will possess skills for ingesting data into Big Data platform and prepares data for consumption and analysis (Hadoop, Map Reduce, Hive, HBase)
  • Learn area's direct flow; and how it affects surrounding systems and operational areas.
  • Architect, design, construct, test, tune, deploy, and support Data Integration solutions for Hadoop (MapR) and MPP (Spark) solutions.
  • Work closely with Business Intelligence team, Data Engineer and Data Scientist to achieve company business objectives.
  • Collaborate with other technology teams and architects to define and develop solutions.
  • Research and experiment with emerging Data Integration technologies and tools related to Big Data.
  • Work with the team to establish and reinforce disciplined software development, processes, standards, and error recovery procedures are deployed; ensuring a high degree of data quality.
  • Assist Users/Analysts with the development in MapR and Spark
  • Develop, write and implement processing requirements and post implementation review and performance tuning
  • Facilitate and/or create new procedures and processes that support advancing technologies or capabilities
  • Design & Implement ETL solutions utilizing Informatica Big Data Management (BDM)
  • Create logic, system, and program flows for complex systems, including interfaces and metadata
  • Write and execute unit test plans. Track and resolve any processing issues.
  • Implement and maintain operational and disaster-recovery procedures.
  • Participate in the review of code and/or systems for proper design standards, content and functionality.
  • Participate in all aspects of the Systems Development Life Cycle
  • Analyze files internal, external and 3rd party systems and map data from one system to another
  • Adhere to established source control versioning policies and procedures
  • Meet timeliness and accuracy goals.
  • Communicate status of work assignments to stakeholders and management.
  • Responsible for technical and production support documentation in accordance with department standards and industry best practices.
  • Maintain current knowledge on new developments in technology-related industries
  • Participate in corporate quality and data governance programs
  • 6+ years of experience building and managing complex Big Data Integration solutions in Cloud
  • 6+ years of experience with distributed, highly-scalable, multi-node environments.
    • MS SQL Certification or other certification in current programming languages a plus
  • Bachelor's Degree in Information Technology or related field preferred
Required Professional Competencies: 
  • Advanced knowledge of business intelligence, programming, and data analysis software
  • Intermediate knowledge of Microsoft SQL databases.
  • Intermediate proficiency in T-SQL, NZ-SQL, PostgreSQL, data tuning, enterprise data modeling and schema change management.
  • Ingestion of data into Hadoop and proficiency with the usage of common Hadoop Tools; such as, NIFI, Hive, Pig, Oozie, HBase, Flume, Sqoop, Yarn MapReduce, Ambari, Spark, Java, Python,
  • Proficiency in Bid Data Integration tools like Informatica Big Data Management and/or Talend
  • Strong object-oriented design and analysis skills
  • Experience consuming, organizing and analyzing JSON and XML messages as data.
Preferred Job Skills
  • Advanced knowledge of Data Management including Data Integration and Data Quality
  • Advanced proficiency in Informatica Big Data Management, Talend Open Studio tools.
  • Experience in Informatica Big Data Quality, Big Data Masking, Enterprise Information Catalog a plus
  • Intermediate knowledge in Python and/or R scripting
  • Flair for data, schema, data model, how to bring efficiency in big data related life cycle
  • Minimum 1-2 Year Experience on Cloud computing, Google preferable.
  • Proficiency with agile development practices
  • Experience collecting and storing data from Restful API's