Previous Job
Previous
Data Architect
Ref No.: 18-59108
Location: Irvine, California
Position Type:Contract
Start Date: 08/15/2018
An ideal candidate who must demonstrate in-depth knowledge and understanding of RDBMS concepts and experienced in writing complex queries and data integration processes in SQL/TSQL and NoSQL. This individual will be responsible for helping the design, development and implementation of new and existing applications.

Responsibilities:
  • Responsible for providing subject matter expertise in design of database schemes and performing data modeling (logical and physical models), for product feature enhancements as well as extending analytical capabilities.
  • Reviews the existing database design and data management procedures and provides recommendations for improvement.
  • Develop technical documentation as needed.
  • Architect, develop, validate and communicate Business Intelligence (BI) solutions like dashboards, reports, KPIs, instrumentation, and alert tools.
  • Define data architecture requirements for cross-product integration within and across cloud-based platforms
  • Analyze, architect, develop, validate and support integrating data into the SEW platform from external data source; Files (XML, CSV, XLS, etc.), APIs (REST, SOAP), RDBMS.
  • Perform thorough analysis of complex data and recommend actionable strategies.
  • Effectively translate data modeling and BI requirements into the design process.
  • Big Data platform design i.e. tool selection, data integration, and data preparation for predictive modeling.

Required Skills:
  • Minimum of 4-6 years of experience in data modeling (including conceptual, logical and physical data models.
  • 2-3 years of experience in Extraction, Transformation and Loading ETL work using data migration tools like Talend, Informatica, Datastage, etc.
  • 4-6 years of experience as a database developer in Oracle, MS SQL or other enterprise database with focus on building data integration processes.
  • Candidate should have any NoSql technology exposure preferably MongoDB.
  • Experience in processing large data volumes indicated by experience with Big Data platforms (Teradata, Netezza, Vertica or Cloudera, Hortonworks, SAP HANA, Cassandra, etc.).
  • Understanding of data warehousing concepts and decision support systems.
  • Ability to deal with sensitive and confidential material and adhere to worldwide data security and Experience writing documentation for design and feature requirements.
  • Experience developing data-intensive applications on cloud-based architectures and infrastructures such as AWS, Azure etc.
  • Excellent communication and collaboration skills.