Previous Job
Data Architect
Ref No.: 18-01829
Location: Chantilly, Virginia
Position Type:Contract
Start Date: 03/01/2018
Client: Experian
Position: Data Architect
Location: Irvine, CA 92626
Duration: 1 + Year
Rate: $70/hr on C2C

Must Have:
At Least 6 Years Of Experience, 6+ Years Of Experience In Working With Relational Databases, 2+ Years Of Experience With NoSQL Database Solutions MongoDB, DynamoDB, Hadoop/HBase Etc, 5+ Years Of Experience With ETL/ELT Tools
· Bachelors or Master's degree in Computer Science, Mathematics or other STEM discipline
· 6+ years of experience in working with relational databases. (Redshift, PostgreSQL, Oracle, MySQL).
· 2+ years of experience with NoSQL database solutions (MongoDB, DynamoDB, Hadoop/HBase etc.)
· 5+ years of experience with ETL/ELT tools (e.g. Talend/ Informatica, AWS Data Pipeline. Preferably on Talend.)
· Strong Knowledge on Data warehousing Basics and relational database management Systems and Dimensional modelling (Star schema and Snowflake schema).
· Configuration of ETL ecosystems and perform regular data maintenance activities such as data loads, data fixes, schema updates, database copies, etc.
· Experienced in data cleansing, enterprise data architecting, data quality and data governance
· Good understanding of Redshift Database design using distribution style, sorting, encoding features
· Working experience with cloud computing technologies AWS EC2, RDS, Data Migration Service (DMS), Schema Conversion Tool (SCT), AWS Glue.
· Well versed in advanced query development and design using SQL, PL/SQL, Query Optimization, performance and tuning of applications on various databases.
· Supporting multiple DBMS platforms in Production/QA/UAT/DEV in both on premise and AWS cloud environments.
Strong Pluses:
· Experience with Database partitioning Strategies on various databases (PostgreSQL, Oracle)
· Experience in migrating, automating and supporting a variety of AWS hosted (Both RDBMS & NoSQL) databases in RDS, EC2 using CFT.
· Experience with Big Data Technology Stack: Hadoop/Spark/Hive/MapR/Storm/Pig/Oozie/Kafka, etc.
· Experience with shell scripting for process automation.
· Experience with source code versioning with Git and Stash.
· Ability to work across multiple projects simultaneously.
· Strong experience in all aspects of the software lifecycle including design, testing, and delivery
· Ability to understand and start projects quickly
· Ability and willingness to work with teams located in multiple time zones