Previous Job
Previous
Data Warehouse Developer
Ref No.: 18-55193
Location: San Diego, California
Position Type:Full Time/Contract
Start Date: 08/02/2018
Data Warehouse Developer
San Diego, CA
Fulltime Permanent


Job Description:
The Data Warehouse Developer will be integral part of BI and Analytics team and is responsible for delivering various BI and Analytics initiatives. They participate in defining and delivering the Data and Analytics based insights that can be leveraged by business teams. They align solutions, releases, and sprints consistent with program and project delivery cadence and are responsible for ensuring practical and implementable alignment with the organization and enterprise technological and architectural vision.
This position will work closely with the Scrum masters, Data Architects, QA, Dev/Ops, as well as multiple organizations within the company
Partner with the Data architects, Product managers and Scrum Masters to deliver data integrations and BI solutions required for various projects
Enable Continuous Delivery (CD) to production for all data warehousing and BI builds
Collaborate with DevOps team to align with CI/CD requirements for assigned projects
Ability to understand end to end data integration requirements and response time SLA's to build data driven solutions that provide best in class customer experience
Ability to develop scripts (Unix, Python etc.) to do Extract, Load and Transform data
Ability to write SQL queries against major databases such as Oracle, Netezza, Snowflake, SQL Server etc.
Ability to provide production support for Data Warehouse issues such data load problems, transformation/translation problems etc.
Ability to integrate on premise infrastructure with public cloud (AWS, AZURE) infrastructure
Ability to develop ETL pipelines in and out of data warehouse using combination of Python and SnowSQL
Ability to translate requirements for BI and Reporting to Database design and reporting design
Ability to understand data transformation and translation requirements and which tools to leverage to get the job done
Ability to understand data pipelines and modern ways of automating data pipeline using cloud based and on premise technologies
Actively test and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions.

Education/Experience:
3-5 years' experience with major data warehousing platform such as Oracle, Teradata, Netezza, AWS Redshift, Snowflake etc.
3-5 years' experience developing ETL, ELT and Data Warehousing solutions
3-5 years' experience AWS cloud, Azure or Google cloud
3-5 years' experience in data modeling using ERWIN
3-5 years' experience developing Python based code that reads/writes data into databases
3-5 years' experience developing SQL scripts and stored procedures that processes data from databases
3-5 years' experience in loading source system data extracts into data warehouse
3-5 years' experience with batch job schedling and identifying data/job dependencies
3-5 years' experience with automation of DevOps build using Bitbucket/Jenkins/Maven
Strong in Linux experience for shell scripting and Python scripting
3-5 years' experience with REST API development and consumption
Strong undersatnding of various data formats such as CSV, XML, JSON etc.
Strong undersatnding of of incident management and change management process to support day to day production issues
Experience working directly with technical and business teams.