Previous Job
AWS, S3 Developer with SFDC Wave Analytics
Ref No.: 17-01839
Location: Atlanta, Georgia
Candidate should have both AWS , S3 development along with SFDC Wave Analytics (MUST)

As a Data & Analytics Engineer supporting Cox Automotive Enterprise Platforms' Lead to Cash (L2C) transformation, and as an Agile team member, you will be responsible for the delivery of strategic analytics data solutions. This role, in partnership with counterpart technology teams, is accountable for the design, development, quality, support, and adoption of production grade data and analytics solutions. A successful Engineer is one who thrives as a collaborative member of a small team within a "start-up like” environment: she/he may wear many hats, be asked to solve problems beyond her/his current technical knowledge, and is resourceful in ensuring delivery commitments are met.

Technology Stack: MuleSoft (data movement), AWS (data processing and data repository), and SFDC Wave Analytics (visualization and presentation). Services leveraged within AWS are S3, EMR (Spark, Scala), EC2 (Bash Scripting), Data Pipeline, and Redshift.

1. In partnership with Product Owner and Agile team members, deliver analytics solutions, including collecting data from providers, building transformations and integrations, persisting within repositories, and distributing to consuming systems
2. Working primarily within AWS, deliver event-driven, data processing pipelines, and ensure data sets are captured, designed, and housed effectively (consistently, optimized for cost, ease of support and maintenance).
3. Transition MVP solutions into operationally hardened systems, including introducing re-useable objects and patterns to drive automation, maintainability and supportability.
4. Participate in backlog refinement and request decomposition, including data discovery and data analysis
5. Proactively identify, communicate, and resolve project issues and risks that interfere with project execution
6. Self-directed problem solving: research, self-learn, and collaborate with peers to drive technical solutions
7. Rapid response and cross-functional work to resolve technical, procedural, and operational issues

1. A minimum of 5 years of experience delivering analytics, reporting or business intelligence solutions
2. A minimum of 3 years of experience developing in big data technologies (Hadoop, NoSQL, AWS)
3. Proficient in SQL and at least one of these programming language: Java, Scala, Python
4. Experience designing event-driven, data processing pipelines
5. At ease developing within both databases and file systems via CLI
6. Strong, hands-on technical skills and self-directed problem solving
7. MUST: Experience with Mulesoft, SFDC Sales Cloud CRM, SFDC Wave Analytics
8. Desired: Experience with data modeling (normalization, slowly changing, star, data vault)
9. Desired: Experience with MMP databases (Teradata, Exadata, Netezza, Redshift)
10. Desired: Experience in working on Agile teams
11. Desired: Experience with Lean software development
12. Preferred: Experience developing in Spark (Spark Streaming, Dataframes, Datasets)
13. MUST : Experience developing within AWS, especially S3, EMR, Data Pipeline, and Redshift