Previous Job
Previous
Data Engineer
Ref No.: 18-01995
Location: Burbank, California
Position Type:Contract
Start Date: 02/26/2018
Client: Machinima
9 months Contract 
Pays: $50 - $60per hour 
Requirements:
Must Haves​
  • ETL development w/ multiple sources and preferably API experience with social media (ideal), healthcare, etc (3+ years).
  • Work with a GUI Data Integration / ETL tool such as 1 of the following: Pentaho PDI aka Kettle, Talend, SSIS, DataStage, Informatica, Ab Initio, ODI, Twister, etc (1½ + years).
  • Python development and Bash scripting (2+ years).
  • SQL on 1 or 2 DBMS, e.g. PostgreSQL, MySQL, Redshift, Snowflake, or other mainstream DBMS (3+ years).
  • Data and System Design (2+ years), preferably w/ some tool experience, e.g. Visio, ER-Studio, ERwin, TDM, etc.
Nice to Haves -- Strong Plus for 2 or more of the following:
  • Big Data related experience, e.g. Redshift, Vertica, Presto, Spark SQL, Kafka, Kinesis, AWS Lambda, Glue.
  • PostgreSQL on Linux experience as back-end developer and some DBA skill.
  • Cloud experience, ideally AWS -- or alternate, e.g. Azure or Google Cloud Compute.
  • Social Media and Business platform API development experience with Google/YouTube, Facebook, Twitter, Instagram, Pinterest, Go90, Salesforce, SAP, etc.
  • Tableau or MicroStrategy for BI/Analytics/Visualization. Metadata admin experience also useful.
  • SCM, DevOps, and some SysOps experience, e.g. Git, Docker, Ansible, AirFlow, basic Linux admin (with DBA focus), Storage admin (e.g. zfs filesystems, etc).
  • Background in other sw development languages, e.g. Ruby, Java, JavaScript, Scala.
  • Experience w/ gaming, entertainment, and Internet monetization business models
Responsibilities:
  • Hands-on design and development work to migrate and enhance our existing ETL pipelines as we transition to our growing cloud based data integration environment.
  • Create, enhance, and maintain existing data integrations/ETL pipelines in Python, Pentaho PDI, and some Ruby and Bash. (Back ends on Snowflake, Redshift, PostgreSQL. Technical stack might evolve over time.)
    ​ ​Not creating new components from complete scratch, they will be leveraging other patterns to build new ones. Ex take an existing component and customize it for new things they are building.​
  • Document parts of the current and future processing framework.
  • Work closely with other groups to satisfy data requirements. This includes collaborating with Business Intelligence, Finance, Accounting, Talent, Contracts, Client, Content Ops, Biz Ops, Distribution, et al.
  • Help design and develop systems to evolve and migrate operational, data warehouse, and BI infrastructure.
  • Participate in development/system/database operations as needed (devops, sysops, and DBA work).