Previous Job
Data Engineer II
Ref No.: 19-03339
Location: Menlo Park, California
Menlo Park, CA | $92.19/hour

Half of the population of the world are not yet connected to the Internet. Our client is doing research to develop next-generation technologies to connect everyone on the planet to the Internet. They are looking for a data engineer to help bring our models of global connectivity to life.

In this role, you will help plan and implement the transfer of connectivity data and models onto our servers, will identify opportunities to improve and streamline the models, and will create data visualizations. The ideal candidate will be passionate about Facebook and global connectivity. You will have strong technical, data, and project management skill set, as well as a knowledge of data visualization. You will have a background in a related quantitative or technical field, have experience working with large data sets, be a great communicator and have experience in creating data-driven insights.

  • Work closely with the Facebook Connectivity Understand (FBCU) team to understand our data models and to transform them into modular SQL/Python based models.
  • Develop pipelines from numerous external sources which the models run on.
  • Apply your expertise in data analysis and scripting to identify areas to improve the models both in terms of reliability and enhanced data sources.
  • Create data visualizations that communicate key data insights to the FBCU team, programs, and to our broader cross-functional stakeholders.
  • Own the end-to-end data engineering, analysis, and presentation components of the solution
  • Collaborate with the program's SMEs and data scientists.

  • 5+ years of relevant experience in data science & analytics.
  • Understanding of statistical analysis.
  • Strong interpersonal skills.
  • Ability to turn vague concepts and asks into well-documented and effective SQL models and data visualizations.
  • Development experience particularly in Python, SQL, CLI with an expert working knowledge of SQL.
  • Proficiency in Big Data stack environments (Hadoop, MapReduce, Hive)
  • Competence with relational databases
  • Coding and scripting experience with experience in visualization tools like Tableau Fluency in English

BS/MS in Computer Science, Statistics, Engineering or a related technical field.