Job Description
Facebook's mission is to give people the power to build community and bring the world closer together. Through our family of apps and services, we're building a different kind of company that connects billions of people around the world, gives them ways to share what matters most to them, and helps bring people closer together. Whether we're creating new products or helping a small business expand its reach, people at Facebook are builders at heart. Our global teams are constantly iterating, solving problems, and working together to empower people around the world to build community and connect in meaningful ways. Together, we can help people build stronger communities - we're just getting started.
Facebook Reality Labs is dedicated to the research and development required to bring virtual and augmented reality to billions of people around the world. At our lab, we aspire to a vision of social VR and AR, where people are able to interact with each other across distances in a way that is indistinguishable from in-person interactions. We are looking for an exceptional researcher with a proven track record of working at the intersection of Computer Vision, Machine Learning and Graphics as well as an outstanding software engineer who can prototype invented algorithms. As a Research Scientist at FRL, you will pursue research, and work with other Researchers and Engineers to solve challenges at the forefront of the field to transform virtual reality from dream to reality. We're looking for a creative researcher to usher in the next era of human-computer interaction by solving open and exciting computer vision problems.
Responsibilities
- Participating in cutting edge research in computer vision, graphics and machine learning
- Developing efficient deep neural network models for 3D content generation and tracking
- Contributing research that can be applied to Oculus product development
- Publish research results in top-tier journals and at leading international conferences
Minimum Qualification
- Currently has, or is in the process of obtaining, a PhD and/or postdoctoral assignment in the field of computer vision, computer graphics, machine learning or a related field.
- Interpersonal experience: cross-group and cross-culture collaboration.
- Experience with using generative neural networks for 2D/3D modeling of face/bodies.
- Experience with machine learning and deep learning tools (e.g. Pytorch, numpy).
- Must obtain work authorization in the country of employment at the time of hire, and maintain ongoing work authorization during employment.
Preferred Qualification
- Proven track record of achieving significant results as demonstrated by grants, fellowships, patents, as well as first-authored publications at leading conferences (e.g., NIPS, ICLR, ICML, SIGGRAPH, CVPR, ECCV and ICCV) or journals (e.g., PAMI, IJCV, JMLR, ToG)
- Experience in C++, CUDA programming
- 3+ years experience with deep learning tools (e.g. Pytorch, numpy)
- Experience with generative models (e.g. VAE, GAN) especially for human modeling (e.g. faces, hands, bodies, clothing)
- Demonstrated software engineer experience via an internship, work experience, coding competitions, or widely used contributions in open source repositories (e.g. GitHub)
Facebook is proud to be an Equal Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Facebook is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@fb.com.