Research Scientist, 3D-CV & Machine Perception

Employer

Job Description

The Meta Reality Labs Research Team brings together a world-class team of researchers, developers, and engineers to create the future of AR and VR. The Surreal Vision group at FRL Research is seeking exceptional Research Scientists to research and help build the egocentric machine perception functionalities that will underpin future VR and contextual-AI-enabled AR devices. We do this by developing - and leveraging - Project Aria. This role focuses on researching and formulating a new generation of end-to-end, real-time, egocentric methods for identifying objects the wearer interacts with and why, as well as building 3D maps of relevant objects around them.

Research Scientist, 3D-CV & Machine Perception Responsibilities:

  • Plan and execute cutting-edge research and development to advance the state-of-the-art in machine perception, mapping, object tracking and localization, as well as 3D scene understanding.
  • Collaborate with other researchers and engineers across machine perception teams at Meta to develop experiments, prototypes, and concepts that advance the state-of-the-art in AR/VR systems
  • Work with the team to help design, setup, and run practical experiments and prototype systems related to large-scale long-duration sensing and machine reasoning

Minimum Qualifications:

  • Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience.
  • Hands-on experience implementing 3D computer vision algorithms as well as end-to-end training of ML models from data collection and design to evaluation.
  • Experience working within both C++ and Python environments.
  • PhD in the following fields: Computer Vision, Robotics, Machine Perception, or a related field
  • 1+ years of post-graduate experience in academia or industry

Preferred Qualifications:

  • Knowledge and hands-on experience working with 3D and projective geometry, in addition to image-space CV.
  • Knowledge and hands-on experience with training/evaluating/using state-of-the-art ML models and architectures.
  • Strong track-record of published research in the fields of Object Tracking / Detection / Segmentation or 3D semantic scene understanding.
  • Experience working in a Unix environment
  • Strong communication and collaboration skills.

About Meta:

Meta builds technologies that help people connect, find communities, and grow businesses. When Facebook launched in 2004, it changed the way people connect. Apps like Messenger, Instagram and WhatsApp further empowered billions around the world. Now, Meta is moving beyond 2D screens toward immersive experiences like augmented and virtual reality to help build the next evolution in social technology. People who choose to build their careers by building with us at Meta help shape a future that will take us beyond what digital connection makes possible today—beyond the constraints of screens, the limits of distance, and even the rules of physics.

Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment.

Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@fb.com.