FaceMeshTracking is a real-time face tracking plugin for Unity that detects 478 facial landmark points and generates a fully animated 468-vertex FaceMesh that accurately follows the user’s facial shape and expressions.
The system reconstructs a runtime 3D facemesh from tracked facial landmarks, allowing developers to render materials, attach AR objects, or animate a custom facemesh that mirrors the user's facial movements in real time.
You can either generate the facemesh automatically at runtime or use a custom facemesh model with 468 vertices that maps directly to the tracking topology. This allows precise facial deformation and realistic expression animation.
FaceMeshTracking also includes AR template facemesh prefabs, making it easy to attach AR objects such as sunglasses, hats, masks, or other face accessories. Simply place your AR content on the template mesh and it will automatically follow the user's face.
The plugin runs entirely locally on device with no API keys, cloud services, or third-party dependencies required. It works with standard RGB cameras and does not require a depth camera.
FaceMeshTracking supports Android, iOS, Windows, macOS, and Linux, and can run directly inside the Unity Editor, making development, testing, and debugging easy.
Key Features
Use Cases