A machine learning model mapping action unit from OpenFace to Blendshapes from Arkit LiveFace
-
Updated
Sep 29, 2023 - Jupyter Notebook
A machine learning model mapping action unit from OpenFace to Blendshapes from Arkit LiveFace
Search for a dependency between "annemo" annotations of valence and arousal from RECOLA and Action Units from OpenFace using ARD regression and ARIMA.
A simple Action Unit player with a rendered face modeled with Candide-3.
Guided Interpretable Facial Expression Recognition via Spatial Action Unit Cues
ROS 2 Wrapper for OpenFace
Shout-out supporters in your GitHub README file.
Deep Region and Multi-label Learning for Facial Action Unit Detection
To examine the feasibility of and aim to use different behavioral indicators for depression, consisting of, but not limited to, visual and audio features to design an effective testing model which can be made more accessible than traditional testing methods.
A PyTorch re-implementation of Weakly Supervised Facial Action Unit Recognition through Adversarial Training
This project is a semi-supervised approach to detect emotions on faces in-the-wild using GAN
ROS bindings for OpenFace 2.1.0
[WACV 2024] LibreFace: An Open-Source Toolkit for Deep Facial Expression Analysis
ICface: Interpretable and Controllable Face Reenactment Using GANs
Code for BMVC paper "Joint Action Unit localisation and intensity estimation through heatmap regression"
Pytorch implementation of Multi-View Dynamic Facial Action Unit Detection, Image and Vision Computing (2018)
An Out-of-the-Box Replication of GANimation using PyTorch, pretrained weights are available!
Add a description, image, and links to the action-units topic page so that developers can more easily learn about it.
To associate your repository with the action-units topic, visit your repo's landing page and select "manage topics."