5.5 KiB
LivePortrait for Nuke
Introduction 📖
This project integrates LivePortrait: Efficient Portrait Animation with Stitching and Retargeting Control to The Foundry's Nuke, enabling artists to easily create animated portraits through advanced facial expression and motion transfer.
LivePortrait leverages a series of neural networks to extract information, deform, and blend reference videos with target images, producing highly realistic and expressive animations.
By integrating LivePortrait into Nuke, artists can enhance their workflows within a familiar environment, gaining additional control through Nuke's curve editor and custom knob creation.
This implementation provides a self-contained package as a series of Inference nodes. This allows for easy installation on any Nuke 14+ system, without requiring additional dependencies like ComfyUI or conda environments.
The current version supports video-to-image animation transfer. Future developments will expand this functionality to include video-to-video animation transfer, eyes and lips retargeting, an animal animation model, and support for additional face detection models.
🔥 For more results, visit the project homepage 🔥
Compatibility
Nuke 15.1+, tested on Linux.
Features
- Fast inference and animation transfer
- Flexible advanced options for animation control
- Seamless integration into Nuke's node graph and curve editor
- Separated network nodes for customization and workflow experimentation
- Easy installation using Nuke's Cattery system
Limitations
Maximum resolution for image output is currently 256x256 pixels (upscaled to 512x512 pixels), due to the original model's limitations.
Installation
- Download and unzip the latest release from here.
- Copy the extracted
Cattery
folder to.nuke
or your plugins path. - In the toolbar, choose Cattery > Update or simply restart Nuke.
LivePortrait will then be accessible under the toolbar at Cattery > Stylization > LivePortrait.
Quick Start
LivePortrait requires two inputs:
- Image (target face)
- Video reference (animation to be transferred)
Open the included demo.nk
file for a working example.
A self-contained gizmo will be provided in the next release.
Release Notes
Latest version: 1.0
- Initial release
- Video to image animation transfer
- Integrated into Nuke's node graph
- Advanced options for animation control
- Easy installation with Cattery package
License and Acknowledgments
LivePortrait.cat is licensed under the MIT License, and is derived from https://github.com/KwaiVGI/LivePortrait.
While the MIT License permits commercial use of LivePortrait, the dataset used for its training and some of the underlying models may be under a non-commercial license.
This license does not cover the underlying pre-trained model, associated training data, and dependencies, which may be subject to further usage restrictions.
Consult https://github.com/KwaiVGI/LivePortrait for more information on associated licensing terms.
Users are solely responsible for ensuring that the underlying model, training data, and dependencies align with their intended usage of LivePortrait.cat.
Community Resources 🤗
Discover the invaluable resources contributed by our community to enhance your LivePortrait experience:
- ComfyUI-LivePortraitKJ by @kijai
- comfyui-liveportrait by @shadowcz007
- LivePortrait hands-on tutorial by @AI Search
- ComfyUI tutorial by @Sebastian Kamph
- LivePortrait In ComfyUI by @Benji
- Replicate Playground and cog-comfyui by @fofr
And many more amazing contributions from our community!
Acknowledgements
We would like to thank the contributors of FOMM, Open Facevid2vid, SPADE, InsightFace repositories, for their open research and contributions.
Citation
If you find LivePortrait useful for your research, welcome to 🌟 this repo and cite our work using the following BibTeX:
@article{guo2024liveportrait,
title = {LivePortrait: Efficient Portrait Animation with Stitching and Retargeting Control},
author = {Guo, Jianzhu and Zhang, Dingyun and Liu, Xiaoqiang and Zhong, Zhizhou and Zhang, Yuan and Wan, Pengfei and Zhang, Di},
journal = {arXiv preprint arXiv:2407.03168},
year = {2024}
}