.github | ||
licenses | ||
video2x | ||
.gitignore | ||
Dockerfile | ||
Dockerfile.alpine | ||
LICENSE | ||
NOTICE | ||
pyproject.toml | ||
README.md | ||
setup.cfg |
Official Telegram Discussion Group
Join the Telegram discussion group to
Download Windows Releases
The latest Windows update is built based on version 4.8.1. GUI is not available for 5.0.0 yet, but is already under development. Go to the Quick Start section for usages. Try the mirror if you can't download releases directly from GitHub.
Google Colab
You can use Video2X on Google Colab for free if you don't have a powerful GPU of your own. You can borrow a powerful GPU (Tesla K80, T4, P4, or P100) on Google's server for free for a maximum of 12 hours per session. Please use the free resource fairly and do not create sessions back-to-back and run upscaling 24/7. This might result in you getting banned. You can get Colab Pro/Pro+ if you'd like to use better GPUs and get longer runtimes. Usage instructions are embedded in the Colab Notebook.
Download Nightly Releases
Nightly releases are automatically created by the GitHub Actions CI/CD pipelines. They usually contain more experimental features and bug fixes. However, they are much less stable to the stable releases. You must log in to GitHub to download CI build artifacts.
Container Image
Video2X container images are available on the GitHub Container Registry for easy deployment on Linux and macOS. If you already have Docker/Podman installed, only one command is needed to start upscaling a video. For more information on how to use Video2X's Docker image, please refer to the documentations (outdated).
Introduction
Video2X is a video/GIF/image upscaling software based on Waifu2X, Anime4K, SRMD and RealSR written in Python 3. It upscales videos, GIFs and images, restoring details from low-resolution inputs. Video2X also accepts GIF input to video output and video input to GIF output.
Currently, Video2X supports the following drivers (implementations of algorithms).
- Waifu2X Caffe: Caffe implementation of waifu2x
- Waifu2X Converter CPP: CPP implementation of waifu2x based on OpenCL and OpenCV
- Waifu2X NCNN Vulkan: NCNN implementation of waifu2x based on Vulkan API
- SRMD NCNN Vulkan: NCNN implementation of SRMD based on Vulkan API
- RealSR NCNN Vulkan: NCNN implementation of RealSR based on Vulkan API
- Anime4KCPP: CPP implementation of Anime4K
Video Upscaling
Upscale Comparison Demonstration
You can watch the whole demo video on YouTube: https://youtu.be/mGEfasQl2Zo
Clip is from trailer of animated movie "千と千尋の神隠し". Copyright belongs to "株式会社スタジオジブリ (STUDIO GHIBLI INC.)". Will delete immediately if use of clip is in violation of copyright.
GIF Upscaling
This original input GIF is 160x120 in size. This image is downsized and accelerated to 20 FPS from its original image.
Catfru original 160x120 GIF image
Below is what it looks like after getting upscaled to 640x480 (4x) using Video2X.
Image Upscaling
Original image from nananicu@twitter, edited by K4YT3X.
All Demo Videos
Below is a list of all the demo videos available. The list is sorted from new to old.
- Bad Apple!!
- YouTube: https://youtu.be/A81rW_FI3cw
- Bilibili: https://www.bilibili.com/video/BV16K411K7ue
- The Pet Girl of Sakurasou 240P to 1080P 60FPS
- Original name: さくら荘のペットな彼女
- YouTube: https://youtu.be/M0vDI1HH2_Y
- Bilibili: https://www.bilibili.com/video/BV14k4y167KP/
- Spirited Away (360P to 4K)
- Original name: 千と千尋の神隠し
- YouTube: https://youtu.be/mGEfasQl2Zo
- Bilibili: https://www.bilibili.com/video/BV1V5411471i/
Screenshots
Video2X GUI
Video2X CLI
Sample Videos
If you can't find a video clip to begin with, or if you want to see a before-after comparison, we have prepared some sample clips for you. The quick start guide down below will also be based on the name of the sample clips.
- Sample Video (240P) 4.54MB
- Sample Video Upscaled (1080P) 4.54MB
- Sample Video Original (1080P) 22.2MB
Clip is from anime "さくら荘のペットな彼女". Copyright belongs to "株式会社アニプレックス (Aniplex Inc.)". Will delete immediately if use of clip is in violation of copyright.
Quick Start
Prerequisites
Before running Video2X, you'll need to ensure you have installed the drivers' external dependencies such as GPU drivers.
- waifu2x-caffe
- GPU mode: Nvidia graphics card driver
- cuDNN mode: Nvidia CUDA and cuDNN
- Other Drivers
- GPU driver if you want to use GPU for processing
Running Video2X (GUI)
The easiest way to run Video2X is to use the full build. Extract the full release zip file and you'll get these files.
Simply double click on video2x_gui.exe to launch the GUI.
Then, drag the videos you wish to upscale into the window and select the appropriate output path.
Drag and drop file into Video2X GUI
Tweak the settings if you want to, then hit the start button at the bottom and the upscale will start. Now you'll just have to wait for it to complete.
Video2X started processing input files
Running Video2X (CLI)
Basic Upscale Example
This example command below uses waifu2x-caffe
to enlarge the video sample-input.mp4
two double its original size.
python video2x.py -i sample-input.mp4 -o sample-output.mp4 -r 2 -d waifu2x_caffe
Advanced Upscale Example
If you would like to tweak engine-specific settings, either specify the corresponding argument after --
, or edit the corresponding field in the configuration file video2x.yaml
. Command line arguments will overwrite default values in the config file.
This example below adds enables TTA for waifu2x-caffe
.
python video2x.py -i sample-input.mp4 -o sample-output.mp4 -r 2 -d waifu2x_caffe -- --tta 1
To see a help page for driver-specific settings, use -d
to select the driver and append -- --help
as demonstrated below. This will print all driver-specific settings and descriptions.
python video2x.py -d waifu2x_caffe -- --help
Running Video2X (Docker)
Video2X can be deployed via Docker. The following command upscales the video sample_input.mp4
with Waifu2X ncnn Vulkan and outputs the upscaled video to output.mp4
. For more details on Video2X Docker image usages, please refer to the documentations (outdated).
docker run -it --rm \ # temporary container, delete after run
--gpus all -v /dev/dri:/dev/dri \ # mount GPUs
-v $PWD:/host \ # bind mount the current directory as the container's /host
ghcr.io/k4yt3x/video2x:5.0.0-beta1-cuda \ # the URL of the docker image
-i sample_input.mp4 \ # path of the input file
-o output.mp4 \ # the path to write the output
-p5 \ # launch 5 processes
upscale \ # set action to upscale
-h 720 \ # set output hight to 720 pixels
-d waifu2x \ # use driver waifu2x
-n3 # noise level 3
To interpolate a video, set the action to interpolate
. Right now, only 2x framerate is supported.
docker run -it --rm \
--gpus all -v /dev/dri:/dev/dri \
-v $PWD:/host \
ghcr.io/k4yt3x/video2x:5.0.0-beta1-cuda \
-i sample_input.mp4 \
-o output.mp4 \
interpolate # set action to interpolate
Documentations
Video2X Wiki
You can find all detailed user-facing and developer-facing documentations in the Video2X Wiki. It covers everything from step-by-step instructions for beginners, to the code structure of this program for advanced users and developers. If this README page doesn't answer all your questions, the wiki page is where you should head to.
Drivers
Go to the Drivers wiki page if you want to see a detailed description on the different types of drivers implemented by Video2X. This wiki page contains detailed difference between different drivers, and how to download and set each of them up for Video2X.
Q&A
If you have any questions, first try visiting our Q&A page to see if your question is answered there. If not, open an issue and we will respond to your questions ASAP. Alternatively, you can also join our Telegram discussion group and ask your questions there.
History
Are you interested in how the idea of Video2X was born? Do you want to know the stories and histories behind Video2X's development? Come into this page.
License
This project is licensed under the GNU Affero General Public License Version 3 (GNU AGPL v3)
Copyright (c) 2018-2022 K4YT3X and contributors.
This project includes or depends on these following projects:
Project | License |
---|---|
FFmpeg | LGPLv2.1, GPLv2 |
waifu2x-ncnn-vulkan | MIT License |
srmd-ncnn-vulkan | MIT License |
realsr-ncnn-vulkan | MIT License |
rife-ncnn-vulkan | MIT License |
ffmpeg-python | Apache-2.0 |
Loguru | MIT License |
opencv-python | MIT License |
Pillow | HPND License |
Rich | MIT License |
tqdm | MPLv2.0, MIT License |
Legacy versions of this project includes or depends on these following projects:
Project | License |
---|---|
waifu2x-caffe | MIT License |
waifu2x-converter-cpp | MIT License |
Anime4K | MIT License |
Anime4KCPP | MIT License |
Gifski | AGPLv3 |
More licensing information can be found in the NOTICES file.
Special Thanks
Appreciations given to the following personnel who have contributed significantly to the project.
Similar Projects
- Dandere2x: A lossy video upscaler also built around
waifu2x
, but with video compression techniques to shorten the time needed to process a video. - Waifu2x-Extension-GUI: A similar project that focuses more and only on building a better graphical user interface. It is built using C++ and Qt5, and currently only supports the Windows platform.