Skip to content

Latest commit

 

History

History
98 lines (64 loc) · 3.2 KB

File metadata and controls

98 lines (64 loc) · 3.2 KB

Artistic Style Transfer for Videos

Artistic Style Transfer for Videos is a Python application that applies the style transfer technique to both images and videos. This technique reimagines your photos or videos in the style of another image, such as a famous artwork, using neural network algorithms.

Unofficial implementation in PyTorch of an "Artistic style transfer for videos" [arXiv].

Features

  • Model-agnostic. Fast EfficientNet-B0 is used as a feature extractor. RAFT is used as an optical flow estimator.
  • High-order statistics of feature grid are used to estimate the style, thus eliminating dependency from content/style image shape.

Limitations

  • Not a real-time application.

TODO

  • Remove hardcoded paths.
  • Add long-term temporal consistency.
  • Implement multi-pass algorithm.
  • Copy audio from the original video.

Installation

Prerequisites

  • Python 3.7+
  • pip
  • Git LFS (optional)

Install

  1. Clone the repository with Git LFS support.
  2. Verify that example files are downloaded with:
    git lfs pull
  3. Install dependencies with:
    python -m pip install -U -r requirements.txt

Video Style Transfer

Apply style transfer to video files, transforming each frame to carry the artistic style of your choice.

Usage

First, prepare your video for style transfer:

python main_video_prepare.py

Next, apply the style transfer to the video:

python main_video.py

Finally, join the processed frames back into a video:

python main_video_join.py

Arguments and filepaths need to be hardcoded in main_video_prepare.py, main_video.py, and main_video_join.py.

Alternatives

License

Licensed under either of

at your option.

Contribution

Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in the work by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.

Acknowledgements

This project is inspired by the pioneering work in neural style transfer. Special thanks to the authors of the original research papers and the open-source community for making their code available.