LivePortrait AI: Transform Static Photos into Talking Videos. It now supports Video to Video conversion and Superior Expression Transfer at Remarkable Speed

1 view
Skip to first unread message

Furkan Gözükara

unread,
Jul 22, 2024, 1:12:59 PM7/22/24
to SECourses
LivePortrait AI: Transform Static Photos into Talking Videos. It now supports Video to Video conversion and Superior Expression Transfer at Remarkable Speed

A new tutorial is anticipated to showcase the latest changes and features in V3, which includes Video to Video capabilities and other enhancements.

This post provides information for both Windows (local) and Cloud installations (Massed Compute, RunPod, and free Kaggle Account).

The V3 update introduces video to video functionality. If you're seeking a one-click installation method for the open-source, zero-shot image-to-animation application LivePortrait on Windows for local use, this tutorial is essential. It introduces the cutting-edge image-to-animation open-source generator Live Portrait. Simply provide a static image and a driving video, and within seconds, you'll have an impressive working animation. LivePortrait is incredibly fast and capable of preserving the facial expressions from the input video. The results are truly mind-blowing.

🔗 Windows Local Installation Tutorial ️⤵️
▶️ https://youtu.be/FPtpNrmuwXk

🔗 LivePortrait Installers Scripts ⤵️
▶️ https://www.patreon.com/posts/107609670

🔗 Requirements Step by Step Tutorial ⤵️
▶️ https://youtu.be/-NjNy7afOQ0

🔗 Cloud Massed Compute, RunPod & Kaggle Tutorial (Mac users can follow this tutorial) ⤵️
▶️ https://youtu.be/wG7oPp01COg

🔗 Official LivePortrait GitHub Repository ⤵️
▶️ https://github.com/KwaiVGI/LivePortrait

🔗 SECourses Discord Channel to Get Full Support ⤵️
▶️ https://discord.com/servers/software-engineering-courses-secourses-772774097734074388

🔗 Paper of LivePortrait: Efficient Portrait Animation with Stitching and Retargeting Control ⤵️
▶️ https://arxiv.org/pdf/2407.03168

The video tutorial covers the following topics:
0:00 Introduction to the state-of-the-art image to animation open-source application LivePortrait
2:20 Downloading and installing the LivePortrait Gradio application on your computer
3:27 Requirements for the LivePortrait application and their installation
4:07 Verifying the accurate installation of requirements
5:02 Confirming successful installation completion and saving installation logs
5:37 Starting the LivePortrait application post-installation
5:57 Extra materials provided, including portrait images, driving video, and rendered videos
7:28 Using the LivePortrait application
8:06 VRAM usage when generating a 73-second long animation video
8:33 Animating the first image
8:50 Monitoring the animation process status
10:10 First animation video rendering completion
10:24 Resolution of the rendered animation videos
10:45 Original output resolution of LivePortrait
11:27 Improvements and new features coded on top of the official demo app
11:51 Default save location for generated animated videos
12:35 The effect of the Relative Motion option
13:41 The effect of the Do Crop option
14:17 The effect of the Paste Back option
15:01 The effect of the Target Eyelid Open Ratio option
17:02 Joining the SECourses Discord channel

For those interested in using LivePortrait but lacking a powerful GPU, Mac users, or those preferring cloud usage, this tutorial is ideal. It guides you through the one-click installation and use of LivePortrait on #MassedCompute, #RunPod, and even a free #Kaggle account. After this tutorial, running LivePortrait on cloud services will be as simple as running it on your own computer. LivePortrait is the latest state-of-the-art static image to talking animation generator, surpassing even paid services in both speed and quality.

🔗 Cloud (no-GPU) Installations Tutorial for Massed Compute, RunPod and free Kaggle Account ️⤵️
▶️ https://youtu.be/wG7oPp01COg

🔗 LivePortrait Installers Scripts ⤵️
▶️ https://www.patreon.com/posts/107609670

🔗 Windows Tutorial - Watch To Learn How To Use ⤵️
▶️ https://youtu.be/FPtpNrmuwXk

🔗 Official LivePortrait GitHub Repository ⤵️
▶️ https://github.com/KwaiVGI/LivePortrait

🔗 SECourses Discord Channel to Get Full Support ⤵️
▶️ https://discord.com/servers/software-engineering-courses-secourses-772774097734074388

🔗 Paper of LivePortrait: Efficient Portrait Animation with Stitching and Retargeting Control ⤵️
▶️ https://arxiv.org/pdf/2407.03168

🔗 Upload / download big files / models on cloud via Hugging Face tutorial ⤵️
▶️ https://youtu.be/X5WVZ0NMaTg

🔗 How to use permanent storage system of RunPod (storage network volume) ⤵️
▶️ https://youtu.be/8Qf4x3-DFf4

🔗 Massive RunPod tutorial (shows runpodctl) ⤵️
▶️ https://youtu.be/QN1vdGhjcRc

The cloud tutorial video covers:
0:00 Introduction to the state-of-the-art image to animation open-source application LivePortrait cloud tutorial
2:26 Installing and using LivePortrait on MassedCompute with a special discount coupon code
4:28 Entering the special Massed Compute coupon for a 50% discount
4:50 Setting up ThinLinc client to connect and use Massed Compute virtual machine
5:33 Setting up ThinLinc client's synchronization folder for file transfer
6:20 Transferring installer files to Massed Compute sync folder
6:39 Connecting to initialized Massed Compute virtual machine and installing LivePortrait app
9:22 Starting and using LivePortrait application on MassedCompute post-installation
10:20 Starting a second instance of LivePortrait on the second GPU on Massed Compute
12:20 Locating generated animation videos and downloading them to your computer
13:23 Installing LivePortrait on RunPod cloud service
14:54 Selecting the appropriate RunPod template
15:20 Setting up RunPod proxy access ports
16:21 Uploading installer files to RunPod's JupyterLab interface and starting installation
17:07 Starting LivePortrait app on RunPod post-installation
17:17 Starting LivePortrait on the second GPU as a second instance
17:31 Connecting to LivePortrait from RunPod's proxy connection
17:55 Animating the first image on RunPod with a 73-second driving video
18:27 Animation time for a 73-second video (highlighting the app's impressive speed)
18:41 Understanding input upload errors with an example case
19:17 One-click download of all generated animations on RunPod
20:28 Monitoring the progress of generating animations
21:07 Installing and using LivePortrait for free on a Kaggle account with impressive speed
24:10 Generating the first animation on Kaggle after installation and startup
24:22 Importance of waiting for input images and videos to fully upload to avoid errors
24:35 Monitoring the animation status and progress on Kaggle
24:45 GPU, CPU, RAM, and VRAM usage during the LivePortrait animation process on Kaggle
25:05 One-click download of all generated animations on Kaggle
26:12 Restarting LivePortrait app on Kaggle without reinstalling
26:36 Joining the SECourses Discord channel for chat and support
Reply all
Reply to author
Forward
0 new messages