-
Notifications
You must be signed in to change notification settings - Fork 71
Minimum requirements for inference #31
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
And I was hoping that my 4090 will manage it :) thank you for providing the reference, but it's strange seeing how many starts this repo has one would think it is usable in real life... |
@sczhou would be grateful if you could advise on the minimum VRAM requirements. I've tried it on a RTX A6000 and also get OOM error.
I've tried the following: which didn't resolve the problem. |
40GB fails too T_T |
Meanwhile I'm sitting here with my RTX 3080 12GB VRAM wondering why it doesn't work. This thread explains a lot. Here's what I did so far to get this software running:
Adjusting Processing Batch Size:
GPU Configuration:
Quality vs. Performance Tradeoffs:
The hardware requirements are described in their working paper:
Hope it helps some of you. Edit: In this thread someone apparently got it working with a H200 with 140GB of VRAM... |
using #42 I estimated it would require around 35Gb, still out of reach for home GPUs However... it seems it can be done in "around" 24Gb needs a little tuning to stop the peaks going about 24 but I am trying the following on a 3090 on windows (UAV) D:\repo\Upscale-A-Video>python inference_upscale_a_video.py -i "C:\Users\new\THE REAL DEAL - CroppedFinal.mp4" --fp16 --load_8bit_llava --output_path output -n 50 -g 4 -s 10 --tile_size 96 --perform_tile --color_fix None / / / /___ ______________ / /_ / | | | / ()/ / ____ Upscale-A-Video Device: cuda:0 (torch.float16) Loading Upscale-A-Video Pipeline components...
Loading LLaVA... Starting processing for 1 video(s)...[1/1] Processing video: THE REAL DEAL - CroppedFinal [1/1] Reading chunk 1... |
Video resolution: 660x540, FPS: 50.00 Seems to fit so far. Obviously the quality will be terrible, but some "hey this is a place to start" numbers might help someone |
Hi!
Now I'm just trying to use the inference code to upscale some examples, and I use A40 48G GPU. However, it still reminds me OOM.
So Please tell me the minimum requirements for inference, or if you can, please tell me some methods to reduce GPU memory.
Thanks a lot!
The text was updated successfully, but these errors were encountered: