Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
|
|
Thread Tools | Search this Thread | Display Modes |
3rd February 2023, 07:55 | #181 | Link | ||
Registered User
Join Date: Aug 2002
Location: Italy
Posts: 312
|
Quote:
Quote:
__________________
Hybrid Multimedia Production Suite will be a platform-indipendent open source suite for advanced audio/video contents production. Official git: https://www.forart.it/HyMPS/ |
||
14th February 2023, 08:43 | #182 | Link |
Registered User
Join Date: Aug 2002
Location: Italy
Posts: 312
|
@Selur and anyone else interested in participating are called to collaborate by "make bug reports/feature requests here on GitHub" according to the aydin's author:
https://github.com/royerlab/aydin/is...ent-1428538108
__________________
Hybrid Multimedia Production Suite will be a platform-indipendent open source suite for advanced audio/video contents production. Official git: https://www.forart.it/HyMPS/ |
11th March 2023, 11:21 | #183 | Link |
Registered User
Join Date: Oct 2001
Location: Germany
Posts: 7,462
|
btw. has anyone tested if there are speed differences depending on the RGB color space VSGAN is fed?
Just noticed that for me (only testst a few models), RGBS seems to be ~10% faster than when I feed any other RGB color space. -> Is it just me or can others confirm this? Cu Selur |
11th March 2023, 16:11 | #184 | Link | |
Registered User
Join Date: Sep 2007
Posts: 5,469
|
Quote:
|
|
11th March 2023, 22:30 | #186 | Link |
Registered User
Join Date: Oct 2001
Location: Germany
Posts: 7,462
|
Argh, I got a strange problem when using vsdpir and vsgan inside the same script:
using both with RGH: Code:
from vsdpir import dpir as DPIR # adjusting color space from YUV420P8 to RGBH for vsDPIRDenoise clip = core.resize.Bicubic(clip=clip, format=vs.RGBH, matrix_in_s="470bg", range_s="limited") # denoising using DPIRDenoise clip = DPIR(clip=clip, strength=5.000, task="denoise", device_index=0, trt=True, trt_cache_path="J:/tmp") # changing range from limited to full range clip = core.resize.Bicubic(clip, range_in_s="limited", range_s="full") # Setting color range to PC (full) range. clip = core.std.SetFrameProp(clip=clip, prop="_ColorRange", intval=0) # resizing using VSGAN from vsgan import ESRGAN vsgan = ESRGAN(clip=clip,device="cuda") model = "C:/Users/Selur/Desktop/testing/1x_Dotzilla_Compact_80k_net_g.pth" vsgan.load(model) vsgan.apply() # 640x352 clip = vsgan.clip using both with RGBS: Code:
from vsdpir import dpir as DPIR # adjusting color space from YUV420P8 to RGBS for vsDPIRDenoise clip = core.resize.Bicubic(clip=clip, format=vs.RGBS, matrix_in_s="470bg", range_s="limited") # denoising using DPIRDenoise clip = DPIR(clip=clip, strength=5.000, task="denoise", device_index=0, trt=True, trt_cache_path="J:/tmp") # changing range from limited to full range clip = core.resize.Bicubic(clip, range_in_s="limited", range_s="full") # Setting color range to PC (full) range. clip = core.std.SetFrameProp(clip=clip, prop="_ColorRange", intval=0) # resizing using VSGAN from vsgan import ESRGAN vsgan = ESRGAN(clip=clip,device="cuda") model = "C:/Users/Selur/Desktop/testing/1x_Dotzilla_Compact_80k_net_g.pth" vsgan.load(model) vsgan.apply() # 640x352 clip = vsgan.clip Using VSDIR with RGBS and VSGAN with RGBH: Code:
from vsdpir import dpir as DPIR # adjusting color space from YUV420P8 to RGBS for vsDPIRDenoise clip = core.resize.Bicubic(clip=clip, format=vs.RGBS, matrix_in_s="470bg", range_s="limited") # denoising using DPIRDenoise clip = DPIR(clip=clip, strength=5.000, task="denoise", device_index=0, trt=True, trt_cache_path="J:/tmp") # changing range from limited to full range clip = core.resize.Bicubic(clip, range_in_s="limited", range_s="full") # Setting color range to PC (full) range. clip = core.std.SetFrameProp(clip=clip, prop="_ColorRange", intval=0) # adjusting color space from RGBS to RGBH for vsVSGAN clip = core.resize.Bicubic(clip=clip, format=vs.RGBH, range_s="full", dither_type="error_diffusion") # resizing using VSGAN from vsgan import ESRGAN vsgan = ESRGAN(clip=clip,device="cuda") model = "C:/Users/Selur/Desktop/testing/1x_Dotzilla_Compact_80k_net_g.pth" vsgan.load(model) vsgan.apply() # 640x352 clip = vsgan.clip Using VSDIR with RGBH and VSGAN with RGBS: Code:
from vsdpir import dpir as DPIR # adjusting color space from YUV420P8 to RGBH for vsDPIRDenoise clip = core.resize.Bicubic(clip=clip, format=vs.RGBH, matrix_in_s="470bg", range_s="limited") # denoising using DPIRDenoise clip = DPIR(clip=clip, strength=5.000, task="denoise", device_index=0, trt=True, trt_cache_path="J:/tmp") # changing range from limited to full range clip = core.resize.Bicubic(clip, range_in_s="limited", range_s="full") # Setting color range to PC (full) range. clip = core.std.SetFrameProp(clip=clip, prop="_ColorRange", intval=0) # adjusting color space from RGBH to RGBS for vsVSGAN clip = core.resize.Bicubic(clip=clip, format=vs.RGBS, range_s="full") # resizing using VSGAN from vsgan import ESRGAN vsgan = ESRGAN(clip=clip,device="cuda") model = "C:/Users/Selur/Desktop/testing/1x_Dotzilla_Compact_80k_net_g.pth" vsgan.load(model) vsgan.apply() # 640x352 clip = vsgan.clip Code:
Input type (float) and bias type (struct c10::Half) should be the same Using VSDIR with RGBS and VSGAN with RGB48: Code:
from vsdpir import dpir as DPIR # adjusting color space from YUV420P8 to RGBS for vsDPIRDenoise clip = core.resize.Bicubic(clip=clip, format=vs.RGBS, matrix_in_s="470bg", range_s="limited") # denoising using DPIRDenoise clip = DPIR(clip=clip, strength=5.000, task="denoise", device_index=0, trt=True, trt_cache_path="J:/tmp") # changing range from limited to full range clip = core.resize.Bicubic(clip, range_in_s="limited", range_s="full") # Setting color range to PC (full) range. clip = core.std.SetFrameProp(clip=clip, prop="_ColorRange", intval=0) # adjusting color space from RGBS to RGB48 for vsVSGAN clip = core.resize.Bicubic(clip=clip, format=vs.RGB48, range_s="full", dither_type="error_diffusion") # resizing using VSGAN from vsgan import ESRGAN vsgan = ESRGAN(clip=clip,device="cuda") model = "C:/Users/Selur/Desktop/testing/1x_Dotzilla_Compact_80k_net_g.pth" vsgan.load(model) vsgan.apply() # 640x352 clip = vsgan.clip Cu Selur Ps.: I also reported this to https://github.com/rlaphoenix/VSGAN/issues/31, but thought it might be interesting to others in case they run into the same issue. |
12th March 2023, 07:53 | #187 | Link |
Registered User
Join Date: Oct 2001
Location: Germany
Posts: 7,462
|
Okay, issue seems to be that VSDPIR overwrites the default tensor type. (so not a bug in VSGAN, but an unexpected behavior of VSDPIR)
Using: Code:
import torch torch.set_default_tensor_type(torch.FloatTensor) Behavior is fixed with https://github.com/HolyWu/vs-dpir/releases/tag/v3.0.1 Cu Selur Last edited by Selur; 12th March 2023 at 08:49. |
19th October 2023, 04:14 | #188 | Link |
Registered User
Join Date: Sep 2006
Posts: 1,657
|
New 2x upscaling model for anime AniScale2
https://github.com/Sirosky/Upscale-H.../tag/AniScale2 It only works for digital created animes. It came with a base upscale model and a refiner model for sharpening. The model has depth of field awareness so the intended blur effect will be kept intact in the result. I tested it a little on BLEACH and it looks pretty convincing. Last edited by lansing; 19th October 2023 at 04:24. |
Tags |
esrgan, gan, upscale, vapoursynth |
Thread Tools | Search this Thread |
Display Modes | |
|
|