Using VEAI 2.4.0 / Artemis HQ v11 / Chronos v2 all via ProRes 422 HQ, I am working on an animation (Appleseed 2004) and want to know if there is some guidance on workflow. Through trial and error I know I need to color grade before upscaling, but would frame rate conversion come before or after upscaling? Developers would clearly know certain assumptions that would make one scenario more preferable. Current frame rate is 23.98fps and the closest doubling is 50fps and I have stripped audio out before all processing and will add it back in at the end. Upscaling is 1080p to 4k.
Upscaling with Artemis first and then interpolating with Chronos might yield better results; perhaps having more information (higher resolution) for the AI algorithm to read from would eliminate a lot of the erroneous interpolations.
Running some tests I have noticed that if you run Chronos after upscaling the processing time becomes prohibitive, so it was better to do it before upscaling. I could not confirm quality difference doing it before or after, but I did notice some artefacts with Chronos V2 processing. They include:
I have VOB files that are 480i - 29.97 frame rate.
the settings I use are upscale to HD and 2x deinterlace (59.98)
I have also done the upscale to HD keeping original frame rate at 29.97
Using chrono fast and chrono with Dione TV along with Dione TV halo.
Not seeing much difference between the mix of these settings.
Am I gaining anything by 2x deinterlace (59.98) or should I keep original frame rate?
I was working on a video for a client who after I delivered the video in 1080p asked for a 4K version. 4K was never stated as a requirement and a good part of the footage was shot at1080p for potential slowing down. It was edited on a1080p timeline.
After trying to explain to my client why upscaling was a bad idea I decided to export a version in 4K (using the YouTube 2160p 4K Ultra HD preset in Media Encoder) out of curiosity. I was very pleasantly suprised at the result! It is noticeably sharper than the same version at 1080p (YouTube1080p Full HD preset). Also sharper than the version exported in ProRes in1080p.
I don't think this GPU supports DLSS, that would only be for RTX 20xx series and up I believe. It should support FSR, but it's a card from 2015 so it may be purposefully excluded by Warframe by devs due to the overhead that upscaling can cause for certain hardware (as in, workloads that would be more efficient to just render at native res for slower hardware).
I just bought and installed Vegas Pro 20, and I also installed the Deep Learning Models (version 20.2.0.0), but If I try to run 'upscaling' on my old DV material I get the message that the media cannot be upscaled because the upscaling model is not present on my system and that I have to download it.
If your computer does not meet these specifications - especially by generations - this could very possibly be the reason that Upscale is not working. When the original DLMs were released just over 2 years ago with Vegas Pro 18, none of the DLM AI FX worked for me at the time because my then editing computer did not meet the minimum recommended specs/
The material I use is old SD in .avi format 720x576
I tried both project settings: first, when importing the material in Vegas I let Vegas use the same settings as the source material. I also tried with project settings set to 1280x720
In both cases I get the warning: 'the media cannot be upscaled because the upscaling model is not present on your system. Please install the Vegas AI bundle, which you can download at www.vegascreativesoftware.com'
Also make sure your Intel graphics drivers are updated using Intel's driver update page ... the DLM uses an Intel igpu if it is detected. If you're not sure what version you have, the Vegas help driver-update will tell you but you need to go to Intel to check for updates for the 9700k igpu.
Musicvid, thanks for the tips. After checking the intel drivers I will do some testing with your tips.
I de-interlaced my .avi files, but not with Smart Deinterlacing, but just in the project settings.
I can get that error when project resolution is the same as media resolution . it results in scale set to 1 and the error about the models not being loaded. Try increasing your project resolution first, then add the upscale, the upscale figure should automatically populate to the correct value, the difference between media resolution and project resolution.
Btw I tried an old free upscaler and compared it to Vegas. it's approx a 2.3x upscale. Vegas is on right, Video2x is on the left. You can see Vegas has a huge problem with noisy footage, while Video2x is able to moderate the noise somewhat. Video2x is 0.7fps in this example, compared to about 6fps with Vegas, even though Video2x uses 100% GPU, while Vegas uses only CPU.
@Frank-Bee
On this screen I see that your Intel GPU 630 as part of the CPU is not visible.
Probably it can not being used because it is not allowed in the BIOS of your computer.
Try to allow it yourself or something else.
BIOS has to be opened and set to allow the Intel Graphics to be shown and used.
After that you have to install the latest drivers for that GPU from the Intell website, drivernumber gfx win 101.3729.101.2114.
After that your Help/ Check for Driver Updates has to look like this
The last time I did any serious overclocking was around 2007, using a freshly purchased Intel Core 2 Quad Q6600 and six month old Nvidia GeForce 8800 GTX. I managed to boost the former's stock 2.6GHz clock speed all the way up to a comfortable 3.4GHz (an increase of 31%) and the latter's core up by something like 20% (I can't recall the specifics), but this all took many hours of messing about and dealing with constant crashes.
Although I can't quite remember how much better all my games ran, I do know that it was quite a decent increase. Probably in order of 20 to 25%, if I had to pin it down. But sheesh, what a mess it all was. That poor graphics card, which had cost me something like 650, looked utterly horrific, with shunts soldered everywhere and crudely constructed heatsinks and fans strapped over it all.
Before that setup, the most overclocking fun I had was doing the 'pencil trick' on early AMD Athlons. Just a few layers of graphite to close an electrical connection on the CPU package was all that was required to set the clocks to anything you like.
And probably the last overclocking I did with any serious intention was around six or seven years ago, seeing if I could get a Titan X Pascal graphics card to boost more than 500MHz extra on top of its stock 1,530MHz value. I don't recall if I was especially successful but I do remember the sheer racket of running the fans at 100% to stop the darn thing from boiling itself to oblivion.
I don't bother with overclocking anymore. My CPU, a Core i7 9700K, runs at stock clocks; heck, I even have all the power management options enabled in the UEFI. It's more than good enough as it is, though I do yearn for something better for rendering and compiling. But I'd never be able to achieve the level of improvement I want just by raising the clock speed a few percent.
My graphics card overclocks itself, as do all of them these days. If the GPU isn't constrained by its power and thermal limits in a game, the chip will run at a clock speed higher than the maximum claimed by its manufacturer. It's supposed to boost up to 2,625MHz but in many games, it trundles away quite happily at 2,850MHz.
I have tried to manually overclock it, of course, but with the best, always-stable result being no more than a 5% improvement in any game's performance, it's just not worth doing it. I think I've done it no more than five times since I've had it and it was only to provide some clock scaling data for analysis.
Nowadays, if I want a lot more performance in games, I simply enable upscaling. A single click in the graphics options gives me an instant boost, with no need for me to do anything else. Sure, I lose a bit of visual quality in some cases, but that's it. I don't need to adjust anything on my hardware to cope with it; everything just works. And while it's not as visually solid as upscaling, frame generation lifts things even further.
At this point, I suspect somebody in the interwebs is prepping a response to the above containing the phrase 'fake frames' but I literally couldn't care less about how 'real' any frame displayed on my monitor is. It could be put there by the power of magic pixies, for all I care, as long as it doesn't mess up the enjoyment of my games.
The death of consumer-grade overclocking was inevitable, of course, because despite the considerable advances in semiconductor technology and chip design, clock speeds just aren't the be-all and end-all on CPUs, like they used to be at the start of this century.
I'm sure many of you out there will remember the race between AMD and Intel to be the first to release a 1GHz desktop CPU and yes, we're currently bouncing around the 6GHz limit now, but there's little difference in games between a 5.5GHz and 6GHz chip (as long as everything else is the same). Only in the world of competitive FPS battle would it actually matter, though most players of that ilk will just run at low graphics settings and resolution to get the best performance.
c80f0f1006