Download Project Igi Highly Compressed For Pc

0 views
Skip to first unread message

Juliano Nichols

unread,
Aug 5, 2024, 2:21:49 PM8/5/24
to saciricic
Whichsituation is the best (occupies the least space)? Why? Does it depend on compression algorithm? I know that compressing one compressed file cannot help much, but let's say 20 of them? For me situation 1 doesn't look like a good idea.

You might however look at something like RAR which allows redundancy and split archives. This is a bit like RAID5. You create multiple archive files each of which has built in redundancy so that you can loose a file and still recreate the original data.


Also, your 3rd approach can indeed lead to another reduction in size. I remember some discussion (see here) about compressing files multiple times using different algorithms. The author was compressing highly redundant text files and could go from 100GB to a few MB after experimenting enough. Note that his case was a bit special, but the general idea is that iterated compression can actually be worthwhile in some cases.


Between Situations 1 and 2 the latter definitely has more chance of resulting in a smaller archive, especially when you use larger dictionary sizes (the dictionary in simple words is the memory area used to find and compress repeated patterns in data). Plain old ZIP can only use a tiny 32KB dictionary, which given the hardware these days is way too small.


If this option is enabled, WinRAR analyzes the file contents before starting archiving. If several identical files larger than 64 KB are found, the first file in the set is saved as usual file and all following files are saved as references to this first file. It allows to reduce the archive size, but applies some restrictions to resulting archive. You must not delete or rename the first identical file in archive after the archive was created, because it will make extraction of following files using it as a reference impossible. If you modify the first file, following files will also have the modified contents after extracting. Extraction command must involve the first file to create following files successfully.


Thus if you have many duplicate files among your projects, a large dictionary size combined with solid archiving and the feature above is very likely to lead to significant size reduction with Situation 2. Of course all the general caveats about large archives apply, so including a recovery record would also be recommended.


However for archiving purposes all the above are not good because zip format doesn't support solid archives. 7z and rar do use solid archives by default (Do 7z archives compress each file individually or compress everything together as one?) so the compression ratio is much better (because you most definitely have the same byte patterns repeated in many files). It's the same way you tar (i.e. making a solid uncompressed archive) then pass to gz or bz2 to compress. OTOH zip compresses each file separately so it'll be easier to extract separate files but the compressed output would be much larger


I am no expert but do understand codecs, bitrates, formats etc. and usually have pretty good results when exporting footage. However, today I was editing a webinar which is basically screenshots and voice over. The source file (.mp4) is about 40MB but when exporting the file size jumps to 3+GB when using .h264 codec and matching the source file.


Thanks guys. here are the screenshots. As you can see the original file (sequence) is a screen grab which is not a typical video file and I couldn't match it's format when exporting. The best I could do was to reduce it's bitrate but even that too almost 30 minutes to render even though most of the screens are static images.


I know this is an old thread but I am seeing the same results, namely a 500 mb webinar video turns into a 2 gb file no matter what I set it at in the export panel. The only workaround for this for me is to use another program like Shotcut to import the Premiere Pro exported video and then reduce in size using that app. It's really unfortunate that a program so expensive as Premeier Pro can't export to a file format that is comparable in file size to the original.


If you included the information on the original file from say the Tree view of MediaInfo, and your export settings in Premiere, and the MediaInfo Tree view of that export, we could make actually useful comments. Also ... Premiere can do all sorts of things if you know to go in and actually, like, set the many options so it will do what you want.


With a caveat for this one. Starting with say screen-capture media, which is normally extremely compressed, editing it, and wanting to get back the that extremely condensed version ... that's possible. But ... you of course have to accept potential issues.


Accepting the above, we all need to learn the format/codec settings for various needs, that's a normal part of editing. Including for extreme compression, where H.264/265 are probably what you will need.


And the presets offered are trying to maintain quality. Forget that, you're demanding compress the heck out of it. So you only start with a preset, then you have to go into the settings to take control yourself.


Then go down into the Video tab, Level setting rather than say 5.1, drop that to about 4. Now go down to the settings for CBR/VBR, and select VBR (variable bitrate, only 'keeps' the bits per frame needed) ... and drop to say 12 max for 1920x1080, and maybe 20 max for UHD.


So you're saying that GoToWebinar has a more advanced video compression method than the world's leading video editing software? If Premiere cannot, at the very least, match what GTW uses for compression then there is no point in using it.


I followed several different YT/Adobe videos on how to compress the crap out of videos for file size and not one single method got even close to what GTW can do. Like I said, I couldn't get my 500mb webinar any smaller than 2gb - I then import it into Shortcut and that gets it down to 900mb.


My original file was 116.5 MB. The smallest I was able to get it using H.264 and reducing the size was 1.1 GB. Used your tips about switching to HEVC/265 and lowering the level, VBR, etc. Got it down to 209.5 MB at full size (1920x1080), which is a whole lot better than over a Gig with a smaller viewing area.


I'm having exactly the same problem. I recorded a webinar using Zoom, which created an 85Mb MP4 file. It's basically a Powerpoint presentation of about 70 minutes, with a voiceover. When I chose H.264 and high bitrate, the exported file was over 5Gb. I then tried again, this time selecting MPEG4 and medium bitrate - this time, the file was smaller than the original but the video was so blurry it was unusable. I notice that the Export Settings window actually gives a preview of what the output will look like, and even the higher 3GPP 352x288 H.263 gives results that are too blurry, even though the file size is larger than the original.


The Zoom MP4 is pretty good, and not at all blurry, and the voice quality is fine. All I'm trying to is trim out the beginning and ending of the Zoom MP4, but Adobe Premier doesn't seem to be able to do that and yield a file of a comparible size. This is very disappointing - surely if Zoom are able to produce MP4s of decent quality and a sensible size, Adobe are capable of the same?


I had some luck with my own webinar editing project today. I can't guarantee this will work for you, but maybe it is worth a try. Even though Premiere estimated that the size would be huge, it only exported a little bit bigger than the original video. The original video was 32mb and the new export was 36mb.


That makes sense, and explains it! However, nothing I did got playing with other settings got it down below 2Gb. I tried a utility from BeeCut (which got it down to about 130Mb on my trial version) but which isn't cheap (and having shelled out for Adobe Creative Cloud, I didn't fancy having to buy yet more software). In the end, I used Prism from NCH Software, which got it down to 182Mb - bigger than BeeCut, but free.


In future, I think I'll just have to remember to press the "record" button at exactly the right time on Zoom so I don't need to trim the video. I am surprised that Zoom seems to be able to produce good-quality MP4s that are low in size, and yet Adobe (who are a very well-established company in this field) can't come close.


That certainly helped! I wasn't sure what you meant by low adaptive bitrate to start with, but then realised that appears if I select H.264 rather than MPEG4. Doing what you suggested got it down to 500Mb, which is an improvement (but still six times bigger than the original before I trimmed it!) - but I'll keep playing around and see if I can make it more manageable.


You also need to understand that the specialized camera chip that mangles the camera data to compress it that highly ... is a very specialized chip. It is hardware-built to do a very set bit of compression knowing exactly what the incoming data will be.


I take the point, but BeeCut and Prism did a reasonable job and produced smaller files than Adobe. I suppose what I don't understand is that all I want to do is trim an mp4 so that it results in a subset of the current file, and it seems counterintuitive that a file that just has bits snipped out should be so much bigger than the original.


Some programs can just 'clip' segements from a long-GOP file, so there's no re-encoding. Of course ... you can mostly only do that at the i-frame points ... which can be up to 30 or more frames away from any point you might set as your in/out points.


Can you name a free or affordable program that simply cuts the original video so we don't get that recompression from Adobe Premier? It's a little irritating to have such huge exports with simple cut edits for webinars haha


Integrating renewable energy into the energy delivery system presents challenges, such as managing variable and intermittent generation from sources such as wind and solar. To address these challenges, the California Public Utilities Commission identified energy storage procurement targets for investor-owned utilities (IOUs). The U.S. Department of Energy and California Independent System Operator also identified a need for energy storage. However, there are significant barriers to energy storage use, including high capital costs, lack of information regarding performance, and limited operational experience.

3a8082e126
Reply all
Reply to author
Forward
0 new messages