Rarlabs Winrar

1 view
Skip to first unread message

Dhara Lyford

unread,
Aug 3, 2024, 10:38:00 AM8/3/24
to dickprobtabe

WinRAR is a trialware file archiver utility for Windows, developed by Eugene Roshal of win.rar GmbH. It can create and view archives in RAR or ZIP file formats, and unpack numerous archive file formats. To enable the user to test the integrity of archives, WinRAR embeds CRC32 or BLAKE2 checksums for each file in each archive. WinRAR supports creating encrypted, multi-part and self-extracting archives. WinRAR is a Windows-only program. An Android application called "RAR for Android" is also avail...

check if it signed by expected vendor, for executable files signature on windows and validation is handled natively
image6161126 34.6 KB
If signature is from major CA and name of signee matches expectations, then all is well barring major cybersecurity incident on software vendor side.

Personally there has been absolutely no reason to buy winrar or winzip this or last decade, you can use 7zip instead. Winrar offers nothing that OSS does not and it probably survives only thanks to brand recognition alone.

Copyright 2002-2023 Alexander Roshal. All rights reserved.
win.rar GmbH - the official publisher for RARLAB products - handles all support, marketing and sales related to WinRAR and www.rarlab.com.

I've noticed a significant degradation of file extract performance when using Directory Opus, especially RAR files. So I decided to run a test. With a 1.2gb RAR archive, I first extracted it using Opus internal extractor and then with WinRAR extractor. It took Opus 26 seconds to extract it and WinRAR only 10 seconds. I'd love to just use WinRAR and call it a day, but I'd lose the convenience of Opus pre-populating the destination folder to the second pane. Is there anyway I could get the best of both worlds? Maybe a secret way to direct Opus's internal extractor to use the WinRAR binary for RAR files instead of its own? I've not tested zip or 7z performances yet since I mostly work with RAR files but whatever binary gives me the best performance for each type I wish I could configure Opus to use if present and only fall back to the "turtle"-paced internal extractor if all else fails.

Assuming WinRAR and unrar.dll use the same code, and you're extracting to a local drive which doesn't need UAC elevation to write to, speed differences are more likely to come from either caching (extracting the same archive twice will likely be faster the second time) or antivirus treating the two programs differently.

Ok I ran the test in reverse and got 11 seconds with WinRAR and 25 seconds with Opus. It doesn't matter the rar archive. The bigger it is the more noticeable the time difference. I'm typically extracting MSFS scenery updates some of which contain a lot of small files in a otherwise large archive. I extracted the file simply to a subfolder of the folder it was in using both extractors one after another, deleting the subfolder after each of course. Maybe WinRAR uses multithreading? I am using the current v6.21 of WinRAR registered if that is what is making the difference. How can you get away with using their dll? or is it your own version? If its your own version, that is likely why it is slower. But I don't know what the issue is.

If it's lots of small files (which definitely isn't "all archives", so it's an important detail), the difference may be in things we do per-file that WinRAR isn't doing. The options in Preferences / File Operations / Copy Attributes can turn much of that off.

Multithreading is unlikely to be a factor when extracting an archive (only when compressing one, for some newer algorithms), and is something Unrar.dll would typically handle by itself if it needed to.

I only have checked 'Clear read-only flag when copying from CDs', 'Preserve the attributes of copied files', and 'Preserve the timestamps of copied files'. The first option is not applicable because I'm not copying from a CD. I cleared the last 2 even though WinRAR does reserve timestamps of copied files. But even with those changed and restarting Opus, it only shaved 4 seconds off of the time to takes for Opus to extract the file, still much slower than using WinRAR directly. Not sure what file WinRAR is using when I initiate the extract using its right-click context menu... but for some reason I doubt it's the same unrar.dll you are using. I could be wrong though, but if it is different, I wish I could point Opus to it somehow.

Extracting the rar file with 7z directly gives me the same performance as winrar itself to extract it. So it's only Opus that is sluggish when it comes to extracting the rar files. Next I tried this... I compressed all the files in this folder into a 7z file.. basically I converted the archive from rar to 7z. Extracting this 7z archive was only a few seconds, nearly instantaneous, on both 7z and Opus, but slower, back to the standard 10 sec, with WinRar. Wish we could choose the extractor Opus used. I think I'd point it to 7z. Maybe you just have unrar.dll configured wrong in some way when it comes to unpacking rar files.

Changing to use 7z.dll in preferences for RAR files instead of unrar.dll gives me nearly the performance of using winzip directly.. 12 seconds vs 10 seconds. So that is a good fix but I'd like to see if I can get it to actually match the performance. Where do I put the new dll versions?.. in this path:
C:\Program Files\GPSoftware\Directory Opus\VFSPlugins

And what about the 32bit version.. do I have to replace that in the 32bit folder? Which version is the Opus plugin actually using? I mean it says unrar.dll in the preference settings, not unrar64.dll? I'm using 64x Opus. I tried replacing both unrar.dll and unrar64.dll in the respective plugin folders, along with the 64x version of 7z.dll (I don't have 32bit version of that installed to get the 32x version to replace it). However after restarting Opus I seen no performance difference.. still 21sec with Opus internal unpacker / unrar.dll and 12 sec instead of 10 sec when configured to use 7z.dll. Maybe I'm not putting the dll's in the right folders?

Yeah I've tried everything. It's not a matter of file version. Unrar.dll (actually unrar64.dll) is simply much slower at unpacking for me verses using WinRAR directly, even after I replace the dll with the most current edition. Using 7z.dll gives me the performance of 7z directly, but 7z seems to be slightly slower in general when handling RAR files.. maybe 20% slower verses WinRAR on the file I tested. I guess good enough until the issue can be resolved. Others should test this as well to confirm my findings. If so, then maybe the problem was introduced in one of the more recent versions of unrar.dll, as I found a thread on here where people were praising unrar.dll be faster than 7z.dll back in 2016. I find it hard to believe they would intentionally program the dll to be slower now than their main product.. at least not a performance degradation this significant. It's almost twice as slow to extract.

Using 7z.dll for just a small performance hit is acceptable for the convenience of using Opus internal unpacker. But if I were the Opus programmer, I'd be curious to confirm if I too were getting a big noticeable performance hit with unrar.dll. And if so, get back with the developer to ask why. I mean back in 2016 he seemed willing to help.

We have added native support for additional archive formats, including tar, 7-zip, rar, gz and many others using the libarchive open-source project. You now can get improved performance of archive functionality during compression on Windows.

The team and I are pumped to be back at Build with the developer community this year.Over the last year, Windows has continued to see incredible growth fueled by Windows 11 adoption. In fact, one of the mo

The rar file does not matter. It's more about the size. You need one that is big enough with enough files so the decompression lasts long enough to time it, especially if you are using an M.2 drive like I am. This requires an archive 500MB+ which I'm not going to attempt to upload to the forum. However I found an openly available test rar you can download that illustrates the problem. Download the 616MB sample.rar from here:

Use the Directory Opus Archives right-click context menu to extract to it "sample" folder. Do this once with unrar.dll enabled and again with 7z.dll. For me it takes 9 secs with unrar.dll and 5 secs with 7z.dll. Now try it with WinRAR. For me, this took 5 secs as well. So for this file the performance using 7z.dll was the same as WinRAR since it's a relatively smaller archive. But the problem is, why isn't the performance of unrar.dll the same as WinRAR or at least relatively close? I'm seeing a 80% performance hit with unrar.dll currently, even with the latest unrar64.dll .

As for as my account, I'm only on week one of evaluating Directory Opus. It's a great product and I have every intent on registering it, hence why I'm here trying to make it even better. Also, I found it is so useful, I can't imagine running Windows without it. But it's to my advantage to wait as long as possible to register because v12 has been out for several years now. We don't know when v13 might be released. v13 could be released later this year and you say, everyone who registered v12 from July 1st forward can upgrade for free. So hence why holding out for my max 60 days wouldn't hurt. Perhaps you have a beta v13 I could test, or if there is nothing is imminent for the near future, I could go ahead and register. Let me know.

Switching back to the original archive, I did a test with the RARLab's sample UnRar.exe code (which uses UnRar.dll). That is not slow, so the issue isn't entirely inherent to UnRar.dll. But I suspect the problem is when UnRar.dll is used in callback mode. (Opus needs to use callbacks to work with things like UAC and FTP destinations. The UnRar.exe sample doesn't need that as it always writes its files directly, and fails if the destination needs UAC or isn't a real directory.)

Doing a little debugging, the callback we give to UnRar.dll is being called with new data (UCM_PROCESSDATA) in very small chunks as the files decompress. Small chunks are inevitable with small files, but they're also happening with the larger ones. I might be wrong, but I suspect the bottleneck is there, and the reason 7z.dll is faster is that it's buffering the data better, sending fewer, larger chunks instead of many more tiny chunks.

c80f0f1006
Reply all
Reply to author
Forward
0 new messages