Until not so long ago, I've not even known that you could compress specific folders, files or even entire drives using windows' builtin compression. A simple way to do this is just go to properties and check "compress contents to save disk space" and you're all set.
When I checked the file size in properties, the size didn't change even a byte, but only the "size on disk" changed decreasingly. So it seems that the file's data remains intact as is (the same hash value probably proves this), but it just compresses and decompresses when reading it from the OS side.
Performance: In theory, reading/writing the file becomes somewhat slower, due to the compression/decompression that's happening. However, this is usually negligible nowadays, and may also be offset by the drive needing to store less info.
Old versions of windows on your network might not be able to read the files. The default compression is fine, but you can tweak the algorithm for some dramatic disk savings, with the cost being old-windows compatibility.
Since the death of 7 was upon us, i put 10 on all my machines and nothing big other than new bloatware really happened, then windows decides my very full ssd needed to have every fucking non OS file on it compressed. 10GB free is cutting it close but this is a default and i have to check and see if this is killing my 100000000 game saves because i was too lazy to reroute default locations a long time ago (kinda my fault but still)
In signal processing, data compression, source coding, or bit-rate reduction is the process of encoding information using fewer bits than the original representation. Any particular compression is either lossy or lossless. Lossless compression reduces bits by identifying and eliminating statistical redundancy. No information is lost in lossless compression. Lossy compression reduces bits by removing unnecessary or less important information. Typically, a device that performs data compres The proc...
There are VBA methods to zip and unzip using the windows built in compression as well, which should give some insight as to how the system operates. You may be able to build these methods into a scripting language of your choice.
The basic principle is that within windows you can treat a zip file as a directory, and copy into and out of it. So to create a new zip file, you simply make a file with the extension .zip that has the right header for an empty zip file. Then you close it, and tell windows you want to copy files into it as though it were another directory.
Just for clarity: GZip is not an MS-only algorithm as suggested by Guy Starbuck in his comment from August. The GZipStream in System.IO.Compression uses the Deflate algorithm, just the same as the zlib library, and many other zip tools. That class is fully interoperable with unix utilities like gzip.
Here'a my attempt to summarize built-in capabilities windows for compression and uncompression - How can I compress (/ zip ) and uncompress (/ unzip ) files and folders with batch file without using any external tools?
As regards to the shell.application and WSH I preferred the jscriptas it allows a hybrid batch/jscript file (with .bat extension) that not require temp files.I've put unzip and zip capabilities in one file plus a few more features.
There are both zip and unzip executables (as well as a boat load of other useful applications) in the UnxUtils package available on SourceForge ( ). Copy them to a location in your PATH, such as 'c:\windows', and you will be able to include them in your scripts.
You may also want to check how the server is being used, if the disk has compression enabled this will also reduce the overall usage, but if this is a SharePoint file location, it might be using versioning, which only saves the changes, not complete files again
There is no Sharepoint, it is a File Server. I was thinking it might have something to do with versioning which only saves changes. However, there also is a lot of new files (including PDFs and Images) added on a daily basis.
Are these servers physical or virtual? If they are virtual you may be looking at dynamically expanding vhd/vhdx virtual disks, which will always look like they are nearly full. They will continue to grow in size, as needed, until you run out of physical space on the physical drive(s) they reside on.
The capacity would always show the same overall size, the underlying space used would change but the visual to the guest would always be X, that wont change unless someone is adding space, and then it will be obvious
Virtual or not, and even if it is as a dynamic disk or thin disk for VMware, the total size will always show what you configured, it wont show values as the disk increases, if it says 300GB it will always say 300GB - unless someone increases it
I see Portuguese compresses the most down to 40% (except for Chinese and Japanese), a bit more than my bigram idea (while losing direct indexing). Do you need to support mostly one specific language, or few, then which? My bigram idea is almost trivial to code, trigram more involved, do you really need down to 33%?
I have lots of memory compression ideas that have higher priority, e.g. for numbers. I choose the priority based on what excites me most at the time. If you want direct indexing into a string then UTF-8 is already a problem, TranscodingStreams.jl would also be a problem, and it seems also the two short-string compression methods already mentioned.
The github link to Unishox now redirects to Unishox2 but I see the former in Yggrasil. Does that mean it was built for Unishox, not Unishox2, that may or may not (but I find likely) to be incompatible?
Unishox was found to be the slowest of all since employs several [methods?] to achieve the
best compression. However this should not be too much of an issue in most
cases when a single string or few strings are handled at a time.
Some quick tests show that it was the lib that was returning the wrong number of bytes of the uncompressed string even though the part it returns is correctly decompressed. I am updating the jll to see if that fixes it, otherwise I will open an issue upstream .
Good morning.
I can no longer open some of my photos ( a lot actually). I click on them and it says no view possible, even though they finish .jpg as they should. I believe I compressed these files when I did my last disk clean and therefore my current situation.
Could you tell me if I have indeed done what I think, or if not what you think, and if I have compressed all of these images, how to uncompress them.
I am on Windows XP Pro.
Thank you in advance,
Vladimir
Thank you, Jon, for the instructions on how to decompress all your files that windows did without your permission. I have now changed the setting to NOT allow that, but it was a nightmare trying to fix everything by hand until I saw you post. Your instructions worked, although I am obviously not as computer literate as you and it took a while for me to figure out exactly what to do for my situation. THANK YOU SO VERY MUCH, Leo and Jon!!! Donna in Texas
I would like to know if there is a way to defrag without compacting files. I have over 67% disk space available abd would like to avoind compression since it will needlessly slow down file access. Thanks for your help!Defragging and compression are unrelated. If you defrag using the built-in Windows tools, it will not automatically compress.
19-Sep-2009
Hi,I have run into trouble. I compressed a large amount of data on my PC, but unfortunately it crashed.I was able to recover all the data but I am unable to read any of the file and get to see an error that it is a unrecognized format.Its close to 3GB important data, any means I can recover it.regards
Are Windows compressed files readable through other operating systems. For example, if my computer breaks and I remove the hard drive, will the files be readable connected to my linux computer?Or say a Compressed file is saved in Dropbox, will I be able to open it using an appropriate Linux or Android App for that file type?Thanks again Leo!
Very nice explanation! I was first introduced with file compression when I ran across this guide: -compression.html and learned how everything works, but I wanted to learn more and I came to the right place. Thank you very much for explaining :)
Windows automatically compressed some of my video files due to low disk space. I worry that quality has been affected. Does Uncompressing restore any losses in file quality, if any is even lost?Thanks.
I wanted to check if caching was the issue so I cleared files in folders:
"C:\Windows\Temp", "C:\Windows\Microsoft.NET\Framework64\v4.0.30319\Temporary ASP.NET Files\root" and "%SystemDrive%\inetpub\temp\IIS Temporary Compressed Files" and then restarted the webserver.
The HTTP RFC gives no specific guidance on how to choose from many Accept-Encoding values with the same priority, so it would be acceptable to return br content to those clients. However, IIS will choose the first one (left to right) that matches one of its configured compression schemes. This means it won't choose br if either gzip or deflate compression is also enabled.
The obvious solution is to disable gzip and deflate on your server so that br is the only match. However, because roughly 20-25% of Internet users (as of early 2018) are still using older web browsers that don't support Brotli, you probably want to keep gzip enabled on your server to support compression for those clients, at least for a while longer.
If you wish to leave both (or all three) schemes enabled, you must, therefore, take some action to force IIS to choose br when acceptable. To accomplish this, you can modify the Accept-Encoding header value on requests as they enter your IIS pipeline. The IIS URL Rewrite Module makes it easy.
c80f0f1006