Re: Highly Compressed Pc Games Less Than 7mb

0 views
Skip to first unread message
Message has been deleted

Kian Trip

unread,
Jul 11, 2024, 11:17:09 PM7/11/24
to haimanpiso

Hi Naked Scientists, I was just wondering - if planets like Jupiter are just gas giants, why is it they exert such enormous gravitational pull on surrounding matter, like the asteroid belt? Do they have a very large, dense core providing the pull or is the gas highly compressed contributing to the mass? Love the show, Orlando (Perth, Western Australia)

highly compressed pc games less than 7mb


Download File https://byltly.com/2yMwUO



Dominic - Well, planets like Jupiter certainly do have cores. Jupiter, we think, has a rocky core that's about 10 times more massive than Earth. Jupiter itself is a really vast planet. It's got about 300 times the mass of Earth and about 10 times the radius of the Earth and most of that volume, most of that mass is a mixture of hydrogen and helium gas. That gas is very heavily compressed and that's how Jupiter manages to be so very massive.

Actually it is in a state called metallic hydrogen, where these molecules are so compressed together that they form a lattice and the electrons, rather than orbiting around individual hydrogen nuclei, actually can flow freely through that metallic hydrogen. That's why Jupiter has such a strong metallic field - because the electrons flow through the hydrogen producing that electric field.

(adsbygoogle = window.adsbygoogle []).push();Chris - How did it get all of that gas in the first place? Hydrogen and Helium being so light, how did they manage to coalesce around Jupiter before it got big and had all of that gravity?

Dominic - That's an interesting question that people are actually still researching. But I think that the best theory at the moment is that when a planet gets to a mass of ten times that of the Earth, it's gravitational field is then so strong that it can pull in gas around it and you can get this sudden catastrophic fall of material onto this planet. So any planet that is less than ten times the mass of the Earth will tend to be rocky, like the inner planets of the solar system. Any planets that creep over that mass suddenly turn into these vast gas giants like Jupiter and Saturn.

Your real compression performance will probably depend a lot on the data you are putting in. Is it all geometries? If you have a lot of non-spatial data (or a lot of text attributes for spatial points), then it doesn't really matter what you do the geometries - you need to find some way to compress that data instead.

As others have said, I think you are going to struggle to find a format that meets your compression requirements. You would have to create your own custom format, which given your requirement to use commercial software is not going to be viable.

I think you need to possibly first consider how you can make your data models more efficient, then look at the compression aspects. For example, do you have a lot of repetition of geometry? You could then have a base set of geometry layers with unique IDs and then separate attribute data sets that reference the geometry by ID - that way you can have multiple views of the same geometry serving specific functions. Most decent software packages will then allow you to create joins or relates in order to create the unified view for a layer.

GML is a good example of a format that supports this kind of relational data model, though being a verbose format file sizes will be large. You can compress GML using gzip compression and can potentially get a 20:1 ratio but then you are relying on the software being able to support compressed GML.

Regardless, I would urge you to first look at your data models and see where there could be savings to be had. FME from Safe Software is your best bet if you need to start manipulating your data models.

To achieve that sort of ratio, you could use some sort of lossy compression, but I don't know of anything that uses it, and although I have a couple of ideas on how one might implement it, it would be far from standard. It would be much much cheaper to kit your server out with a 1TB disk than to spend time and money developing a custom solution.

You are also confusing data storage with data representation. Your 4th point mentions being able to view the data at different scales, but this is a function of your renderer, not the format per se. Again, a hypothetical lossily compressed file could store data at various resolutions in a sort of LoD structure, but that is likely to increase data size if anything.

If your data is to be on a server somewhere accessible by mobile applications, you're far better off using existing tools that have been designed for the purpose. A WFS server (such as GeoServer or MapServer) is ideally suited to this sort of application. The client makes a request for data of a specific area, normally that covered by the screen, and the WFS sends vector data for just that area, so all the heavy lifting is done by the server. It's then up to the application to render that data. An alternative would be to use the WMS features of MapServer and GeoServer, in which all the rendering is done by the server, and then it sends an image tile to the client. This enables features such as server-side caching of tiles, as well as scale-dependent rendering, with the minimum of work by you. They both read myriad formats, so you can author your data exactly how you like, and store it where you like, and they do all the cool stuff. Quantum GIS also has a WMS server, so you can author and serve data all in the same application.

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

We shed light on the specific hydration structure around a zinc ion of nanosolution restricted in a cylindrical micropore of single-wall carbon nanotube (SWNT) by comparison with the structure restricted in a cylindrical mesopore of multi-wall carbon nanotube (MWNT) and that of bulk aqueous solution. The average micropore width of open-pore SWNT was 0.87 nm which is equivalent to the size of a hydrated zinc ion having 6-hydrated water molecules. We could impregnate the zinc ions into the micropore of SWNT with negligible amounts of ion-exchanged species on surface functional groups by the appropriate oxidation followed by heat treatment under an inert condition. The results of X-ray absorption fine structure (XAFS) spectra confirmed that the proportion of dissolved species in nanospaces against the total adsorbed amounts of zinc ions on the open-pore SWNT and MWNT were 44 and 61%, respectively, indicating the formation of a dehydrated structure in narrower nanospaces. The structure parameters obtained by the analysis of XAFS spectra also indicate that the dehydrated and highly compressed hydration structure can be stably formed inside the cylindrical micropore of SWNT where the structure is different from that inside the slit-shaped micropore whose pore width is less than 1 nm. Such a unique structure needs not only a narrow micropore geometry which is equivalent to the size of a hydrated ion but also the cylindrical nature of the pore.

Because I do not fully understand how compression works @ it's core, I have (possibly ridiculous) concerns that sending a pre-compressed .tar to gzip might prevent gzip from compressing as well as it's potential would allow and things of that nature.

As you stated- "tar can also compress", implies that - tar does not always compress data by itself. It does so only when used with the z option. That too not by itself, but by passing the tarred data through gzip.

Usually neither gzip nor tar can create "the absolute smallest tar.gz". There are many compression utilities that can compress to the gz format. I have written a bash script "gz99" to try gzip, 7z and advdef to get the smallest file. To use this to create the smallest possible file run:

The advdef utility from AdvanceCOMP usually gives the smallest file, but is also buggy (the gz99 utility checks that it hasn't corrupted the file before accepting the output of advdef). To use advdef directly, create file.tar.gz however you feel like. Then run:

Since you only recently learnt that tar can compress, and didn't say why you wanted the the smallest ".tar.gz" file, you may be unaware that there are more efficient formats can be used with tar files, such as xz. Generally, switching to a different format can give a vastly better improvement in compression than fiddling round with gzip options. The main disadvantage of xz is that it isn't as common as gzip so the people you send the file to might have to install a new package. It also tends to be a bit slower, particularly when compressing. If this doesn't matter to you, and you really want the smallest tar file, try:

In this trivial example, we see that to get the smallest gz we need advdef (though 7z -tgzip is almost as good and a lot less buggy). We also see that switching to xz gains us much more space than trying to squeeze the most out of the old gz format, without compression taking too long.

I would like to take advantage of the free storage so my question is if the photos are originally less than 16 megapixels, is high quality going to be the same as original or would Google compress it more?

Also, what happens to the existing photos on my Google Drive that are less than 16 megapixels? Are they automatically not counted toward my quota or do I need to somehow migrate them to Google Photos? Just that, the majority of the files on my Drive are photos (which are less than 16 megapixels) and since I've downloaded Photos, the space available has not changed.

I compared both images and I can't see a difference between them, although the file sizes differ (the image in Google Photos has 1.3 MB and the one in Dropbox has 2.2 MB). I made the same comparison with an older image (when I had the "original" option enabled in Google Photos), and the files are exactly the same.

Pictures uploaded using 3rd-party software will be as-is regardless option and will use quota if their size is bigger than 2048px by longest dimension even if they are less than 16MP, even if options is set to free.

b1e95dc632
Reply all
Reply to author
Forward
0 new messages