Because that photos actually smaller on the webpage, it's obsuring some of the photo. Is there a way to make the that black lightbox smaller. I have other photos on this page and it's happening on those as well. I will just target this collection if I can make this happen. Thanks in advance.
The only things I know about reducing file size is to have RGB colour setting or to export the artwork into a PDF etc. But I want to keep it as an AI file and keep my vectors editable, I don't want to flatten or rasterise my file.
The pattern swatches used are mostly jpegs and one PNG file, none of them are paths. they have all been all been rasterized to 150 ppi. I am not sure what their file size were before I dropped them into the illustrator file, maybe I should have resized the file before dropping into illustrator?
You can save a copy of your file, and your colleague's file as PDFs, them open them in Acrobat and go to File> Save as other> Optimized PDF, and click on Audit space usage, that will give you a list of the PDF contents and sizes. Comparing the two PDFs should give you some insight about the difference in file size.
If your raster images are reduced in size, the effective resolution may be much higher than necessary, which would result in increased file size. Generally, the raster elements will increase the file size more than the vector elements.
You can reduce the effective raster resolution in the PDF copy when you optimize it in Acrobat, or re-size the raster images in Photoshop and re-link them to your illustrator file, to produce a smaller PDF.
The size of the output PDF depends very much upon the layers in the map and the other export settings, not just the DPI. I have tabloid (11 x 17) PDFs that export at under 1MB @ 300 DPI, and letter-sized maps that export at closer to 5 MB at < 150 DPI.
If your map contains lots of vector data, you might get better filesizes by using the "output as image" option in ArcGIS Pro, which will force flattening of vector layers similar to the "optimize pdf" option in Adobe Acrobat.
@JeremyWright Thank you. We are on 2.7.0 and between the time I posted this and your response we were able to figure out how to use the "output as image" properly. We can't use the "Adaptive compression" because it looks bad. But if we use the JPEG image compression on Quality almost Max and Compress vector graphics, 300 dpi, Raster resample: Best, with no georeference information exported and no layers and attributes - then we can get a pdf that's similar to what we would expect from ArcMap.
@Anonymous User if you could share your project (package it from share tab of ribbon->Project Package) with me so I could evaluate the adaptive compression issue you noted, that would be super awesome. Adaptive attempts to determine the best compression type from the type of image object being compressed - if it looks bad that may mean our heuristic for picking compression type might need to be adjusted. I can DM you separately to work out file transfer details.
@JeremyWright Ok great. Well, I talked to my tech and it looks like in this instance the pdf looked ok but it didn't compress. So the adaptive compression pdf is 13MB whereas the jpeg compression pdf is 2MB. I can send you the project package. DM me and I'll be happy to send it.
(Sorry off topic: My tech also mentioned both 2.7.2 and 2.7.3 have issues with adding items to legends during the export. These items are usually tied to web services add via URL and the best work around right now is to make the items visible in the legend in the layout and create a white rectangle placed over them. - Maybe you're already aware of that.)
I would like to be able to compress PDF maps I create when I need to export them so they can be easily e-mailed yet maintain the sharpness and readability of the PDF. The work around is to create the PDF, open it up in Adobe and use their add-on compression tool ( requires a subscription). It would be handier to do this directly within ArcGIS Pro and bypass Adobe.
@Anonymous User I'm glad the improvements we've implemented over the last couple releases are working out for you - several changes were made to improve file size. We have even more improvements going in to the next release of ArcGIS Pro as well.
If you have a specific map or layout that proves particularly hard to get comparable exports on, I'd encourage you to try ArcGIS Pro 3.0 when it is released later this year - we have a few enhancements to export file size reduction that may prove very useful for you.
I figured out that after compiling my projects gets a huge amount of my storage ( 1GB and above) and I'am just curious that is there a way to make project size smaller or not?
sth like share some most-used dependencies or sth like that?
I would generally run cargo clean on a project after I'm done working on it. So at least I'll only have one (or a few commonly used) project(s) taking up the extra disk space of compilation artifacts at a time.
Sccache trades reduced compilation time for increased disk usage. It copies the compiled rlibs into every project specific target dir when cargo runs on the respective project and in addition it keeps a copy in it's own cache. AFAIK it doesn't use hardlinks or anything like that to save space.
That is accurate; sccache works essentially by wrapping rustc to check the shared compilation cache first. But if it finds something there, rustc/cargo uses it normally, i.e. writes the compilation artifact(s) into the target directory.
If you want to share target artifacts between projects, your best bet is going to be a workspace, as within a workspace cargo can actually unify dependency versions so you aren't using se...@1.0.187 in one project and se...@1.0.188 in another. If you can't / don't want to use a monolithic workspace, setting $env:CARGO_TARGET_DIR can redirect each project's ./target to some shared directory. This won't help projects select the same dependency configuration, but when they do, there'll only be one copy of the dependency compiled. Then with a cargo clean every now and then (e.g. whenever you rustup update and the old artifacts are unusable anyway), the target directory size won't grow completely unboundedly.
Hm, sounds like room for improvement. In particular I'm thinking that on file systems like btrfs and zfs you could use data deduplication using reflinking, (basically copy on write, and thus safer than hardlinks). I might take a stab at implementing that since I do use sccache and btrfs.
One possibility is to mount the projector in the conference room just above the window at the ceiling. This would keep the projector out of the center of the room yet near your electronics. The other possibility is to mount the projector very close to the screen and get a short throw projector for the very short distance. Either way you could use a kvm extender to get the video feed to where you need it.
As others have suggested, you need get a projector with the correct focal length (throw). Use the projector config tables that others have already posted to calculate this based on the distance from the screen to the projector and the size of the screen.
A white bed sheet can be used for rear projection if you want to test. I have included two links at the end that can help with the projector side of things, the first will let you enter your current projector and see what it can do, The second will allow you to enter a throw distance and desired image size to find a projector that matches.
There are two ways to do rear projection. The easy way to do rear projection is to fire the projector directly at the screen from the proper distance for the desired screen size. If you do not have the required distance there are rear projector mounts that have a mirror to allow for larger images while requiring less depth behind the screen.
The kicker is, I am placing the projector where an old-style 35mm slide projector would have been placed. The rooms were designed for that decades ago and the technology available at that time was able to handle it.
I know this is an old thread but nobody gave the correct answer to the dilemma, so for the benefit of anyone else looking for the same solution, the answer is actually the opposite to what has been recommended, a long throw projector is the solution as a long throw is designed to give a large image at a longer distance it will therefore give a mush smaller image at a short distance, whereas a short throw is designed to give the maximum sized image to a very short distance.
I have exactly the same problem with fixed ceiling projector position (22 feet away from lens), and particular screen size 100". I am just looking for some digital HDMI video processor, which may reduce picture size.
I used to work on a 80gb HDD back in 2016 and it actually felt snappy most of the time. And after purchasing a new laptop with 1TB HDD and a much bigger ram and faster CPU - the HDD actually felt much slower than my old one although it was newer generation HDD (Not sure about the RPM speed).
After doing it - it actually felt much snappier than before for some reason. File manager runs so quickly and the overall performance of the HDD just feels much better. I want to know if it's just a placebo or it's a legit thing that has been going on before the SSD era.
It improves seek performance by limiting the drive's head movement. Hard drive performance is primarily limited by three factors: Seek time (the time it takes to move the heads in or out to the desired cylinder), rotational latency, and of course the actual data transfer rate.
Most modern 3.5 inch hard drives have average seek times in the 9 to 10 msec range. Once a "seek" is done, then the drive has to wait for the start of the desired sector to come under the heads. The average rotational latency is simply half of the time it takes for the drive to turn one full revolution. A 7200 rpm drive, turns at 120 revs per second, so a rev takes 1/120 sec, so half a rev - the average rotational latency - is 1/240 sec, or 4.2 msec. (Note that this is the same for every 7200 rpm hard drive.) So we have an average of about 13 msec before we can start transferring data.
d3342ee215