Ihave a problem where users are reporting that their images aren't being uploaded and the old ones are still there. On closer inspection the new images are there, they just have the same name as the old one. What I do on the upload is that I rename the images for SEO purposes. When they delete an image the old index becomes available and is reused. Therefore it has the same image name.
You can put http-equiv in tag which will tell browser not to use cache (or even better -- use it in some defined way), but it is better to configure server to send proper http cache headers. Check out article on caching.
If you look at the data that is exchanged between your browser and the server, you'll see that the browser will send a HTTP HEAD request for the images. The result will contain the modification time (but not the actual image data). Make sure that this time changes if the image changes on the server and the browser should download the image again.
The time() function show the current timestamp. Every page load is different. So this code deceive the browser: it read another path and it "think" that the image is changed since the user has visited the site the last time. And it has to re-download it, instead of use the cache one.
If you do this after a new image is uploaded, the browser should see those headers and make a call to the backend, skipping the cache. The new image is now in cache under that URL. The primary user will see the new image, but you still retain all the benefits of caching. Anyone else viewing this image will see updates in a day or so once their cache invalidates.
I had come up with this issue some times ago. And I was getting data through JSON in AJAX.So what I did is, I just added a Math.random() Javascript function and It worked like a charm.The backend I used was a flask.
Query String: You can add a query string to the image URL with a unique parameter each time the image changes. For instance, image.jpg?v=1, image.jpg?v=2, and so on. This signals to the browser that the resource is different from what it has cached.
HTTP Cache Headers: Configure your web server to send appropriate cache-control headers. You can set the Cache-Control header to no-cache or max-age=0 for the directory containing the images. This tells the browser to revalidate the cache before using the cached version.
Server-Side Scripting: If you have server-side scripting capabilities (like PHP, Node.js, etc.), you can dynamically generate the image URLs with unique identifiers each time the page is loaded.
Is there a way i can force my site to load my main background image before it load all the small images on my site? i know in IE it seems to be fine , but in FF it load the background image at the end.
About the only way to make it more likely that the background will load first is to have it cached on their computer before starting to load the current page so that it is immediately available when called for.
Removes (and un-tags) one or more images from the host node. If an image hasmultiple tags, using this command with the tag as a parameter only removes thetag. If the tag is the only one for the image, both the image and the tag areremoved.
You can remove an image using its short or long ID, its tag, or its digest. Ifan image has one or more tags referencing it, you must remove all of them beforethe image is removed. Digest references are removed automatically when an imageis removed by tag.
And when I build a custom script to inspect the HTTP headers, it has the correct mimetype (image/jpeg or image/png). This makes me think that apparently, full browsers (not just some curl script) send some Accept header or something, that causes the particular webserver to output a webp image instead of the png or jpg, regardless of whether the URI specifically requests the png or jpg.
This is not really a web browser thing, but a web server thing. When the browser says it can handle webp images, some (allas: many) servers send webp images, even when the url would suggest it is a jpg or png image. The browser then displays and saves what it gets: the webp image.
Obviously this problem is avoidable by adding images with distinct file names, but I wondered if there were some better way that works regardless of naming convention. Or something else that I'm completely missing?
@t31os Thanks for your response. I'm sure it is just my browser caching the images. But (I should have specified) I'm not the only user. I'd prefer not to have to tell my users to refresh every time they delete something or the clunkiness of building in a page refresh as part of image deletion.
This is a browser cache issue, not a WordPress media library issue. The reason you see the old images in the media gallery is because you used the same file name and your browser is trying to save time and bandwidth by loading the versions it already has.
Add a query variable (?v=5) to the end of your image names as used on the front-end. This won't affect the media gallery ... but it will prevent people from seeing cached versions of old images on the front-end. For reference, this is the same method we use in WP core to force the browser to re-download updated JavaScript files.
You can tell your server (Apache) to set a custom "expires" date and time for different images either based on the extension (all jpg or png files) or the filename, if you want to get really granular. This is more of an advanced trick, but you can use this to immediately flush everyone's cache whenever you upload new images.
1) everything from 3 to 5 stars. This is some 6,500 images and has been synced from Classic to the cloud. These, I know, will be available to other devices as smart previews only, and that's just fine by me.
My question is this: when, at a remote location, using an OSX rather than IOS device, and therefore running a full, logged in, desktop version of CC, I want to view one of the files in category 2) above at full resolution, WTF am I supposed to do? The only way I seem able to force the machine to load a full res version is to tell it to open the file in Photoshop. I would have thought that zooming to 1:1 would force a full res version to load to this local machine from the cloud. Failing that, I would have thought there would be, under a Menu item or a right click on the image, or FFS SOMETHING, a way to make a full res version load. After all I uploaded it and paid for the storage specifically so that I could do this. But oh no: no sign of a way to do it, not way of finding a good Help resource and the usual CC annoyance of being made to feel stupid because, no matter how many support pages I visit or videos I watch, the CC speciality is to make you uncertain about what is where and how to get it to where you want it.
So I am left with the familiar CC problem: loads of unexplained duplicates, no ability to do a simple and bleedin' obvious thing that any sane human would want to do and nowhere to find a quick answer.
A lot of us actually share that opinion. LR CC is extremely far from being ready for real use - it can't even print! It is a fun toy to play with for shooting with your phone and doing light edits on an iPad that will sync back to your main Classic
The entire point of putting this stuff in the cloud is so that when I move from home to my studio to rented studio to travel etc, I can access my key images in full resolution without carrying a hard drive with me. That means being able to do so from any computer I use to log into my CC account. I am not going to tell each and every computer to download an entire library of images: 1800 images is just a starter number - over time it could be several TB of images (which is presumably why Adobe offers the ability to purchase up to 10TB of cloud storage). So if 1800 images is just for starters, no one in their right mind is going to force each computer they use to download the entire library. It just isn't how a professional photographer would work. I currently run a desktop machine, two laptops and an iPad and iPhone and I need to be able to access full res on all of these, bar the iPhone, easily. That's surely the entire point?
It is insanity to not just have a 'right click' option 'Force Download of Original' or something similar and is yet another example of how badly wrong this piece of software is going. It is bad enough that I can't store my Classic LRCAT in the cloud (without some very cludgy Dropbox possibilities that some people have learned to live with) but to introduce a cloud storage option that then combines the worst of the cloud and the desktop computing models takes a particular form of warped logic.
1) That is an absurd suggestion. One does not want to build up an array of locally stored full res images on every machine on which one works: a real world use case is that the photographer is working away from base and needs to access the full res original of a particular image so as to edit it, show it to a client, print it, whatever. SO the ability to force full res into the develop function for one photo at a time, without committing to store that photo locally thereafter (thereby littering the local drive with unwanted and unneeded files) is important and should and COULD be quickly and easily done. If, as staff, you think that a 'workaround' is a solution to a core need, it shows how badly the rot has set in on the LR team.
2) Your suggestion doesn't work. Try it. It merely seems to add the file to the queue of 'things that need downloading' and does not in any way prioritise it or cause it to load in full res straight away. The only thing that DOES do that is 'Edit in Photoshop.'
Reference Item 2... I did try it. It worked as expected. Image downloaded locally in a few seconds - on command. But, I see in your most recent comment you want priority over other cued items as well. That wasn't mentioned earlier. As I had no items currently syncing, it had top priority and performed thusly. Priority download would be a new (and afaik, not yet available as a workaround) feature.
3a8082e126