Theremoval of the old version 6.0.0.0 runs ok (all installed files are being removed), but when the installer continues to install the new version, 2 files are missing: a 3rd party DLL and a 3rd party EXE (that haven't been changed) are not being reinstalled.
If I click on "repair" after the major upgrade, the 2 missing files re-appear.Also, if I install version 6.0.1.0 for the first time (no upgrade, but first installation on a clean machine), then those 2 files are installed directly and normally.(tested on several Windows machine (XP, 7 and 8)
I've seen this problem with this updater in the past. Christopher is correct. The updater updated its files but didn't tell the MSI (it doesn't update the MSI which is not the correct thing to do). The new MSI thinks newer stuff is on the machine, chooses not to install its files, but during the upgrade the old package removes the files (it doesn't notice that the versions are newer). Since the new installer chose not to install the files you end up with nothing... until the repair.
To work around the problem, you need to move your RemoveExistingProducts action later. If you're using the MajorUpgrade element then Schedule='afterInstallExecute' or Schedule='afterInstallFinalize' should do the trick. You'll need to be more careful with the Component Rules.
I have had the same problem. The issue here is when doing major upgrade, msi at first checks which components to install (and all dlls with lower version than the ones already installed are marked as "do not install"), then it removes installed app and then installs new version but without those previously marked components.
A log file would help. My guess is it's based on where you have scheduled RemoveExistingProducts. I've seen situations where Costing figures out a file being installed is the same as a file already installed and decides to not install the file. Then the major upgrade occurs and you end up not having the file. The repair works because the file isn't there and costing realizes that it needs to be installed.
I had another solution to this problem, but the previous reply certainly pointed me in the right direction. The DLLs in my .NET project were being assigned a lower version number than my previous installation. Going to the AssemblyInfo.cs files and incrementing the third octet from 0 to 1 solved it. Wix now recognized the DLLs as newer.
Error still exists on installer 5.0 and is still a problem.Workaround to place RemoveExistingProduct after InstallFinalize is no solution for us. I forced the update by property setting on the single file.
My WiX installer (Wix 3.10, MSI 4.5) uses MajorUpgrade for updating. The files to be installed are harvested with heat.exe in pre-build. The current (older) msi file contains a file nlog.dll (which came with a NuGet package v4.1.0) that has a file version of 4.1.0.0, a product version of 4.1.0 and last write time of 2015-09-01.
Since the nlog team ran into some strong naming issues, they published an updated NuGet package v4.1.1, containing an updated nlog.dll with its file version decreased back to 4.0.0.0 while its product version has been increased to 4.1.1, last write time is 2015-09-14.
Now I'm running into a related issue as Robbie did here: wix major upgrade not installing all files: When I install the new msi package and the major upgrade is performed, the present nlog.dll (which is newer according to its file version, but older according to its file date and product version) is being removed, but the new nlog.dll isn't installed.
However, using Schedule="afterInstallExecute" or Schedule="afterInstallFinalize" as suggested won't do the trick for me. Instead of removing the newer file and not installing the older one as in Robbie's case, it doesn't overwrite the present file, and just leaves it in place.
Long story short, I would like my installer to simply install all files that come with it, regardless of any file/product/assembly versioning stuff. There are valid circumstances in which replacing a newer file with an older one is desired. Can't you just tell the installer engine to ignore file versions/dates? If not, what are my options?
The other trick is to use "version lying". This is where you author the file element with a higher version. Using heat can make this difficult as now you have to transform the XML before compiling it.
Of course the real solution is to hit the nlog team over the head. But based on what I've seen from them over the years it'll never happen. Perhaps you just use a resource editor to hack the DLL and 'fix' the version #. That's assuming you don't need it strong named. This feels dirty and a possible CM nightmare to me though.
If this is a major upgrade and you want everything to be uninstalled before the new product is installed, then you schedule RemoveExistingProducts after InstallInitialize or InstallValidate. That does the uninstall first.
I can't tell if you're getting the "disallowing install..." issue or not, but if you are, and there are other clients of the Dll (it's shared with other installed products) then I'd see if that Dll has support for private copies so you can have you own private copy for your product. If it is shared with other products I wouldn't use version lying - I'd open the Dll with Visual Studio "open as file" and change the version! Make that your latest shared version, so every package that installs it can just use it.
If it's not shared with other products and you're just running into that MSI quirk, then make your own upgrade element and schedule RemoveExistingProducts before CostInitialize, which is what is deciding not to install. That works, but it's before MigrateFeatureStates so you will lose feature migration in your major upgrade.
I have been saving files as mrb files. I realized today, while going through my registrations, that the full body scans that had previously been used for segmentation (which were all different), had somehow gotten reduced to copies of the second full body volume. I have no idea how this happened as until a couple weeks ago there was no issue. I have the original files that I had segmented, but when I pulled them all into the same scene and subsequently started registration all of the volumes except the volume I registered to have turned into duplicates of the first moving volume. Here is an example:
image19201040 254 KB
image19201040 251 KB
image19201040 254 KB
Here are some examples of the original files when I reloaded them all prior to saving as a mrb file:
image19201040 364 KB
image19201040 238 KB
Can you reproduce the issue?
Do you have scenes that you can load individually and they appear correctly but when you load them all into one scene they show up differently?
Can you share those scenes so that we can investigate?
Do you get any warnings or errors when you load the scene?
If you run out of memory then anything can happen. How much RAM do you have? How much virtual memory have you configured for your system? What is the size of all data sets in the scene (uncompressed)? Is there any error logged during scene loading? Can you share the data sets?
To investigate or fix the problem, it is essential to be able to reproduce it. If you cannot share the images, then blank out the images with some simple shapes (you can do that using Segment Editor / Mask volume; after installing SegmentEditorExtraEffects extension) save the scene, and see if you can reproduce the problem using these modified images. Or save the images in nhdr format and then delete the pixel data (.raw file) and send all the other files.
You can mask volume using sgment editor as I described above. But maybe it is even simpler to use numpy to set all voxels in the image to a constant value (and use a different constant value in each file) - see here: _voxels_in_a_volume
I think I figured out the problem anyways. My CPU and Memory were being maxed out and my storage was 2 GBs shy of being full. In fact, when I tried running it last the whole computer overloaded and shut down.
Thanks for the update. In general, most software act randomly (crash, report various errors, etc) if they run out of memory. As a rule of thumb, allocate at least 10x more virtual memory than the sum of size of all data that you load (backed up by as much physical memory as possible).
Context - I work with a lot of PDF files, creating/exporting a lot and storing almost everything on Dropbox (for now, though I am seriously considering ending my contract and looking elsewhere). I am often commuting so working on the train with unreliable WiFi or hotspots.
If I move a PDF file into my Dropbox folder while the connection is offline or unstable, I lose access to the file completely and it cannot be opened - even though it is a local file, created on this very machine. This has only been a problem since the recent CloudStorage changes. I just want Dropbox to work how it used to - as a folder on my machine that I could use like any other local folder, which then updates/syncs when connected but never stopping me from accessing my own local files.
Take as an example this file that originated in my Downloads folder on the system. I just moved it into Dropbox and am currently connected via mobile hotspot so connection is a little patchy but I do have internet (obviously). The file appears like so:
Normally after the internet connection stabilises or my system restarts, Dropbox completes syncing and I can open the file. But this is not satisfactory for me, as I am often working within Dropbox to open, create and edit PDF files, even without an internet connection.
Is this behaving as expected? If so, I cannot continue using Dropbox as I am losing hours of work by not being able to open my own files until they are... uploaded..? It makes no sense to me! Any fixes would be welcome but this is quite hard to replicate
Did this post help you? If so, give it a Like below to let us know.
Need help with something else? Ask me a question!
Find Tips & Tricks Discover more ways to use Dropbox here!
Interested in Community Groups? Click here to join
3a8082e126