After a recent dom0 update froze my Qubes 3.2 system I'd been using for the past year and a half, I upgraded to Qubes 4.0 on the same Samsung EVO 850 SSD. My PC has a Gigabyte AORUS GA-Z270X-Gaming 7 (rev. 1.0) motherboard running the latest UEFI enabled BIOS.
After updating dom0 successfully, I restored my templates. I then updated to Whonix-14 via the instructions for uninstall & install on the Qubes website.
All seemed well and I shut down. The next day when I attempted to restart my system, instead of launching Qubes, it asked for me to insert a bootable device. I went into boot options by pressing F12 and noticed that there was no longer a "Qubes" option in the list of bootable devices. There's there was a selection of drives, one of which was empty. When I selected the SSD drive, I kept getting the same error, and the system still will not boot. I have tried installing Qubes on multiple SSD drives, each time with a successful install but unable to reboot after being shut down.
I do not believe that all my drives have simultaneously failed. My online searches lead me to believe that this may be related to a UEFI problem with my BIOS. My efforts to use the Qubes Recovery option were unsuccessful. I did find a recommendation to use reFInd on the Qubes website but the instructions are not sufficiently detailed for me to follow them.
Any assistance to help me get my Qubes systems back up and running is greatly appreciated. It's a wonderful operating system and I'd love to keep using it.
Hi,
Thank you for responding. I apologize if I seem like a newbie - this particular problem is not something I've ever faced before.
I inserted the Qubes 4.0 installation CD, and noticed that there were two boot options corresponding to my CD Drive. Both had the CD drive identified by manufacturer, but one was preceded with UEFI while the other was not. I first tried to boot up with the UEFI version of the drive. It never gave me the option to recover a Qubes installation, it just launched directly into the installer.
So I shut down, and restarted using the non-UEFI CD ROM drive option. I was able to get to the Qubes recovery, selected 1 (one) to attempt to find my installation and mount it under /mnt/sysimage/, entered the password to decrypt the drive, and let it do its thing. Once it was done, I pressed enter to get to a shell and entered chroot /mnt/sysimage/ as suggested by the instructions.
I then attempted to execute the command at the bash-4.3# shell prompt that the link suggested:
efibootmgr -v -c -L Qubes -l /EFI/qubes/xen.efi -d /dev/nvme0n1p1
That did not work. It returned an error message as follows: "EFI variables are not supported on this system"
I tried again but added sudo before the command, but got the same error message. I attempted the process again but instead of selecting 1, I selected 2 to mount read only (just on the off chance it would make a difference). I also tried with option 3 (skip to shell), but got the same response each time.
To be sure, I went into my BIOS and made sure that the the following two options on the bottom of the BIOS tab were set as follows:
Storage Boot Option Control = UEFI
Other PCI Devices = UEFI
I also noticed that it detected "No NVME device found", so I am not sure if that would alter the command.
I attempted to change both of the options above to Legacy, and tried to execute the efibootmgr command again with the same result.
I welcome further suggestions on what I might try to get my system to boot again.
One other thing. Leaving everything legacy in my BIOS, I decided to start from the very beginning and reinstall Qubes 4.0 instead of trying to recover.
When I got to the disk selection screen, I saw the /dev/sda set up into 2 partitions: /dev/sda1 is the EFI partition and /dev/sda2 is the LUKS encrypted partition with the OS itself.
I decided to stop the intall and try recovery again with your first suggestion. I booted into recovery mode, option 1 (which mounts the /dev/sda device as follows:
/dev/sda1 -> /mnt/sysimage/boot/efi
/dev/sda2 -> /mnt/sysimage.
This time I did not execute the chroot command. I also realized the original thread was to get a Qubes 3.2 system running again so I thought to alter it a bit since I am working with a Qubes 4.0 system. I tried to directly access the EFI partition via the following commands, with their responses outlined:
command: sudo efibootmgr -c -L Qubes40 -d /dev/sda -p 1 -l \\EFI\\qubes\\xen.efi
response: sudo: error while loading shared libraries: libsudo_util.so.0: cannot open shared object file: no such file or directory
command: efibootmgr -c -L Qubes40 -d /dev/sda -p 1 -l \\EFI\\qubes\\xen.efi
response: EFI variables are not supported on this system
command: sudo efibootmgr -c -L Qubes40 -d /dev/sda -p 1 -l \\mnt\\sysimage\\boot\\efi\\EFI\\qubes\\xen.efi
response: sudo: error while loading shared libraries: libsudo_util.so.0: cannot open shared object file: no such file or directory
command: efibootmgr -c -L Qubes40 -d /dev/sda -p 1 -l \\mnt\\sysimage\\boot\\efi\\EFI\\qubes\\xen.efi
response: EFI variables are not supported on this system
To sum up, despite the different conditions I set up, the proposed solution still is not working.
awokd:
Thank you for the additional suggestions. I have several drives with the same issue, so I will attempt your solution on one of them.
I am currently trying to install Qubes 4.0 from scratch on another of my drives after making absolutely sure the BIOS settings are set to "Legacy" instead of "UEFI" whenever possible.
With my first attempt at this, I opted not to create the Whonix app VMs and enable updates over TOR, so I could just restore the backed up versions of Whonix 14 from before the system failed to boot. Turns out that was a bad decision, as it caused dom0 to be unable to download updates and would not let me restore the templates.
I had to start the fresh install all over again, but this time when I was destroying the old partitions and have Qubes re-partition automatically, I noticed a difference with the setup. Instead of /dev/sda1 being an EFI partition, now it was set up as an ext4 partition. Perhaps that's what I needed to do all along.
I will let you know how this goes, as well as your additional suggestion on another disk. It may be a while. :)
Thanks again!
It has been a week since my last post. I again tried reinstalling Qubes 4. Here is what I found made the difference between working and getting the error I described previously:
In the installer:
-- Select the option to manually set up partitioning rather than auto.
Within the manual partitioning section, there is an option to automatically set up partitions. Selecting this option will set up the partitions for you, but display the partitioning scheme so you can alter the partition arrangement. I selected this option and wound up with an ext4/LUKS/swap partitioning scheme
-- Continue to install normally
-- Select updates over TOR
-- Set up sys-net/sys-whonix systems, but not work/private/vault
Once that was done, I proceeded incrementally, rebooting after each step, confirming that each new change did not result in the system failing to boot. (Update/reboot, restore/reboot, etc... etc...)
For increased isolation, I have set up my hardware such that I am able to swap in different SSD drives using a trayless removable hard drive rack, each with its own install of Qubes 4. I also have a removable hard drive rack for a 7200 RPM backup drive. The 7200 RPM drive holds my my qubes templates and appvm backups. During my incremental installation process, I tested booting with and without the backup drive present.
My newly reinstalled Qubes OS worked fine for the next 5 days, during which the backup drive was always present. There were some template and appvm updates done during this time. I shut down and removed the SSD and the backup drive when I was expecting repairmen to visit my residence and I could not be present. When I inserted the SSD and rebooted it this morning (Saturday 9/29/2018) without the backup drive present, I got the error that caused me to start this thread - "No bootable device found - Insert a bootable device." I tried swapping in one of the other SSDs and got the same problem. Then I inserted the previously functioning SSD drive AND the backup drive. This time it booted.
Does anyone have any insight/ideas as to why the OS is only booting with the backup drive present? During typical use, I don't usually keep the backup drive connected when using the operating system.
Thank you