GSoC Anti Evil Maid improvement project

139 views
Skip to first unread message

Patrik Hagara

unread,
Mar 26, 2017, 8:37:00 AM3/26/17
to qubes...@googlegroups.com, rust...@openmailbox.org
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA256

Hi!

I'm thinking about applying to the GSoC program and working on
the Anti Evil Maid shoulder surfing and video surveillance
resistance project idea.

However, I've got a question regarding the proposed
solution which requires implementing both TOTP based
machine-to-user two factor authentication mechanism and
support for secondary AEM device containing passphrase
protected LUKS keyfile (so the user has to carry two
USB sticks and a TOTP token with them at all times).

Wouldn't it be sufficiently secure to use only one external
AEM device containing both the AEM protected boot partition
and a TPM-sealed LUKS keyfile? This approach would protect
against keyboard sniffing as no passphrases need entering
(the LUKS keyfile can optionally be protected by an additional
passphrase, a TPM SRK passphrase would not add any extra
security with an external AEM device, and the BIOS boot
passphrase can still be sniffed by the attacker anyway);
screen sniffing is also prevented by getting rid of the static
AEM secret displayed upon successful unsealing, which would
get replaced by LUKS keyfile unsealing. Such set up would
successfully and automatically boot the OS from enrypted drive
only if the TPM PCRs are able to unseal the LUKS keyfile.

The advantages of this approach compared to the proposed
solution are: no need of always carrying and protecting three
separate 2FA devices (AEM boot stick, AEM keyfile stick and a
TOTP token), no manual verification by the user is necessary,
no sensitive data knowing which an attacker would be able to
defeat the AEM trust model are displayed on screen or typed on
keyboard.

To defeat the protection offered by this solution, an attacker
would need to clone or seize the user's AEM stick (and possibly
also sniff up to three additional passphrases used, namely one
used for extra encryption of TPM-sealed LUKS keyfile, BIOS boot
and TPM SRK passphrase).

If I made a silly mistake somewhere, please do correct me.


Regards,
Patrik
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2

iQI5BAEBCAAjBQJY17VsHBx4aGFnYXIwMEBzdHVkLmZlZWMudnV0YnIuY3oACgkQ
XB5x3wMfmuUTjg/+NBOe/46B0tipScF3GW8ojHKpQWO7CYeImd+Go5jZ0BbNu9e3
HR3j5xGkUaj0nq3miDPh5jH6Qwr7IC1UfeGf1fYdfgs5V990LnQXnb9CjPokI2wo
30ZNllIQ20/fLguE56anFRGvxfjV5MiydJF2skkBYQnPNT1x7iGbFvdnVwLJ7U1l
XRemyYk1/4FimL2lbvsrbA1XTVc1hP81IjAfkiOj1rzaXPSlhdjiYigmkiiizEBb
6omo4evz9n701ApHkxwBDgu4dzZIEhHoUzXDhcuSKptYB+5mRMkm8xtguZen+zp/
iY11Ihnu3uRLhKpOWrL4/mQtnz2d9yL1J/5PMQNbIenvzBtNZCxuyhxw/1HwAxUJ
MRK+qtrFvgM/+01RJeRemoEBLcTs3CdPyezGzI6D04AgTBGpKNGDOqEsxKj3wrtf
489y/qI2ooP6UzuCvV2BavubAAodE72dID2ZLzGuF77RsqJE59MTOsEdY17/hcRl
FEEhzlel7Zg6LO2cv/lSpROFe6B1rpcf2PYBW8yA0FAzS2d7Xm1sJoHxjkKagJp7
ZryVv+w3fm8WTKsXSOicTCTwPjNsqnQ5Zaml3zu8yc1zdthagTbg5C+3+FgwbnqJ
JFrGyZBKi97OUS4245/TXfbWL6miXFSWYma8o1MlBbqEMVNaLzgGckhHP5o=
=fn7o
-----END PGP SIGNATURE-----

Rusty Bird

unread,
Mar 26, 2017, 2:21:52 PM3/26/17
to Patrik Hagara, qubes...@googlegroups.com
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512

Patrik Hagara:
> I'm thinking about applying to the GSoC program and working on
> the Anti Evil Maid shoulder surfing and video surveillance
> resistance project idea.

Awesome!
When the attacker is infecting the user's computer, they could add some
code to copy the sealed encrypted keyfile into the nooks and crannies
(firmware, reserved disk sectors, ...) of that computer. A multi-stage
attack would go like this:

1. Visually capture the boot passwords (or retroactively access CCTV)
2. Infect the computer's boot code (but back up the uninfected version)
3. Wait for the user to connect the AEM stick on the next boot attempt.
Even if they immediately destroy the stick after noticing
something's up, they're still screwed:
4. Seize computer, restore sealed encrypted keyfile, restore uninfected
boot code, replay captured passwords, decrypt disk

So when the computer fails to authenticate itself, not only must the
user have the discipline to stop _using_ the computer. If multi-stage
attacks are part of their threat model, they must also _destroy_ the
computer

- - completely: because who knows where exactly the sealed encrypted
keyfile has been been copied to
- - quickly: before the attacker can get to it

Whereas, if we put the encrypted keyfile on a separate stick that's only
ever connected _after_ the computer has authenticated itself to the
user, getting rid of that is more realistic. It's just a dirt cheap
little commodity device that doesn't contain any irreplaceable data.

A nice setup would be two microSD cards, in tiny USB adapters[1] if
necessary, on a key chain. We could call them "verification stick" and
"decryption stick", sort of like how people are used to having a
building key and an apartment key?


That said, although the scheme you're proposing wouldn't prevent the
multi-stage attack, it still has _a lot_ of good things going:

- - It improves security and UX, compared to the AEM status quo
- - UX is vastly better than the two sticks + TOTP device scheme
- - No need to trust the TOTP device (probably a smartphone - eew!) to
authenticate the computer
- - No need for conspicuous TOTP verification


We could support both schemes and let users choose either or none:

- - If secret.luks.encrypted.sealed2 exists on the verification stick:
unseal it, decrypt it, and use it as a keyfile
- - Otherwise, if secret.totp.sealed2 exists and we're not in "I don't
have my TOTP device" fallback mode: unseal it and show the code
- - Otherwise, unseal and show secret.png.sealed2/secret.txt.sealed2
- - If we don't already have a keyfile, and the user inserts a decryption
stick before unplugging the verification stick, unseal
secret.luks.encrypted.sealed2 from there, decrypt it, and use that

I don't know if this combination would be too much work for a GSoC
project though?


Rusty


[1] https://www.kingston.com/en/flash/readers/fcr-mrg2 is top notch
-----BEGIN PGP SIGNATURE-----

iQJ8BAEBCgBmBQJY2AayXxSAAAAAAC4AKGlzc3Vlci1mcHJAbm90YXRpb25zLm9w
ZW5wZ3AuZmlmdGhob3JzZW1hbi5uZXQ4NEI1OUJDRkM2MkIxMjlGRTFCMDZEMDQ0
NjlENzhGNDdBQUYyQURGAAoJEEadePR6ryrfL5kP/3O0DTt3ni1LbpbasABcZhws
N8Nc0m42Uay1pqpgDKk4dER8YofJQ4vD726lbia6EYwIKSIQQZ0Ifayn3y0aJyj7
MN0hD1lKLam4uIDJEk6XbqTQHWvrBlqAmFnoY2SyWlwYC2i/Jgu6LAGkpCbANoqB
qY5jMixlyFPPBpO3oBzt5AG+j4aDIj5OKYB4MNj7HvVHLs+rsfkjuFLOy+IchFzX
zRKUFI/Ka/P8Nx/wn1Ytd5OIfnFA5qw9M5qje5J3kNJBicK/ZHVirIiWciyrlp4Y
xy1CUwriM0v4+90/hvaq7tIhTQQpcC6fs40DdL6ipYyGOclpba/svFNE1xoB/e3x
O55I/9CzDSbdGMvvWlx8+n422Ey1mbtyr8bJCt8ITK6kLDJhM3XfyC+Cz17Mp/g9
Uk9Q47Z93Ukh3hA3RzMPHhEYRxweTpzadDczjLvtQr4TNEQOXhJGM7cDqF277MV0
BNgwojU6qotZA+A637U4o9uK9J3EqI1gm0BTn1m0KHYFxig8NRk/4VNwyjxn8WUH
T9lNb5VzvW5S4uJ6eoEqGQyEksb8KdIqYBGiulN4c/lFhpTOUyXAJfyYZC8Wf5Ti
adxh5Ywp6U8MSComkwOLI9BcegXBPyDPoPMfgQet/NoG51HUekD5UJDRt/Lqt9ZA
QRQnz4WjQ3ZoV4l4fTPd
=q/0V
-----END PGP SIGNATURE-----

Patrik Hagara

unread,
Mar 26, 2017, 4:29:29 PM3/26/17
to qubes...@googlegroups.com, rust...@openmailbox.org
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA256

On Sun, Mar 26, 2017 at 6:21 PM, Rusty Bird <rust...@openmailbox.org> wrote:
> When the attacker is infecting the user's computer, they could add some
> code to copy the sealed encrypted keyfile into the nooks and crannies
> (firmware, reserved disk sectors, ...) of that computer. A multi-stage
> attack would go like this:
>
> 1. Visually capture the boot passwords (or retroactively access CCTV)
> 2. Infect the computer's boot code (but back up the uninfected version)
> 3. Wait for the user to connect the AEM stick on the next boot attempt.
> Even if they immediately destroy the stick after noticing
> something's up, they're still screwed:
> 4. Seize computer, restore sealed encrypted keyfile, restore uninfected
> boot code, replay captured passwords, decrypt disk
>
> So when the computer fails to authenticate itself, not only must the
> user have the discipline to stop _using_ the computer. If multi-stage
> attacks are part of their threat model, they must also _destroy_ the
> computer
>
> - - completely: because who knows where exactly the sealed encrypted
> keyfile has been been copied to
> - - quickly: before the attacker can get to it
>
> Whereas, if we put the encrypted keyfile on a separate stick that's only
> ever connected _after_ the computer has authenticated itself to the
> user, getting rid of that is more realistic. It's just a dirt cheap
> little commodity device that doesn't contain any irreplaceable data.

For such multi-stage attack, it could be be much more effective and
still perfectly feasible to implant a passive hardware device into
the target computer that would silently capture and record relevant
USB traffic. Such device would be invisible to all existing measured
boot mechanisms and thus undetectable by the user -- the computer would
successfully authenticate itself, so there would be no apparent reason
not to insert the secondary USB stick with their LUKS keyfile.

However, should hardware implant attacks be out of the scope of the
user's threat model, then you're of course correct and the original
proposal provides better protection against multi-stage evil maid
attacks.


> A nice setup would be two microSD cards, in tiny USB adapters[1] if
> necessary, on a key chain. We could call them "verification stick" and
> "decryption stick", sort of like how people are used to having a
> building key and an apartment key?
>
>
> That said, although the scheme you're proposing wouldn't prevent the
> multi-stage attack, it still has _a lot_ of good things going:
>
> - - It improves security and UX, compared to the AEM status quo
> - - UX is vastly better than the two sticks + TOTP device scheme
> - - No need to trust the TOTP device (probably a smartphone - eew!) to
> authenticate the computer
> - - No need for conspicuous TOTP verification

As I was thinking about the original proposal, I realized that most
users would propably opt to use a smartphone as a TOTP token -- and
since smartphones are not air-gapped, they could potentially be tricked
into rolling back the time (eg. via spoofed cellular network with fake
time synchronization data; this can be explicitly disabled in most/all
phone's settings but it's an important fix to make). TOTP code
verification can also be cumbersome in case the clock difference between
the computer and TOTP device becomes high (a few seconds out of the 30s
TOTP interval perhaps) -- this also increases maintenance burden placed
upon the user.

That was one of the reasons along with hardware implants undetectable
by trusted boot process mentioned above why I devised the alternative
scheme (which is much simpler and has near-zero maintenance cost).


> We could support both schemes and let users choose either or none:
>
> - - If secret.luks.encrypted.sealed2 exists on the verification stick:
> unseal it, decrypt it, and use it as a keyfile
> - - Otherwise, if secret.totp.sealed2 exists and we're not in "I don't
> have my TOTP device" fallback mode: unseal it and show the code
> - - Otherwise, unseal and show secret.png.sealed2/secret.txt.sealed2
> - - If we don't already have a keyfile, and the user inserts a decryption
> stick before unplugging the verification stick, unseal
> secret.luks.encrypted.sealed2 from there, decrypt it, and use that
>
> I don't know if this combination would be too much work for a GSoC
> project though?

In case you deem the probability of software-based (but requiring prior
physical access) multi-stage evil maid attacks much higher than
hardware-based ones, I could implement both schemes (probably not in the
two month time-frame of the GSoC, but I could work on it in my free time
before and/or after GSoC).
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2

iQI5BAEBCAAjBQJY2CPMHBx4aGFnYXIwMEBzdHVkLmZlZWMudnV0YnIuY3oACgkQ
XB5x3wMfmuXJ+RAAjxQmayBBNt3+35rsWIzF2pSfjDLNpBYJXDniPbIohjL6KGbb
ZcHi8vlqiU58VjXAh+knKlwzVh9UogUIJj2ySdkcfKGNJ2jYkoryCXFwh86nh/Ai
lrAuKigzRodX/FSmZbyRwwdXzM/wqFBJ3S2J9GFmvX+Ff2EOfQ400AHtqOv6n9DI
fx7Q7VI71RpqTzkKJwL8YoRRC/4o1f6hUNvZpapeth5fQ5eZYbxjA5sGU/8BDtru
kHp1RWzw05ABRQUKYMNRH7B7sQUQfw+vVATEFEIC5bo88u04Lz5Z9LZSRBcdvEWQ
pZierpI5QDSFNkUX/yp/c4adsAq9mCsmfPO0MxKknOW5shywcKEsPI2n1rXXjNJ/
VyGnNYI6mWWa92vk9TW79L13N4WD0fIdAnwqQ2RDCtx+XwMXPjSByoJxyhBdb/hz
8yFo0Ro9Ibyv+9bhDTktxqQoop/SrchOfEGJuB+nmqZraz9IiiBhH+Xz870nFXtw
U/WnVRmXQ9YfuaPqQsIjGG/702FxdiOuKzqS79OJ0WOC+Svp6Nd7PZPSHOIOHppq
YVHY87iVkRgAP1HMBH2hSae+JEwUfdGRLC5eS9rVGcGeQXuuk3lRRMCAk4uwj4w8
hMRPhxUv+vpuphWIOSfba52sILpHEJgCE+TOYALfBd0Z4RT/8l0B5196480=
=x04N
-----END PGP SIGNATURE-----

Rusty Bird

unread,
Mar 27, 2017, 10:44:37 AM3/27/17
to Patrik Hagara, qubes...@googlegroups.com
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512

Patrik Hagara:
> For such multi-stage attack, it could be be much more effective and
> still perfectly feasible to implant a passive hardware device into
> the target computer that would silently capture and record relevant
> USB traffic. Such device would be invisible to all existing measured
> boot mechanisms and thus undetectable by the user -- the computer would
> successfully authenticate itself, so there would be no apparent reason
> not to insert the secondary USB stick with their LUKS keyfile.

As long as the keyfile is encrypted, and the user has a PS/2 keyboard,
such a USB sniffer wouldn't necessarily be a problem.

> > We could support both schemes and let users choose either or none:
> >
> > - - If secret.luks.encrypted.sealed2 exists on the verification stick:
> > unseal it, decrypt it, and use it as a keyfile
> > - - Otherwise, if secret.totp.sealed2 exists and we're not in "I don't
> > have my TOTP device" fallback mode: unseal it and show the code
> > - - Otherwise, unseal and show secret.png.sealed2/secret.txt.sealed2
> > - - If we don't already have a keyfile, and the user inserts a decryption
> > stick before unplugging the verification stick, unseal
> > secret.luks.encrypted.sealed2 from there, decrypt it, and use that
> >
> > I don't know if this combination would be too much work for a GSoC
> > project though?
>
> In case you deem the probability of software-based (but requiring prior
> physical access) multi-stage evil maid attacks much higher than
> hardware-based ones, I could implement both schemes (probably not in the
> two month time-frame of the GSoC, but I could work on it in my free time
> before and/or after GSoC).

Hmm, what do the people on this list think about the TOTP device + two
sticks scheme? Is the small amount of added protection against multi
stage attacks even worth the added complexity?

Rusty
-----BEGIN PGP SIGNATURE-----

iQJ8BAEBCgBmBQJY2SVKXxSAAAAAAC4AKGlzc3Vlci1mcHJAbm90YXRpb25zLm9w
ZW5wZ3AuZmlmdGhob3JzZW1hbi5uZXQ4NEI1OUJDRkM2MkIxMjlGRTFCMDZEMDQ0
NjlENzhGNDdBQUYyQURGAAoJEEadePR6ryrfO+UP/jWUdVMxci36sWNf+1JASc0Y
qh45W/X8mY7U5wew0SNMetD4V8g5ONFTB4mj3LTPpOXzWP7kRd+bdkXwa//fh7I8
FgP6I+b+XGlMnJNY4b9n3CVzorNTDZYxEBub6iHpSz/bgVdB7LzzXUAdD3ei/yaf
c9Y8r6zHkxeQf7kTqWkz6cRYEnKFmuSICEy0XJlpwiCvYyWmCJh3bdflR76d3Iwz
TdeT66VXMdekoyoQvmbMprLTL6IO9UB/ej9jH1Bt+cSsW9jY0zXnn5BT80QZn7Mc
9vQnw0xyC0btzOcp+qPnOOn5YiqVREdP02xaZNNtChN31PDhOSgaD4B0tWEqLNtf
YjwM6qeSXOXg0RgjNJsMpG9U3+/Vp6mw2XXEswoo3eLrdbrpXsVTT671bqqFluN6
6Bs5GtH4RK/mQjpTtx/GSkOZx9FCzUBFizp6MbJjKsO0+154YRGRH9sGsW6qiSok
E3aizJkbsROTIuaBrWkLub5EcZZ3Xr8fhRGQomvfJRkmVe2gXWW7hRIqw9OHZNVz
iEypWXgDS2FglriZIDJSI3XSe4eXXZTNmooqEWbB2xeyCSoqMKk6ExasdhXQKc6+
Dsd3f9vHT7F4I7rhJfDT4UaQncxh5gxTyKK5YlprKmxhGAUnZOczAduBlA4PMjtA
etf2Hw74WFTcZsd2xRIQ
=mIjE
-----END PGP SIGNATURE-----

Rusty Bird

unread,
Mar 29, 2017, 7:39:14 AM3/29/17
to Patrik Hagara, qubes...@googlegroups.com, Marek Marczykowski-Górecki
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512

- -----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512

Rusty Bird:
> Patrik Hagara:
> > In case you deem the probability of software-based (but requiring prior
> > physical access) multi-stage evil maid attacks much higher than
> > hardware-based ones, I could implement both schemes (probably not in the
> > two month time-frame of the GSoC, but I could work on it in my free time
> > before and/or after GSoC).
>
> Hmm, what do the people on this list think about the TOTP device + two
> sticks scheme? Is the small amount of added protection against multi
> stage attacks even worth the added complexity?

Since nobody on the list seems to have strong feelings about the issue,
concentrating on the simpler scheme during GSoC sounds good to me.
(Please keep in mind it's impossible to guarantee in advance that the
project will be accepted. It depends on the strength of your final
written proposal, and on the number of student slots assigned by
Google.)

If you do get around to starting on the complex scheme before/during/
afterwards, then feel free to do it in any order - but the following
might be a good progression to familiarize yourself with the AEM code
base:

1. The TOTP part of the complex scheme. This would be nicely straight-
forward, I think.
2. The simple scheme. Getting systemd/dracut to dynamically use a
keyfile could possibly turn out to be a bit of a PITA.
3. The decryption stick part of the complex scheme, reusing code from
(2).

Rusty
- -----BEGIN PGP SIGNATURE-----

iQJ8BAEBCgBmBQJY25y4XxSAAAAAAC4AKGlzc3Vlci1mcHJAbm90YXRpb25zLm9w
ZW5wZ3AuZmlmdGhob3JzZW1hbi5uZXQ4NEI1OUJDRkM2MkIxMjlGRTFCMDZEMDQ0
NjlENzhGNDdBQUYyQURGAAoJEEadePR6ryrfxp0P/R1gPL7RqjduCvC7KqeWQmC4
+JKE+rN2U91z6s6SZXMhApywR2sgLWMvwPuoDD+wlBHvxvTmBkwS3UWxaHjyzP/l
8VSLSYnFfgQzeHYiI5lOl1zIl7A3Us6TJFcdIY5UMepBQqhR8z0JMnpOEiqoymp1
EJzRniqHvX+NtioCT2a25g1O6xdr968AwCvNyNuntKiQFoczrn5tyySSzDkP+vhC
EElDrulBov9fSs7PKWNYxjd5Y3IH9qBNVqBywWj2BClwIl0n1awiR87zccufehJ4
MeOVrWUq4mnCDgKjtL0oYPB9y9hpS+pLL6FsxI0vldp7ekb+3OeVAGAYaGwxuFgb
25EmDcwTJpu0N0SWNamcqyyKqyRyEpdFWl9v7j6b1oiRlM2EGFtQEk3P016+TY1H
aIkGF2ySPUIdT2l6Lc5A/8ulSiB+gkGJOWRd3wX1JOsLarpPQgQ01ICQmsVz+Sxy
DOuNDaecBcrkKAvSyDf31wiKOcjWiV/4y6+oZfYQ3RyVV5hSOKedD5s+15vBFiel
qwHo4OkvdZjs43qmS18K/efzdZuiKqXl8gDEePI5R7s/JlTMnmOKCH6Fksvv+NQh
8OCE8G+iEiTXNuCCib7FX2GoVPw8ul4jBZLI0zziQsrR6oeXIeBTHv+LWJ9YO6Xn
LF2sQZIE51gufytGCn+o
=eeqa
- -----END PGP SIGNATURE-----
-----BEGIN PGP SIGNATURE-----

iQJ8BAEBCgBmBQJY25zWXxSAAAAAAC4AKGlzc3Vlci1mcHJAbm90YXRpb25zLm9w
ZW5wZ3AuZmlmdGhob3JzZW1hbi5uZXQ4NEI1OUJDRkM2MkIxMjlGRTFCMDZEMDQ0
NjlENzhGNDdBQUYyQURGAAoJEEadePR6ryrfJooP/i+eHzWOljG5qm9z9YE0Ei/m
OOhZH9TKjcr36x+lpjKg1n5z0D+ODyb3d3XXqFmNIH7Hzoh/a3nmViuyb/Cc0Wq0
0x6mry6pWX1Za8Owch/PX5094Qak22l2gShzwINSa8r8KMw8bnVbjA3CFNttvvH7
XfgevgsrVkgkusvSKmUtbohnSU0Dlibl1gBUCfRcZMZJpEfmKw/7n/OkpkzLjPF3
9C6ovycDz8Cw4lxSqYn81YcZqAkov0av/BjuELCLWuxWelqM7wSZVyWAhNnAFWRD
+hNNvWbX4X+dmAxQFMgXB9HtdafrBPM+j7IN91zpM3aPMdtYKtC0zVDya04hcOYr
tUN1udHYB8CabItxEyD0b6LE/gpJgs9kS7lyIhHib9wg0ew5uKss2IB1gFhk5Upd
DJExWYBezD6IduTGmUu9/O5saa0THeWqpXgKVTK16NnqDXffF0lLawE3hajc9as2
9ytIZulEDpUSgPLsmO+J/Y7bEbWiGmTEXGdy5kXgWxU631ZvBA337OpJ4ahk4vZu
V12YWkMWb6uOjjCXdTfaof4vRdbnuC3Z77Ds+lxHj3NTu3IvN5jEGo5Qjt5wj9Ls
NWVvP6PkDzFwsbI2rGZubzO1rfs4xFNYYNO/rXb+PZ5hpPI4K1TYZ9HYRw0vTXUH
IhLSxKUXXceDxGuT8Yj1
=kEDE
-----END PGP SIGNATURE-----

Patrik Hagara

unread,
Mar 29, 2017, 9:45:16 AM3/29/17
to qubes...@googlegroups.com, rust...@openmailbox.org, Marek Marczykowski-Górecki
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA256

On Wed, Mar 29, 2017 at 1:39 PM, Rusty Bird <rust...@openmailbox.org> wrote:
>>> In case you deem the probability of software-based (but requiring prior
>>> physical access) multi-stage evil maid attacks much higher than
>>> hardware-based ones, I could implement both schemes (probably not in the
>>> two month time-frame of the GSoC, but I could work on it in my free time
>>> before and/or after GSoC).
>>
>> Hmm, what do the people on this list think about the TOTP device + two
>> sticks scheme? Is the small amount of added protection against multi
>> stage attacks even worth the added complexity?
>
> Since nobody on the list seems to have strong feelings about the issue,
> concentrating on the simpler scheme during GSoC sounds good to me.

Alright, I'll start drafting a GSoC proposal with the simpler scheme.

> (Please keep in mind it's impossible to guarantee in advance that the
> project will be accepted. It depends on the strength of your final
> written proposal, and on the number of student slots assigned by
> Google.)

Yes, I am aware of that possibility.

> If you do get around to starting on the complex scheme before/during/
> afterwards, then feel free to do it in any order - but the following
> might be a good progression to familiarize yourself with the AEM code
> base:
>
> 1. The TOTP part of the complex scheme. This would be nicely straight-
> forward, I think.

Agreed. I even saw an implementation [0] of this (seemingly targeted at
dracut initramfs) with a fork [1] by the Heads [1] project. The limitations
mentioned in readme are not applicable to Qubes (except the malicious
firmware remark, ofc), as Qubes already uses tboot to also measure
initramfs. If not already a no-op, it should take very little effort to
make it compatible with Qubes.

> 2. The simple scheme. Getting systemd/dracut to dynamically use a
> keyfile could possibly turn out to be a bit of a PITA.

Could be, won't know until I try though.

> 3. The decryption stick part of the complex scheme, reusing code from
> (2).

Had I not found the TOTP attestation code, I'd probably move step #1 to
the end as it requires figuring out the Plymouth UI in addition to the
actual implementation "guts". However, assuming the existing TOTP code
can be trivially re-used, I do side with the ordering you proposed.


[0] https://github.com/mjg59/tpmtotp
[1] https://github.com/osresearch/tpmtotp
[2] https://trmm.net/Heads
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2

iQI5BAEBCAAjBQJY27iJHBx4aGFnYXIwMEBzdHVkLmZlZWMudnV0YnIuY3oACgkQ
XB5x3wMfmuXY8g/9FcZ45dj4NLeCy8B07W7OAupcS5lxnoSHHR4Qp44RerCTr276
nqk2eHrVT8+TUhjAzkX544IR6qVZjkMaQh7/oCCFFMYKZEtq3TAUMCUfipmj9l5+
RipDk8AHQFAwAeqDp/i22pxdaGEpRCOY3O/xB1JIBTEPy9Kcvygs1LHU0j/4qIjv
kmBHbeNDuPDZ7KVlCzti4Ld/AmpS3JJjF2/G4mnxcOoicgbGPw8Pu0g1rkdrX+i5
p1BzBgMyXAGqnSWmjoeQlSQLRE9sajuWmMj+1si/iBaEhEvu2Y316qC/6DbxUTcw
ZxP1ApaG3/zia+T66vgcy4ukCbIx65K60dgBuCYYTjFcuTQ7dyVeqB8+aoCubxlW
w2BmyKX3ZNSCkSPoVFuLpIZ8sSmKwTylqZIirEB9sLgaIU3mHcGBHR2h7/Mn3fdA
pRx7XLB838bywThO8buAhKVkQUkdiwW1qF34cSCYQ2evQKTocu7dqdRop6gTWMv/
B0a1LNsMxxoboJXrAD5+F4GKWLLKM+7wIlKxMcd99O+OoFGe60NX4fnvxi0ZtX1I
CI6V/DkJVOlV3QHt6YWmT3cOkeJm8hHZPt9hc0teVc2KYGEzRcBIKBjerxduYMU3
cw8X8WWmeRDc+lMTCiPS8NIxu2MVnIiWNeK3n1b7We/dNBHOtjxUX9pzdWg=
=WzXB
-----END PGP SIGNATURE-----

Rusty Bird

unread,
Mar 29, 2017, 10:53:39 AM3/29/17
to Patrik Hagara, qubes...@googlegroups.com, Marek Marczykowski-Górecki
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512

Patrik Hagara:
> On Wed, Mar 29, 2017 at 1:39 PM, Rusty Bird <rust...@openmailbox.org> wrote:
> > 1. The TOTP part of the complex scheme. This would be nicely straight-
> > forward, I think.
>
> Agreed. I even saw an implementation [0] of this (seemingly targeted at
> dracut initramfs) with a fork [1] by the Heads [1] project. The limitations
> mentioned in readme are not applicable to Qubes (except the malicious
> firmware remark, ofc), as Qubes already uses tboot to also measure
> initramfs. If not already a no-op, it should take very little effort to
> make it compatible with Qubes.

oathtool (available as a Fedora package) might fit better into into the
existing code, because tpmtotp needs direct access to the TPM chip.
Which then requires a special case in the (un)sealing scripts to skip or
delay tcsd startup, to invoke the proper sealing/unsealing command, to
transform the PCR configuration syntax in anti-evil-maid.conf from
tpm_(un)sealdata format to tpmtotp format, etc.

> Had I not found the TOTP attestation code, I'd probably move step #1 to
> the end as it requires figuring out the Plymouth UI in addition to the
> actual implementation "guts". However, assuming the existing TOTP code
> can be trivially re-used, I do side with the ordering you proposed.

You wouldn't have to do anything fancy UI wise, "message NNNNNN"
(possibly looped in a background subshell) works with and without
plymouth.

Rusty
-----BEGIN PGP SIGNATURE-----

iQJ8BAEBCgBmBQJY28ppXxSAAAAAAC4AKGlzc3Vlci1mcHJAbm90YXRpb25zLm9w
ZW5wZ3AuZmlmdGhob3JzZW1hbi5uZXQ4NEI1OUJDRkM2MkIxMjlGRTFCMDZEMDQ0
NjlENzhGNDdBQUYyQURGAAoJEEadePR6ryrf8lIP/ih1GkE/WIi5KPAowvLp4DCT
X4y30MJ/Qgjd+iofGqaXrHv+QC4+5h0HbG5uS4xYbBK7d+dTY8ff292S9o3C6JrL
KsQlc8MZuAKuN0DipJwRGGWU56SRHBFvAoqGhaZvcwo9qvy5n8Lj7O4i8D29Kwxd
3YfPsoSVTiTk19TwkaVkfZH32drBhno3+9k4aKtrmf4lLJHrYHxiSoletfj/f8IZ
7RcOVXEMv+BImMoGZqTK5ZN2P/9oPatM0dFePJt0YK0SyetBC9L8i2qUamWnafSQ
kNVWdzMlxc3jzQKVBYXdV082OX2CJamth+dwaXxNUXWRvUZ71KCA+FAQO0dsVIkK
w/XS0ioVgjFnPeoiQqZs8Mte3vo9mn7eZBgAMi59UBRZSWHtFQCAICYvxsNSJfc8
QfR161cOr6LeynHlcHt8Jm7lv8EAdzSXiW7JmUeHtjrWrj/SxcO8hyMNHpNqT1oY
NQ91H4gxbTGb37LfRxlN4RxK0I2GdjrDAcgJcf25JqBqx39Fs4demphIlng4m4AF
uRyV+FXB8l8DOVONPtbugYzHbSgTbmBvxrEOLbk5VPYnFVyzOC1AtmnAoPbVRMf6
tjeT5Gzh/v39GRTVo7keMN6qFZBcJu31jb7l2E+2JeG4ytXCp7badiG0drfFxm/c
0JUSbPeZUnmSKir+8Pbq
=q73r
-----END PGP SIGNATURE-----

Patrik Hagara

unread,
Apr 2, 2017, 10:57:08 AM4/2/17
to qubes...@googlegroups.com, Marek Marczykowski-Górecki, Rusty Bird
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA256

Hi!

You can find and comment on my GSoC proposal for AEM
improvement draft at [0]. The proposal is also attached for
archival purposes.

I encourage you to review it and if you have any questions
or suggestions, feel free to either reply to this e-mail or
leave a comment on the GitHub gist itself. All feedback is
welcome and appreciated!


Regards,
Patrik


[0] https://gist.github.com/phagara/f6274dd4a3cc1872bc3bf42c207d64e9
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2

iQI5BAEBCAAjBQJY4REMHBx4aGFnYXIwMEBzdHVkLmZlZWMudnV0YnIuY3oACgkQ
XB5x3wMfmuWgexAAjNqqR/K9IDTn63qDSHHa8xqoHSiSMaxbN9hmHyhtEif6gsDL
AbuVHwIUvI+Rmi/18HcNtB45aYfzUL7usjKjnHwc3HrF4+ouNOvMFRLXKXqxu4ln
kH80NS7P4Q6h4/El7t3imzpIsEyixKRk9CoSgTfJUNHiPd5G+rrdmDYHT+tmCCBN
PzGNVSsUHXRBrbMf54HSWeK5/No5doscCTJIu9llk3B11fjk0b6nFVkjSzCQ1avz
WcVKiK3NG/U/bSfe7AMjmomnpP/KM2cHjjoc3mxrYVCzLjHKnraXEh7XRqzaq4fK
6hhSHx95Kob+6OqmxpqTaQHO9qJYSHEuA4v8c1lZ0FRTuXcfjof6LmwewTQB/Bj/
OEn/yRVFhwhuc56WzieHy8izPuRK1CebxAgQ4ODT6iiyyiuNzIIn8wyI703bpUOH
3h1rMhnyO/KlahEIfsiRj1j57UHKGlMopXteNodsPNtgmOVJzf3oPCn/BZ0tLsWt
G55fAUCcoF25e5PFKArP9WEYx5T//iKGWo41fgAYa5AwCPi7V2a18wI+pgEb4l6C
2QVY0EkoUnN2emfwVqGYqhJw7IN7P+u8+ObYSGxlUERvPdKKZ41/W16VMLm5EvOL
y0qGnfBHSLWgJbxUh+y9lbSJzOcKnAd4uYt3y8bV1Gb1pO9LVjdj1cXhXQg=
=AiXG
-----END PGP SIGNATURE-----
aem-gsoc-proposal.md

Patrik Hagara

unread,
Apr 3, 2017, 7:21:23 AM4/3/17
to qubes...@googlegroups.com, Marek Marczykowski-Górecki, Rusty Bird
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA256

Hi again,

I have pushed the final proposal to the GSoC website, you can
find the PDF attached to this e-mail.

$ sha1sum aem-gsoc-proposal.pdf
0643cf898a84eca883f0ed4352d6a04ce4300252 aem-gsoc-proposal.pdf


Regards,
Patrik
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2

iQI5BAEBCAAjBQJY4i4sHBx4aGFnYXIwMEBzdHVkLmZlZWMudnV0YnIuY3oACgkQ
XB5x3wMfmuWHPA//dcd+MiCafncQ/elMM/SgYSIU52xHkAIcUEMkk4c5wUv52wr8
hDg1uekjDO6yNdRGA49YlRNV4Kgd6xNaXciB4sPjdK76Ws7lCFajkygFeZqESWEZ
4TZuiDepTXqCOZtwEGx3NmEFIRuvnwArFGp9XakotEZHvrw3ZmT+GomLxCG2zGHG
SXizLrEtpXqLyxMHWzrx8ARkhsxZl1IDhcF2RgVdzQvTCjlmO91QOih9QJbZsn+i
w9gNvs9oJAL21tli1KO8gMW5M0/gP8NZoavSGQn462ukg0DZT8TzFWEX45EUoBB7
5rAJBEIHW1IjruWfitRd3P34GaxVk62Nz1+jbA2Ofqh7itGoOkYt0fyBkjEX4s9j
dobKJrr+WgpdYz6Yx0g/UK7mX/5ewuZYemEAOOF/HQZGh4T4wMEpc5Q7Ksn7nRWo
r36MaluRluClI2QIreN3SREqrD+Nt9varqfOuTIM929u0dO9rC0dOna4T4rN5RnV
xdNxJu1EN/GlDCakpGPUeDdBMcXuTT6hobINiUWKc2+hXeMrA6wLRd1l8jlA6EXm
pwVddOSK2Qwt+tuSPG9tDvtTJYzRE6w9k6lV0Yc+gZfZB+FWzj9uGBu55wIAlNDl
jhDugaJdoFrbXQ5fgUL+oH7SNepT911UXGOp0HLls4wAr9rJil/ugjGxbwU=
=gVrZ
-----END PGP SIGNATURE-----
aem-gsoc-proposal.pdf
Reply all
Reply to author
Forward
0 new messages