Revised CAP-0021: Generalized transaction preconditions

173 views
Skip to first unread message

David Mazieres

unread,
Mar 5, 2021, 5:09:17 PMMar 5
to stell...@googlegroups.com
I've revised CAP-0021, the latest version of which you can find here:

https://github.com/stellar/stellar-protocol/blob/master/core/cap-0021.md

One reason to revive this dormant CAP is that it would greatly simplify
the implementation of payment channels. I was particularly inspired
because I had what I thought was a good design for payment channels,
only to discover it was broken because Claimable Balances didn't work
quite how I thought they did (invalid Claimable Balance preconditions
don't invalidate a transaction, just make it fail). Since I anticipate
that payment channels will become more important as Stellar gains
increasing acceptance, I would like to make them as simple as possible
and eliminate opportunities for people to shoot themselves in the foot
accidentally.

More generally, I anticipate are other benefits to relative the
timelocks enabled by this proposal. For example, and example use
highlighted in the revised document is timelocked account recovery,
allowing a third party to recover a user's account if the user loses the
private key, but also giving the user an opportunity to object to
unauthorized recovery attempts.

Finally, while I don't have specific examples, I believe that CAP-0021
will make pre-auth transactions less brittle by allowing transactions to
run at a variety of sequence numbers. Note the proposal does *not*
change the fact that, for a transaction to be valid, the source account
sequence number must be less than the transaction sequence number. It
also does not change the fact that after executing a transaction, the
source account sequence number will equal the transaction's sequence
number. It does, however, add a non-default mode in which a transaction
can execute even if the account sequence number is not exactly the
transaction sequence number minus one.

Feedback appreciated.

Thanks,
David

Nicolas Barry

unread,
Mar 8, 2021, 2:03:03 PMMar 8
to David Mazieres expires 2021-06-03 PDT, Stellar Developers
Cool, thanks for the update.

general note: please use the CAP template. In particular, we're now using diffs for xdr to avoid ambiguity and to ensure that changes are indeed complete.

As this proposal is meant to help implement payment channels, I think that we will only consider it if indeed it is an acceptable solution to that particular problem.

That said we can still discuss it assuming it meets that requirement.

There are a few changes packed in the same CAP, so I will try to separate them out.

## Preconditions on ledger state
I think the only addition on that front is  the addition of `ledgerBounds`. I do not see any problem with this as a transaction's life cycle is already coupled with ledger life cycle.

## Preconditions on account state

### minSeqAge/minSeqLedgerGap
I think the challenge with this pre-condition is that we can't have more than one transaction per account in the mem pool as we can only check this condition for the first transaction.
The implication is that we can only flood the first transaction in chain of transactions.
This may limit quite a bit the kind of protocols that those conditions can be used.

As this condition is the basis for implementing "relative time locks", I can't help but notice that we're duplicating logic in the protocol.
I am wondering if there are better ways to do this using `BalanceEntry` that already support this (not ledger sequence number based, but we could add it there). As conditions in `BalanceEntry` are already quite generic and store state.
I remember that you had a proposal (a long time ago) that was leveraging "statements" stored in the ledger (where the only thing that mattered was their existence). Can't we use certain `BalanceEntry` transitions like this (maybe introduce new operations/Transaction types)?

###  minSeqNum
This change is different from the other changes: it's loosening pre-conditions instead of being an additional pre-condition.
I imagine that in the context of payment channels, the sequence number ranges are very large (order of millions) and bad actors have access to transactions signed with any (older) sequence number. As a consequence, I think this condition introduces some challenges in avoiding spam and fairness in transaction queues so we'll have to think of strategies to mitigate this (a bad actor can submit many of those transactions ahead of the "latest one", the only purpose is to delay processing).
This problem may be compounded further by `minSeqAge/minSeqLedgerGap`.

### GeneralPreconditions
Maybe not as important, but `GeneralPreconditions` do not seem future proof, it enumerates all conditions in one blob.
It would be better if we can make it a vector of sorts that we can extend over time. That way we avoid building a structure that defaults to worst case from a binary representation point of view.

## approach to extending ledger entries
This seems like an unrelated issue and probably needs its own thread.
In particular, it allows for sharing code without having to introduce an additional abstraction layer between xdr and the business logic.
It also allows you to share references to underlying fields without having to copy data around.



--
You received this message because you are subscribed to the Google Groups "Stellar Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email to stellar-dev...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/stellar-dev/87a6rhqmlx.fsf%40ta.scs.stanford.edu.

Orbit Lens

unread,
Mar 11, 2021, 1:00:35 PMMar 11
to Stellar Developers
Very impressive paper! The preconditions for payment channels look very well structured.

A few thoughts on the proposal.

1. Totally agree with Nicolas on using an array instead of GeneralPreconditions extension containing all new additions at once. This seems like a more extensible and devlopers-friendly approach.
2. "Key recovery" and "Parallel transaction submission" cases look a bit irrelevant. The former can be easily implemented using an auxiliary account added as a signer to the account that may need recovery and a transaction with the time-lock granting access to the auxiliary account. Parallel submissions based on preconditions can't guarantee the execution in case if any of the submitted transactions ends up in a mempool or delayed for some other reason (minSeqNum <= n < tx.seqNum condition is not met since other accepted transactions have already bumped the sequence). Of course, it's a good alternative to channel accounts approach, but it is not as straightforward as it looks and requires additional resubmission logic on the client side.
3. Are there any plans to include some other account-based preconditions for extended smart contracts functionality (in this CAP or in some future upgrade)?
Like "trustline exists","trustline authorized", "has signer with weight x". I understand that this feature request is out of the scope of this CAP, and the conditions mentioned above can be checked in runtime. However, not all of them can be checked without causing a transaction to fail during execution, which in turn bumps the source account sequence and effectively brakes the smart contract logic. Maybe this is more related to Horizon since some of those preconditions may cause additional performance impact on the Core node. It would be nice to have clarity on this matter.

Thanks,
OL

Leigh McCulloch

unread,
Mar 11, 2021, 1:59:40 PMMar 11
to Stellar Developers
The GeneralPreconditions as defined assumes that the source account wouldn't want maximum bounds on the account's sequence number, sequence age, or sequence ledger gap.

If the GeneralPreconditions minSeqNum, minSeqAge, and minSeqLedgerGap are defined as ranges/bounds, like ledgerBounds, it will be possible to define that upper bounds.

Technically the upper bound for the account's sequence number is the transactions sequence number, but I don't think that would be always necessary. A source account may wish to specify the sequence number precondition independent of the sequence number that the account will be updated to.

I don't have examples for these, but if we're hoping to create more general preconditions it's unclear to me why we'd only specify minimum bounds and not upper.

Leigh McCulloch

unread,
Mar 12, 2021, 4:26:22 PMMar 12
to Stellar Developers
Another thought:

We could generalize the minSeqAge and minSeqLedgerGap by making them useful in more situations if we take them outside of the account and make it a new ledger entry, e.g. a predictably identifiable TimerEntry. The TimerEntry would have an creationTime, creationLedger, and a relative expirationTime, and a relative expirationLedger. The timer would be expired when both are satisfied.

In CAP-12 there is a similar change to add a creationTime to a new type of account to be able to condition a future transaction on some time delay after creation time of the account.

Instead of pursuing minSeqAge and minSeqLedgerGap (CAP-21) or creationTime (CAP-12) which both make it possible to condition transactions on specific moments, a TimerEntry would allow for conditioning transactions on arbitrary moments.

The TimerEntry would have two operations:
CreateTimerOp – Creates the timer with a predictable ID.
DeleteTimerOp – Deletes the timer, which is only valid if the timer has expired by the time given the current ledger.

In the payment channel case the declaration transaction would create the timer, the close transaction would delete it and only be valid when it could be deleted.

A bonus of this approach is you can have multiple timers active at a time for a single account. I don't have a specific example of the usefulness of that, however, disconnecting the account that starts the time lock from the transaction that is delayed seems like it could be useful.

The downside of this is it means another ledger entry which means a higher reserve cost. I think that's probably okay though.

If we did this, the rest of the preconditions in CAP-21 would remain. Only the minSeqAge would be removed.

Thoughts?

Nicolas Barry

unread,
Mar 12, 2021, 5:02:26 PMMar 12
to Leigh McCulloch, Stellar Developers
That was my intuition on using ClaimableBalanceEntry:
how is a `TimerEntry` really different from a `ClaimableBalanceEntry` that holds only native asset (maybe even 0)? Native is important as otherwise claiming it may fail due to authorization/other trustline related issues.

The overhead is basically the same, even with the generic conditions that we have: basically the perf hit is not on conditions but on the fact that we have to load an additional ledger entry when validating the transaction.

So basically the transaction level condition would be "this balance entry is claimable by the source account right now".

Nicolas






--
You received this message because you are subscribed to the Google Groups "Stellar Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email to stellar-dev...@googlegroups.com.

Leigh McCulloch

unread,
Mar 12, 2021, 5:08:33 PMMar 12
to Stellar Developers
I think that would work but we'd need to make changes to claimable balances.

David identified that ClaimClaimableBalanceOp is valid even if the claimable balances predicates are not satisfied. This makes it possible for anyone with access to the delayed transaction to consume the transaction's sequence number early. A bad actor could use this to prevent the claimant from claiming, or the claimant could make a mistake.

CreateClaimableBalanceOp also fails for a zero amount, but we could change that.

If we can address both of those things then we just need to add a new predicate for absolute and relative ledger sequence.

The downside remains though that this is more expensive in terms of reserve than CAP-21's proposal.

Jonathan Jove

unread,
Mar 15, 2021, 10:32:03 AMMar 15
to Leigh McCulloch, Stellar Developers
I agree with Nicolas that using ClaimableBalanceEntry would be preferable. Right now validation conditions are either verifiable without loading any entries (eg. payments amounts are positive, assets are valid) or verifiable by loading an account (eg. source account has the right sequence number, transaction is properly signed). Different validation conditions for TimerEntry would change this model, and so would changing the validation conditions for ClaimClaimableBalanceOp. I'm not intrinsically opposed to allowing other ledger entries to be loaded for validation conditions, just (1) the benefits should justify the effort and (2) we should do the most powerful thing that we can since this is fundamentally a new class of primitive for the protocol. Because ClaimableBalanceEntry can encode every constraint that could be expressed by a TimerEntry and many more, I think ClaimableBalanceEntry is better suited for this task.

During the design of CAP-23, I wondered whether there could be interesting uses for a transaction that could claim a ClaimableBalanceEntry instead of consuming a sequence number. I think this idea of using ClaimableBalanceEntry as a timer is in the same line of thinking.

Jonathan Jove

unread,
Mar 15, 2021, 11:26:03 AMMar 15
to Leigh McCulloch, Stellar Developers
Leigh asked me to add some additional thoughts about the feasibility of implementing this change. I hesitate to call any protocol change easy, but it is my anticipation that adding validation conditions that depend on other ledger entries should be a relatively straightforward and scoped change. I think there might be a few minor edge cases related to handling times that are "soon", but overall I don't expect any major issues. 

Leigh McCulloch

unread,
Mar 15, 2021, 11:37:01 AMMar 15
to Stellar Developers
I gave this some more thought and I think the simplicity of the example use cases in CAP-21, like payment channels, erodes with the use of ClaimableBalanceEntry or a new TimerEntry. This criticism applies to CAP-12 too which also relies on the creation of another ledger entry.

If we use the two-way payment channel use case as an example, it becomes complicated in these ways:

1. For starters the payment channel would require an additional transaction to claim back any claimable balance that gets created to clean up. Both parties have to store an unbounded number of these transactions.

2. A failure to provide account reserves consumes a sequence number, so an account can pretty easily break the channel, maybe intentionally if the escrow account doesn't have enough lumens for all possible ClaimableBalanceEntry's.

If minSeqAge/minLedgerGap are built into the account then the only cost and fees to plan ahead for are the transaction cost of executing the declaration and close transactions, and if the escrow account ever runs out of XLM for fees, either account can top it up. If there is a race condition on fee consumption and submitting a declaration transaction the transaction is invalid and can be resubmitted. An account also has the option of fee bumping the transaction to provide the fee themselves.

If we use a ClaimableBalanceEntry/TimerEntry the account must have sufficient reserve for a potentially huge number of claimable balances. At any time that the account doesn't have sufficient claimable balances either account can submit declaration transactions where they have increased liability to revoke their future use.

Is there a way around these issues?

Leigh McCulloch

unread,
Mar 15, 2021, 12:20:35 PMMar 15
to Stellar Developers
Actually, to address my own concern, I guess both concerns can be addressed by building two declaration transactions and two close transactions, a pair for each party involved that includes sponsorship operations so the party proposing the exit sponsors the exit. This also simplifies cleanup because we can also permit each party to cleanup their own declaration claimable balances. However, complexity of a payment channel is still higher than CAP-21 in its current form.

Leigh McCulloch

unread,
Mar 17, 2021, 1:01:36 PMMar 17
to Stellar Developers
I've given this more thought I don't think we should pursue storing the time lock in any ledger entry, whether it is a ClaimableBalance, new Timer entry, or a deterministic Account (how CAP-12 does it). The reason being is that it increases the complexity of the channel, and makes it brittle.

Complexity: The two-way channel requires additional lumen reserve and to protect the channel from being locked that reserve has to be provided by each party, which requires multiple transactions of each type to be signed.
Brittle: Either party accidentally submitting a transaction at a time it shouldn't can consume sequence numbers causing them to lose access to the channel. This seems like a bad foot gun.

Leigh McCulloch

unread,
Mar 26, 2021, 3:28:40 PMMar 26
to Stellar Developers
I took a stab at a minimal implementation of minSeqNum, minSeqAge, minSeqLedgerGap, seqTime, and seqLedger, for my own experimentation, and noticed the following things that I think need addressing or discussing about CAP-21:

1. Transaction Results – The CAP doesn't discuss the transaction results that should be returned for each new precondition. We could lean on the existing txTOO_EARLY and txTOO_LATE result codes returning whichever is most appropriate. For example, if submitted prior to minSeqNum, ledgerBounds, minSeqAge, or minSeqLedgerGap, then txTOO_EARLY. If submitted after ledgerBounds, then txTOO_LATE.

2. Backwards Compatibility for seqTime/seqLedger – The CAP doesn't discuss what should happen if an account doesn't yet have a seqTime or seqLedger but a transaction is submitted with non-zero minSeqAge or minSeqLedgerGap. Should we assume a zero value for seqTime and seqLedger? That will cause most transactions to succeed regardless of when the account's seqNum was last set. Or should it fail until seqTime and seqLedger are set? The latter may be safer.

3. Reserve Balances – The motivation states that minimum reserve balances are being addressed, and in the two-way payment channel example there is no mention of minimum reserve balances. However, a minimum reserve is required for the closing claimable balance, and at any moment that the minimum reserve is not in the account and a close is valid the channel can be broken. Under those conditions the submission of the close will consume the closing transactions sequence number without creating the claimable balance or changing the signers on the escrow account. In that situation both parties would have to cooperate to unlock the account.

4. Extensions – The shift away from dangling extensions for the new account fields is inconvenient because data will live in multiple places and we're requiring any software parsing XDR to over time expand the internal mapping it has to do with each new extension version. The benefits of the change to the extension structure isn't clear to me and so I think it needs discussing in the design rationale. In any case I think we need to rethink the extension structure because of point 5 below.

5. Sponsorship – The account extension v2 is only added to accounts if participating in sponsorship, but adding a new dangling extension or the new v3 in CAP-21 will store that data regardless of an accounts participation in sponsorship, changing assumptions core makes about when that data is present. This change might not matter, but the CAP should probably discuss this change in more detail if we intend this change to take place. Another option is we change the account extension structure so v2 can continue to only be present for accounts participating in sponsorship.

David, do you have any thoughts on these issues and what approaches to them would be best?

Leigh McCulloch

unread,
Apr 19, 2021, 2:17:47 PMApr 19
to Stellar Developers
Another concern:

6. TimePoint change to int64 – The proposal changes the type of TimePoints to from uint64 to int64, but this could be breaking change for any transaction that has a value in time bounds that is greater than MAX_INT64. Such a transaction would become invalid since it would be negative. I ran a query on all previous transactions and there are no results over MAX_INT64 which suggests nobody is building transactions that would become invalid. We could interpret a negative int64 value to be MAX_INT64 so that no transaction with a uint64 timebound greater than int64 max becomes invalid. Do you think that is necessary? At the least I think we need to discuss this backwards compatibility in the CAP.
Reply all
Reply to author
Forward
0 new messages