Revised CAP-0021: Generalized transaction preconditions

484 views
Skip to first unread message

David Mazieres

unread,
Mar 5, 2021, 5:09:17 PM3/5/21
to stell...@googlegroups.com
I've revised CAP-0021, the latest version of which you can find here:

https://github.com/stellar/stellar-protocol/blob/master/core/cap-0021.md

One reason to revive this dormant CAP is that it would greatly simplify
the implementation of payment channels. I was particularly inspired
because I had what I thought was a good design for payment channels,
only to discover it was broken because Claimable Balances didn't work
quite how I thought they did (invalid Claimable Balance preconditions
don't invalidate a transaction, just make it fail). Since I anticipate
that payment channels will become more important as Stellar gains
increasing acceptance, I would like to make them as simple as possible
and eliminate opportunities for people to shoot themselves in the foot
accidentally.

More generally, I anticipate are other benefits to relative the
timelocks enabled by this proposal. For example, and example use
highlighted in the revised document is timelocked account recovery,
allowing a third party to recover a user's account if the user loses the
private key, but also giving the user an opportunity to object to
unauthorized recovery attempts.

Finally, while I don't have specific examples, I believe that CAP-0021
will make pre-auth transactions less brittle by allowing transactions to
run at a variety of sequence numbers. Note the proposal does *not*
change the fact that, for a transaction to be valid, the source account
sequence number must be less than the transaction sequence number. It
also does not change the fact that after executing a transaction, the
source account sequence number will equal the transaction's sequence
number. It does, however, add a non-default mode in which a transaction
can execute even if the account sequence number is not exactly the
transaction sequence number minus one.

Feedback appreciated.

Thanks,
David

Nicolas Barry

unread,
Mar 8, 2021, 2:03:03 PM3/8/21
to David Mazieres expires 2021-06-03 PDT, Stellar Developers
Cool, thanks for the update.

general note: please use the CAP template. In particular, we're now using diffs for xdr to avoid ambiguity and to ensure that changes are indeed complete.

As this proposal is meant to help implement payment channels, I think that we will only consider it if indeed it is an acceptable solution to that particular problem.

That said we can still discuss it assuming it meets that requirement.

There are a few changes packed in the same CAP, so I will try to separate them out.

## Preconditions on ledger state
I think the only addition on that front is  the addition of `ledgerBounds`. I do not see any problem with this as a transaction's life cycle is already coupled with ledger life cycle.

## Preconditions on account state

### minSeqAge/minSeqLedgerGap
I think the challenge with this pre-condition is that we can't have more than one transaction per account in the mem pool as we can only check this condition for the first transaction.
The implication is that we can only flood the first transaction in chain of transactions.
This may limit quite a bit the kind of protocols that those conditions can be used.

As this condition is the basis for implementing "relative time locks", I can't help but notice that we're duplicating logic in the protocol.
I am wondering if there are better ways to do this using `BalanceEntry` that already support this (not ledger sequence number based, but we could add it there). As conditions in `BalanceEntry` are already quite generic and store state.
I remember that you had a proposal (a long time ago) that was leveraging "statements" stored in the ledger (where the only thing that mattered was their existence). Can't we use certain `BalanceEntry` transitions like this (maybe introduce new operations/Transaction types)?

###  minSeqNum
This change is different from the other changes: it's loosening pre-conditions instead of being an additional pre-condition.
I imagine that in the context of payment channels, the sequence number ranges are very large (order of millions) and bad actors have access to transactions signed with any (older) sequence number. As a consequence, I think this condition introduces some challenges in avoiding spam and fairness in transaction queues so we'll have to think of strategies to mitigate this (a bad actor can submit many of those transactions ahead of the "latest one", the only purpose is to delay processing).
This problem may be compounded further by `minSeqAge/minSeqLedgerGap`.

### GeneralPreconditions
Maybe not as important, but `GeneralPreconditions` do not seem future proof, it enumerates all conditions in one blob.
It would be better if we can make it a vector of sorts that we can extend over time. That way we avoid building a structure that defaults to worst case from a binary representation point of view.

## approach to extending ledger entries
This seems like an unrelated issue and probably needs its own thread.
In particular, it allows for sharing code without having to introduce an additional abstraction layer between xdr and the business logic.
It also allows you to share references to underlying fields without having to copy data around.



--
You received this message because you are subscribed to the Google Groups "Stellar Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email to stellar-dev...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/stellar-dev/87a6rhqmlx.fsf%40ta.scs.stanford.edu.

Orbit Lens

unread,
Mar 11, 2021, 1:00:35 PM3/11/21
to Stellar Developers
Very impressive paper! The preconditions for payment channels look very well structured.

A few thoughts on the proposal.

1. Totally agree with Nicolas on using an array instead of GeneralPreconditions extension containing all new additions at once. This seems like a more extensible and devlopers-friendly approach.
2. "Key recovery" and "Parallel transaction submission" cases look a bit irrelevant. The former can be easily implemented using an auxiliary account added as a signer to the account that may need recovery and a transaction with the time-lock granting access to the auxiliary account. Parallel submissions based on preconditions can't guarantee the execution in case if any of the submitted transactions ends up in a mempool or delayed for some other reason (minSeqNum <= n < tx.seqNum condition is not met since other accepted transactions have already bumped the sequence). Of course, it's a good alternative to channel accounts approach, but it is not as straightforward as it looks and requires additional resubmission logic on the client side.
3. Are there any plans to include some other account-based preconditions for extended smart contracts functionality (in this CAP or in some future upgrade)?
Like "trustline exists","trustline authorized", "has signer with weight x". I understand that this feature request is out of the scope of this CAP, and the conditions mentioned above can be checked in runtime. However, not all of them can be checked without causing a transaction to fail during execution, which in turn bumps the source account sequence and effectively brakes the smart contract logic. Maybe this is more related to Horizon since some of those preconditions may cause additional performance impact on the Core node. It would be nice to have clarity on this matter.

Thanks,
OL

Leigh McCulloch

unread,
Mar 11, 2021, 1:59:40 PM3/11/21
to Stellar Developers
The GeneralPreconditions as defined assumes that the source account wouldn't want maximum bounds on the account's sequence number, sequence age, or sequence ledger gap.

If the GeneralPreconditions minSeqNum, minSeqAge, and minSeqLedgerGap are defined as ranges/bounds, like ledgerBounds, it will be possible to define that upper bounds.

Technically the upper bound for the account's sequence number is the transactions sequence number, but I don't think that would be always necessary. A source account may wish to specify the sequence number precondition independent of the sequence number that the account will be updated to.

I don't have examples for these, but if we're hoping to create more general preconditions it's unclear to me why we'd only specify minimum bounds and not upper.

Leigh McCulloch

unread,
Mar 12, 2021, 4:26:22 PM3/12/21
to Stellar Developers
Another thought:

We could generalize the minSeqAge and minSeqLedgerGap by making them useful in more situations if we take them outside of the account and make it a new ledger entry, e.g. a predictably identifiable TimerEntry. The TimerEntry would have an creationTime, creationLedger, and a relative expirationTime, and a relative expirationLedger. The timer would be expired when both are satisfied.

In CAP-12 there is a similar change to add a creationTime to a new type of account to be able to condition a future transaction on some time delay after creation time of the account.

Instead of pursuing minSeqAge and minSeqLedgerGap (CAP-21) or creationTime (CAP-12) which both make it possible to condition transactions on specific moments, a TimerEntry would allow for conditioning transactions on arbitrary moments.

The TimerEntry would have two operations:
CreateTimerOp – Creates the timer with a predictable ID.
DeleteTimerOp – Deletes the timer, which is only valid if the timer has expired by the time given the current ledger.

In the payment channel case the declaration transaction would create the timer, the close transaction would delete it and only be valid when it could be deleted.

A bonus of this approach is you can have multiple timers active at a time for a single account. I don't have a specific example of the usefulness of that, however, disconnecting the account that starts the time lock from the transaction that is delayed seems like it could be useful.

The downside of this is it means another ledger entry which means a higher reserve cost. I think that's probably okay though.

If we did this, the rest of the preconditions in CAP-21 would remain. Only the minSeqAge would be removed.

Thoughts?

Nicolas Barry

unread,
Mar 12, 2021, 5:02:26 PM3/12/21
to Leigh McCulloch, Stellar Developers
That was my intuition on using ClaimableBalanceEntry:
how is a `TimerEntry` really different from a `ClaimableBalanceEntry` that holds only native asset (maybe even 0)? Native is important as otherwise claiming it may fail due to authorization/other trustline related issues.

The overhead is basically the same, even with the generic conditions that we have: basically the perf hit is not on conditions but on the fact that we have to load an additional ledger entry when validating the transaction.

So basically the transaction level condition would be "this balance entry is claimable by the source account right now".

Nicolas






--
You received this message because you are subscribed to the Google Groups "Stellar Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email to stellar-dev...@googlegroups.com.

Leigh McCulloch

unread,
Mar 12, 2021, 5:08:33 PM3/12/21
to Stellar Developers
I think that would work but we'd need to make changes to claimable balances.

David identified that ClaimClaimableBalanceOp is valid even if the claimable balances predicates are not satisfied. This makes it possible for anyone with access to the delayed transaction to consume the transaction's sequence number early. A bad actor could use this to prevent the claimant from claiming, or the claimant could make a mistake.

CreateClaimableBalanceOp also fails for a zero amount, but we could change that.

If we can address both of those things then we just need to add a new predicate for absolute and relative ledger sequence.

The downside remains though that this is more expensive in terms of reserve than CAP-21's proposal.

Jonathan Jove

unread,
Mar 15, 2021, 10:32:03 AM3/15/21
to Leigh McCulloch, Stellar Developers
I agree with Nicolas that using ClaimableBalanceEntry would be preferable. Right now validation conditions are either verifiable without loading any entries (eg. payments amounts are positive, assets are valid) or verifiable by loading an account (eg. source account has the right sequence number, transaction is properly signed). Different validation conditions for TimerEntry would change this model, and so would changing the validation conditions for ClaimClaimableBalanceOp. I'm not intrinsically opposed to allowing other ledger entries to be loaded for validation conditions, just (1) the benefits should justify the effort and (2) we should do the most powerful thing that we can since this is fundamentally a new class of primitive for the protocol. Because ClaimableBalanceEntry can encode every constraint that could be expressed by a TimerEntry and many more, I think ClaimableBalanceEntry is better suited for this task.

During the design of CAP-23, I wondered whether there could be interesting uses for a transaction that could claim a ClaimableBalanceEntry instead of consuming a sequence number. I think this idea of using ClaimableBalanceEntry as a timer is in the same line of thinking.

Jonathan Jove

unread,
Mar 15, 2021, 11:26:03 AM3/15/21
to Leigh McCulloch, Stellar Developers
Leigh asked me to add some additional thoughts about the feasibility of implementing this change. I hesitate to call any protocol change easy, but it is my anticipation that adding validation conditions that depend on other ledger entries should be a relatively straightforward and scoped change. I think there might be a few minor edge cases related to handling times that are "soon", but overall I don't expect any major issues. 

Leigh McCulloch

unread,
Mar 15, 2021, 11:37:01 AM3/15/21
to Stellar Developers
I gave this some more thought and I think the simplicity of the example use cases in CAP-21, like payment channels, erodes with the use of ClaimableBalanceEntry or a new TimerEntry. This criticism applies to CAP-12 too which also relies on the creation of another ledger entry.

If we use the two-way payment channel use case as an example, it becomes complicated in these ways:

1. For starters the payment channel would require an additional transaction to claim back any claimable balance that gets created to clean up. Both parties have to store an unbounded number of these transactions.

2. A failure to provide account reserves consumes a sequence number, so an account can pretty easily break the channel, maybe intentionally if the escrow account doesn't have enough lumens for all possible ClaimableBalanceEntry's.

If minSeqAge/minLedgerGap are built into the account then the only cost and fees to plan ahead for are the transaction cost of executing the declaration and close transactions, and if the escrow account ever runs out of XLM for fees, either account can top it up. If there is a race condition on fee consumption and submitting a declaration transaction the transaction is invalid and can be resubmitted. An account also has the option of fee bumping the transaction to provide the fee themselves.

If we use a ClaimableBalanceEntry/TimerEntry the account must have sufficient reserve for a potentially huge number of claimable balances. At any time that the account doesn't have sufficient claimable balances either account can submit declaration transactions where they have increased liability to revoke their future use.

Is there a way around these issues?

Leigh McCulloch

unread,
Mar 15, 2021, 12:20:35 PM3/15/21
to Stellar Developers
Actually, to address my own concern, I guess both concerns can be addressed by building two declaration transactions and two close transactions, a pair for each party involved that includes sponsorship operations so the party proposing the exit sponsors the exit. This also simplifies cleanup because we can also permit each party to cleanup their own declaration claimable balances. However, complexity of a payment channel is still higher than CAP-21 in its current form.

Leigh McCulloch

unread,
Mar 17, 2021, 1:01:36 PM3/17/21
to Stellar Developers
I've given this more thought I don't think we should pursue storing the time lock in any ledger entry, whether it is a ClaimableBalance, new Timer entry, or a deterministic Account (how CAP-12 does it). The reason being is that it increases the complexity of the channel, and makes it brittle.

Complexity: The two-way channel requires additional lumen reserve and to protect the channel from being locked that reserve has to be provided by each party, which requires multiple transactions of each type to be signed.
Brittle: Either party accidentally submitting a transaction at a time it shouldn't can consume sequence numbers causing them to lose access to the channel. This seems like a bad foot gun.

Leigh McCulloch

unread,
Mar 26, 2021, 3:28:40 PM3/26/21
to Stellar Developers
I took a stab at a minimal implementation of minSeqNum, minSeqAge, minSeqLedgerGap, seqTime, and seqLedger, for my own experimentation, and noticed the following things that I think need addressing or discussing about CAP-21:

1. Transaction Results – The CAP doesn't discuss the transaction results that should be returned for each new precondition. We could lean on the existing txTOO_EARLY and txTOO_LATE result codes returning whichever is most appropriate. For example, if submitted prior to minSeqNum, ledgerBounds, minSeqAge, or minSeqLedgerGap, then txTOO_EARLY. If submitted after ledgerBounds, then txTOO_LATE.

2. Backwards Compatibility for seqTime/seqLedger – The CAP doesn't discuss what should happen if an account doesn't yet have a seqTime or seqLedger but a transaction is submitted with non-zero minSeqAge or minSeqLedgerGap. Should we assume a zero value for seqTime and seqLedger? That will cause most transactions to succeed regardless of when the account's seqNum was last set. Or should it fail until seqTime and seqLedger are set? The latter may be safer.

3. Reserve Balances – The motivation states that minimum reserve balances are being addressed, and in the two-way payment channel example there is no mention of minimum reserve balances. However, a minimum reserve is required for the closing claimable balance, and at any moment that the minimum reserve is not in the account and a close is valid the channel can be broken. Under those conditions the submission of the close will consume the closing transactions sequence number without creating the claimable balance or changing the signers on the escrow account. In that situation both parties would have to cooperate to unlock the account.

4. Extensions – The shift away from dangling extensions for the new account fields is inconvenient because data will live in multiple places and we're requiring any software parsing XDR to over time expand the internal mapping it has to do with each new extension version. The benefits of the change to the extension structure isn't clear to me and so I think it needs discussing in the design rationale. In any case I think we need to rethink the extension structure because of point 5 below.

5. Sponsorship – The account extension v2 is only added to accounts if participating in sponsorship, but adding a new dangling extension or the new v3 in CAP-21 will store that data regardless of an accounts participation in sponsorship, changing assumptions core makes about when that data is present. This change might not matter, but the CAP should probably discuss this change in more detail if we intend this change to take place. Another option is we change the account extension structure so v2 can continue to only be present for accounts participating in sponsorship.

David, do you have any thoughts on these issues and what approaches to them would be best?

Leigh McCulloch

unread,
Apr 19, 2021, 2:17:47 PM4/19/21
to Stellar Developers
Another concern:

6. TimePoint change to int64 – The proposal changes the type of TimePoints to from uint64 to int64, but this could be breaking change for any transaction that has a value in time bounds that is greater than MAX_INT64. Such a transaction would become invalid since it would be negative. I ran a query on all previous transactions and there are no results over MAX_INT64 which suggests nobody is building transactions that would become invalid. We could interpret a negative int64 value to be MAX_INT64 so that no transaction with a uint64 timebound greater than int64 max becomes invalid. Do you think that is necessary? At the least I think we need to discuss this backwards compatibility in the CAP.

David Mazieres

unread,
Jul 2, 2021, 2:48:48 PM7/2/21
to Leigh McCulloch, Stellar Developers
"'Leigh McCulloch' via Stellar Developers"
<stell...@googlegroups.com> writes:

> I took a stab at a minimal implementation
> <https://github.com/leighmcculloch/stellar--stellar-core/pull/1> of
> minSeqNum, minSeqAge, minSeqLedgerGap, seqTime, and seqLedger, for my own
> experimentation, and noticed the following things that I think need
> addressing or discussing about CAP-21:

Sorry for the slow reply, but I think best to answer on this list so the
reasoning is documented. These are all good points. I think for most
of them, it's important to have an unambiguous answer, though there are
many mostly equivalent ways of doing it. I'm going to suggest what I
think are the most-straightforward answers, and if people have
objections or suggestions am happy to revisit.

> 1. Transaction Results – The CAP doesn't discuss the transaction results
> that should be returned for each new precondition. We could lean on the
> existing txTOO_EARLY and txTOO_LATE result codes returning whichever is
> most appropriate. For example, if submitted prior to minSeqNum,
> ledgerBounds, minSeqAge, or minSeqLedgerGap, then txTOO_EARLY. If submitted
> after ledgerBounds, then txTOO_LATE.

I'm fine with this, so long as we prioritize one over the other. For
instance, if there's a precondition that's too late and one that's too
early, we should return txTOO_LATE. Basically the meaning should be:

* txTOO_LATE: This transaction will never be valid

* txTOO_EARLY: This transaction is not currently valid but might be
valid at a later point.

> 2. Backwards Compatibility for seqTime/seqLedger – The CAP doesn't discuss
> what should happen if an account doesn't yet have a seqTime or seqLedger
> but a transaction is submitted with non-zero minSeqAge or minSeqLedgerGap.
> Should we assume a zero value for seqTime and seqLedger? That will cause
> most transactions to succeed regardless of when the account's seqNum was
> last set. Or should it fail until seqTime and seqLedger are set? The latter
> may be safer.

We should assume 0 by default. This is fine, since pretty much all uses
of the features involve a pair of transactions, where the first
transaction both enables but delays the second. In the few cases where
this isn't the case (key recovery) the point is that the accounts are
often used, so this doesn't seem like a big deal.

> 3. Reserve Balances – The motivation states that minimum reserve balances
> are being addressed, and in the two-way payment channel example there is no
> mention of minimum reserve balances. However, a minimum reserve is required
> for the closing claimable balance, and at any moment that the minimum
> reserve is not in the account and a close is valid the channel can be
> broken. Under those conditions the submission of the close will consume the
> closing transactions sequence number without creating the claimable balance
> or changing the signers on the escrow account. In that situation both
> parties would have to cooperate to unlock the account.

Because of the way channels are closed, it seems reasonable to make sure
there is enough reserve balance just by erring on the side of slightly
more lumens than necessary, as the initiator will get it all back. So
that argues for just a non-normative warning in the CAP.

> 4. Extensions – The shift away from dangling extensions for the new account
> fields is inconvenient because data will live in multiple places and we're
> requiring any software parsing XDR to over time expand the internal mapping
> it has to do with each new extension version. The benefits of the change to
> the extension structure isn't clear to me and so I think it needs
> discussing in the design rationale. In any case I think we need to rethink
> the extension structure because of point 5 below.

In both go and C++ the suggested AccountEntryExtension mechanism is
easier and less error prone to use. It also leads to a more efficient
encoding, in which there isn't a monotonically increasing space penalty
every time we extend the AccountEntry structure.

That said, this argument is entirely orthogonal to the goals of
CAP-0021. CAP-0021 is that way because I wrote it and because I think
we are doing XDR extensions wrong. So if people can't accept the
suggested structure then we can convert it to the ever-more-nested
dangling union approach, with just an 8-byte encoding penalty in the
AccountEntry and two-tab-stop penalty in the source code. I can live
with both.

> 5. Sponsorship – The account extension v2 is only added to accounts if
> participating in sponsorship, but adding a new dangling extension or the
> new v3 in CAP-21 will store that data regardless of an accounts
> participation in sponsorship, changing assumptions core makes about when
> that data is present. This change might not matter, but the CAP should
> probably discuss this change in more detail if we intend this change to
> take place. Another option is we change the account extension structure so
> v2 can continue to only be present for accounts participating in
> sponsorship.

We can make the sponsorship fields their own structure and include a
pointer to it in AccountEntryExtensionV3 if you believe this is easier.

> 6. TimePoint change to int64 – The proposal changes the type of TimePoints
> to from uint64 to int64, but this could be breaking change for any
> transaction that has a value in time bounds that is greater than MAX_INT64.
> Such a transaction would become invalid since it would be negative. I ran a
> query
> <https://gist.github.com/leighmcculloch/01226c1f122511f43438d0522b89fc06> on
> all previous transactions and there are no results over MAX_INT64 which
> suggests nobody is building transactions that would become invalid. We
> could interpret a negative int64 value to be MAX_INT64 so that no
> transaction with a uint64 timebound greater than int64 max becomes invalid.
> Do you think that is necessary? At the least I think we need to discuss
> this backwards compatibility in the CAP.

I think the easiest is just to make negative values invalid and declare
this in the CAP. I think the risk is minimal, and doing so would
potentially resolve other risks that arise out of being inconsistent
with signed/unsigned time values.

If people don't object or have specific other suggestions, I will draft
new language around these in the CAP.

David

Leigh McCulloch

unread,
Jul 2, 2021, 3:16:45 PM7/2/21
to Stellar Developers
Regarding (2) backwards compatibility and assuming a value of 0, that sounds good to me, but we should be clear that it is not the same as saying the precondition is ignored or always valid. It will be valid for many practical values, but invalid for any large and maybe impractical values. That is probably unlikely to be an issue.

Regarding (5) sponsorship and (4) the extension, I'm in favor of continuing the dangling because it reduces the mapping effort of readers of the data and limits the number of places that store the same information. However, the dangling approach poses problems with assumptions in core about the presence of account extension v2 containing sponsorship, so I'm not sure what is the best approach. Placing sponsorship in the new extension and foregoing the dangling does seem reasonable. I defer to Jon and Nico on this.

Regarding (1), (3), and (6), these additions/changes sound good to me. 

Jonathan Jove

unread,
Jul 29, 2021, 4:03:33 PM7/29/21
to Leigh McCulloch, Stellar Developers
For transparency, posting the same questions I had during the protocol committee meeting:

"All transactions are validated, sequence numbers increased, and fees deducted before any operations are executed. One consequence is that BUMP_SEQUENCE operations, though they update an account's seqTime and seqLedger, do not affect the validity of other transactions in the same block." --> Neither sequence number processing nor validity checking currently work like this. Right now, validity checks are done at validation time and during apply time (eg. a transaction must still be valid to be applied).

"A transaction whose preconditions are not satisfied is invalid and must not execute (even to fail), meaning it cannot change the sourceAccount sequence number or charge a fee. To insure that invalid transactions do not propagate, a transaction with a non-zero minSeqLedgerGap or minSeqAge may not execute in the same ledger as a transaction with a lower seqNum on the same sourceAccount." --> Based on the above, BUMP_SEQUENCE can invalid a transaction thereby causing it to fail. 

I think the solution to these problems is to have a new notion of validity checks: checks that are performed only at validation time. The current preconditions (time-bounds) could work this way, and the new preconditions could too.

One other thing that caught my attention when reviewing the proposal:

"Any validator that receives both of these transactions will only forward T1. However, if a validator sees a nomination vote for a block that contains T2 but not T1, the validator will nonetheless vote for the block. The logic is identical to a situation in which T1 and T2 have the same sequence number but T1 has twice the fee--validators seeing both should preferentially forward T1, but also accept T2 in nominated blocks." --> Transaction submitters have to be very careful because validators will have the discretion to nominate the transaction set that they prefer. For example, if a transaction is at sequence number 10 and a validator has T1 with sequence number 11 and T2 with sequence number 12, min seq 10 then it could nominate any of (T1), (T1,T2) or (T2).

--
You received this message because you are subscribed to the Google Groups "Stellar Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email to stellar-dev...@googlegroups.com.

Eric Saunders

unread,
Jul 29, 2021, 4:50:33 PM7/29/21
to Jonathan Jove, Leigh McCulloch, Stellar Developers
I have some concerns that we didn't get to / seem outside the scope of the protocol meeting, so posting them here.

1) CAP 21 drives significant fundamental changes for Horizon tx submission logic, which we haven’t fully evaluated yet. In the worst case we might have to remove sequence number ordering from Horizon’s queue entirely. Right now Horizon decides on tx submission based on sequence number only and completely ignores time bounds. For example, if seqnum matches but time bounds don’t match currently we submit the tx anyway. With these changes the contract gets murkier because one may continue to believe that Horizon will submit the tx if preconditions are met. We could potentially change this contract (and I'd like to hear opinions on this). I think it would be unwise to vote to accept this CAP without this discussion. We should in parallel more precisely determine the specific impacts for Horizon tx submission.

2) From the CAP:

This proposal changes TimePoint to be signed rather than unsigned, and adds a signed Duration type. These types should also be adopted by ClaimPredicate. There is little need for time points more than 2^{63} seconds after the Unix epoch, but many databases and languages do not store unsigned 64-bit values conveniently. Thus, it is better to use signed values. Since TimePoints should only be positive, any transaction containing a negative time-point is invalid and must not be forwarded or included in a block.

I think this is a refactoring not in scope for this CAP. The CAP doesn't need it. The protocol doesn't care about this. It's not intuitive (negative TimePoints don’t make sense). For other fields, we already work around this downstream (e.g. in Horizon and in JS-SDK). Changing it now breaks downstream. We still have to support all past ledgers from previous protocol versions. New implementors of TimePoint might assume this value has always been a signed int in the past.

Eric


Leigh McCulloch

unread,
Jul 29, 2021, 5:13:02 PM7/29/21
to Stellar Developers
>...validity checks are done at validation time and during apply time...
>I think the solution to these problems is to have a new notion of validity checks: checks that are performed only at validation time. The current preconditions (time-bounds) could work this way, and the new preconditions could too.

It's not clear to me if the payment channel protocol is affected by this issue. I think the protocol survives because the only way to interrupt and/or break a Ti is with a valid T{i+n}, at which case there will always be a T{i+n+1} to close the channel that can only be valid if T{i+n} was submitted. However, this seems especially subtle and I'm not confident every case is handled. If we can make preconditions only affect validity and not apply that seems preferred since reducing the cases that transactions can fail at apply improves the predictability of contracts. 

>if a transaction is at sequence number 10 and a validator has T1 with sequence number 11 and T2 with sequence number 12, min seq 10 then it could nominate any of (T1), (T1,T2) or (T2).

I think this is okay for the payment channel use cases discussed in the CAP because T1 has no operations (a no-op bump seq) so it can't fail. T2 is only valid if applied immediately after T1.

This would affect the parallel submission use case though, since that use case is built on the assumption that every tx will be nominated. I think that use case should be removed from CAP-21 if (T1,T2) is not guaranteed.

Leigh McCulloch

unread,
Jul 29, 2021, 5:48:00 PM7/29/21
to Stellar Developers
>Right now Horizon decides on tx submission based on sequence number only and completely ignores time bounds. For example, if seqnum matches but time bounds don’t match currently we submit the tx anyway. With these changes the contract gets murkier because one may continue to believe that Horizon will submit the tx if preconditions are met.

I think we should continue to limit the preconditions that Horizon analyzes based on the error cases it protects against. Today Horizon analyzes the transactions seqNum, queueing and ordering txs, which limits the number of tx_bad_seq errors that occur. Combinations of seqNum and CAP-21's minSeqNum that cause a tx to fail will result in tx_bad_seq, so we should update Horizon to hold a tx until its seqNum and minSeqNum make the tx valid to continue to offer that tx_bad_seq protection and to prevent Horizon from holding a tx longer than it needs to fulfill the minSeqNum.

All other preconditions introduced in CAP-21 may cause txs to fail with tx_too_early (minSeqAge, minSeqLedgerGap, ledgerBounds), tx_too_late (ledgerBounds), or tx_bad_auth (extraSigners). These error cases are not prevented by Horizon today and I don't think they need to be tomorrow.

The limited change I propose is a non-breaking change to Horizon's API since any tx that would be valid before will continue to be valid with behavior unchanged, and any tx using minSeqNum will be ordered as it is today, and held till valid.

>This proposal changes TimePoint to be signed rather than unsigned, and adds a signed Duration type.
>I think this is a refactoring not in scope for this CAP. The CAP doesn't need it.

Durations are already signed int64 and this change only gives them a name, Duration.

On first pass I agreed that it wasn't worth refactoring TimePoint to be signed, however after reviewing the XDR and some SDKs it appears TimePoint is only used for time bounds and close time, and the Java, Go, Python, and JS Stellar SDKs already use signed int64 (or less) for time bounds. Other time fields used for claimable balances have int64 hardcoded in the XDR rather than use TimePoint. Given that close time will have never been in the negative space this seems like a zero risk change, and we could consider this bringing the XDR in line with actual ecosystem usage.

>We still have to support all past ledgers from previous protocol versions. New implementors of TimePoint might assume this value has always been a signed int in the past.

How would this change affect Horizon's ingestion of old vs new ledgers?

Leigh McCulloch

unread,
Jul 29, 2021, 5:54:30 PM7/29/21
to Stellar Developers
>the Java, Go, Python, and JS Stellar SDKs already use signed int64 (or less) for time bounds

For completeness, the Qt SDK uses int64, and there are some SDKs that use unsigned int64 timebounds: .NET, iOS, Scala, Ruby.

Nicolas Barry

unread,
Sep 7, 2021, 9:30:36 PM9/7/21
to Stellar Developers
Hello

I was reviewed the updated draft (still in PR). Here are my notes/questions.

> `seqTime` `seqLedger`

While the functionality is already opt-in (only transactions that use the new preconditions are effected), the proposal may have some broader testing implications. Should those be "opt-in" by using a new account flag? Only "envelope tests" would need to change. Maybe we can do that if testing as proposed is indeed too much work.

 

 > This proposal changes TimePoint to be signed rather than unsigned

As noted in the "Backward incompatibility", this would break any pre-signed transaction that uses MAX_UINT64 (or any value larger than MAX_INT64) as `maxTime`. I *think* we have example of such transactions in history, so while unlikely, those exist already.

In any case, with this change, we'd have to support both signed and unsigned TimePoints depending on protocol version.

As code (in core, Horizon, SDKs, etc) has to deal with this today already, I am not sure the rationale to change it is sufficient.

 

> Transaction forwarding and ordering

> A transaction with a non-zero minSeqAge or minSeqLedgerGap must be treated as invalid and not forwarded--as if its minTime has not yet arrived--if either A) the appropriate condition (minSeqAge or minSeqLedgerGap) does not yet hold, or B) there are pending valid transactions with lower sequence numbers on the same sourceAccount

A few questions:

  1. This seems to imply that receiving transactions in reverse seqnum order is fine and I am assuming that they would "kick out" transactions with higher seqnum. I am not sure this was the intent.
  2. There are classes of subtle attacks that could exploit "B": a bad actor could submit lower seqnum and force to process all sequence numbers.

While the protocol described below under "Two-way payment channel", that uses separate D_i and C_i is immune to such delay attack, I wonder if we could harden the semantics a bit as it's quite a footgun.

Like: if we make it illegal to mix non 0 `minSeqNum` and `minSeqAge` (ie: make them mutually exclusive), it could nudge people towards safer smart contracts.

      3. Should we relax the condition in "B" to just be "first transaction received wins" (ie: if there is already a transaction in the queue for the same source account, the transaction with the "minSeq" condition gets rejected)? Note that this does not fix the problem from "2." but at least makes it more difficult for an attacker to be able to reliably submit older transactions and we end up with a simpler implementation.

 

> Transaction Validation

> All but the transaction with the lowest seqNum on a given sourceAccount must have 0 for the minSeqAge and minSeqLedgerGap fields.

I am not sure this is that useful: we should simplify this to just not allow to mix `minSeq` transactions with any other transaction with the same sourceAccount (following the rule used in how transactions get flooded).

A detail, but "Apply order" needs to be specified: we still sort by "tx.SeqNum" but allow gaps.

 

The point on "dual validation" is confusing: there are no mechanism that would cause `maxTime` and `maxLedger` to be valid when the transaction set is validated but invalid during externalize (as per CAP-034).

At this point I don't think we would allow any condition to trigger those error codes:

the only invalidations that can occur like this are seqnum related (bump sequence) or signature/threshold related but those don't fall under `txTOO_LATE` / `txTOO_EARLY`.

Leigh McCulloch

unread,
Sep 17, 2021, 3:45:52 PM9/17/21
to Stellar Developers
>Based on the above, BUMP_SEQUENCE can invalid a transaction thereby causing it to fail. 
>
>I think the solution to these problems is to have a new notion of validity checks: checks that are performed only at validation time. The current preconditions (time-bounds) could work this way, and the new preconditions could too.

This problem exists today irrespective of CAP-21 because you can submit a transaction that is valid, but another transaction accepted into the same ledger can contain a BUMP_SEQUENCE that causes the first to fail during apply. Right now the only parameter on an account that this can happen is the seqNum attribute. CAP-21 introduces two new attributes onto accounts that this can happen, seqAge and seqLedgerGap, but this isn't a new problem and is something protocol/contract developers must already be aware.

For that reason I don't think we need to solve this problem as part of CAP-21. This problem could be solved independently as another CAP.

I've updated CAP-21 so that the Security Concerns goes into greater detail to describe the problem and what developers must be aware.

On Thursday, July 29, 2021 at 1:03:33 PM UTC-7 Jonathan Jove wrote:

Jonathan Jove

unread,
Sep 17, 2021, 3:51:04 PM9/17/21
to Leigh McCulloch, Stellar Developers
I need to think more about whether solving the problem is important, but a cursory examination suggests it probably isn't. But regardless, if you propose not solving it then I would want the language in the proposal to reflect the actual semantics.

--
You received this message because you are subscribed to the Google Groups "Stellar Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email to stellar-dev...@googlegroups.com.

Leigh McCulloch

unread,
Sep 17, 2021, 4:29:22 PM9/17/21
to Stellar Developers
I believe the latest version of CAP-21 has the semantics fixed. David addressed them with a change on 2021-09-09.

Jonathan Jove

unread,
Sep 17, 2021, 4:32:53 PM9/17/21
to Leigh McCulloch, Stellar Developers
I still see

"A transaction whose preconditions are not satisfied is invalid and must not execute (even to fail), meaning it cannot change the sourceAccount sequence number or charge a fee. To insure that invalid transactions do not propagate, a transaction with a non-zero minSeqLedgerGap or minSeqAge may not execute in the same ledger as a transaction with a lower seqNum on the same sourceAccount."

in the current draft.

--
You received this message because you are subscribed to the Google Groups "Stellar Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email to stellar-dev...@googlegroups.com.

Leigh McCulloch

unread,
Nov 4, 2021, 11:18:04 PM11/4/21
to Stellar Developers
Discussions with some folks building on Stellar Turrets surfaced a use case that CAP-21 helps with that we haven't discussed. 

CAP-21's ledgerBounds precondition will make it possible to guarantee that a transaction creating an account will have a deterministic sequence number if the transaction succeeds. This will make it possible to predict what the next transactions sequence number would be for the account, and therefore make it possible to add a subsequent transaction as a pre-authorized transaction to the account being created in the same transaction as the account creation. This isn't possible today without making some assumptions using time bounds. Ledger bounds make this guarantee much more solid and safer.

An example transaction set:
tx1:
- source account: A
- ledger bounds: 0 to N
- ops:
  - create account B
  - add trustline to B
  - payment of asset from A to B
  - add tx2 preauth signer to B
  - set master signer of B to 0
  - bump seq of B to N<<32
tx2:
- source account: B
- seq num: N<<32
- ops:
  - …

This use case is independent of payment channels and are another example that the primitives in CAP-21 are more broadly useful.

I've added a new section to CAP-21 under the Design Rationale that discusses this use case and capability:
Reply all
Reply to author
Forward
0 new messages