For example, the entire LMAX system operates in memory, and has been described as a "single aggregate" - the data within the entire (in memory) system is consistent by the time a command has finished executing. This has several benefits, including a global total ordering of events. A memory image is in contrast to a more distributed system, where each aggregate forms a transactional boundary. Such transactional boundaries are important because they enforce internal consistency at a micro-level while still allowing scalability.
How do you think a "global" aggregate impacts the modelling process, both positively and negatively? I can think of several changes in mindset.
Firstly, you do not need to store (sub-) aggregate identifiers, but can store direct references to objects. This can be a positive thing, in that there is no indirection (via a repository) to look up an object. On the negative side, this can cause issues in regards to memory management - cyclic references if you're using smart pointers, for example. On the positive side, this can be more expressive, as the domain is representative of the real world - I no longer store the customer "ID", which maps to a serialized blob/sequence of events on a disk or in a database or in the cloud - rather I just store a reference to the Customer directly.
Secondly, it introduces event handler complexity. I once worked on a piece of software that was written as a series of database stored procedures. After an INSERT/UPDATE/DELETE of data, this stored procedures would issues another SQL command (SQL triggers are effectively generic, non-semantic event handlers). The cascading affect was very hard to trace, but if one of the triggers had an issue, the whole system would be rolled back. In an in-memory situation, an error in an event handler would cause the entire system to crash, as there is no rollback facility in such an approach (unless you build one in of course).
Thirdly, it blurs the lines of invariants. E.g., an invariant is maintained by an aggregate/entity class. With a single-threaded, in-memory approach you can assume that one class will always be "in sync" with another. E.g., a Customer is always "preferential" if they have ordered more than $1,000 of product. This is a business "rule", but there may have been a time before this rule was implemented where preferential customers were those that subscribed to the mailing list. The invariants are what they are, but immediate consistency can fool people into mistaking relationships between entities as invariants.
What are other people's thoughts on the pros and cons of "system prevalence"?