Hi David!
I just saw that you responded, sorry for the delay.
This is mainly a coordination problem. To detect temporal patterns, large numbers of events must be combined during reasoning. In a distributed design where events are routed to separate concepts, this quickly breaks down: coordination becomes difficult, and some form of centralized structure (buffers, FIFOs, etc.) becomes necessary.
Independent of AIKR, the scale of event integration truly matters. At high well-prepared/engineered abstraction levels, relations like <a =/> b> with single events can be useful. But for high-throughput, sensory-like streams, even sequences or parallel conjunctions of ~20 events are often insufficient. Dependencies between events are often way more complex than simply "sequence of events A predicts B", and behavior demands extremely fine-grained, tightly coordinated action, something that is undermined both by distributed system realities and by the constraints of logical descriptions.
Best regards,
Patrick