Issue with Latency Analysis

66 views
Skip to first unread message

Richa Gupta

unread,
Jun 30, 2021, 4:29:38 PM6/30/21
to OSATE
I am new to AADL modeling and using OSATE to run Latency analysis on a multicore configuration. My model is simple I have 2 cores (C1, C2) instantiated from the same implementation running a process (P1, P2) and a  thread (T1, T2) each. Both threads are periodic running at the same rate (R1). The cores are connected by a  generic bus implementation. I have created data ports on both threads sending data from T1->T2. I created a flow from T1->P1->C1->generic bus->C2->P2->T2. I am able to instantiate the model without any errors. I am running into issues with the OSATE latency analysis tool. I chose SS, MF, ET, EQ, and DQL options. Not modeling ARINC653 partitions.

My goal is to model latency when critical data is sent from core1 to core2 in a synchronous system.

The latency report provides the following  information:
1) First Sampling time for T1 defined as a source that is zero
2) Processing time for T1 which it gets from  Compute_Execution_Time 
3) Generic bus transmission time (calculated correctly)
4) Generic bus queue and sampling protocol time (both zero as no data was provided)
5) Connection has P1.t1.tx_data->P2.t2.rcv_data no sampling or queuing latency. (tried defining connection as Immediate, Delayed or Sampling but no change; queuing latency is always zero since I disabled queues)
6) Sampling time for T2 defined as a sink is non-zero which is the problem
a) Set port connection Timing =>Sampling, Min method is sampling, Min Actual is zero, and Max Actual is a weird value. After reading the latency paper Section 4.2 I thought the T2 sampling will be the processing of T1 and any connection delay but the value report provides is not what I expected. 
b)  Set port connection Timing =>Delayed, Min method is delayed sampling, Min Actual is greater than Max Actual value. How can Min be greater than Max actual? I was expecting a frame delayed value;   get T1 period+ bus tx time but Min Actual is less than T1 period(R1) and max actual is less than half of T2 period(R1). 
c) Set port connection Timing =>Immediate; No issue
7) Processing time for T2 which it gets from  Compute_Execution_Time and is correct

I will appreciate if someone can explain how the OSATE tool calculates the sampling for T2 thread. Does the tool have a bug? I tried changing the "Compute_Execution_Time" property for both T1 and T2 but got similar results. 
What am I doing wrong in the model? Or a bug in OSATE tool?
Thanks


jjhudak

unread,
Jul 1, 2021, 10:00:42 AM7/1/21
to OSATE
A little more information would be helpful. 
1) what are the execution properties have you specified for the threads?
2) what deployment bindings have you specified? (thread and connection)
3) Do you specify a latency for the sink thread in addition to the thread execution properties?
For port to port communication the port type and dispatch protocols determine the communication timing.
For data ports, the values are available at thread completion.
For event and event data,  outputs are available anytime during the thread execution

Default input timing is that data is frozen at thread dispatch time - but can be overridden with properties
As as result, the thread does not see new data before the next dispatch.

If you want to model synchronous communication, the most direct way to model it is to use immediate connections.  Sampled port connections can lead to nondeterministic data communication.

Seeing the model would probably be helpful.
-John
Reply all
Reply to author
Forward
Message has been deleted
0 new messages