Computational overhead is the time needed by a node to process a packet. It should be calculated from the time the packet has been fully received to the time the node has finished processing it. Eventually, after this time the node could have another packet ready to be sent.
We explained the topic of computation overhead too many times in this group.
I don't have any willingness to explain it again, especially to someone that shows no interest in reading what has been already extensively discussed.
If, by chance, your definition of computational overhead doesn't match the above one, please double check what you write before sending. Being precise is what differentiates a tech. from a [fill in].
Good luck,
T.