Thank you for the reply Tom.
My apologies I will attach my code and ask specific questions in this post.
I will explain what I have implemented, I have factory model with 80 relay nodes and 20 remote nodes, the remote nodes start the relaying service as remote nodes according to their RSRP values. The relay nodes start their relaying service according to their RSRP values as relay nodes, if their RSRP value is less than the expected threshold then they stop relaying service (eliminating bad nodes). With this I have the following questions:
1. When I increase the number of remote nodes and use sidelink for all the remote nodes than naturally I am dropping the packet reception rate from 100 % to 80%. Now I am assuming this is due to the collision on the sidelink. Am I correct? And collision can happen when the resource blocks are limited. So I wanted to know how I can see these values and where my packets are lost? are the due the collision of different kinds of messages (like cqidownlink, relay broadcasting etc) on the channel or are they due to the the same kind of the messages (messages from the application). Can you tell me how I can go in depth of this to know the loss of each packet and why?
2. In UE relaying network the implemented mechanism is decode and forward. And this adds processing time. Now in a simulation environment these values would be hardcoded. Can you tell me where I can find these values? I am asking this as I want to reduce the added latency due to decode and forward.
3. I am using the case for factories and the size of the factory is 50x50x6m with the base station positioned on the top. As the area is small its difficult to add a building blocking (as in ns3) I tried to reduce the power of the UEs but it was not that helpful. Can you tell me what can be done to add a realistic blocking model? (I am looking for ideas here)
I hope this time its more clear. Please let me know if I was not clear.