Hi everybody,
I am working on interference with directional antennas in ns. I am facing a problem with a specific part of my results. At the moment I am unsure if it is a mistake, my misunderstanding, a bug or if I have reached the limits of the ns abstraction layer.
Here is what I have done so far:
I have not used any existing antenna-module since I am specifically interested in the effects of side-lobes. I have verified this part of the implementation pretty excessive and if anybody is interested in the results or using it, let me know. I skip the details here now.
So my first experiment focuses on inter-flow interference. Here is the setting (Please also have look at the first attachment "inter_flow.png"):
Four nodes, forming two WiFi links between have been arranged in a rectangular two-dimensional grid. All nodes are equipped with directional-antennas aligned towards their corresponding partner. There are two important distances x and y; x is set fixed to 1000m and y is incremented for each new run of the simulation. I saturate the medium (Adhoc - MCS7) with UDP-Traffic and use the FlowMonitor to obtain the throughput on each link. In addition, I use the “WifiPhy/SignalArrival” callback to trace the signal-strength originating from specific nodes. Two signal-strengths are of interested, the parallel-interference (Par.) and the cross-interference (Cr.).
Please have a look at the second attachment which are the first results when the traffic is sent from 3→2 and 2→1 (case A). It shows the side-lobes (from the cross-interference) as expected in a wave-form. The parallel interference decreases according to the FSPL model. The other black line shows the EnergyDetectionThreshold, which is set to -95 dBm. I also added the throughput in red and blue.
Almost everything seems logical: In the first part, when both interference signals (parallel and cross) are above the EnergyDetectionThreshold the DCF does their job and both links receive an equal amount of throughput. When both signals are below the EnergyDetectionThreshold they are independent and both saturate the medium.
But. When just the cross-interference signal is above the threshold for example from 690-940m, I would expect a completely different throughput. Why do both stations receive almost the same amount of throughput like when the DCF is involved? Station 3 is unable to detect the transmission from Station 1 and the other way round. The DCF should not work here and I also disabled RTS/CTS. I would expect that both stations continuously sent their packets and that there is interference at the receiver node (node 2 and 0). In fact, I would even assume that the shape of both throughput lines and the cross-interference line are kind of similar. Since more cross-interference should increase the noise, reduce the SNR and therefore reduce the throughput.
When I look inside the debug logs I see that many packets are dropped at station 2 and 0:
“2.04895s 2 SpectrumWifiPhy:StartRx(): drop packet because already in Rx (power=1.07549e-08W)”
If a station is already in Rx or Tx, and another WiFi packet is arriving later, this should increase the noise? From my understanding, the NistErrorRateModel should do exactly this?
Thanks for reading. Any help, hints or discussion would really help me out.
I am currently working on the ns-3 Spectrum-WiFi-Phy branch located here.
With kind regards,
Michael
Hi everybody,
I am working on interference with directional antennas in ns. I am facing a problem with a specific part of my results. At the moment I am unsure if it is a mistake, my misunderstanding, a bug or if I have reached the limits of the ns abstraction layer.
Here is what I have done so far:
- I extended the new Spectrum-WiFi-Phy module to use directional antennas.
- I implemented a small antenna-module to read antenna diagrams in the .ant v3 format which is provided by many vendors.
I have not used any existing antenna-module since I am specifically interested in the effects of side-lobes. I have verified this part of the implementation pretty excessive and if anybody is interested in the results or using it, let me know. I skip the details here now.
So my first experiment focuses on inter-flow interference. Here is the setting (Please also have look at the first attachment "inter_flow.png"):
Four nodes, forming two WiFi links between have been arranged in a rectangular two-dimensional grid. All nodes are equipped with directional-antennas aligned towards their corresponding partner. There are two important distances x and y; x is set fixed to 1000m and y is incremented for each new run of the simulation. I saturate the medium (Adhoc - MCS7) with UDP-Traffic and use the FlowMonitor to obtain the throughput on each link. In addition, I use the “WifiPhy/SignalArrival” callback to trace the signal-strength originating from specific nodes. Two signal-strengths are of interested, the parallel-interference (Par.) and the cross-interference (Cr.).
Please have a look at the second attachment which are the first results when the traffic is sent from 3→2 and 2→1 (case A). It shows the side-lobes (from the cross-interference) as expected in a wave-form. The parallel interference decreases according to the FSPL model. The other black line shows the EnergyDetectionThreshold, which is set to -95 dBm. I also added the throughput in red and blue.
Almost everything seems logical: In the first part, when both interference signals (parallel and cross) are above the EnergyDetectionThreshold the DCF does their job and both links receive an equal amount of throughput. When both signals are below the EnergyDetectionThreshold they are independent and both saturate the medium.
But. When just the cross-interference signal is above the threshold for example from 690-940m, I would expect a completely different throughput. Why do both stations receive almost the same amount of throughput like when the DCF is involved? Station 3 is unable to detect the transmission from Station 1 and the other way round. The DCF should not work here and I also disabled RTS/CTS. I would expect that both stations continuously sent their packets and that there is interference at the receiver node (node 2 and 0). In fact, I would even assume that the shape of both throughput lines and the cross-interference line are kind of similar. Since more cross-interference should increase the noise, reduce the SNR and therefore reduce the throughput.
When I look inside the debug logs I see that many packets are dropped at station 2 and 0:
“2.04895s 2 SpectrumWifiPhy:StartRx(): drop packet because already in Rx (power=1.07549e-08W)”
If a station is already in Rx or Tx, and another WiFi packet is arriving later, this should increase the noise? From my understanding, the NistErrorRateModel should do exactly this?
Thanks for reading. Any help, hints or discussion would really help me out.
I am currently working on the ns-3 Spectrum-WiFi-Phy branch located here.
On 08/30/2016 08:23 AM, Michael Rademacher wrote:
Hi everybody,
I am working on interference with directional antennas in ns. I am facing a problem with a specific part of my results. At the moment I am unsure if it is a mistake, my misunderstanding, a bug or if I have reached the limits of the ns abstraction layer.
Hi Michael, thanks for a well written description. Inline below.
I suspect that others will be interested in this module in the future, so please consider to contribute it someday; anyway, that is beside the point for this post.
Here is what I have done so far:
- I extended the new Spectrum-WiFi-Phy module to use directional antennas.
- I implemented a small antenna-module to read antenna diagrams in the .ant v3 format which is provided by many vendors.
I have not used any existing antenna-module since I am specifically interested in the effects of side-lobes. I have verified this part of the implementation pretty excessive and if anybody is interested in the results or using it, let me know. I skip the details here now.
So my first experiment focuses on inter-flow interference. Here is the setting (Please also have look at the first attachment "inter_flow.png"):
Four nodes, forming two WiFi links between have been arranged in a rectangular two-dimensional grid. All nodes are equipped with directional-antennas aligned towards their corresponding partner. There are two important distances x and y; x is set fixed to 1000m and y is incremented for each new run of the simulation. I saturate the medium (Adhoc - MCS7) with UDP-Traffic and use the FlowMonitor to obtain the throughput on each link. In addition, I use the “WifiPhy/SignalArrival” callback to trace the signal-strength originating from specific nodes. Two signal-strengths are of interested, the parallel-interference (Par.) and the cross-interference (Cr.).
Are you using Ideal or MinstrelHt rate control?
Please have a look at the second attachment which are the first results when the traffic is sent from 3→2 and 2→1 (case A). It shows the side-lobes (from the cross-interference) as expected in a wave-form. The parallel interference decreases according to the FSPL model. The other black line shows the EnergyDetectionThreshold, which is set to -95 dBm. I also added the throughput in red and blue.
Almost everything seems logical: In the first part, when both interference signals (parallel and cross) are above the EnergyDetectionThreshold the DCF does their job and both links receive an equal amount of throughput. When both signals are below the EnergyDetectionThreshold they are independent and both saturate the medium.
But. When just the cross-interference signal is above the threshold for example from 690-940m, I would expect a completely different throughput. Why do both stations receive almost the same amount of throughput like when the DCF is involved? Station 3 is unable to detect the transmission from Station 1 and the other way round. The DCF should not work here and I also disabled RTS/CTS. I would expect that both stations continuously sent their packets and that there is interference at the receiver node (node 2 and 0). In fact, I would even assume that the shape of both throughput lines and the cross-interference line are kind of similar. Since more cross-interference should increase the noise, reduce the SNR and therefore reduce the throughput.
I do not expect DCF to be involved since Par signal level is so low (they are hidden from one another). I would expect instead that both transmitters would send as fast as they can and that there would be collisions at the receiver, and depending on the timing of frames, some may get through and most would collide. I would probably expect the throughputs to be similar since the scenario is symmetric, but whether it should be this high (27 Mb/s) or not, I don't know. Basically, this suggests to me that both nodes are able to squeeze enough frames through during the times in which the other node is not sending for some reason.
I suggest to look at the proportion of time (using the signal trace) that each sending node is active on the channel, and then see whether this is consistent with these nodes being able to get that much throughput through.
It will increase the noise; at an earlier part of the code, the signal is passed to the InterferenceHelper (regardless of this drop). This drop log is just saying that there is no way that the reception will be successful, but the frame is still accounted for as interference (noise).
When I look inside the debug logs I see that many packets are dropped at station 2 and 0:
“2.04895s 2 SpectrumWifiPhy:StartRx(): drop packet because already in Rx (power=1.07549e-08W)”
If a station is already in Rx or Tx, and another WiFi packet is arriving later, this should increase the noise? From my understanding, the NistErrorRateModel should do exactly this?
Thanks for reading. Any help, hints or discussion would really help me out.
I am currently working on the ns-3 Spectrum-WiFi-Phy branch located here.
Please note that this branch has been merged to ns-3-dev, and no further work on it is planned, so please cut over to track ns-3-dev when you are ready.
- Tom
Another strange thing occurs for me in the debug log of the NistErrorRateModel:
5.56767s 0 NistErrorRateModel:Get16QamBer(): 16-Qam snr=134340 ber=05.56767s 2 NistErrorRateModel:Get16QamBer(): 16-Qam snr=80998.2 ber=05.56815s 0 NistErrorRateModel:Get16QamBer(): 16-Qam snr=134340 ber=05.56815s 2 NistErrorRateModel:Get16QamBer(): 16-Qam snr=80998.2 ber=0These snr values seem to be very high and kind of arbitrary. Perhaps when there is no real-noise reported by the Spectrum-WiFi-Phy there is an init problem with the noise value?
It will increase the noise; at an earlier part of the code, the signal is passed to the InterferenceHelper (regardless of this drop). This drop log is just saying that there is no way that the reception will be successful, but the frame is still accounted for as interference (noise).
When I look inside the debug logs I see that many packets are dropped at station 2 and 0:
“2.04895s 2 SpectrumWifiPhy:StartRx(): drop packet because already in Rx (power=1.07549e-08W)”
If a station is already in Rx or Tx, and another WiFi packet is arriving later, this should increase the noise? From my understanding, the NistErrorRateModel should do exactly this?
Thanks for reading. Any help, hints or discussion would really help me out.
I am currently working on the ns-3 Spectrum-WiFi-Phy branch located here.
Please note that this branch has been merged to ns-3-dev, and no further work on it is planned, so please cut over to track ns-3-dev when you are ready.
I tried to merge my code to ns-3-dev ( 12276:8156de57e4a6) at the beginning of the week but I ran into multiple issue. First, I need to apply your commit from github " wifi: copy Packet to allow multiple devices to modify ". It seems not to be merged into ns-3-dev yet? Second, I ran into bug 2477 . If you need some logs or additional information let me know. I think I am waiting for ns-3.26 to be release before I conduct an additional merge attempt.