Having a handy simulation framework for my WiFi transceiver, I used it for a quick simulation study for the sensitivity of the frame detection block.
Frame detection in WiFi is usually implemented based on autocorrelation, as the short preamble is comprised of a pattern that repeats ten times.
Having a noisy signal, the question is at which autocorrelation threshold to trigger frame detection. This is of course a trade-off between missed frames (if not triggered) and computational overhead (when triggered on noise).
In my implementation, I calculate the autocorrelation coefficient, i.e., normalize the autocorrelation with the power of the signal. (Actually, I use a slightly larger window for the power then for the autocorrelation, but that’s not important here.) Running the simulations with 435 byte frames BPSK 1/2, I get the following graph.
I would say 0.56 is a pretty good value here. Therefore, I added a sensitivity parameter to the WiFi physical layer block and set its default value of 0.56. The code is already on Github.
Of course an analytical study, considering also impairments like frequency offset is still to do, but I think it’s a good starting point.