In the OEM wireless compliance guide (DA-08149-001_v14) there are values given for peak antenna gain. Then the cable loss is subtracted which then leads to the customer maximum allowed antenna gain. But shouldn’t the cable loss be added to the maximum allowed antenna gain to ensure peak antenna gain at the module?
This is from the compliance guide tables 2, 3, and 4, consolidated into a spreadsheet:
|Cable Loss (dBm):||0.9||0.9||0.9||2||2|
|Max. Ant. Gain (dBi):||1.51||1.91||1.96||3.49||3.57|
|Gain – Loss:||1.51||1.91||1.96||3.49||3.57|
What they are doing is looking at maximum emissions, not effectiveness. If you produce noise and spurious emissions in the transmission lines you won’t be allowed to bump up the power in order to keep the antenna producing the same output. Even if you were to do this, then your receive side would still suffer, so bumping up power has only so much usefulness anyway. If you want a more effective antenna, then the transmission line requires the best quality you can get. I believe the tables are correct since this is about total power emitted, but I could be wrong.