This paper builds a model for the benchmarking and the selection of a suitable LED for wireless optical communication, in particular for indoor LiFi Infrared or visible light communication. It reviews LED measurements and theoretical models for such trade-off and applies these into communication bit-rate throughput expressions. While illumination LEDs are chosen for a large quantum efficiency, for communications also a large 3 dB bandwidth is preferred. In the LED, electron hole pairs recombine radiatively (thereby emitting a photon) or non-radiatively (causing a leakage current and reducing EQE). Non-radiative recombination also contributes to the response speed of the LED and increases its 3 dB bandwidth. On the other hand, a reduction in effective optical power may counterproductively lead to an inadequate signal-to-noise ratio. A trade-off is postulated empirically, in the form of a rule of thumb: “transmit power raised to the power alpha times bandwidth raised to the power one minus alpha” appears to be an LED constant. This semi-empirical model gives straight lines on a log-log scale. This paper searches for a theoretical justification for such a model, where current density acts as a parameter to make the trade-off. According to communication theory, the achievable bit rate grows approximately linearly with an increasing bandwidth but approximately logarithmically with the received energy per bit. However, this needs to be reviewed for a gentle low pass roll-off of the LED response, as it allows modulation far beyond the 3 dB bandwidth. These lead to a perspective on how to operate the LED: a system design faces the challenge to trade-off power versus bandwidth according to the physics LED properties, to optimize a communication throughput target.
|