fundamental question regarding bandwidth
The link between the time and frequency domain is the Fourier Transform.
In optical fiber and elsewhere the media bandwidth is the available bandwidth. Data rate and information transmission requires to modulate a carrier frequency. Different modulation types generate different bandwidths for the same data rate. The generated spectra are then organized to fill the available spectrum in a propagation medium.
In reality, even with no modulation (no information, so bandwidth should be zero) the actual spectra of the emission will have some nonzero bandwidth, just due to the nonideal carrier generation. In optics, this bandwidth may actually be larger than the bandwidth associated with useful modulation. For example, a TV remote only has tens of KHz of modulation bandwidth, but the actual spectra of the LED's emissions is far wider.
But what I don't understand is that, for example, I used a 1550nm light to be the source and use BPSK to modulate it, after passing an optic fiber, dispersion makes the bandwidth reduced, then what dose this reduction actually mean? I don't understand if there is only a single frequency of light forming a signal, then what is the purpose to find the frequency range that generate such signal?