Low frequency vs high freq
I read in several books, that decreasing frequency is a general principle to improve EMC for a module (i.e.: use the lowest bus frequency the microcontroller could do the job). Now somebody is telling me it's not necessary that way. Could anyone give me some examples of improving EMC by increasing frequency?
Specifically I reduced clock frequency (some 100s KHz) of some SPI interface (between micro and some driver), 4 times, being sure I've done a good thing, EMC wise.
Thank you,
nike
To be accurate, you have increase the rise and fall times to reduce EMC.
I decreased the frequency and increased rise/fall time as well (low pass filter).
My question was: how can one improve EMC by increasing frequency because I thought you do it by decreasing frequency, and this is the way to go, but someone more experienced than me told me I didn't necessarly improve the EMC by decreasing freq., meaning one could, in some situations, prefer the higher frequency, EMC wise, which doesn't make sense to me at least theoretically because higher frequency means higher spectra bandwidth, so more chances to interfere.
Thanks
nike
beats me. I cant understand how!
Hi techie,
Reading in the Clayton Paul's book "Introduction to EMC", recently, I found one possible reason why somebody would want to have higher frequency instead lower. Say there are 2 bus lines on the board, both with 10 MHz freq. In that case there will be two harmonics from each line at 20MHz, 30MHz, etc., and the amplitudes of the harmonics will be adding (if they have same phase) increasing noise. Suppose none of the frequency could be decreased because of functional reasons. Then we would prefer to increase one of them to say 15 MHz, and have the harmonics not adding one to the other.
Other than this I thought, the target is to use the lowest frequency allowed by functional requirements.
nike