Main forum to discuss RTL-SDR related topics.
5 posts • Page 1 of 1
I was just looking at this page http://www.shawngarringer.org/2014/01/0 ... d-rtl-sdr/ for how to identify cellphone tower signals for the 3 major technologies used now (CDMA2000, GSM, and LTE). GSM towers use a relatively small bandwidth. CDMA2000 towers almost completely fill the waterfall display of the SDR. Meanwhile, LTE extends beyond the edges of the waterfall display. Though the the picture on that page just shows one spectral width, I've seen signals myself in the cellphone bands that look like the one in the picture on that webpage, and they extend to several times the width of the waterfall display (which is a 2.4MHz wide display). Why do LTE cellphone towers use so much bandwidth?
I was wondering if somehow LTE required a wider spectrum, as an inherent part of the protocol, like FHSS (frequency hopping spread spectrum), as opposed to simply using a higher data rate (which of course increases the bandwidth).rtlsdrblog wrote:Well i'm not sure exactly what you're asking for, but simply put: data can be transferred at a faster rate with a wider bandwidth. Think of it like using a fat hose vs a straw.
IIRC, LTE uses a variation of OFDM, which is the equivalent of splitting up and spreading the data out over many many channels simultaneously. That provides both bandwidth and resistance to interference and multi-path nulls, and might be easier to demodulate if have a chip that can do tons of very fast FFTs.