• You can now help support WorldwideDX when you shop on Amazon at no additional cost to you! Simply follow this Shop on Amazon link first and a portion of any purchase is sent to WorldwideDX to help with site costs.

How to perform the 2sc2999 and Schottky diode swap

Yes, if you select 10Hz BW this is true for discrete and narrow signals that are less than 10Hz wide but you are in the business of measuring noise levels so going down to 10Hz RBW doesn't make the analyser more capable in terms of measuring low noise levels.

Wrong on that.

Think of the 10 BHz BW as a sliding window through the wider BW signal. Thats why the storage mode is used, so that the information measured (slow scan rate) is reatined. It takes seconds to pass through the signal of interest this way. Noise is propotional to KTB. Bandwidth is set low (10 Hz) to minimize the product of these parameters. The analyzer is looking way below the floor of the CB radio, which has a large simultaneous bandwidth albiet a better NF, while the analyzer is looking at only a fraction at a time. KTB sets the stage for what can be done. Analyzer tackles the problem by using very narrow BW filters, and a history (memory) of the signal, so that over time, the whole BW of intrest is sampled.
 
But you can't dodge the 24dB noise figure limitation of the analyser when you are trying to measure NOISE levels.

We can agree that kTB noise in a 1Hz bandwidth at room temperature is -174dBm/Hz.

So.... if an analyser has a 24dB noise figure at 27MHz then it can be modelled as a receiver with a wall of noise at its input set at -174dBm/Hz + 24dB = -150dBm/Hz. That is -150dBm of noise power inside a 1Hz bandwidth.

If you put a CB amplifier ahead of this with a 6dB noise figure and 18dB gain the amplifier will also produce a wall of noise at -150dBm/Hz.
This is because -174 + 18 + 6 = -150dBm/Hz.

Once the two -150dBm/Hz noise levels meet at the analyser input then the noise level will rise 3dB (i.e. double the noise power) to -147dBm/Hz and this is what the analyser will display if it had a correctly calibrated 1Hz noise marker function. So the 'deaf' analyser gets the measurement of the CB amplifier noise level WRONG by about 3dB.

How are you going to be able to 'fix' this problem by selecting 10Hz bandwidth?

It's already a problem at 1Hz bandwidth.

If you select 10Hz BW then the noise from the CB amplifier will be -140dBm/10Hz and the noise contributed by the analyser will be -140dBm/10Hz.

So the analyser will read -137dBm/10Hz for the combined noise. It's STILL a measurement error of 3dB.

You haven't gained ANYTHING by swapping from 1Hz to 10Hz or vice versa. The analyser will still measure the level of the noise from the CB amplifier with about a 3dB error.

If you go to 100Hz RBW the analyser will display -127dBm/100Hz because (in a 100Hz RBW) the wall of noise at the analyser input is -130dBm/100Hz (because it has a 24dB noise figure) and the noise produced from the CB amplifier will also be -130dBm/100Hz. That pesky 3dB error is still there... :)


The other way to look at it is if you inject the same 24dB ENR noise source into a typical SSB CB. Where the deaf analyser can only display a change of 3dB in noise level once the noise source is turned on (even at 10Hz RBW) , the CB will show a change in noise level at the speaker of about 15dB despite the fact it has a much wider bandwidth.

i.e. the deaf old analyser thinks the ENR of the noise source is 27dB even when measured in a 10Hz RBW but the (10dB NFigure?) CB shows it is closer to 25dB. Both are wrong (correct answer should be 24dB) but at least the CB did a better job of measuring the noise level from the noise source because it has a noise figure that is about 14dB lower.
 
Last edited:
But you can't dodge the 24dB noise figure limitation of the analyser when you are trying to measure NOISE levels.

We can agree that kTB noise in a 1Hz bandwidth at room temperature is -174dBm/Hz.

So.... if an analyser has a 24dB noise figure at 27MHz then it can be modelled as a receiver with a wall of noise at its input set at -174dBm/Hz + 24dB = -150dBm/Hz. That is -150dBm of noise power inside a 1Hz bandwidth.

.



Have you looked at the noise floor of these types of receivers? Its in the -125dBm range, and goes much higher, when you crank up the RF gain. Turn the radio off, floor goes to about the -130dBm range for mine at, depending on RES BW and how much time I want to spend on the scan at that setting (can be 10 seconds per division at the slowest scan rate). Back on and width with the carrier present, I can differentiate the carrier > 70dB above the radio's noise floor at I/F, depending on RF gain and injected signal strength, of course.

What is the best C/N you typically measure with yours and what was the lowest noise floor you made your measurements at ?
 
Last edited:
I do it using a sig gen at the input and a true rms voltmeter at the AF output and for a healthy mk2 Cobra 148GTL-DX I'd expec the noise floor in a typical SSB bandwidth of 2400Hz to be about -131dBm in SSB mode. (= -165dBm/Hz noise floor)

I can also use a NoiseCOM noise source to measure the receiver using the hot/cold method (and the true rms meter again) and this gives a noise figure of about 9dB. This agrees fairly well with the result above. i.e. -174dBm/Hz + 9dB = -165dBm/Hz.

I know you will argue that the analyser can go down to -140dBm in a 10Hz bw (= -150dBm/Hz) but this is only of use if you want to measure a narrow signal like a carrier that can sit inside the 10Hz bandwidth.

When it comes to measuring NOISE then the receiver with the lowest noise figure will always win. In this case the CB wins over the analyser by about 14 to 15dB :)

i.e. a -165dBm/Hz noise floor is better than a -150dBm/Hz noise floor....
 
Last edited:
You are right, we aren't in agreement on the use of the resolution BW to measure low level signals.

This is an excerpt from my 1971 HP text "Electronic Measurements and Instrumentation", chapter 16-8 :

"Sensitivity The ability of the swept superheterodyne spectrum
analyzer to measure small signals is determined by its own internally
generated noise. Typical noise figures vary from 25 dB at low frequency
to 40 dB at 12 GHz. The internally generated noise referred to the
spectrum-analyzer input exceeds basic thermal noise by these noise
figures. At room temperature, the thermal-noise power spectral density
4 X 10- 9 mW/MHz, or -114 dBm in a 1-MHz bandwidth. The avail-
able thermal noise power, in watts,

Ph = kTB (16-8-1)

where B = bandwidth of system

k = Boltzmann's constant, 1.38 X 10- 23 W-sec/°K
T = absolute temperature, °K

The noise on the spectrum-analyzer display is that contained only within
the passband of its intermediate-frequency filter. Although the spectrum
analyzer covers a wide frequency range (by sweeping) it is a narrow-band
instrument and therefore very sensitive to continuous-wave signals.
Noise power is proportional to bandwidth, and so the highest sensitivity to
continuous-wave signals is obtained by using the narrowest bandwidths.
Figure 16-27 shows the levels of thermal noise for various bandwidths at
room temperature.

From the noise figure and the thermal noise level in various band-
widths, the ability of the spectrum analyzer to measure small signals can be
determined" (the nomograph).....


Your cobra has the same sensitivity as the TRC then (.25 uV for SSB and 0.5 uV for AM). Working to 50 Ohms, 0.7uV is equivalent to -110dBm, which is close to receive threshold for S/N of 10dB. Have you actually meseasured the noise floor of your radio at I/F with your analyzer, say right before the detector diodes, while it is close to threshold ? Also, when you lower your res BW by a factor of 10, does your displayed noise floor go down 10dB? Mine does.

Time to sweep a scan width is = proportional to the scan width in Hz / the square of the resolution BW in Hz.

Anyway, I'm still making measurements, including the AF signal to noise ratio at the speaker at various input levels (with the C/N at the IF), and a look at the 10 mHz LO (PLL) spectral noise, which ends up in the IF chain, where I'm taking the C/N measurements.


The TRC is a good radio, but its limitations include no preselector and a noisy LO. Not having a preselector like every common FM tuner in existance does (analog ones) results in amplification by TR5 of the whole CB band (500 KHz).

Still a nice rig. I sometimes can clearly make out SSB phone from Ireland, England, Germany, France, Spain in the morning hours, with the 1/2 wave vertical dipole, on Ch 38 LSB.
 
Last edited:
Noise power is proportional to bandwidth, and so the highest sensitivity to continuous-wave signals is obtained by using the narrowest bandwidths.
Figure 16-27 shows the levels of thermal noise for various bandwidths at
room temperature.

From the noise figure and the thermal noise level in various band-
widths, the ability of the spectrum analyzer to measure small signals can be
determined" (the nomograph).....

Like I keep telling you, the benefits of the narrow RBW apply to small continuous wave signals, eg a steady carrier that can sit inside the 10Hz RBW.

Noise isn't a continuous wave signal. So reducing the RBW offers no benefit when measuring noise power.

Also, when you lower your res BW by a factor of 10, does your displayed noise floor go down 10dB? Mine does.

Yes it does that on mine too. But the fact you seem to be overlooking is that the noise you are trying to measure ALSO goes down 10dB so there is no 'net' benefit to the lower RBW when measuring the power of the NOISE.

The lower RBW can let you detect tiny cw signals better but if you want to measure the noise power from an amplifier you need to be able to measure 'noise' and NOT continuous wave signals.

The ability to measure low noise levels accurately is dictated by the analyser noise figure. You can't dodge this limitation by messing with lower RBW settings.
 
Last edited:
What about this question-

Your cobra has the same sensitivity as the TRC then (.25 uV for SSB and 0.5 uV for AM). Working to 50 Ohms, 0.7uV is equivalent to -110dBm, which is close to receive threshold for S/N of 10dB.

What is the displayed noise floor of your radio at I/F with your analyzer, say right before the detector diodes, while it is close to threshold, and what res BW did you use ? If you don't have any values handy would you consider making the measurement and posting ?

Keep in mind I'm not measuring absolute noise power, only relative differences in C/N and S/N in the pre-dect I/F chain and post detected 1 KHz signal.

Another benefit to the lower resolution bandwidth is for meausring closely spaced carriers (say the fundamental carrier and the upper sideband of AM signal under 5 KHz modulation), having the ability to differentiate between the two clearly, which is not possible with the higher res bw settings. The lower the res bandwidth, the longer the sweep scan time (by the square), but the result is that the relative amplitudes between the two much easier to determine. The AM or FM modulated carrier is also a sinusoid, in case there was a definition mix-up. Maybe it is called a complex one, but it is contiinuous and made up of one or multiple related signals.

When measuring harmonic distortion of a test signal with an SA, low res BW settings also allow for relative amplitudes between the fundamental to the related harmonics on either side to be measured, which results in being able to deterline a THD in % directly. In case of a 1 KHz test signal, about a 20 KHz total scan width is needed, and setting the res BW at 30 Hz allows for dicerning the relative heights (down to about 70dB below the carrier), resulting in being able to develop THD% values in the 0.05% resolution range.

For FM mod index settings, it is a requirement to use a low res BW, in order to set the relative amplitudes of the sidebands clearly, and arrive at a specific MOD index (FM). The US broadcast FM Stereo broadcast standard envelope is 240KHz, at 15 KHz information signal rate (L/R). This is also a slow scan rate operation.


Thanks
 
Last edited:
What is the displayed noise floor of your radio at I/F with your analyzer, say right before the detector diodes, while it is close to threshold, and what res BW did you

If I made the measurement I would expect it to equate back to a noise floor at the input of about -165dBm/Hz.

I always use the 1Hz noise marker function because it corrects for the detector and log amp errors and also the noise bandwidth of whatever RBW I have selected. Otherwise I have to mess about with correction factors.

Note that this would be a much better place to tap into the radio than TP13 because there will be sufficient gain to overcome the poor noise figure of the spectrum analyser without needing a preamp. However, I would be (slightly) concerned about loading the IF circuit and also it may cause feedback/instability.

Why not measure the noise floor using SSB mode and measure at the AF connector with a decent true rms meter? The AF noise won't be flat across the passband but if you only want to make relative measurements then this is a really good method because the true rms meter only needs to operate over a 3dB range (relative) and so the repeatability of the test is down to the accuracy of your signal generator over time.

Another benefit to the lower resolution bandwidth is for meausring closely spaced carriers (say the fundamental carrier and the upper sideband of AM signal under 5 KHz modulation), having the ability to differentiate between the two clearly, which is not possible with the higher res bw settings. The lower the res bandwidth, the longer the sweep scan time (by the square), but the result is that the relative amplitudes between the two much easier to determine. The AM modulated carrier is also a sinusoid, in case there was a definition mix-up.

For FM mod index settings, it is a requirement to use a low res BW, in order to set the relative amplitudes of the sidebands clearly, and arrive at a specific MOD index (FM). The US broadcast FM Stereo broadcast standard envelope is 240KHz, at 15 KHz information signal rate (L/R). This is also a slow scan rate operation.
Yes but I'm not really bothered or arguing about the above. You are measuring discrete signals in the above tests rather than noise so I don't see a problem :)

It was when you were trying to measure noise levels at TP13 and trying to convince me there was some merit in using a lower RBW to somehow overcome the noise floor of the analyser. i.e. when measuring 'noise' to judge any noise figure change of the amplifier :)

If you look up in various Agilent/HP application notes they will tell you that the external noise signal drops 10dB as well as the analyser noise when you reduce the RBW by a factor of 10. So there is no net benefit when you want to quantify 'noise'
 
Note that this would be a much better place to tap into the radio than TP13 because there will be sufficient gain to overcome the poor noise figure of the spectrum analyser without needing a preamp. However, I would be (slightly) concerned about loading the IF circuit and also it may cause feedback/instability

Agree, but I have not noted any instability making this measurement. Between the detection diodes and the preceeding tank circuit (L7 at 7.8 MHz IF) there are two common emmiter RF amps back to back (TR 8 and TR 9), so a reduction in Q of L7 is minimal.

Why not measure the noise floor using SSB mode and measure at the AF connector with a decent true rms meter? The AF noise won't be flat across the passband but if you only want to make relative measurements then this is a really good method because the true rms meter only needs to operate over a 3dB range (relative) and so the repeatability of the test is down to the accuracy of your signal generator over time.

Let's say at the speaker (into 8 ohms) the SA measures the floor on either side of the 1 KHz test signal (carrier) to be 0dBm. That would be 88mV of hash around the 1 KKz test signal. If the radio at this point had a signal to noise ratio of 30dB, then the carrier test signal voltage would be 2.8 volts. I haven't made this measurement yet with the SA, so don't know what the ratios are going to be, but planned to do this at multiple RF input levels. My voltmeter is an HP 3478, which measures true RMS AC to 300 KHz. I can do this test with the carrier on with the SA, and with the carrier on and then off with the 3478 to get the difference in ac power, so sounds like a good test to add.

If you look up in various Agilent/HP application notes they will tell you that the external noise signal drops 10dB as well as the analyser noise when you reduce the RBW by a factor of 10. So there is no net benefit when you want to quantify 'noise' .

Which one are you referring to, and which section?

Thanks-
 
Like I keep telling you, the benefits of the narrow RBW apply to small continuous wave signals, eg a steady carrier that can sit inside the 10Hz RBW.

Noise isn't a continuous wave signal. So reducing the RBW offers no benefit when measuring noise power.


Consider this set of conditions. Set up a very low noise carrier (with no FM or PM or AM on it), with a real BW of 2 Hz peak to peak, and observe it with a SA having the res BW set to 100 Hz. Carrier amplitude-lets make it -80dBm, which is 22uV into 50 ohms, about 20 times the voltage at threshold for intelligible reception of AM for the CB radio. Where to set the scan width ? Lets set it to 1 KHz per division (10 KHz total on the display).

Where is the noise floor of the SA within the I/F BW of 100 Hz ? Using a 25 dB NF, its at -130dBm. This means that on each side of this carrier, the floor (within the 100 Hz BW of the I/F) is at -130dBm, and the carrier, at -80dBm, is 50dB higher. Using HP's 2.5 sweep time factor for older analog units, it requires 25 seconds to sweep through the whole 10 Khz scan (1 KHz per div).

Turn the carrier off, and within 100 Hz BW of the I/F for floor is setting at -130dBm again.

Now, lets store this information for display and move the 100 Hz window slowly over to another 100Hz bin (or window or aperture) and measure this energy again, giving enough charge time for the detector to respond. Repeat this process from one side of the 10KHz scan with to the other, taking 25 seconds, and we have a picture of the noise floor at -130 dBm for any signal energy present within the scan of intrest.
 
If you look up in various Agilent/HP application notes they will tell you that the external noise signal drops 10dB as well as the analyser noise when you reduce

the RBW by a factor of 10. So there is no net benefit when you want to quantify 'noise' .



Which one are you referring to, and which section?


It's in pretty much every HP application note on this subject dating back about 40 years. Eg it's even in the old 1974 article you linked me to earlier :)

http://cp.literature.agilent.com/litweb/pdf/5952-0292.pdf

Look at the section on noise measurements eg the text below is on page 10 of the document. (page 13 of the pdf?)

If we examine what happens as the spectrum analyzer bandwidth is changed, we
will see that the sensitivity for random noise measurements is independent of bandwidth.

For example, we narrow the bandwidth by a factor of 10. The analyzer's internal noise (which is, itself, random noise) is decreased by a factor of 10, or 10 dB. At the same time, the random noise we are measuring also decreases by 10 dB, so the signal-to-noise ratio remains constant.

This is basic physics. You are overlooking the fact that NOISE is random in nature so when noise 'itself' becomes the signal of interest then there is no benefit to changing RBW as the S/N remains constant as stated by HP above.


To show this I connected my high power noise source to a power meter. This noise source puts out about half a milliwatt of noise across 0.2 to about 185MHz so it appears as a big chunk of noise on the analyser.

The power meter has a 100kHz to 5.5GHz head fitted and it measures -3.2dBm as the total power of the noise.

The analyser shows the noise to be about 185MHz wide.

185MHz is about 82.7dBHz (i.e. 10 * log(185000000) = 82.7dB)

So the noise 'should' measure out at about (-3.2 - 82.7) = -85.9dBm/Hz on the noise marker function on the analyser if the noise source were totally flat and the analyser response was totally flat.

You can see in the images below (where I added some video filtering) that the noise measures out at about -86dBm/Hz (close enough!) on the analyser and you can see what happens if I turn down the RBW by a factor of 10.

The noise goes DOWN 10dB on the display even though it is an external signal. In theory it should go down by 10log (10/1) = 10dB.

But you can see by the 1Hz noise marker that the noise still measures out at about -86dBm/Hz.

2013021418395449388Dsc00557a.jpg


2013021418394558608Dsc00551a.jpg

201302141950347486Dsc00560.jpg

2013021419508743293Dsc00561a.jpg
 
Last edited:
I appreciate your posting those pictures.

I could not locate any of the quotes you sighted in the HP 150 Ap Note. I tried everywhere in the text to find your two quotations.

I cut and paste below text from the bottom of page 59 and the top of page 60 (pdf page # and document # agree) of your linked document:

Resolution bandwidth also affects signal-to-noise ratio, or sensitivity. The
noise generated in the analyzer is random and has a constant amplitude over
a wide frequency range. Since the resolution, or IF, bandwidth filters come
after the first gain stage, the total noise power that passes through the filters is determined by the width of the filters.

This noise signal is detected and ultimately reaches the display. The random nature of the noise signal causes the displayed level to vary as:
10 log (BW2/BW1) where BW1 = starting resolution bandwidth
BW2 = ending resolution bandwidth Figure 5-1.

Reference level remains constant when changing input attenuation.

So if we change the resolution bandwidth by a factor of 10, the displayed
noise level changes by 10 dB, as shown in Figure 5-2.

For continuous wave (CW) signals, we get best signal-to-noise ratio, or best sensitivity, using the minimum resolution bandwidth available in our spectrum analyzer.

A spectrum analyzer displays signal plus noise, and a low signal-to-noise ratio
makes the signal difficult to distinguish. We noted previously that the video
filter can be used to reduce the amplitude fluctuations of noisy signals while
at the same time having no effect on constant signals. Figure 5-3 shows how
the video filter can improve our ability to discern low-level signals. It should
be noted that the video filter does not affect the average noise level and so
does not, by this definition, affect the sensitivity of an analyzer.
In summary, we get best sensitivity for narrowband signals by selecting the
minimum resolution bandwidth and minimum input attenuation. These settings
give us best signal-to-noise ratio. We can also select minimum video bandwidth to help us see a signal at or close to the noise level3. Of course, selecting narrow resolution and video bandwidths does lengthen the sweep time.


Back to you-
 
Last edited:
The stuff you posted in your latest post is still all related to detecting CONTINUOUS WAVE signals.

Noise is not a continuous wave signal! It is random in nature :)


Sorry if you couldn't find the text I quoted. I posted up a link to a more modern version of application note AN-150 in error.

I intended you to look at your own linked AN-150 document here from 1974:

http://cp.literature.agilent.com/litweb/pdf/5952-1147.pdf


look at page 10 in the noise measuring section. and you will find the text I quoted earlier.

This is an earlier version of AN-150. it's the one you linked to yourself so I assumed you actually read it yourself before linking it on here ;)

HP/Agilent release newer versions of this AN-150 document every so often and I'm much more familiar with the modern versions as I have access to the very latest Agilent spectrum analysers (eg PSA and PXA) at my place of work.

If we examine what happens as the spectrum analyzer bandwidth is changed, we will see that the sensitivity for random noise measurements is independent of bandwidth.

For example, we narrow the bandwidth by a factor of 10. The analyzer's internal noise (which is, itself, random noise) is decreased by a factor of 10, or 10 dB. At the same time, the random noise we are measuring also decreases by 10 dB, so the signal-to-noise ratio remains constant.



2013021514224505034noise31.gif
 
Here's some basic physics about the nature of noise.

Thermal noise power can be measured using noise power = kTB .

T = temperature = Kelvin = 290K at room temperature
k = Boltzmann's constant = 1.38 E-23 J/K
B = Noise Bandwidth (Hz)

So if you change the detection bandwidth B by a factor of 10 then the power at the detector will change by a factor of 10. (because you narrowed the access window that the noise can get through by a factor of 10)

So this noise power to bandwidth B relationship will equally affect both external noise signals and internally generated noise inside the analyser. BASIC PHYSICS and endorsed by Hewlett Packard since at least 1974.

The S/N ratio of the noise you are trying to measure against the noise inside the analyser will not improve/change with a change to a narrower RBW. This is because both the noise contribution from the analyser and the noise signal of interest BOTH have to pass through the same (narrower) RBW filter so they BOTH go down by the same ratio of 10dB as they BOTH get sliced into a narrower slice of noise by a factor of 10. A Slice of noise 10 times narrower will have 10 times LESS power. I hope this helps :)

Good luck to you if you still want to prove that modern physics has this simple concept wrong ;)
 
Last edited:
Your last included"The S/N ratio of the noise you are trying to measure against the noise inside the analyser will not improve/change with a change to a narrower RBW. This is because both the noise contribution from the analyser and the noise signal of interest BOTH have to pass through the same (narrower) RBW filter so they BOTH go down by the same ratio of 10dB as they BOTH get sliced into a narrower slice of noise by a factor of 10. A Slice of noise 10 times narrower will have 10 times LESS power. I hope this helps"

This text is directly out of the same AP note, so is in direct conflict with your position.

Resolution bandwidth also affects signal-to-noise ratio, or sensitivity. The noise generated in the analyzer is random and has a constant amplitude over a wide frequency range. Since the resolution, or IF, bandwidth filters come after the first gain stage, the total noise power that passes through the filters is determined by the width of the filters.
I think this is the same ap note (150):http://literature.agilent.com/litweb/pdf/5954-9130.pdf

Look on page 20, figure 29a, 29b, and 29c, and see how insuffiicient res BW has degraded the S/N ratios for the FM broadcast carrier (29c). The note makes a point about this in the caption. I set my FM signal generators MOD index using the process described in this app note, counting carrier nulls.

Table 11 shows that at the 3rd carrier null, using a modulating fequency of 8.67KHz, I acheive exactly +75KHz frequency deviation (standard for FM broadcasting). To really see the nulls of the carrier, I use a 300 Hz res BW, versus the 1 KHz they use in the later app note 150. My 1971 hardcopy version uses 300 Hz res BW for figure A. Using 300 Hz res BW does not degrade the carrier powers, but the null is closer to zero in the display when adjusting the modulating signal. It is a tradeoff between scan time and resolution of the null (0 energy) to the adjacent sidebands.

The power of the carrier is not changed with res BW (better not be, or we have a problem), but with too wide of res BW, the noise floor is too high to make out the carrier null events. Figure 27 shows the carrier headed to the first Bessel null. Mathematically, the null crossing goes below 0 signal and then changes sign (phase).

Aside from the signal (carrier), KTB is a product of equivalent noise temp (NF), BW, and, and Boltzman''s constant. Minimizing one of these terms (BW) has a great effect on how much noise power is allows to be detected by the analyzer's AM detector, along with the signal, as the mixer sweeps through the scan width. So yes, signal to noise improves with lower res BWs.

How would you would go about using a spectrum analyzser to set FM modulation index on a carrier modulated with a sine wave? We used this method in the '80s to set our cmd uplinks for Hughes Satellites. Cmd sensitivty verfication is a sensitive to carrier uplink power and FM deviation. The HP 8566 was always used for this work, on the ground and in orbit. We set the res BW to very low values to get the main carrier null precisely established, since it provided the best measurement of sideband power to noise ratios within the uplink signal, at the point where the Bessel function went to 0 for a specific MOD index.
 
Last edited:

dxChat
Help Users
  • No one is chatting at the moment.