I have a program to write that is very confusing. I have to write and 80x86 assembly language program that will make an adaptive digital signal processor tha acts as a linear amplifier with automatic gain control.
I have a Tektronix CFG 250 function generator providing the signal. I have a D/A-A/D converter for the signal to be read into, for the computer to have a digital signal.
I have a breadboard with dip switches with that can be set as 1's or 0's to set the maximum amlitude of the wave.
The input signal has to be adjusted to where it looks like the original signal just scaled with the ampification factor (set with the dip switches).
The wave is viewed on an ocilloscope, and i am to come up with the amplification factor and the frequency of the wave.
The amplitude and frequency of the signal may change during operation.
I understand reading in from and the output of the D/A converter, but not sure how to work the amplification factor and find the frequency.
Any help on this would be apprecitated.
I have a Tektronix CFG 250 function generator providing the signal. I have a D/A-A/D converter for the signal to be read into, for the computer to have a digital signal.
I have a breadboard with dip switches with that can be set as 1's or 0's to set the maximum amlitude of the wave.
The input signal has to be adjusted to where it looks like the original signal just scaled with the ampification factor (set with the dip switches).
The wave is viewed on an ocilloscope, and i am to come up with the amplification factor and the frequency of the wave.
The amplitude and frequency of the signal may change during operation.
I understand reading in from and the output of the D/A converter, but not sure how to work the amplification factor and find the frequency.
Any help on this would be apprecitated.