I know that bandwidth capacity of a channel/medium is the difference between the highest and lowest frequencies that can be sent across that channel, and I get this when it comes to sending analog signals that have varying frequency, like if you say a low sounding note that has a low frequency and if you say a high pitch note that has a high sounding frequency. But what I don't get is how bandwidth works with regards to digital signals. When would the signal have a low freq and when would it have a high freq? Also here is a diagram:
Because I don't understand bandwidth and frequencies in a digital signal, I don't understand what causes the bandwidth dependencies in the third row. I don't understand how the width or rise time and fall time of a pulse could determine a frequency in the signal.
Because I don't understand bandwidth and frequencies in a digital signal, I don't understand what causes the bandwidth dependencies in the third row. I don't understand how the width or rise time and fall time of a pulse could determine a frequency in the signal.