Winfield said:
Nah. First, there's nothing wrong with running MOSFETs in the
subthreshold region - they work fine (it's the typical spice
subthreshold MOSFET model that fails to work properly). Second,
the transconductance/current ratio actually improves for MOSFETs
in the subthreshold region, approaching BJTs in many cases, so
that argument is wrong. Third, typical leakage for these small
MOSFETs is in the low pA region, not 100s of nA, for Vds drain
voltages below 80% of Vdss, which = 30 to 40V for these parts.
Fourth, any Ids leakage current is part of the current-source
output, measured by the servo, so contributes NO error anyway,
unless the leakage exceeds the desired current. So 100nA would
be fine in a 1uA current source.
I don't believe that for a minute- this MOSFET is leaky as a sieve. They
specify Vgs(th) at Id=1mA, which is much higher than most, the gate body
leakage Igss is bounded by 10nA, and the Idss at Vds=25V is bounded by
0.5uA. One aspect of this you have not considered is that these numbers
are on the order of 100x to 1000x the specified error band of 100pA at
1uA source current. These leakage currents will all have components of
some unknown proportion that exhibit standard deviations not governed by
the usual shot- and Johnson- noise distributions, so that it is
unreasonable for you to discard a complete unknown as being 40 to 60 dB
down from the mean when you have so little information. If the regulated
current is in fact exhibiting the flicker described, then it has to be
related to this ratio of allowed variation to average leakage
magnitudes- you cannot expect reliable performance with arbitrarily
small errors.