Abstract
In recent years several publications have reported reductions in the low frequency noise of MOSFETs under large signal excitation. These observations are important for modern analog and RF circuits. The classically used low frequency noise models for circuit simulation are not able to explain this effect. In this paper, we extend the classical approach to non-equilibrium biasing conditions and give a device-physics based explanation for the noise amplitude reduction. In addition, we present measurements which are in good agreement with the derived model, and suggest approaches to implement the model within standard compact models.
Original language | English |
---|---|
Pages (from-to) | 668-673 |
Number of pages | 6 |
Journal | Solid-State Electronics |
Volume | 50 |
Issue number | 4 |
DOIs | |
State | Published - Apr 2006 |
Externally published | Yes |