Electron emission from a cathode surface produced by an applied field E
is enhanced by the presence of projections on the emitter surface which cause a local increase in E
. The nature of this enhancement factor μ(z
), which is a function of the distance z
from the cathode, is discussed more fully than hitherto and its magnitude is calculated for certain ideal but realistic geometries. Although such a factor may be large on the surface (z
=0), it decreases rapidly as z
increases so that the mean field magnification (z)
, which is required in the Schottky thermionic emission theory, is unlikely to be >2 and is probably near to unity even for fields as great as 5×105
v∕cm. This fact will mean that the Fowler‐Nordheim theory frequently used to explain emission results at these fields is not applicable and that a Schottky theory should be used.
It is also shown that (z)
is itself field dependent and produces departures from the Schottky law in such a way that the slope of the current vs E☒
plot increases rapidly for E
v∕cm, remains approximately constant for 103
v∕cm, and then increases again at higher fields. A similar reasoning shows that deviations from the Fowler‐Nordheim law for fields >106
v∕cm can also be expected.
Finally, the assumption of a constant emitting area for rough cathodes is also shown to give departures from the emission laws in such a manner that the estimated current density at high fields is appreciably less than the expected values.
These deductions help to explain some of the anomalous results frequently found in many conduction experiments in gases and liquids.