Detailed Summary: Quantisation of Charge
The quantisation of charge refers to the principle that electric charge can only exist in discrete amounts that are whole-number multiples of a basic unit of charge, denoted by 'e'. In formal terms, this is expressed as:
q = n * e
where 'n' is any integer (positive or negative).
The fundamental unit of charge is derived from the charge of elementary particles: the electron carries a charge of '−e' (approximately -1.602 x 10⁻¹⁹ coulombs), and the proton carries 'e' (approximately +1.602 x 10⁻¹⁹ coulombs). This section emphasizes that, despite the large macroscopic charges we typically work with, their fundamental nature remains quantised. Thus, the charges we observe in everyday phenomena are actually due to the cumulative effect of a vast number of elementary charges.
This principle of quantisation was initially suggested by Faraday through electrolysis experiments and later experimentally validated by Robert Millikan in 1912 through his oil-drop experiment. Additionally, the practical implications of charge quantisation show in how charges are transferred in physical interactions without creating or destroying charge. For larger systems, where millions of elementary charges are involved, the quantisation effect can often be ignored as it seems to behave continuously due to the sheer quantity involved.