Industry-relevant training in Business, Technology, and Design to help professionals and graduates upskill for real-world careers.
Fun, engaging games to boost memory, math fluency, typing speed, and English skills—perfect for learners of all ages.
Enroll to start learning
You’ve not yet enrolled in this course. Please enroll for free to listen to audio lessons, classroom podcasts and take practice test.
Listen to a student-teacher conversation explaining the topic in a relatable way.
Signup and Enroll to the course for listening the Audio Lesson
Today, we’re diving into the quantisation of electric charge. Can anyone tell me what it means to say that charge is quantised?
Does it mean charge can only exist in certain amounts, like whole numbers?
Exactly! We express this as q = n * e, where 'n' is an integer and 'e' is the basic unit of charge, about 1.6 x 10⁻¹⁹ coulombs. This means every charge is a multiple of this basic unit.
Why is it important to know that? Can’t charge just be any random value?
Good question! Understanding this helps us comprehend atomic and molecular interactions. Charges are always conserved and only transferred in whole units. This is crucial in many physical phenomena.
Let’s remember: ‘Count on Charge’ for quantised charges!
Can anyone summarize what we’ve learned so far?
Charge is quantised and can only exist in whole-number multiples of the elementary charge 'e'.
Exactly! Well done!
Signup and Enroll to the course for listening the Audio Lesson
Let’s discuss the historical scientists who contributed to this concept. Who can tell me about Faraday's role in this?
He was the first to suggest that charge can only be quantised through experiments with electrolysis?
Correct! And the experimental proof came from Robert Millikan's oil-drop experiment. Can anyone summarize what Millikan did?
He used tiny oil droplets to measure the charge of electrons, confirming they are quantised!
Just right! Remember Millikan’s work with ‘Oily Electrons’ as a memorable tip for quantised charges.
What does this quantisation imply about our understanding of charge in physics?
It means we can't create or destroy charge; we can only transfer it in units of 'e'.
Excellent! Each contribution led to today’s understanding of charge in physics.
Signup and Enroll to the course for listening the Audio Lesson
Now, let's connect quantisation of charge to real-world phenomena. Can anyone think of where this concept is applied?
In circuits, where charge must flow as discrete electrons?
That's a great example! Every electric current is just the sum of many charged particles moving.
So this means if I have a large charge, it’s made up of so many electron charges then?
Exactly! In bulk, it appears continuous but at the fundamental level, it’s made of quantised charges.
Let’s remember: ‘Charge is Counted, Not Infinite’ as a memory aid for this connection.
To finish, why is understanding charge quantisation crucial for physics?
It helps us understand everything from the atomic scale to circuits and beyond!
Perfect conclusion, everyone!
Read a summary of the section's main ideas. Choose from Basic, Medium, or Detailed.
This section explores the quantisation of electric charge, detailing that all free charges are integral multiples of a basic unit of charge denoted by 'e'. Historically, the concept was introduced by Faraday and experimentally confirmed by Millikan.
The quantisation of charge refers to the principle that electric charge can only exist in discrete amounts that are whole-number multiples of a basic unit of charge, denoted by 'e'. In formal terms, this is expressed as:
q = n * e
where 'n' is any integer (positive or negative).
The fundamental unit of charge is derived from the charge of elementary particles: the electron carries a charge of '−e' (approximately -1.602 x 10⁻¹⁹ coulombs), and the proton carries 'e' (approximately +1.602 x 10⁻¹⁹ coulombs). This section emphasizes that, despite the large macroscopic charges we typically work with, their fundamental nature remains quantised. Thus, the charges we observe in everyday phenomena are actually due to the cumulative effect of a vast number of elementary charges.
This principle of quantisation was initially suggested by Faraday through electrolysis experiments and later experimentally validated by Robert Millikan in 1912 through his oil-drop experiment. Additionally, the practical implications of charge quantisation show in how charges are transferred in physical interactions without creating or destroying charge. For larger systems, where millions of elementary charges are involved, the quantisation effect can often be ignored as it seems to behave continuously due to the sheer quantity involved.
Dive deep into the subject with an immersive audiobook experience.
Signup and Enroll to the course for listening the Audio Book
Experimentally it is established that all free charges are integral multiples of a basic unit of charge denoted by e. Thus charge q on a body is always given by
q = ne
where n is any integer, positive or negative. This basic unit of charge is the charge that an electron or proton carries. By convention, the charge on an electron is taken to be negative; therefore charge on an electron is written as -e and that on a proton as +e.
Quantisation of charge means that all electric charges are composed of discrete packets or units of charge. The smallest of these packets is the charge carried by an electron (about -1.6 × 10^-19 coulombs). Therefore, any charge can be expressed as an integer multiple of this base unit. For example, if you have 4 units of charge, it can be described as +4e or -4e depending on the nature of the charge. This principle illustrates that charge can't be split into smaller bits beyond these fundamental units.
Think of charge quantisation like the currency in a piggy bank. Just as you can't have half of a dime or quarter, but only whole units of coins, you can't have a fractional amount of electric charge. Each electron or proton can be seen as a single coin. You collect these coins (charges) to total what you have in your piggy bank (the overall charge).
Signup and Enroll to the course for listening the Audio Book
The fact that electric charge is always an integral multiple of e is termed as quantisation of charge. There are a large number of situations in physics where certain physical quantities are quantised. The quantisation of charge was first suggested by the experimental laws of electrolysis discovered by English experimentalist Faraday. It was experimentally demonstrated by Millikan in 1912.
The concept of quantisation of charge was largely brought to light through experiments. Faraday's investigations into electrolysis showed that the amount of substance deposited at electrodes was proportional to the total electric charge passed through the electrolyte. This suggested that charge must come in discrete amounts. Millikan's oil drop experiment was pivotal in accurately measuring the charge of an electron, affirming that charges appear in integer multiples of this fundamental value.
Imagine filling a jar with marbles. You can only add whole marbles – you can't add half a marble. Each marble represents a unit of charge. No matter what you do, the total number of marbles in the jar can only ever be a whole number, mirroring how charges are always whole multiples of the electron's charge.
Signup and Enroll to the course for listening the Audio Book
In the International System (SI) of Units, a unit of charge is called a coulomb and is denoted by the symbol C. A coulomb is defined in terms of the unit of the electric current which you are going to learn in a subsequent chapter. In terms of this definition, one coulomb is the charge flowing through a wire in 1 s if the current is 1 A (ampere). In this system, the value of the basic unit of charge is e = 1.602192 × 10–19 C.
The coulomb (C) is the standard unit of electric charge in the SI system. One coulomb is defined as the quantity of charge that passes through a wire when a current of one ampere flows for one second. This makes the coulomb a very large unit compared to the basic unit of charge, the charge of a single electron, which is approximately 1.6 × 10^-19 C.
Think of a coulomb like a full bucket of water, while the charge of an electron is like a single drop of water. The bucket represents a large amount of charge (1 C), while the drop represents the smallest 'building block' of charge you can have (1 e). When dealing with practical applications, we often reference the larger bucket measurements, even though everything is actually built from tiny drops.
Signup and Enroll to the course for listening the Audio Book
Thus, if protons and electrons are the only basic charges in the universe, all the observable charges have to be integral multiples of e. Thus, if a body contains n electrons and n protons, the total amount of charge on the body is n × e + n × (-e) = (n1 - n2)e. Since n1 and n2 are integers, their difference is also an integer.
This section highlights that all charges in the universe can be described as sums or differences of the fundamental units of charge (the charge of an electron or proton). If you were to count the total number of these particles in a body, the resultant charge would still be in multiples of e. If you have more electrons than protons, the net charge is negative, and vice versa.
Imagine you’re keeping track of your marbles again. If you have 10 blue marbles (electrons) and 7 red marbles (protons), your overall count is 3 blue marbles (a net negative charge). Just like with charges, you can only specific integer amounts; you can't have 2.5 of a marble!
Signup and Enroll to the course for listening the Audio Book
The step size e is, however, very small because at the macroscopic level, we deal with charges of a few mC. At this scale the fact that charge of a body can increase or decrease in units of e is not visible. In this respect, the grainy nature of the charge is lost and it appears to be continuous.
At a large scale, the fundamental nature of charges being discrete becomes less noticeable because the numbers become so large. Even though every charge is composed of these minuscule units, when you aggregate a lot of them together, they appear to form a continuous surface, similar to how individual sand grains can create a smooth beach when collected in large quantities.
Think of it like looking at a pixelated image on a screen. If you zoom in far enough, you can see the individual pixels. But when you step back, the image looks continuous and smooth. Charges act the same way — on a large scale, they group together and seem like they form a continuous charge distribution.
Learn essential terms and foundational ideas that form the basis of the topic.
Key Concepts
Quantisation of charge: Charges can only exist as whole multiples of a fundamental unit.
Elementary charge (e): The smallest measurable unit of charge.
Conservation of charge: Total charge within an isolated system remains constant.
See how the concepts apply in real-world scenarios to understand their practical implications.
An example of charge quantisation is that in an atom, electrons do not exist at random energies but in specific quantised energy levels.
When two objects are rubbed together, surplus charges are transferred in whole units of 'e', causing one object to become negatively charged and the other positively charged.
Use mnemonics, acronyms, or visual cues to help remember key information more easily.
Quantised charges, so neat and discrete, neat and discrete, like stepping to a beat.
Once, in a charge land, everything was continuous until a wise old wizard said, 'Only whole units can live here!' Thus, every charge became quantised, creating harmony.
Remember 'CDE': Charge is Discrete, Everything counts as integer multiples.
Review key concepts with flashcards.
Review the Definitions for terms.
Term: Quantisation
Definition:
The concept that charge can only exist in discrete quantities, specifically whole-number multiples of a fundamental unit 'e'.
Term: Elementary Charge (e)
Definition:
The smallest unit of charge, approximately 1.602 x 10⁻¹⁹ coulombs, carried by a single electron or proton.
Term: Coulomb (C)
Definition:
The SI unit of electric charge, defined as the amount of charge transferred by a constant current of one ampere in one second.