Ionization Enthalpy
Ionization enthalpy (
ΔiH
) is the energy required to remove an electron from an isolated gaseous atom in its ground state. For example, the reaction can be represented as:
X(g) → X⁺(g) + e⁻.
This enthalpy change is expressed in kJ/mol and indicates how readily an atom can lose an electron, which is crucial for understanding its reactivity. The section covers several key points about ionization enthalpy:
-
Trends in Ionization Enthalpy: Ionization enthalpy generally increases as we move across a period (from left to right) in the periodic table due to increasing nuclear charge, which causes a stronger attraction between the nucleus and the outer electrons. Conversely, it decreases down a group because the distance between the nucleus and the outermost electrons increases, leading to greater shielding by inner shell electrons.
-
Factors Influencing Ionization Enthalpy: Key factors impacting ionization enthalpy are the effective nuclear charge and electron shielding. Effective nuclear charge increases across a period while remaining relatively constant down a group, leading to varying ionization energies.
-
Exceptions in Trends: Some trends in ionization enthalpy display anomalies due to electron-electron repulsion and subshell configuration. For instance, the first ionization enthalpy of boron is lower than that of beryllium due to the different types of electrons being removed (p-electron versus s-electron). Similarly, oxygen exhibits a lower first ionization enthalpy compared to nitrogen because of increased electron repulsion among paired electrons.
Understanding these trends is essential for predicting an element's reactivity and is foundational for further studies in atomic structure and chemical bonding.