In X-ray measurement of thin films, we often used XRR and XRD fringes to show the qualities of the films. We can also use those to measure the film thickness. An interesting observation is that the angle distance between the XRR fringe and XRD fringes follow roughly:
ΔθXRR = 2 ΔθXRD (1)
The reason actually has to do with the slightly difference cause of the fringes.
The fringes come directly from the interference between the reflection from the surface and from the film/substrate interface. Therefore, the formula is:
2ΔθXRR =λ/[cos(θXRR)t], (2)
where t is the film thickness.
The fringe is actually coming from the local minimal of the interference between the reflection of different atomic planes. In this case, the intensity of the diffraction is:
I ∝ sin [2π 2Nsin(θXRD)d/λ] / sin(2π 2sin(θXRD)d/λ), where d is the distance between the atomic planes and N is the total number of atomic plane.
One can see that when
2sin(θXRD)d/λ = m, where m is an integer,
the intensity has a maximum, which is the normal Bragg scattering condition.
In addition, when
2Nsin(θXRD)d/λ = m/2, or 2sin(θXRD)d/λ = m/(2N), (3)
the intensity has a minimum, this is the reason of the fringes around Bragg peaks.
From (3) we can calculate the distance between the minimal in terms of the angle:
2θXRDcos(θXRD) = λ/(2Nd),
or
2θXRD = λ/[2cos(θXRD)Nd].
Recall that t=Nd is the total film thickness, one has:
2θXRD = λ/[2cos(θXRD)t] (4).
Compare (2) and (4), we can get (1).