Question Error Report

Thank you for reporting, we will resolve it shortly

Back to Question

Q. Two points are located at a distance of $10\, m$ and $15\, m$ from the source of oscillation. The period of oscillation is $0.05\, s$ and the velocity of the wave is $300 \,m/s$. What is the phase difference between the oscillations of two points?

Oscillations

Solution:

As, Wavelength $=$ Velocity of wave $\times$ time period
i. e. $\lambda=300 \times 0.05$
or $\lambda=15\,m$
According to the problem path difference between two points
$=15-10=5\, m$
$\therefore $ Phase difference $=\frac{2 \pi}{\lambda} \times$ path difference
$\Delta \phi =\frac{2 \pi}{\lambda} \times \Delta x $
$=\frac{2 \pi}{15} \times 5=\frac{2 \pi}{3} $