Question Error Report

Thank you for reporting, we will resolve it shortly

Back to Question

Q. How long will it take sound waves to travel a distance 1 between points $A$ and $B$ if the air temperature between them varies linearly from $T_{1}$ to $T_{2} ?$ (The velocity of sound in air at temperature $T$ is given by $v=\alpha \sqrt{T}$, where $\alpha$ is a constant)

Waves

Solution:

$< v>=\frac{v_{1}+v_{2}}{2}=\frac{\alpha \sqrt{T_{1}}+\alpha \sqrt{T_{2}}}{2}$
$\Rightarrow $ Time taken $=\frac{2 l}{\alpha\left(\sqrt{T_{1}}+\sqrt{T_{2}}\right)}$
Alternate Solution:
$\frac{d x}{d t}=V=\alpha \sqrt{T_{1}+\left(\frac{T_{2}-T_{1}}{l}\right) x}$
$\int\limits_{x=0}^{x=l} \frac{d x}{\sqrt{T_{1}+\left(\frac{T_{2}-T_{1}}{l}\right) x}}=\int\limits_{0}^{t} \alpha d t$
on solving we get
$t=\frac{2 l}{\alpha\left(\sqrt{T_{1}}+\sqrt{T_{2}}\right)}$