Thank you for reporting, we will resolve it shortly
Q.
A converging lens forms a real image $I$ of an object on its principal axis. A rectangular slab of refractive index $\mu$ and thickness $x$ is introduced between $I$ and the lens, $I$ will move
Due to insertion of slab, the optical path increases by $x / \mu$, where $x$ is thickness of slab. Therefore the converging point will shift away by
$\left[x-\frac{x}{\mu}\right]=x\left(1-\frac{1}{\mu}\right)$