Let $X \sim \mathrm{Ga}(a,b)$, i.e., \begin{align} \mathrm{Ga}\left(x \mid a,b\right) = \frac{b^a}{\Gamma(a)} x^{a-1} e^{-xb} \end{align} for $x \geq 0$ and $a,b > 0$.
Let $Y = 1/X$. Show that $Y \sim \mathrm{IG}(a,b)$, i.e., \begin{align} \mathrm{IG}\left(y \mid \mathrm{shape}=a,\mathrm{scale}=b\right) = \frac{b^a}{\Gamma(a)} y^{-(a+1)} e^{-b/y}. \end{align} for $y > 0$ and $a,b > 0$.

Proof. \begin{align} y &= \dfrac{1}{x} \Rightarrow x = \dfrac{1}{y} \end{align} Differentiating (3) w.r.t $y$, \begin{align} \dfrac{dx}{dy} &= - \,\dfrac{1}{y^2} \end{align} The cdf (cumulative distribution function) of (2) is given by \begin{align} \mathrm{F}_Y(y) &= \mathrm{P}(Y \leq y) \\ &= \mathrm{P}\left(\frac{1}{X} \leq y\right) \\ &= \mathrm{P}\left(\frac{1}{y} \leq X\right) \\ &= \mathrm{P}\left(X \geq \frac{1}{y}\right) \\ &= 1 - \mathrm{P}\left(X < \frac{1}{y}\right) \\ &= 1 - \mathrm{F}_X\!\left(\dfrac{1}{y}\right) \end{align} Differentiating both sides of (10) w.r.t $y$, \begin{align} \frac{d}{dy} \mathrm{F}_Y(y) &= \frac{d}{dy} \left[1 - \mathrm{F}_X\!\left(\dfrac{1}{y}\right)\right] \\ f_Y(y) &= \frac{d}{dx} \left[1 - \mathrm{F}_X\!\left(\dfrac{1}{y}\right)\right] \cdot \frac{dx}{dy} \\ &= -f_X\left(\frac{1}{y}\right) \cdot \left(-\,\frac{1}{y^2}\right) \\ &= \frac{b^a}{\Gamma(a)} \left(\frac{1}{y}\right)^{a-1} e^{-b/y} \cdot \left(\,\frac{1}{y^2}\right) \\ &= \frac{b^a}{\Gamma(a)} \left(\frac{1}{y}\right)^{a+1} e^{-b/y} \\ &= \frac{b^a}{\Gamma(a)} y^{-(a+1)} e^{-b/y} \end{align}

References:

Kevin P. Murphy. Machine Learning: A Probabilistic Perspective
Exercise 2.10 Deriving the inverse gamma density.