An object is dropped from rest at a height of $150\ m$ and simultaneously another object is dropped from rest at a height $100\ m$. What is the difference in their heights after $2\ s$ if both the objects drop with same accelerations? How does the difference in heights vary with time?

Here initial velocity for the first object $u_1=0$,

Gravitational acceleration $g=10\ m/s^2$      [take $g=10\ m/s^2$]

Time $t_1=2\ s$

So distance covered by first object fallen from $h_1=u_1t+\frac{1}{2}gt_1^2$

$=0+\frac{1}{2}\times 10\times2^2$

$=20\ m$

Similarly, for second object

Initial velocity $u_2=0$, time $t_2=2\ s$, $g=10\ m/s^2$

So, distance covered by second object $h_2=u_2t+\frac{1}{2}gt_2^2$

$=0+\frac{1}{2}\times10\times 2^2$

$=20\ m$

So, the height of first object from the ground after 2 seconds $=150\ m-20\ m=130\ m$

Height of the second object from the ground $=100\ m-20\ m=80\ m$

Difference between the heights of both the objects $=130\ m-80\ m=50\ m$


Simply Easy Learning