A dog chases a squirrel. The dog is originally 200 feet away from the squirrel. The dog’s speed is 150 feet per minute. The squirrel’s speed is 100 feet per minute. How long will it take for the dog to get the squirrel?

Respuesta :

Answer:4 minutes

Step-by-step explanation: you just subtract 150 from 200 and add 100 feet and keep doing that until it is 0 or in the negitves

The time taken by the dog to get the squirrel with a speed of 150 feet per minute will be t=4 minutes.

What is relative speed?

The relative speed is defined as the speed of one object with respect to the other object moving at different speeds.

The relative speed is calculated by the difference between the two speeds.

It is given in the question that:

DIstance of dog from squirrel = 200 feet

speed of the dog = 150 feet per minute

Speed of the squirrel = 100 feet per minute

The relative speed is calculated as:

[tex]\rm Relative\ speed= Speed\ of \ Dog-Speed\ of\ squirrel[/tex]

[tex]\rm Relative\ speed= 150-100=50 \ feet\ per\ minute[/tex]

Now speed is given as:

[tex]\rm Relative\ speed= \dfrac{Distance}{Time}[/tex]

[tex]\rm TIme=\dfrac{Distance}{Relative \ speed}[/tex]

[tex]\rm Time=\dfrac{200}{50}=4\ minutes[/tex]

Hence the time taken by the dog to get the squirrel with a speed of 150 feet per minute will be t=4 minutes.

To know more about Relative velocity follow

https://brainly.com/question/17228388