How to model an equation?

4 views (last 30 days)
Danny Maefengea
Danny Maefengea on 11 Sep 2020
Commented: Danny Maefengea on 11 Sep 2020
Hi there, I have been trying to think of how to model the problem bellow in MATLAB but I couldn't. Can anybody help here please?
Thank you very much for your help.
A ball starts falling down through dense liquid. Its velocity (in cm/s) is given by the equation: dv/dt = 1.6 − 0.025v^2 .
If v(0) = 0cm/s, show that it will take approximately 6.77s for the ball to reach a velocity of 7.0cm
If the container is 1 meter deep, determine, the time the ball has reached the bottom.
  1 Comment
Alan Stevens
Alan Stevens on 11 Sep 2020
Type help ode45 in the Matlab workspace.

Sign in to comment.

Accepted Answer

BOB MATHEW SYJI
BOB MATHEW SYJI on 11 Sep 2020
I think this should work. When you give time t1 for which the velocity you want, and the depth of the container as input, the function returns the velocity at time t1 and the time taken to hit the bottom as the output
function [velocity, time]= diff_eqn(t1,depth)
%Inputs given are time and depth of container
%function returns the velocity at time t1 and
%time taken to hit the bottom of container
syms y(t)
ode = diff(y)+0.025*y^2==1.6;
cond=y(0)==0;
ySol(t)=dsolve(ode,cond);
velocity=double(ySol(t1))
time=depth/velocity
end
  1 Comment
Danny Maefengea
Danny Maefengea on 11 Sep 2020
Thank you so much Bob for your help. I really appreciate it. Now I have some fair ideas on how to work on the problem.
Once again thank you.

Sign in to comment.

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!