Hi, I'm trying to numerically solve a set of ODEs and got a problem with numerical stability of my solutions. Simplified version of my actual problem (sufficient to illustrate the problem) consists of 2 ODEs for functions n(t) and nc(t). The equations to be solved are: * dn/dt = -alpha*(n+nc)*n + nc/tau + gamma*n * d(nc)/dt = alpha*(n+nc)*n - nc/tau + gamma*nc where alpha, tau, gamma and nu are constants set numerically. Both n(t) and nc(t) grow exponentially due to the last term while the first two terms represent a model of conversion between two species. Specifically, I have alpha=1e-9, tau=150, gamma=1e-5, nu=9000. Initial conditions are n(0)=100, nc(0)=0. I'm using a SimBiology toolbox that uses MATLAB's standard ODE solvers. As far as I can tell, the problem comes from the fact that the first two terms always add to approximately zero (and I want them in the model because later I'd like to choose parameters such that they wouldn't add to zero). This is because the time scale at which the steady state between the first two terms is reached is shorter than 1/gamma. Just as an example: at some point in time, alpha*(n+nc)*n would be about 10^8 and -nc/tau would be about -10^8. The problem then comes from the fact that we're subtracting two very large numbers that are very close to each other. And as the error in the estimate of their sum grows it eventually becomes so large as to produce non-sensical results: both n(t) and nc(t) eventually stop growing (and they should continue exponentially). Since it's not a very demanding simulation, ideally I'm looking for a way to improve the precision in calculating n(t) and nc(t). I've tried changing the MaxStep size, Relative and Absolute Tolerance, but without success. I appreciate any suggestions. Thank you.