Discussion Closed This discussion was created more than 6 months ago and has been closed. To start a new discussion with a link back to this one, click here.
temperature undershoot problem in time dependent simulation near time t = 0
Posted Feb 6, 2022, 10:25 a.m. EST Heat Transfer & Phase Change Version 6.0 2 Replies
Please login with a confirmed email address before reporting spam
I am a new user and I have a simple 2D axis-symmetric model to test. In my first run of a time dependent simulation, at time t = 0, I raise temperature of one edge boundary with a delta T of +30 K above the ambient temperature of 300K and see how temperature changes in the nearby tissue volume. The temperature near the edge rises as time t becomes large (as expected). However, for very small time of t <0.4 second, there is a temperature undershoot near the edge of ~4K below the ambient temperature. Why is this happening? The temperature should never go below the ambient temperature in this simulation. What could be wrong in my simulation to get this wrong behavior? I searched old posts and there was only one post of 2010 talking about a undershoot problem, and it was different to my simulation conditions.
Thank you!
Attachments: