In other words, when a clock (or field) that shows just hh:mm, the minute portion changes when the true seconds go from 59 → 00, and not when the seconds go from 29 → 30.
Correct?
What about when the time has precision greater than 1 second? Does the seconds displayed change when the tenths go from 0.9999 (or whatever the largest value less than 1 second is) → 0?
(As opposed to going from .4999 → .5000)
I feel like this isn’t going to be a one size fits all answer. Payroll and accounting/legal might follow different conventions than science or engineering.
I believe this is how it works on computers – the time doesn’t roll over as soon as it gets to 59 seconds, but rather the very last millisecond before it would add up to more than the last second. I think thats how the microprocessors I used at school worked, you took the number of cycles ran since the chip booted and the used the megahertz you were running at to calculate how many milliseconds had gone by, which you then could divide into seconds, minutes, etc.