I have quoted from a source for which the link follows after the quoting. With in the quote, you will see that there are many parts to a second (cycles or hertz).
In my discussions above I selected a mid-range fraction, the millisecond (1/1000th of a second). Time does not stop thus there is never a reason or need to round up such that any amount of time after 12:12:12 am will show as 12:12:12 am such as 500 milliseconds or 999 milliseconds up to and until the time becomes 12:12:13 with zero additional time fraction. Then the time fractions are counted by the computer until enough have passed to reach 12:12:14 exactly and so on. Thus the whole number of seconds is displayed for one full second while the computer counts the fractions until the next whole number of seconds is reached at which time the next whole number of seconds is displayed.
If the computer is operating at a cycle of hertz of 1000 per second, then it is reasonable to believe the system could process 1000 last second snipe bids with out a problem. Of course all would show with the same whole number of seconds, but be sorted by the the number of cycles (hertz) which had occurred since the last whole number of seconds. As always of those bids the unique highest will win, no matter when placed. However if tie bids are revived then the bid received farthest from the end of the auction, AKA the earliest (first) of the tie wins as per the eBay stated rules.
Quote:
How Computers Keep Time Correctly?
Quote:
How Clocks Work?
Mechanical clocks are devices that convert the energy loaded into the mechanism inside into measured rhythmic movements, count these movements with gears, and when certain numbers are reached, present this as a measurement of time in a way that can be perceived by humans, such as the sound of a bell or changing angles on the dial.
What is the Time?
Time is the continued sequence of existence and events that occurs in an apparently irreversible succession from the past, through the present, into the future. It is a component quantity of various measurements used to sequence events, to compare the duration of events or the intervals between them, and to quantify rates of change of quantities in material reality or in the conscious experience. Wait, What? This is a deep definition. Of course, I copied from Wikipedia. But I will focus measuring the time with computers.
How Computers Time Work?
Computers have programmable timer chips. These chips can send 100, 500, 1000 or 1000000 interrupts per second depending on the required sensitivity. In computers, time is found by counting these interrupts and multiplying them by a fixed number. For example, in a chip that sends 100 (100 Hz) interrupts per second, time is increased by 10 milliseconds (ms) for each interrupt. In a chip that generates 1000000 (1 MHz) interrupts per second, each interrupt is 1 microsecond. This means that a computer's sensitivity to time depends on the sensitivity of the chip.
However, a computer's ability to keep accurate time can be affected by environmental factors. For example, changes in temperature and voltage can affect the precision of the timer chip. For this reason, computers regularly synchronize their time with NTP (Network Time Protocol). NTP servers receive the time from crystals that can keep time very precisely but are only off by a few milliseconds every few years. That means incredible precision in time measurement.
NTP adjusts a computer's clock by comparing it with NTP servers around the world. NTP servers use very accurate clocks, such as atomic clocks. This allows computers to set their time with very high accuracy. There is also a hierarchy among them (e.g. Stratum 0, 1, 2 and 3). The reason for this hierarchy is cost.
It is important for computers to keep and synchronize time accurately for many applications. For example, timekeeping is necessary to accurately schedule files and processes, to route network traffic correctly, and to ensure that users see the time correctly. Now let's look at the genius method used for time synchronization between NTP and computers.
For example, if its own clock is 11:01:06 and the server clock is 11:01:12, it will not suddenly move the clock to 11:01:12. We said above that a 100 Hz chip adds 10 ms per interrupt. The NTP client adds 9 ms (sampling) instead of 10 ms per interrupt for a period of time until the gap closes, thus consciously going back 100 ms every second (like slowing down when you are running and your friend can't keep up with you). As a result, the client makes up the 4 s (4000 ms) difference in 40 seconds. The reason for this behavior, called disciplining the system clock, is to avoid confusing the applications running on the computer. This way, when time changes on computers, everything usually continues to work properly.
Measuring time accurately and increasing precision is so important that this year the 2023 Nobel Prize in Physics was awarded to three scientists for their "attosecond" experiment, the smallest known unit of measurement of time.
To understand how small a unit of time the attosecond is, we can use the following example. For comparison, an attosecond is to a second what a second is to about 31.71 billion years. Especially in sensitive and large projects related to human life, the tiniest error in time can result in a disaster.
https://ahmetdoruk.medium.com/how-c...7de71f20d82dThis is one of those times, yet again, I miss Don (RIP). He would have already explained the issue by the time I saw the thread and likely used fewer words.
Edit: Added a missing "s" to a word.