I've spend decades as a software/firmware developer of real-time systems, going all the way back to the 1970s when I was writing software in the assembler languages of the IBM 360/370 and the PDP-11. The term "real-time" always seemed kind of ironic, since it is easy, when closely scrutinizing such systems - with their asynchronous, concurrent, and parallel behavior - to come to the conclusion that time doesn't exist. Only ordered events. We don't have a way to measure time, except by counting events produced by some oscillator that ultimately derives its periodicity from nature. We call such devices a "clock", Since the only way to test the accuracy and precision of a clock is with a better clock, it's turtles all the way down.
Turing-award winning computer science Leslie Lamport even wrote what came to be a classic paper on this topic, "Time, Clocks, and the Ordering of Events in a Distributed System" [CACM, 21.7, 1978-07]. He proposed a "logical clock" which was simply a counter that incremented every time it was read, allowing events to be placed in a clear order. I remember reading this paper as a graduate student. And again, later. And again, even later. I may read it again today.
Years ago I mentioned this line of thought to a colleague of mine, who happened to have a Ph.D. in physics and had worked at Fermi Lab. (It's handy to keep such folks around just for this reason.) He immediately brought up the now obvious to me fact that time must exist: Einstein's special and general relativity.
Einstein's theories of SR and GR have been experimentally verified time and again (no pun intended). You can synchronize two atomic clocks side by side, then take one up to the top of mountain (where it experiences less gravity due to being further from the center of the Earth, and hence time runs faster: that's GR) and back down, and find they they now differ by just the predicted amount. This experiment has been done many times.
The U.S. Global Positioning System (and indeed all other Global Navigation Satellite Systems) work by just transmitting the current time to receivers on the Earth. Fundamentally, that's it. All the heavy lifting, computationally, is done by the GPS receiver in your hand. But the atomic clocks inside every GPS satellite have to be carefully adjusted by controllers on the ground to account for GR (because the satellites in their orbits are further from the center of the Earth than you are, and so their clocks run faster), and for SR (because the satellites in their orbits are centripetally accelerated more than you are, and so their clocks run slower). GPS wouldn't give useful results if this correction weren't performed.
The resonant frequency of cesium-133 is the definition of the "second" in the International System (SI) of units. Count off exactly 9,192,631,770 pulses of the microwaves emitted by cesium-133 during the hyperfine transition of their electron in the element's outer electron shell, and that's one second. If cesium is lying to us, we'll never know.
Or maybe we would. Experimental atomic clocks using elements like ytterbium are running in national metrology labs. These are called "optical" atomic clocks because they operate at terahertz frequencies using lasers instead of microwaves at gigahertz frequencies, and their periods are measured in attoseconds instead of nanoseconds. The time is very near in which the definition of the SI second will be changed to use these clocks.
Clocks that are so precise that their position has to be determined by careful surveying because their results are different if the altitude the laboratory optical bench changes by a centimeter, thanks to GR.
Clocks that are still nothing more than oscillators and counters.
(I took the photograph below in 2018: a survey marker embedded in the concrete floor of an optical atomic clock laboratory at NIST's Boulder Colorado facility.)