Time, Where does it come from?

This is part of a two part series on time measurement. Part 2 will explain how to obtain accurate time measurement in Java. This first part explains the general concept of time measurement.

How do we measure time?

With a clock of course. So how do we build a clock? The first thing we need is to observe something that occurs at a constant rate. Humans have built clocks based on observations of changes in their environment that appeared to occur at a constant rate.


Some of these earliest clocks were Sundials. The shadow cast by the Sundial moves based on the rate of rotation of the earth on its axis. This observation of the Earth's rotation is what Greenwich Mean Time was based on. At the Royal Observatory in Greenwich astronomers would observe the sun and other stars in order to calibrate their clocks. In principal on average the sun would appear directly above in the sky at 12h00.00. This is why its called Mean time. The current term for this mean solar time is called UT1. This method of time measurement has the fundamental problem that the earth does not rotate on its axis at a constant rate, it fluctuates and on average is slowing down. At this rate eventually the earth will stop rotating on its axis, fortunately this will only happen in 1.9 trillion years.

The definition of the second

The second unit has historically been derived from the length of the day divided by 86400. (24 hour x 60 min x 60 sec). Since the invention of Atomic Clocks measurements of time with much greater accuracy is possible. The Nist-F1 Atomic Clock is expected to not vary more than a second in 100 million years. With this level of accuracy, it was insufficient to define the second with respect to the length of a day and it was therefore redefined in 1967 as "9 192 631 770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium 133 atom." That's a mouth full, put a little more simply its an electron moving back and forth between two sub-energy levels of a caesium atom.

Why is time measurement useful?

Sometimes we want to determine the order of events, their spread or agree on having an event at a time in the future. Unfortunately there is no easily derived absolute time, i.e. time has to be measured relative to some particular point in the past. We therefore make up some point, usually a significant event in history where we measure time from. Some examples of these are the Year 1 in the Gregorian calendar (they started counting from 1) and Unix time. Unix time is defined as the number of seconds since 00:00:00 UTC on the 1st of January 1970.

With these relative starting points we can state that the Boston Tea Party occurred on the 16th of December 1773 on the Gregorian Calendar or that there will be a total solar eclipse at 1607962479 Unix time. Depending on the event we require varying levels of granularity in our time measurement. Events like the Boston Tea Party do not require second level granularity to be useful, however a solar eclipse can last a few seconds and therefore in order to avoid missing the event second level granularity is required.

Time in a computing world

Computers make millions of decisions per second. In Financial Markets computers are used to automatically make decisions on whether to buy or sell assets. In order to determine the order of events we need clocks with micro and sometimes nano second granularity. A microsecond is one millionth of a second and a nanosecond is one billionth of a second. To put that in perspective, one nanosecond is to one second as one second is to 31.71 years.

Markets in Financial Instruments Directive (MiFID)

MiFID is a large set of European Union laws which impose a broad range of regulations on investment services. A review of MiFID known as MiFID II requires details of trades to be published including the time at which they occurred. For traders using "high frequency algorithmic trading" it requires the clocks used have at least one microsecond granularity. See: Regulatory technical and implementing standards - Annex I , pg 507.

Granularity of a time measurement is not very useful if your measurement is not accurate. The only way to determine the accuracy of a clock is to use another clock that you trust more. In order to check the accuracy of your wrist watch you glance at another clock and compare. You may be able to ensure that your clocks are accurately synchronised by a couple of seconds. This process is much more challenging when you are attempting to sync clocks with microsecond accuracy. 

Part 2 How to efficiently obtain nanosecond clock precision from Java.


Popular posts from this blog

Obtaining the time in Java with nano-second precision

Persisting JMH Results for Regression Detection

Cracking the Nut - Open Sourcing from a Single Source Tree