When was the first world's first commercial atomic clock unveiled?

The Atomichron, unveiled on October 3, 1956, was the world's first commercial atomic clock. At a time when timekeeping is more accurate than ever before.  

In his work of fiction The Time Keeper, American author Mitch Albom has one of his characters say that "man will count all his days, and then smaller segments of the day, and then smaller still- until the counting consumes him, and the wonder of the world he has been given is lost. While the last part of the statement is rather too deep, and well beyond the scope of this column, there might be some truth with respect to the counting consuming us.

Comes down to counting When we started, we looked up at the sun and the moon to get a sense of time. We picked up stones, collected water, and were able to tell time even better. And now, we have come to a stage where the best of our clocks are so precise that it would take around 30 billion years for it to lose even one second.

And yet, at the heart of it, the fundamental process remains the same as we count a periodic phenomenon. In a grandfather clock, the pendulum swings back and forth. In a wristwatch, an electric current ensures that a tuning fork-shaped piece of quartz oscillates. And when it comes to atomic clocks, we use certain resonance frequencies of atoms and count the periodic swings of electrons as they jump between energy levels.

What are atomic clocks? The best of our clocks, by the way, are atomic clocks. As we learned more of the atom's secrets, we were able to build practical applications, including these clocks.

We now know that an atom is made up of a nucleus - consisting of protons and neutrons- that is surrounded by electrons. While the number of electrons in an element can vary, they occupy discrete energy levels, or orbits.

Electrons can jump to higher orbits around the nucleus on receiving a jolt of energy. As an individual element responds only to a very specific frequency to make this jump, this frequency can be measured by scientists to measure time very accurately.

Been around since 1950s

By the mid 1950s, atomic clocks with caesium atoms that were accurate enough to be used as time standards had been built.

The Massachusetts Institute of Technology Research of Electronics developed the first commercial atomic clocks around the same time, and these were manufactured by the National Company, Inc. (NATCO) of Malden, Massachusetts.

Initially, the atomic beam clocks that NATCO were building were called just that: ABC. By 1955, the prototypes bore the working name National Atomic Frequency Standard (NAFS). As this acronym was clearly not pleasing to the ear, there was a need for a better name to market the first practical commercial atomic clock.

Quantum electronics equipment

They came up with the name Atomichron, which NATCO then made its generic trademark for all their atomic clocks. In a well publicised event at the Overseas Press Club in New York, the Atomichron was unveiled to the world on October 3, 1956.

The first commercial atomic clock was indeed the first piece of quantum electronics equipment made available to the public. In the years that followed, 50 Atomichrons were made and sold to military agencies, government agencies, and universities.

Defining a second

By 1967, the official definition of a second by the International System of Units (SI) was based on caesium. This meant that the internationally accepted unit of time was now defined in terms of movements inside atoms of caesium.

Atomic clocks, however, aren't going to come home soon. At about the size of a wardrobe, it consists of interwoven cables, wires, and steel structures that are connected to a vacuum chamber that holds the atoms.

These clocks, however, are already in use everywhere around us. Be it satellite navigation, online communication, or even timed races in the Olympics, atomic clocks are in action. The best of our atomic clocks, as you might guessed, are employed in research and experiments to further our understanding of the universe around us.

Picture Credit : Google 

What is the backstory behind the invention of the Xerox machine?


Young Chester Carlson worked as a patent analyser for a manufacturer of electrical components. This required laborious paperwork - he had to submit multiple copies when registering his company's inventions and ideas at the patent office. Each duplicate had to be written by hand. Carlson suffered from arthritis. He knew there had to be another way of doing his job.

Working in his kitchen during his free time, Carlson discovered that some materials changed their electrical properties when exposed to light called photo-conductivity. After years of research, he came up with a patent in 1942 for a reproduction technique based on this, which he named 'electric photography. Another 20 years went by before he found a company interested enough to manufacture the machine. He was turned away by the likes of IBM, GE and RCA, until in 1960, the Haloid company finally thought his idea marketable.

The company was later named Xerox. The process became so popular all over the world that the word ‘xeroxing’ (a trademark) is used instead of the correct term-‘photocopying’!

Picture Credit : Google 


William Bateson was an English biologist who was the first person to use the term genetics to describe the study of heredity, and the chief populariser of the ideas of Gregor Mendel following their rediscovery in 1900 by Hugo de Vries and Carl Correns. His 1894 book Materials for the Study of Variation was one of the earliest formulations of the new approach to genetics.

Bateson became the chief popularizer of the ideas of Mendel following their rediscovery. In 1909 he published a much-expanded version of his 1902 textbook entitled Mendel's Principles of Heredity. This book, which underwent several printings, was the primary means by which Mendel's work became widely known to readers of English.

"Bateson first suggested using the word "genetics" (from the Greek [Offsite Link]  genn?, ?????; "to give birth") to describe the study of inheritance and the science of variation in a personal letter to Alan Sedgwick... dated April 18, 1905. Bateson first used the term genetics publicly at the Third International Conference on Plant Hybridization in London in 1906. This was three years before Wilhelm Johannsen used the word "" to describe the units of hereditary information. De Vries had introduced the word "pangene" for the same concept already in 1889, and etymologically the word genetics has parallels with Darwin's concept of pangenesis.

Bateson co-discovered genetic linkage with Reginald Punnett, and he and Punnett founded the Journal of Genetics in 1910. Bateson also coined the term "epistasis" to describe the genetic interaction of two independent traits.

Picture Credit : Google 


Researchers from Weizmann Institute of Science have developed an advanced, innovative method to detect non-visual traces of fire. Using this method, they have discovered one of the earliest known pieces of evidence for the use of fire, dating back at least 8,00,000 years. Their results have been published in an article late in June in PNAS.

Ancient hominins are a group that includes humans and some of our extinct family members. The controlled use of fire by this group dates back at least a million years. Archaeologists believe that this was the time when Homo habilis began its transition to Homo erectus.

Cooking hypothesis

A working theory called the "cooking hypothesis", in fact, postulates that the use of fire was instrumental in our evolution. Controlled fire not only allowed for staying warm, crafting tools, and warning off predators, but also enabled cooking, paving the way for the growth of the brain.

Traditional archaeological evidence relying on visual identification of modifications resulting from combustion has provided widespread evidence of fire use no older than 2,00,000 years. Sparse evidence of fire dating back to 5,00,000 also exists.

The team of scientists involved in this research had pioneered the application of Al and spectroscopy in archaeology to find indications of controlled burning of stone tools. For this research, they developed a more advanced Al model capable of finding hidden patterns across a multitude of scales. Output of the model could thus estimate the temperature to which the stone tools were heated.. providing insights into past human behaviours.

Assess heat exposure

The researchers took their method to Evron Quarry, an open-air archaeological site first discovered in the 1970s. The site is home to fossils and tools dating back to between 8,00,000 and 1 million years ago, but without any visual evidence of heat. With their accurate Al, the team assessed the heat exposure of 26 flint tools. The results showed that these tools had been subjected to a wide range of temperatures, with some even being heated to over 600 degree Celsius. The presence of hidden heat puts the traces of controlled fire to at least 8,00,000 years ago.

Apart from identifying non-visual evidence of fire use, the scientists hope that their newly developed technique will provide a push toward a more scientific, data-driven archaeology that uses new tools. The researchers believe that this will help us understand the behaviour of our early ancestors and the origins of the human story.

Picture Credit : Google 


Food waste is a huge problem worldwide. In Japan alone, the edible food waste produced in 2019 amounts to 5.7 million tons. While their government aims to reduce that to around 2.7 million tons by 2030, there are others who are working on the same problem differently. Researchers from Tokyo University, for instance, have found a new method to create cement from food waste.

In addition to addressing the issue of food waste, the researchers also hope to reduce global warming in this way. Apart from the estimate that cement production accounts for 8% of the world's carbon dioxide emissions, there is also the fact that wasted food materials rotting in landfills emit methane. By using these materials to make cement, scientists hope to reduce global warming.

The researchers borrowed a heat pressing concept that they had employed to pulverise wood particles to make concrete. By using simple mixers and compressors that they could buy online, the researchers used a three-step process of drying. pulverising, and compressing to turn wood particles into concrete.

Heat pressing concept

Following this success, they decided to do the same to food waste. Months of failures followed as they tried to get the cement to bind by tuning the temperature and pressure. The researchers say that this was the toughest part of the process as different food stuff requires different temperatures and pressure levels.

The researchers were able to make cement using tea leaves, coffee grounds, Chinese cabbage, orange and onion peels, and even lunch-box leftovers. To make this cement waterproof and protect it from being eaten by rodents and other pests, the scientists suggest coatings of lacquer.

Cement that can be eaten!

Additionally, the researchers tweaked flavours with different spices to arrive at different colours, scents, and taste of the cement. Yes, you read that right. This material can even be eaten by breaking it into pieces and then boiling it.

The scientists hope that their material can be used to make edible makeshift housing materials for starters, as they are bound to be useful in times of disasters. If food cannot be delivered to evacuees, for instance, then they could maybe eat makeshift beds prepared from food cement. The food cement that they have created is reusable e and biodegradable.

Picture Credit : Google