Home » Technology » Learn about the history of temperature measurement.

Technology

Learn about the history of temperature measurement.

Check out the timeline of the origin of measurement and its inventions, such as thermocouples and resistance thermometers.

The perception of hot and cold has always accompanied human experience. However, transforming these sensations into measurable values ​​has been a challenge that spans centuries. But who invented temperature measurement, and how is it possible to measure it as we know it today?

What is the origin of temperature measurement?

Well, there are no official records of methods used by ancient civilizations, such as the Greeks or Chinese, to measure temperature in a standardized way. Therefore, the history of thermal measurement is usually associated with the Renaissance period, when the first instruments capable of indicating variations in heat emerged.

Heat can be understood as the energy present in a body or material. The greater this energy, the higher the temperature. Unlike quantities such as mass and length, temperature could not be measured directly. Early methods observed the effects of heat on liquids, gases, or metals and, from these changes, estimated thermal values.

Another obstacle was the creation of reliable scales. In 1664, the English scientist Robert Hooke suggested the freezing point of water as an initial reference. Shortly after, the Danish astronomer Ole Rømer proposed the use of two fixed points: the freezing and boiling points of water. This approach allowed for the establishment of temperature ranges and gave rise to the first calibrated scales.

At the beginning of the 18th century, Rømer presented his own system of measurement. Around the same time, Isaac Newton developed a scale based on the freezing point of water and points obtained from the melting of metals. Although different from each other, these proposals helped to consolidate the idea that temperature needed to be expressed by comparable values.

What is temperature and how is it measured?

Temperature is a physical quantity that indicates the level of thermal energy of a body. It also expresses the direction of heat flow, which always occurs from the hotter body to the colder one. In the International System of Units, temperature is measured in kelvin (K). Absolute zero, equivalent to −273,15 °C, represents the point at which molecular activity is minimal.

On the Celsius scale, 0 °C corresponds to the freezing point of water and 100 °C to its boiling point, considering atmospheric pressure at sea level. The Fahrenheit scale uses the same reference points, defined as 32 °F and 212 °F. The two scales intersect at −40 °F.

Air pressure influences the behavior of water. Therefore, reference measurements consider standardized atmospheric conditions. These criteria were essential so that temperature could be compared in different places and contexts.

Initial relative measurements

Until the 17th century, the instruments available did not provide exact temperature values. They only indicated whether something was hotter or colder relative to another object.

Around 250 BC, the Greek engineer Philo of Byzantium developed a device in which heated air expanded and pushed water through a tube. When the temperature dropped, the liquid returned to the container. The movement of the water indicated thermal variations, but without numerical values.

In the 1st century AD, Heron of Alexandria created a similar mechanism, known as a thermoscope. It was also based on the expansion and contraction of air or water. A hundred years later, the physician Galen added a scale indicating hot, cold, and neutral. Still, there was no standardization.

These instruments helped to observe temperature changes, but they did not allow for precise measurements. The lack of calibration limited the scientific use of these tools.

At the end of the 16th century, European scientists took new steps in their attempts to measure temperature. Among them were Galileo Galilei, Santorio Santorio, Robert Fludd, and Cornelius Drebbel. They developed more refined versions of the thermoscope, using glass bulbs connected to tubes immersed in colored water.

As the temperature rose, the air inside the bulb expanded and the water level in the tube changed. Although it indicated temperature variations, the instrument was still influenced by atmospheric pressure. This prevented the acquisition of reliable values.

Despite these limitations, the thermoscope helped solidify the idea that temperature could be observed systematically, paving the way for the development of calibrated thermometers.

Who invented the temperature scales?

At the beginning of the 18th century, Ole Rømer created the first calibrated temperature scale, based on the freezing and boiling points of water. Shortly after, Isaac Newton proposed his own scale, using thermal references from metals.

The decisive breakthrough occurred when scientists began using liquids in sealed containers, reducing the interference of air pressure. In 1654, Ferdinando II de' Medici built a thermometer using alcohol. Then, in 1709, Daniel Gabriel Fahrenheit presented a model using mercury, which was more stable and accurate.

The Fahrenheit scale defined 32 as the freezing point of water and 212 as its boiling point. This standardization allowed for more consistent measurements and expanded the use of thermometers in fields such as medicine and meteorology.

In 1742, the Swedish scientist Anders Celsius proposed a new scale. Initially, he defined 0 °C as the boiling point and 100 °C as the freezing point. Later, the scale was inverted to become more intuitive, arriving at its current format.

Pyrometer for high-temperature measurements

In the 18th century, the English potter Josiah Wedgwood created a mechanical pyrometer to measure the temperature of kilns. The equipment was based on the expansion of materials when exposed to heat.

Other, more sophisticated models emerged throughout the 19th century. They used metal bars that expanded when heated, moving needles over graduated scales. These devices allowed measurements in environments where conventional thermometers could not withstand the pressure.

The pyrometer has become an important tool in industry, especially in processes involving high temperatures, such as the production of ceramics and metals.

Temperature measurement with a thermocouple.

In 1821, the physicist Thomas Seebeck He observed that joining two different metals generated an electric current when one of the ends was heated. This phenomenon became known as the Seebeck effect.

The discovery led to the creation of thermocouples, sensors that convert temperature variations into electrical signals. thermocouple It became widely used in industry due to its durability and ability to operate in high temperature ranges.

This type of sensor is still common in ovens, motors, and thermal control systems.

RTDs and electrical resistance

In the second half of the 19th century, Carl Wilhelm Siemens demonstrated that the electrical resistance of certain metals varied with temperature. Based on this observation, the following were created: RTDs (Resistance Temperature Detectors), usually made of platinum.

In 1885, Hugh Longbourne Callendar improved the design, making it more reliable and viable for commercial use. RTDs offer stable readings and are used in applications requiring higher precision.

Unlike thermocouples, these sensors require an external power supply and have more complex measurement systems.

Thermoresistors and semiconductors

As Resistance temperature detectors (RTDs) These are sensors made of metal oxides that change their electrical resistance according to temperature. They emerged as a compact and efficient alternative for thermal measurements.

With a typical resistance of around 2000 ohms at 25 °C, thermistors are used in electronic equipment, control systems, and household devices. Their main advantage is their sensitivity to small temperature variations.

Infrared measurement

Infrared radiation was discovered by William Herschel in the early 19th century. However, it wasn't until 1931 that the first thermometers capable of using it to measure temperature without contact appeared.

During World War II, technology advanced for military reasons. In the 1960s, physician Theodore Benzinger developed a portable pyrometer for medical use, facilitating quick and safe measurements.

Today, infrared thermometers are common in industrial inspections, airports, hospitals, and domestic environments.

Thermal cameras and thermography

Unlike point pyrometers, thermal cameras allow visualization of the temperature distribution across an entire surface. The technology began to be developed in the 1920s by the physicist Kálmán Tihanyi.

Initially designed for military use, these cameras have since been used in fields such as engineering, medicine, and security. With advancements in sensors and digital processing, the equipment has become more accessible.

Currently, thermography is used to identify machine faults, detect thermal leaks, and assist in rescue operations.

Concept of absolute zero

In the 19th century, scientists like Gay-Lussac studied the behavior of gases under different temperatures. They observed that volume increased proportionally to heating.

These studies led to the concept of absolute zero, the point at which molecular motion virtually ceases. This value was fixed at −273,15 °C and became the basis of the Kelvin scale.

Today, temperature sensors are part of data acquisition systems, medical equipment, industries, and household devices. Signal conditioning allows thermal information to be converted into digital data, facilitating automatic analysis and control.

Infrared cameras, thermocouples, RTDs, and thermistors continue to be used depending on the application. Each technology meets specific needs, whether in extreme environments or in high-precision measurements.

Temperature measurement, which began with rudimentary instruments, is now an essential element of modern life, influencing everything from health to industrial production.

Industrial temperature measurement in Brazil with Alutal

When it comes to temperature measurement in the Brazilian industrial sector, Alutal is considered a benchmark. The company has been operating for over three decades, supplying equipment and systems used in the thermal control of industrial processes in areas such as steelmaking, oil and gas, chemicals, food, paper and pulp, among others.

Their products include sensors, thermocouples, resistance thermometers, transmitters, and instrumentation systems used in furnaces, boilers, reactors, pipelines, and production lines. This equipment is used to monitor temperature variations that directly affect the operation of industrial plants.

In addition to manufacturing equipment, the company also provides calibration, maintenance, and technical support services. This set of activities allows Alutal to be present in industrial projects in different regions of the country, related to temperature control in production processes.

Take a look at the Alutal equipment used in... temperature measurement in industrial processes.

Anny Malagolini

Anny Malagolini is a journalist, writer, and SEO specialist with extensive experience producing strategic web content.

Operation and application of thermocouples