Home Courses Physics Physics and Measurement
Physics

Physics and Measurement

Share
Share

To measure is to know. If you cannot measure it, you cannot improve it.
Lord Kelvin (William Thomson)

The topic of “Physics and Measurement” is of critical importance for understanding the fundamental concepts that form the foundation of physics as a science. This topic covers the definition of physical quantities, the examination of measurement methods, and the interpretation of measurement results. In this article, we will delve into the contribution of measurement to physics, the measurement units used, measurement errors, and the significance of scientific notation.

The Scientific Foundations of Physics and Measurement

Physics is the branch of science that studies the material and energetic properties of the universe and their interactions. The description and understanding of physical phenomena are only possible through the quantitative expression of these phenomena using specific quantities. This is where the concept of measurement comes into play. Measurement is the process of obtaining the quantitative expression of a quantity by comparing it with a specific reference unit. All advancements in both theoretical and experimental physics are based on accurate and reliable measurements.

Physical Quantities and Unit Systems

Physical quantities are divided into two categories: fundamental quantities, which can be measured directly, and derived quantities, which are derived from these fundamental quantities. Fundamental quantities include length (meter, m), mass (kilogram, kg), time (second, s), electric current (ampere, A), temperature (kelvin, K), amount of substance (mole, mol), and luminous intensity (candela, cd). Each of these quantities is defined in the SI (Système International d’Unités) system, which forms the universal language of physical measurements.

Derived quantities are obtained by mathematical combinations of the fundamental quantities. For example, speed is defined as the distance traveled per unit of time and is expressed in meters per second (m/s). Force is defined as the product of mass and acceleration and is measured in Newtons (N). The units of derived quantities are derived from the units of fundamental quantities and are standardized in the SI system.

Measurement Errors: Uncertainty and Precision

Any measurement process involves a specific measuring device and method, and the value obtained from this measurement contains a certain level of uncertainty. Measurement error refers to the deviation of the measurement result from the true value and is generally categorized into random errors and systematic errors.

  • Random errors occur when the measurement results show random variation within a certain range. These errors can be reduced by repeating the measurements.
  • Systematic errors, on the other hand, arise from a bias in the measuring device or method and consistently cause the measurement results to deviate in the same direction. Detecting and correcting systematic errors is crucial for obtaining accurate measurement results.

Measurement results are usually expressed with a certain level of uncertainty, which provides information about the precision of the measurement. Precision refers to the repeatability of a measurement, while accuracy indicates how close the measurement is to the true value. Achieving high precision and accuracy in physical experiments is key to obtaining reliable scientific results.

Scientific Notation and Dealing with Large and Small Numbers

In physics, we often encounter quantities that are either extremely large or very small. To express these numbers more conveniently and to facilitate comparison, scientific notation is used. Scientific notation involves writing a number as the product of its decimal part and a power of 10. For instance, the distance between the Earth and the Sun, rather than being expressed as 150 million kilometers, can be written as 1.5 × 10^8 km.

Scientific notation is useful not only for large numbers but also for very small ones. For example, the radius of a hydrogen atom is approximately 0.00000005 cm, which can be expressed in scientific notation as 5 × 10^-8 cm. This notation makes it easier to compare physical quantities and use them in calculations.

Measurement: The First Step in Physics

Measurement is one of the most fundamental aspects of physics and plays a central role in understanding physical phenomena. The data obtained through measurement is used to test theoretical models, discover new laws, and develop engineering applications. Understanding the measurement processes, minimizing errors, and effectively using scientific notation are indispensable tools in our quest to understand the physical world.


To understand and learn these topics in depth, it is important to solve various problems and see how these concepts are applied in real life. Theoretical knowledge becomes more permanent when reinforced with practical applications. Therefore, problem-solving exercises that include physics and measurement applications will increase your mastery of this subject. In the questions and solutions section below, you can find examples that reinforce these topics.

Share

Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Articles
Physics

Gravitation

I just fell, but the world of science soared, -The Apple Gravitation...

Physics

Periodic Motion

Experimental science is the queen of knowledge. -Roger Bacon Periodic motion is...

Physics

Static Equilibrium

Science is a way of thinking much more than it is a...

Physics

Kinematics and Dynamics of Rotational Motion about a Fixed Axis

If learning the truth is a scientific’s goal, then he must make...