What Does µm Mean (Micron)?

Glossary Definition

Microns, also known as micrometers (represented as µm) are a length of measurement equal to one millionth of a meter. (1,000µm is equal to 1mm.)

You will see microns used for two types of measurements in our products. They are used to measure the wavelengths of electromagnetic radiation, and they are also used for measuring the pixel pitch of our thermal sensors. These are two different things that do not directly relate to one another.

Sensor sizes for thermal imagers work similarly to standard visible cameras like DSLRs, where the larger the sensor the wider the angle of view you will get. The smaller the sensor, the narrower your field of view will be, assuming the rest of the specifications such as resolution and lens focal length are the same. (In standard consumer cameras this phenomenon is known as crop factor and is well understood by most photographers.)

For example, our 10µm pixel pitch thermal sensor will have a 50% narrower field of view than a standard 15µm sensor, which means a 100mm lens on our 10µm sensor will give you the same field of view as a 150mm on standard 15µm sensor. A smaller sensor size results in a longer viewing range, while a larger sensor size results in wider view with less detail.