Definition: An Interval Scale is a numeric scale in which the numbers are assigned to the objects in such a way that numerically equal distances on the scale represent the equal distances between the characteristics of the objects being measured.
The interval scale possesses all the characteristics of an ordinal scale, but it also allows the researcher to compare the difference between the objects. The interval scale is characterized by a constant or equal interval between the values of the scale. This means the difference between any two values is equivalent to the difference between any two adjacent values of an interval scale. Such as, the difference between 1 and 2 on a scale is same as that of the difference between 3 and 4 on the scale.
The most common example is a Celsius temperature scale in which the difference between the values is same. In the case of interval scales, the distance between the descriptors is also known. Time is another most common example of an interval scale in which the values are known, constant and measurable.
In the ordinal scales, there is no fixed beginning or a true zero point, which means these scales do not possess the origin characteristic. For example, in the case of a temperature scale, there is no point where the temperature can be zero. And without a true zero point, it is impossible to compute Ratios.
Leave a Reply