I had two simple questions about the Sensor Accuracy property.
Question #1: What is the number format?
For example, should 80% be stored in the JSON response as "80" or "0.80"?
I'm guessing "80" based on other percentages in Redfish, but I wanted to confirm it.
In the mockup, there are several Sensors like PS1InputVoltage and PS1Out12V that have Accuracy values of "0.02". So I wasn't sure if that was supposed to mean 2% or 0.02%.
Question #2: Is the percentage standalone, or is it related to other min/max values?
For example, a sensor has an Accuracy of 2% and a Reading of 10.
Does that mean the actual sensor value is 9.8 to 10.2?
Or is the Accuracy based on a min/max range? For example, if the min/max range of the Sensor is 0-100, then the actual value is 8 to 12?
If the Accuracy is based on a min/max range, where is that defined? Is it the ReadingRangeMin/ReadingRangeMax properties of the Sensor? Since those properties are optional, if they are missing I guess the Accuracy property can't be used?
Last Edit: May 24, 2022 20:32:20 GMT by shawnmm: Added followup to Question #2.
For question 1, the units annotation on Accuracy shows "%", which means the value would be from 0 to 100. So, a value of 0.25 means 0.25%.
For question 2, the accuracy is standalone, and in simple cases I would expect this to be derived based on the sensor's specification, such as an LM75 purchased from a particular supplier. The specification of the sensor would contain the limits of the device, such as min and max readings, as well as the accuracy of the reading. More sophisticated devices might report this information outward (perhaps they have I2C registers for a manager to extract this information). For synthesized sensors where the manager is constructing a reading based on other information from the system, it would need to be calculated based on the constraints of the other sources.
I wanted to ask a quick follow-up to ensure I am understanding the answer to question 2. There was still some confusion on this.
So if a sensor has an Accuracy of 2% and a Reading of 10, the actual value is assumed to be from 9.8 to 10.2? The Accuracy value is independent of a min/max range, such as ReadingRangeMin/ReadingRangeMax?
Yes, technically if you have a 2% accuracy and the reading is 10, the actual metric could be from 9.8 to 10.2 (regardless of the min/max properties). Think of it in terms of how a physical sensor might be sampling something. Using the LM75 example again, they have a spec'd range for what they report in their I2C register, such as -20C to 100C; this means that from an I2C register perspective, you're guaranteed a value from -20C to 100C, and if the value read over I2C is outside of that range, it should be considered invalid. However, due to how the sensor might be manufactured, the environment could be 100.2C, but the hardware in the sensor is detecting it at 99.8C, which in turn is the value read by the BMC over I2C. The true accuracy of the sensor is typically going to be dependent on manufacturing processes of the device and typically can only be verified if you have other equipment that has much tighter accuracy tolerances to verify the readings.
One thing Jeff noted to me is that since typically sensor accuracy is rated in terms of the sensor's units from the manufacturer as opposed to a percentage, we'll need to clarify how to establish an accuracy as a percentage in the Redfish model. Using the LM75 example again, the sensor might be rated with an accuracy of +/- 0.5C over the -20C to 100C range. So, with this in mind, you're probably right in that you need to use the range of the sensor to determine the percentage value to assign the Accuracy property in Redfish. We'll need to discuss this further with others.