Q. Should I use 100ohm or 1000ohm RTD’s?

A. In many HVAC applications, the RTD is some distance from the controller or transmitter. This is where the 1000 ohm RTD has a big advantage. The sensitivity of the 100 ohm RTD is about 0.21 ohms per °F, while the sensitivity of the 1000 ohm RTD is about 2.1 ohms per °F. This means that the error in reading caused by the resistance of the wire will be much greater when using 100 ohm RTD’s. For example, if a 100 ohm and a 1000 ohm RTD are connected to a transmitter using 100 feet of 18 gauge wire, the error in reading caused by wire resistance is calculated as follows.
Complete loop for 2-wire RTD = 200 feet of wire

18 gauge wire resistance is 0.664 ohms/100 ft.

Total resistance of wire = 1.328 ohms

For a 100 ohm RTD, error = 1.328/0.21 = 6.3°F.
For a 1000 ohm RTD, error = 1.328/2.1 = 0.63°F.

This means that if your controller or transmitter is not located with the RTD, a 1000 ohm RTD should be used.