The radiator fan on these cars has two separate speeds they can run at, LOW and HIGH.
In the case of this 2006 Gransport, the low speed turns on at 95 deg C and shuts off at 90 deg C. Conversely the high speed turns on at 100 deg C and shuts off at 95 deg C. This assumes no air con.
The way that the speed is reduced in the low speed case is that the circuit is fed through a resistor that causes a voltage drop and in turn causes the fan to spin slower. This resistor is subjected to high temperatures due to the power it must dissipate and often fail open. When they do, the low speed fan no longer functions.
The replacement part (p/n 263140) isn’t necessarily that expensive, but still runs around $75.
Mine was looking a bit sorry. The green insulating coating had cracked away and exposed the resistive element inside.
Ideally it looks something more like this when new
Regardless of the physical appearance the resistor still may work perfectly fine.
The first thing to check is that there are no breaks in the metal coil wrapping around. If they are, the resistor is most definitely shot and needs replacement. If no obvious damage, the next thing to verify is that the resistance is still to spec indicating there’s no breaks or the resistive strip isn’t marginal.
The reason this test is needed is that regardless of the physical appearance looking good, there’s actually a small thermal fuse underneath the resistor (metal cylinder). Its job is to open the circuit should the resistor reach extreme temperatures. Mine has a Microtemp 216C which will blow open between around 191C and 200C. This is a one shot deal as well and once it’s blown open, will remain open circuit forever. (notetoo this isn’t measuring coolant temperature, its the temperature in the vicinity of the resistor)
The nominal resistance of the resistor part is 0.23 ohms based on the sticker on the part. Measuring this low of a resistance with a basic ohm meter directly would not be very accurate. Problems such as the leads on your multi-meter themselves likely have a resistance in this neighborhood, and the measurement itself would be highly dependent on how well you can get the probes to contact the terminals on the resistor. For fun I tried this and I was measuring fairly random values of about 0.46 – 2.7ohms using this method (e.g. not very useful). My probes, with nothing being measured by them themselves were alone showing 0.18 ohms.
A more accurate way to do this is to use the 4-wire method. This entails feeding a constant current through the resistor and simply measuring the voltage drop across it. This measurement is generally unaffected by the test leads and how well they make contact and so is much more accurate. The main pre-requisite for this test is that you have a DC power supply capable of outputting a constant current. If you don’t, you can still get away doing this test with a DC battery (keep the voltage low!! e.g. 1.5V AA) and first measuring the current in the circuit, and then measuring the voltage drop but it’s not nearly as convenient.
For my setup, I setup a constant current source of 0.5A on my power supply. I then measured the voltage across the resistor as close as possible to the resistor itself using a multi-meter to be 117.1mV.
Using some simple Ohm’s law, V=IR or R=V/I. In this case, R=0.1171V / 0.499A = 0.235 ohms. Or in other words, this resistor is actually currently still good and lands pretty much dead on the expected 0.23 ohms. To sanity, you can try different currents and should see the voltage follow linearly. e.g. I tried 0.25A next and measured 59.6mV (e.g. R=0.238ohms) so my measurements match up nicely. This means my resistor is still within spec and doesn’t have any marginal opens (increasing the resistance) or unintended shorts (reducing the resistance). The insulation is badly damaged though and I’ll likely want to get it replaced.