The titration method is only accurate to within one drop which for a 25 ml sample size would be 0.2 ppm. When you add add the last drop that extinguishes the pink color during the FC portion of the FAS-DPD test, it can over-shoot by up to one drop's equivalent especially if the sample was a faint pink when you added the last drop. So usually, the CC reading in a FAS-DPD test will read from 0 to somewhat less than 0.2 ppm higher for FC and at the same time 0 to somewhat less than 0.2 ppm lower for CC than the "true" reading such that the sum of these two errors equals 0.2 ppm (assuming there is both FC and CC to read -- if the CC is really zero, then an over-shoot won't have any error for CC as it will still read 0). Though one could try using a 50 ml sample size, the resulting resolution of 0.1 ppm isn't really accurate because the pink transition is so faint. How are you getting a 0.1 ppm reading when you titrate?
As for non-chlorine shock (MPS), it reads as TC in the OTO test, FC in the FAS-DPD test and CC in the DPD test.
waterbear may have more info regarding a comparison of titration vs. photometric in his experience.
Why do you care about this 0.2 ppm difference? A CC of 0.2 ppm or less is perfectly acceptable and within existing code standards. Now if you are at a higher CC, then this difference may be more of a problem for you and the titration kit would probably let you pass more readily, assuming the inspector also uses a titration kit.
Bookmarks