When I try telling the software that my samples are coated with 0.6nm of iridium instead of the default 20nm of C, is the software able to handle this? I have noticed that the numbers are exactly the same (down to the third decimal place) if I process the data both with the C and the Ir coating. This seems a little suspect to me.... but if the standards and the unknown have the same coating then maybe it is real? Though or ideas?