I am analyzing metal specimens that are generally single phase. Occasionally I will get a request to analyze a specimen with two phases and report the average composition and stats for each phase, but in all cases I am expecting to get totals in the 98-102 range.
I agree that normalizing would not be a good idea. The idea to normalize data had come up when reviewing an old data set that had a significant number of points with low analytical totals, maybe 25%. The basic idea was to "salvage the run" and get some use out of the data because losing a quarter of the data points reduced the power of the statistical tests. I was thinking that a data set with that many points outside the 98-102 range should be treated as suspicious and scrapped, but I didn't really have a good reason other than my own "gut feeling".
Any thoughts on the percent of unreliable data points in a data set?
Thanks very much for the help,
Gian