Method validation is the process used to confirm that the analytical procedure employed for a specific test is suitable for its intended use. Results from method validation can be used to judge the quality, reliability and consistency of analytical results; it is an integral part of any good analytical practice.
Analytical Method Validation
- Validation should be performed in accordance with the validation protocol. The protocol should include procedures and acceptance criteria for all characteristics. The results should be documented in the validation report.
- Justiﬁcation should be provided when non-pharmacopoeial methods are used. If pharmacopoeial methods are available, justiﬁcation should include data such as comparisons with the pharmacopoeial or other methods.
- Standard test methods should be described in detail and should provide sufﬁcient information to allow properly trained analysts to perform the analysis in a reliable manner.
- As a minimum, the description should include the chromatographic conditions (in the case of chromatographic tests), reagents needed, reference standards, the formulae for the calculation of results and system suitability tests.
Characteristics That Should Be Considered during Validation of Analytical Methods
- Detection limit
- Quantitation limit
- It is the degree of agreement of test results with the true value, or the closeness of the results obtained by the procedure to the true value.
- It is normally established on samples of the material to be examined that have been prepared to quantitative accuracy.
- Accuracy should be established across the speciﬁed range of the analytical procedure.
- It is the degree of agreement among individual results. The complete procedure should be applied repeatedly to separate, identical samples drawn from the same homogeneous batch of material.
- It should be measured by the scatter of individual results from the mean (good grouping) and expressed as the relative standard deviation (RSD).
- It should be assessed using a minimum of nine determinations covering the speciﬁed range for the procedure e.g. three concentrations/three replicates each, or a minimum of six determinations at 100% of the test concentration.
2.2 Intermediate Precision
- It expresses within-laboratory variations (usually on different days, different analysts and different equipment). If reproducibility is assessed, a measure of intermediate precision is not required.
- It expresses precision between laboratories.
3. Robustness (or ruggedness)
- It is the ability of the procedure to provide analytical results of acceptable accuracy and precision under a variety of conditions.
- The results from separate samples are inﬂuenced by changes in the operational or environmental conditions.
- Robustness should be considered during the development phase, and should show the reliability of an analysis when deliberate variations are made in method parameters.
3.1 Factors That Can Have An Effect On Robustness When Performing
Chromatographic Analysis Include:
- Stability of test and standard samples and solutions;
- Reagents (e.g. Different suppliers);
- Different columns (e.g. Different lots and/or suppliers);
- Extraction time;
- Variations of pH of a mobile phase;
- Variations in mobile phase composition;
- Temperature; and
- Flow rate
- It indicates the ability to produce results that are directly proportional to the concentration of the analyte in samples.
- A series of samples should be prepared, in which the analyte concentrations span the claimed range of the procedure.
- If there is a linear relationship, test results should be evaluated by appropriate statistical methods.
- A minimum of ﬁve concentrations should be used.
- It is an expression of the lowest and highest levels of analyte that have been demonstrated to be determinable for the product. The speciﬁed range is normally derived from linearity studies.
6. Speciﬁcity (Selectivity)
- It is the ability to measure unequivocally the desired analyte in the presence of components such as excipients and impurities that may also be expected to be present. An investigation of speciﬁcity should be conducted during the validation of identiﬁcation tests, the determination of impurities and assay.
7. Detection Limit (Limit of Detection)
- It is the smallest quantity of an analyte that can be detected, and not necessarily determined, in a quantitative fashion.
may include instrumental or non-instrumental procedures and could include those
- Visual Evaluation;
- Signal to Noise Ratio;
- Standard Deviation Of The Response And The Slope;
- Standard Deviation Of The Blank; &
- Calibration Curve
8. Quantitation Limit (Limit Of Quantitation)
- It is the lowest concentration of an analyte in a sample that may be determined with acceptable accuracy and precision.
- Approaches may include instrumental or non-instrumental procedures and could include those based on:
- Visual Evaluation;
- Signal to noise ratio;
- Standard deviation of the response and the slope;
- Standard deviation of the blank; and
- Calibration Curve.
For feedback or suggestions, kindly write to email@example.com