Department of Pharmaceutical Analysis , Ashokrao Mane Institute of Pharmacy , Ambap-416112,India
Chromatography, while primarily a separation technique, is widely used in chemical analysis. High-performance liquid chromatography (HPLC) is a highly versatile technology that separates analytes by passing them through a column filled with micrometer-sized particles. Today, reversed-phase chromatography is the most extensively utilized separation technique in HPLC. This is due to the reversed-phase method's simplicity, versatility, and scope, which can handle compounds with a wide range of polarity and molecular mass.This article discusses tactics and difficulties for planning HPLC method development and validation. Method development frequently follows well-established processes, such as buffer selection, mobile phase selection, and column selection. The approach developed should be as basic as feasible, and the optimum strategy is therotical and empirical approach.
High-Performance Liquid Chromatography (HPLC) has emerged as one of the most powerful and versatile analytical procedures in modern laboratories. HPLC was first introduced in the late 1960s and has since become the gold standard for compound separation, identification, and quantification in a variety of sectors. Its uses include pharmaceutical drug analysis, environmental testing, food safety review, clinical research, and forensic investigations.
HPLC's success stems from its capacity to handle a diverse spectrum of chemical substances, including those that are non-volatile, thermally unstable, or have large molecular weights. HPLC, as opposed to classical liquid chromatography, delivers faster, more efficient separation with improved resolution and sensitivity due to the use of high-pressure pumps that drive the mobile phase through closely packed stationary phase columns.
Importance Of Method Development
HPLC method development is critical for achieving dependable and reproducible analytical results. The purpose of method development is to identify the best chromatographic conditions for appropriate separation, accurate quantification, and consistent reproducibility. This requires fine-tuning numerous parameters, such as:
Creating an HPLC technique is frequently an iterative process that necessitates trial and tweaking of conditions until the desired performance is obtained. When working with complicated sample matrices or multi-component systems, overlapping peaks or contaminants can make proper analysis difficult.
Industry And Regulatory Compliance :
Method development and validation are crucial in highly regulated industries such as pharmaceuticals, not only to ensure product quality but also to meet regulatory requirements. Regulatory authorities such as the FDA, ICH, and WHO require rigorous validation processes before a method may be employed for quality control, release testing, or stability research. Adherence to these standards reduces the likelihood of product recalls, ensures the safety of medications and medical equipment, and promotes customer confidence. With the growing complexity of current medicine formulations and biopharmaceutical products, HPLC technique development and validation have gotten increasingly complicated. Advances in HPLC technology, such as ultra-high-performance liquid chromatography (UHPLC) and the use of mass spectrometry as a detection instrument, have increased the technique's capabilities.
CONCLUSION:
development and validation of HPLC techniques is critical to ensure that analytical results Theare accurate, reliable, and reproducible. Properly developed and verified methodologies are critical for quality control and regulatory compliance in a variety of sectors. Dealing with complex samples or attaining appropriate separation are examples of method development issues that can be handled by thorough chromatographic condition optimization. Successful validation ensures that the method performs consistently under typical testing conditions, meeting both industry and regulatory requirements. development and validation of HPLC techniques is critical to ensure that analytical results The As technology advances, HPLC's position in research and quality assurance becomes more important than ever, providing new options for precision, speed, and robustness in analytical science.
HPLC System:
Typical HPLC system consists of the following main components:
Solvent reservoirs: store enough HPLC solvents to ensure the system runs continuously. Could be equipped featuring an online degassing system
and unique using filters to separate the solvent from environmental influence. Pump: The pump ensures continuous flow of the mobile phase through the system. Most current pumps support this function. Controlled mixtures of several solvent from different reservoirs.
injector: introduces the analyte mixture into the mobile phase stream prior to its entry. Most current injectors are column-mounted. Autosamplers, which allow programmed injections of various sample quantities. which are taken from the vials in autosampler tray.column : The HPLC technology relies on the column to separate analytes from the mixture. A column is the space where the mobile phase comes in touch with the stationary phase from a contact with the enormous surface. Most of the chromatography development in recent years went toward the design of many different ways to enhance this interfacial contact. Detector: This is a device for continuous registration of specific physical (sometimes chemical) properties of the column effluent. The most common detector used in pharmaceutical analysis is UV(ultraviolet) which allows monitoring and continuous registration of the UV absorbance at a selected wavelength or over a span of wavelengths (diode array detection).
registration of specific physical (sometimes chemical) properties of the column effluent. The most common detector used in pharmaceutical analysis is UV(ultraviolet)The presence of an analyte in the detector flow-cell can induce
the Change of the absorbance. If the analyte absorbs more Compared to the background (mobile phase), A favorable signal is obtained.
The Data Acquisition and Control System is a computer-based system that regulates all parameters of the HPLC apparatus (eluent). composition (the combination of various solvents); Temperature, injection sequence, and so on. and collects data from the detector and monitors System performance (continuous monitoring) of the mobile-phase composition, Temperature, back pressure, and so on. What is knew before beginning technique development?
molecules Basic compounds. We strongly propose 10 - 20 mM of an ammonium salt of your choice as a starting buffer (i.e. acetate, formate, carbonate, or phosphate salts) for electrolyte solutes.
The section on method development for ionizable chemicals provides additional guidance on selecting and manipulating buffer systems. A buffer keeps the pH stable when a little amount of acid or base is applied.(10) Some of the additives are listed below. A Buffers are most efficient when utilized within ±1. pH unit of its p Ka , but may offer sufficient Buffering ±2 pH units from pKa [10]. The most commonly used buffers for HPLC with UV detection are phosphate and acetate.
Phosphate and acetate are especially beneficial. Buffers can be utilized at wavelengths.
Below 220 nm.
Choosing the right mobile phase composition and type is crucial for effective separation. However, the type of stationary phase used in the column limits the available choices. The primary distinction is between reversed and normal phase chromatography. In normal phase systems. Nonpolar solvents like hexane or iso-octane are used, but reversed phase requires polar. Solvents include water, acetonitrile, and methanol. The solvent's physicochemical characteristics determine which mobile phase is best. Considerations include polarity and miscibility with other chemical inertness, solvents, and UV cutoff wavelength as well as poisoning.
Practical Tips For Handling Mobile Phase:
C) Eliminate dissolved air as this may result in erratic pumping behavior and variable signals from the
through carrying out one or additional of the following :
D) When mixing solvents to form mobile phases:
E) Be aware that volatile components in a mobile phase of mixed composition may evaporate.
This can be minimized by :
f) Ensure that the sample to be analyzed is soluble in the mobile phase.
g) It is important to take the mobile phase's UV absorption into account while employing UV detectors. This is what the UV cut-off value signifies. Tetrahydrofuran, for instance, has a UV cut off of 280 nm, hence it cannot be used to analyze pyridine samples because pyridine's peak maximum is approximately 260 nm.
h) Make sure there is no reaction between the stationary phase and the mobile phase. Ammonia is one such ingredient that buffers and pH modifiers may contain. which is capable of swapping out one of the NR groups of an amide on a stationary phase of amino propyl.
i) Additionally, it's critical to keep an eye on the mobile phases' levels and make sure they're regularly replenished, since this will ensure the The system is never permitted to empty.
j) In order to prevent the system from ever being dry, it's also critical to keep an eye on the mobile phases' levels and make sure they're regularly topped off.
Make sure the bottle labels are accurately labeled with the following information whenever there is a change in the mobile phase
After obtaining the proper separations, the experimental conditions should be adjusted to obtain the required separations and sensitivity. Consistency indicating experimental circumstances for the assay
will be accomplished by systematic/planned analysis of variables such as gradient, flow, pH (if ionic), and mobile phase components and ratio. temperature, pace, sample size, and injection amount and type of solvent diluents. "Welcome to our validation process! This is an important step to ensure the quality and accuracy of our work. Validation involves reviewing and confirming that our output meets the required standards and specifications. Your participation in this process will help us refine our performance and provide the best possible results. Please carefully review the output and provide your feedback."
Definition Of Validation-According To ISO – “validation is the confirmation by examination and the provision of objective evidence that the particular requirements for a specific intended use are fulfilled”
According To USFDA- “To establish documented evidence which provide high degree of assurance that a specific process will consistently produce a product meeting its predetermined specifications and quality attributes”
Importance Of Validation-
The process of validation typically involves the following steps:
1. Review: Examine the output, data, or results to ensure they meet the requirements and specifications.
2. Verification: Check that the output is accurate, complete, and consistent with the expected outcome.
3. Evaluation: Assess the output against predetermined criteria, standards, or benchmarks.
4. Feedback: Provide input on the output, highlighting any errors, discrepancies, or areas for improvement.
5. Correction/Revision: Address any issues or deficiencies identified during validation.
6. Re-validation: Repeat the validation process to ensure corrections/changes meet the requirements.
7. Approval: Confirm that the validated output meets the necessary standards and is ready for use.
8. Documentation: Record the validation process, results, and any subsequent changes.
4.1 General Principles Of Analytical Method Validation Definition: Validation of Analytical method may be defined as "The process by, which it is established, by laboratory studies, that the performance characteristics of thy, which meet the requirements for the intended analytical application"
Typical analytical characteristics can be listed as follows:
1. Accuracy
2. Precision
3. Specificity
4. Limit of Detection
5. Limit of Quantitation
6. Linearity
7. Range
8. Ruggedness
9. Robustness
Definition: Accuracy of an analytical method may be defined as "The closeness of test. results obtained by that method to the true value. This accuracy should be established across its range".
.Accuracy of an analytical method may be determined by the assay method used on a highly pure substance like reference standards and compared it with the same material with a known and established method.
This can also be further evaluated by addition of known pure substance and assessing the recovery of the added substance. In case of quantitative analysis of impurities, accuracy should be assessed on samples of drug substance or product by spiking with known amounts of impurities. The LC.H. Recommends that the accuracy should be assessed using a minimum of nine determinations over a minimum of 3 concentration levels, covering the specified range (ie. 3 concentration and 3 replicates of each concentration).
Definition: Precision of an analytical method may be defined as "the degree of agreement among individual test results when the method is applied repeatedly to multiple sampling of a homogenous sample". The precision of an analytical method in usually expressed as the standard deviation or relative standard deviation (ie. coefficient of variation) of a series of measurement. Precision refers to degree of reproducibility of results. This can be within a single laboratory or between more than one laboratories. Precision can also be considered as repeatability of results, which refers to comparison of results of an analysis within a short time by the same analyst, in same laboratory, and using the same equipment. The precision of an analytical method is determined by assessing sufficient number of aliquots of a homogenous sample to be able to calculate statistically valid estimates of standard deviation or relative standard deviation (i.e. Coefficient of variation) The 1.C.H. Documents recommend that repeatability should be assessed using a minimum of nine determinations covering the specified range for the procedure (i.e. 3 concentrations and 3 replicates of each concentration or using minimum of six determinations at 100% of test concentration).
Definition: Specificity may be defined as "The ability to assess unequivocally the analyte in the presence of components that may be expected to be present, such as impurities degradation products and matrix components". Specificity can be used for identification tests, purity tests, assays, etc. The ICH documents state that, when chromatographic procedures are used, representative chromatogram should be represented to demonstrate the degree of specificity (selectivity) and peaks should be appropriately labeled. Peak purity tests may be useful to show that the analyte chromatographic peak is not attributable to more than one component.
Definition: The limit of Detection may be defined as "The lowest amount of analyte in a sample that can be detected, but not necessarily quantitated, under the stated experimental conditions". The L.O.D. is generally expressed as the concentration of the analyte sample (e.g percentage or parts per million, etc). The 1.C.H. documents describe a common approach, which is to compare measured signals from samples with known low concentrations of analytes with those of blank samples. These detection limits should be subsequently validated by the analysis of a suitable number of samples of known to be near or prepared at the detection limit.
Definition: LO.Q. May be defined as "As characteristic of quantitative assays for low levels of compounds in sample matrices such as impurities in bulk substances and degradation products in finished pharmaceuticals. It is the lowest amount of analyte in a sample that can be determined with acceptable precision and accuracy under the stated experimental conditions". The L.O.Q. is generally expressed as the concentration of the analyte in the sample (e.g. percentage or parts per million, etc). The I.C.H. documents describe a common approach, which is to compare measured signals from samples with known low concentrations of analytes with those of blank samples. These quantitation limits should be subsequently validated by the analysis of a suitable, number of samples known to be near or prepared at the quantitation limit.
Definition The Linearity of an analytical method may be defined as "Its ability to elicit tests that are directly or by well defined mathematical transformations proportional to the concentration of analyte in sample within a given range". The Linearity should be established across the range of the analytical procedure. It should be established initially by visual examination of a plot of signals as a function of analyte concentration of contact. If this appears to be a linear relationship, test results should be established by appropriate statistical methods like regression analysis, etc.
1.CH Recommends that, for the establishment of Linearity, a minimum of 5 concentrations normally be used.
Definition: The range of an analytical method may be defined as "Interval between the upper and lower levels of analyte (including these levels) that have been demonstrated to be determined with a suitable level of precision, accuracy and linearity using the method as written. The range is normally expressed in the same units as test results. (e.g. Percentage, parts per million, etc.) Obtained by the analytical method". The range of the methods validated by verifying that the analytical method provides acceptable precision, accuracy and Linearity when applied to samples containing analyte at the extremes of the range as well as within the range.
Definition: The Ruggedness of an analytical method may be defined as "The degree of reproducibility of test results obtained by the analysis of the same sample under a instruments, different lots of reagents, different elapsed assay times, different away temperatures, different days, etc." Ruggedness is normally expressed as the lack of influence on test results of operational environmental variables of the analytical methods. Ruggedness is a measure of reproducibility of test results under the variation in conditions normally expected from Laboratory to Laboratory and analyst to analyst. The ruggedness of an analytical method is determined by analysis of aliquots from homogeneous lots in different laboratories, by different analysts using operational and environmental conditions that may differ but are still within the specified parameters of the away. The degree of reproducibility of test results is then determined as a function of the assay variables. The reproducibility of test result is then determined as a function of the assay variables. This reproducibility may be compared to the precision of the assay under normal conditions to obtain a measure of the ruggedness of the analytical method.
Definition: The robustness of an analytical method may be defined "a measure of its capacity to remain unaffected by small but deliberate variations in method parameters and provides an indication of its reliability during normal usage"
REFERENCES
HOW TO CITE: Akash Shelake*, Mrs. Jaya D. kamble, Dr. Nilesh Chougule, HPLC Development Method and Validation, Int. J. of Pharm. Sci., 2024, Vol 2, Issue 11, 910-919. https://doi.org/10.5281/zenodo.14203300