If the assay involves addition of exogenous reactants (the reagents) then their quantities are kept fixed (or in excess) so that the quantity (and quality) of the target is the only limiting factor for the reaction/assay process, and the difference in the assay outcome is used to deduce the unknown quality or quantity of the target in question. Some assays (e.g., biochemical assays) may be similar to or have overlap with chemical analysis and titration. But generally, assays involve biological material or phenomena which tend to be intrinsically more complex either in composition or in behavior or both. Thus reading of an assay may be quite noisy and may involve greater difficulties in interpretation than an accurate chemical titration. On the other hand, older generation qualitative assays, especially bioassays, may be much more gross and less quantitative (e.g., counting death or dysfunction of an organism or cells in a population, or some descriptive change in some body part of a group of animals).
Assays have become a routine part of modern medical, environmental, pharmaceutical, forensic and many other businesses at various scales from industrial to curbside or field level. Those assays that are very highly commercially demanded have been well investigated in research and development sectors of professional industries, undergone generations of development and sophistication, and become copyrighted intellectual properties via highly competitive process patenting. Such industrial scale assays as these are often done in well equipped laboratories and with automated organization of the procedure—from ordering an assay to pre-analytic sample processing (sample collection, necessary manipulations e.g. spinning for separation or other processes, aliquoting if necessary, storage, retrieval, pipetting/aspiration etc.). Analytes are generally tested in high throughput AutoAnalyzers, and the results are verified and automatically returned to ordering service providers and end users. These are made possible through use of advanced Laboratory informatics system that interfaces with multiple computer terminals with end users, central servers, the physical autoanalyser instruments, and other automata.
According to Etymology Online the verb assay, at least since the 13th century meant "to try, endeavor, strive; test the quality of," from Anglo-Fr. assaier, from assai (n.), from O.Fr. essai "trial". And the noun assay thus means "trial, test of quality, test of character," mid-14c., from Anglo-Fr. assai and Meaning "analysis" is from late 14c. For assay of currency coins this literally meant analysis of the purity of the gold or silver or whatever precious component used to represent the true value of the coin. This might have translated later (possibly after 14th century) into a generalized meaning of analysis e.g. of important/principle component of a target inside a mixture such as active ingredient of a drug inside the inert excipients in a pharmacological formulation which originally used to be measured by its actual action on an organism (e.g. lethal dose or inhibitory dose).
General steps of any assay
An assay (analysis) is never an isolated process and must be preceded by pre- and post analytic procedures. The information communication (e.g. request to perform an assay and further information processing) or specimen handling (e.g. collection transport and processing) that are done until the beginning of an assay are the preanalytic steps. Similarly, after the assay, the result may be documented, verified and transmitted/communicated in steps that are called post-analytic steps. Like any multistep information handling and transmission systems, variation and errors in the communicated final results of an assay involves corresponding parts in every such step i.e. not only analytic variations and errors intrinsic to the assay itself but also variations and errors involved in preanalytic and post analytic steps. Since the assay itself (the analytic step) gets a lot of attention, steps that get less attention by the chain of users i.e. the preanalytic and the post analytic steps are often less stringently regulated and generally more prone to errors- e.g. preanalytic steps in medical laboratory assays may contribute to 32-75% of all lab errors.
Assays can be very diverse, but generally involve the following general steps:
Sample processing/manipulation in order to selectively present that target in a discernible/measurable form to a discrimination/identification/detection system. It might involve a simple centrifugal separation or washing or filtration or capture by some form of selective binding or it may even involve modifying the target e.g. epitope retrieval in immunological assays or cutting down the target into pieces e.g. in Mass Spectrometry. Generally there are multiple separate steps done before an assay and are called preanalytic processing. But some of the manipulations may be inseparable part of the assay itself and will not thus be considered pre-analytic.
Target specific DISCRIMINATION/IDENTIFICATION principle: to discriminate from background (noise) of similar components and specifically identify a particular target component ("analyte") in a biological material by its specific attributes. (e.g. in a PCR assay a specific oligonucleotide primer identifies the target by base pairing based on the specific nucleotide sequence unique to the target).
Signal (or target) AMPLIFICATION System: The presence and quantity of that analyte is converted into a detectable signal generally involving some method of signal amplification, so that it can be easily discriminated from noise and measured - e.g. in a PCR assay among a mixture of DNA sequences only the specific target is amplified into millions of copies by a DNA polymerase enzyme so that it can be discerned as a more prominent component compared to any other potential components. Sometimes the concentration of the analyte is too large and in that case the assay may involve sample dilution or some sort of signal diminution system which is a negative amplification.
Signal DETECTION (and interpretation) system: A system of deciphering the amplified signal into an interpretable output that can be quantitative or qualitative. It can be visual or manual very crude methods or can be very sophisticated electronic digital or analog detectors.
Signal enhancement and noise filtering: may be done at any/all of the steps above. Since the more downstream a step/process during an assay, the higher the chance of carrying over noise from the previous process and amplifying it, multiple steps in a sophisticated assay might involve various means of signal-specific sharpening/enhancement arrangements and noise reduction or filtering arrangements. These may simply be in the form of a narrow band-pass optical filer, or a blocking reagent in a binding reaction that prevents nonspecific binding or a quenching reagent in a fluorescence detection system that prevents "autofluorescence" of background objects.
Assay types based on the nature of the assay process
Depending on whether an assay just looks at a single time point or timed readings taken at multiple time points, an assay may be:
End point assay: when the only reading that matters is the end result after a fixed assay incubation period.
Kinetic assay: when readings are taken multiple times at fixed time intervals during an assay and a kinetic graph of the readings is important.
Depending on how many targets or analytes are being measured:
Usual assay are simple or single target assays which is usually the default unless it is called multiplex.
Multiplex assays are assays that in a same reaction detect multiple analytes simultaneously.
Depending on the quality of the result produced, assays may be classified into:
Qualitative assay, i.e. assays which generally give just a pass or fail, or positive or negative or some such sort of only small number of qualitative gradation rather than an exact quantity.
Semi-quantitative assays, i.e. assays that give the read-out in an approximate fashion rather than an exact number for the quantity of the substance. Generally they have a few more gradations than just two outcomes, positive or negative, e.g. scoring on a scale of 1+ to 4+ as used for blood grouping tests based on RBC agglutination in response to grouping reagents (antibody against blood group antigens).
Quantitative assays, i.e. assays that give accurate and exact numeric quantitative measure of the amount of a substance in a sample. An example of such an assay used in coagulation testing laboratories for the commonest inherited bleeding disease - Von Willebrand disease is VWF antigen assay where the amount of VWF present in a blood sample is measured by an immunoassay.
Functional assay, i.e. an assay that tries to quantify functioning of an active substance rather than just its quantity. The functional counterpart of the VWF antigen assay is Ristocetin Cofactor assay, which measures the functional activity of the VWF present in a patients plasma by adding exogenous formalin-fixed platelets and gradually increasing quantities of drug named ristocetin while measuring agglutination of the fixed platelets. A similar assay but used for a different purpose is called Ristocetin Induced Platelet Aggregation or RIPA, which tests response of endogenous live platelets from a patient in response to Ristocetin (exogenous) & VWF (usually endogenous).
Depending on the general substrate on which the assay principle is applied:
Bioassay: when the response is biological activity of live objects e.g.
Organism (e.g. mouse injected with a drug)
ex vivo body part (e.g. leg of a frog)
ex vivo organ (e.g. heart of a dog)
ex vivo part of an organ (e.g. a segment of an intestine).
tissue (e.g. limulus lysate)
cell (e.g. platelets)
Ligand binding assay when a ligand (usually a small molecule) binds a receptor (usually a large protein).
Immunoassay when the response is an antigen antibody binding type reaction.
Depending on the nature of the signal amplification system assays may be of numerous types, to name a few:
Enzyme activity assay: Enzymes may be tested by their highly repeating activity on a large number of substrates when loss of a substrate or the making of a product may have a measurable attribute like color or absorbance at a particular wavelength or light or chemiluminiscence or electrical/redox activity.
Photometry / spectrophotometry When the absorbance of a specific wavelength of light while passing through a fixed path-length through a cuvette of liquid test sample is measured and the absorbance is compared with a blank and standards with graded amounts of the target compound. If the emitted light is of a specific visible wavelength it may be called colorimetry, or it may involve specific wavelength of light e.g. by use of laser and emission of fluorescent signals of another specific wavelength which is detected via very specific wavelength optical filters.
Transmittance of light may be used to measure e.g. clearing of opacity of a liquid created by suspended particles due to decrease in number of clumps during a platelet agglutination reaction.
Turbidimetry when the opacity of straight-transmitted light passing through a liquid sample are measured by detectors placed straight across the light source.
Nephelometry when the scattered lights are measured by detectors placed at fixed angles to the path of light.
Reflectometry When color of light reflected from a (usually dry) sample or reactant is assessed e.g. the automated readings of the strip urine dipstick assays.
Cell counting, viability, proliferation or cytotoxicity assays
A cell-counting assay may determine the number of living cells, the number of dead cells, or the ratio of one cell type to another, such as numerating and typing red versus different types of whiteblood cells. This is measured by different physical methods (light transmission, electric current change). But other methods use biochemical probing cell structure or physiology (stains). Another application is to monitor cell culture (assays of cell proliferation or cytotoxicity). A cytotoxicity assay measures how toxic a chemical compound is to cells.
The viral plaque assay is to calculate the number of viruses present in a sample. In this technique the number of viral plaques formed by a viral inoculum is counted, from which the actual virus concentration can be determined.
A wide range of cellular secretions (say, a specific antibody or cytokine) can be detected using the ELISA technique. The number of cells which secrete those particular substances can be determined using a related technique, the ELISPOT assay.
When multiple assays measure the same target their results and utility may or may not be comparable depending on the natures of the assay and their methodology, reliability etc. Such comparisons are possible through study of general quality attributes of the assays e.g. principles of measurement (including identification, amplification and detection), dynamic range of detection (usually the range of linearity of the standard curve), analytic sensitivity, functional sensitivity, analytic specificity, positive, negative predictive values, turn around time i.e. time taken to finish a whole cycle from the preanalytic steps till the end of the last post analytic step (report dispatch/transmission), thruput i.e. number of assays done per unit time (usually expressed as per hour) etc. Organizations or laboratories that perform Assays for professional purposes e.g. medical diagnosis and prognostics, environmental analysis, forensic proceeding, pharmaceutical research and development must undergo well regulated quality assurance procedures including method validation, regular callibration, Analytical quality control, Proficiency testing, test accreditation, test licensing and must document appropriate certifications from the relevant regulating bodies in order to establish the reliability of their assays, especially to remain legally acceptable and accountable for the quality of the assay results and also to convince customers to use their assay commercially/professionally.