Heart Disease

The heart is one of the most efficient and durable organs in the human body. Each day, it beats 100,000 times to pump more than 2,000 gallons of blood through the body, never resting for longer than a second, for an average lifetime of 75 years! Unfortunately, disease, genetics, or a host of environmental factors can conspire to make the heart less efficient or even cause it to break down completely.

Diseases of the heart and blood vessels, collectively known as cardiovascular diseases, are the most common disorders in Western societies. Fully 20 percent of Americans are afflicted with one or more forms. And these diseases—mainly heart disease—kill one American every 33 seconds; they account for about 40 percent of all deaths in the United States.

Despite these statistics, advances in diagnosis and treatment have caused a decline in death rates from heart attacks, strokes, and other cardiovascular diseases. The death rate from cardiovascular diseases actually declined between the years 1990 and 2000 by 17.0 percent. But there is still a long way to go before heart disease ceases to be a serious threat to health.

Diagnostic Methods

The physician can use any number of tests to study the function of the heart's arteries, muscle, valves, and electrical system, as well as the structure of cardiac chambers and layers. One of the most common tests is called an electrocardiogram (ECG), in which the electrical activity of the whole heart is recorded by electrodes on the skin. Another procedure, the echocardiogram, is a radarlike technique in which sound waves emitted by a handheld crystal bounce off the heart and back to the crystal, producing an image of the beating heart muscle, the four chambers, thevalves, and the pericardial covering as the heart beats. One evaluation technique is the treadmill, or exercise stress test, in which the patient performs various types of exercise to produce "stress" on a heart that may have blockages in its coronary arteries. The resultant decrease in blood supply to the working heart muscle can then be detected by an ECG, echocardiogram, or through imaging techniques such as thallium scans or PET scans.

Another important diagnostic test is the angiogram, or cardiac catheterization. In this procedure, plastic tubes called catheters are passed through the skin into an artery or vein of the leg or arm, and advanced into the chambers or coronary arteries of the heart. Once installed, the catheter permits pressures in different locations to be measured, and contrast dye is then injected into the heart. The dye is illuminated using X rays, causing the heart chambers and arteries to appear white on a black background. This allows doctors to clearly see any blockages in the arteries, to determine damaged areas in the heart, and to assess myocardial pumping function.

In the late 1990s, doctors began to use magnetic resonance imaging (MRI) techniques to reveal the structure of a diseased heart and determine the location and extent of any damage. Scientists also discovered that damage to the heart is associated with a drop in the level of a chemical substance called creatine, which can be measured using MRI technology.

Coronary-Artery Disease

The most common cause of death in the United States is myocardial infarction (MI), commonly known as heart attack. MI literally means the death of the myocardium, or heart muscle; it is usually due to blockages in the coronary arteries that feed blood to the myocardium. Of the approximately 1 million MIs annually in the United States, about 460,000 are fatal. Of these deaths, about 220,000 occur within one hour of the onset of symptoms and before the victim can reach a hospital.

Blockages in the coronary arteries are due to atherosclerosis, or hardening of the arteries, a condition characterized by fatty, sticky buildups, or plaques, in the inner wall of the coronary arteries and in other arteries throughout the body.

Coronary plaques impair blood flow to the myocardium, leading to reduced pumping force and lowered cardiac output of blood to the body. If plaques are severe but stable, they usually cause angina pectoris—a chest pain or pressure usually triggered by exercise or emotional stress.

Plaques have rough surfaces and are prone to rupturing into the blood, which results in the release of tissue substances that cause blood clots to form at the site of the plaque. These clots may block off the artery completely and suddenly, causing most MIs and sudden death.

Treatments for coronary plaques include medications that reduce the workload on the heart or dilate the arteries to increase blood flow past the narrowed areas. Aspirin and other drugs help thin the blood and prevent clotting if the plaques rupture. In more-serious cases, balloon-angioplasty catheters, which can reach the blockages and dilate them open, have been successful. In some cases, doctors perform bypass surgery, in which surgeons take a patient's vein or artery from elsewhere in the body and transplant it around the blockages to restore blood flow.

Myocardial Disease

Congestive heart failure (CHF) is a serious condition affecting nearly 5 million Americans and causing or contributing to some 250,000 deaths annually. It is marked by a progressive decline in the heart's pumping function. The heart chambers usually dilate in a vain effort to compensate for declining contraction. As the forward flow of blood declines, blood backs up in the heart. This leads to fluid leaking out of the blood vessels and into the tissues. The patient becomes fatigued, swollen, and short of breath. Death occurs from slow progression of pump failure or suddenly from rapid, abnormal heart rhythms.

There are many possible causes for CHF. Among the most common are severe coronary-artery blockages (which starve the muscle of blood); serious valve malfunctions (which prevent efficient forward flow of blood); damage to the heart muscle by high blood pressure; infections of the heart; and inflammation of the heart, or myocarditis, due to the body's immune system attacking its own tissue. Certain medications, alcohol, heavy metals, and other toxins can also cause CHF.

Treatment for CHF includes medications that improve cardiac contraction or reduce fluid retention. In extreme cases, transplantation of a normal heart is undertaken.

Valvular Disease

A number of diseases can affect the heart's four valves. Some occur from birth, while others occur as a result of wear and tear or specific injuries to the valves. There are two types of valvular diseases. Leaking, or regurgitation, of the valves leads to the backward flow of blood through the heart's chambers and the buildup of blood into the lungs and body, fluid retention, and reduced forward flow. Narrowing, or stenosis, of the valves also leads to reduced forward flow, increased backward flow, and congestive heart failure. Reduced forward flow may also cause fainting and chest pains.

Valve diseases are accompanied by either thickening, calcification, infection, inflammation, or rupture of the valve leaflets, and may be due to various causes. The abnormal structures distort blood flow, producing sounds called heart murmurs that can be heard by a stethoscope and detected by an echocardiogram.

Treatment includes medications to increase forward flow (for regurgitant types only); balloon valvuloplasty, in which balloons on catheters crack open stenotic valves; surgical valve repair (for regurgitant valves) involving tightening up of loose, leaky valves; or replacement of the whole valve with an artificial valve (either a metal type or a pig's heart valve).

Rhythmic Disorders

Abnormalities of the heart's rhythm, called arrhythmias, impair the efficiency of the heart, leading to reduced cardiac output, weakness, fainting, palpitations, fluid in the lungs, poor blood flow to the coronary blood vessels, and even death. Some arrhythmias are mild but annoying, while others are unpredictable and lethal. Millions suffer from various abnormal rhythms, and each year at least 40,000 U.S. deaths are attributed directly to these disorders.

Slow heartbeats may require a pacemaker, a wire electrode implanted through the skin, usually in the right chamber wall, into a large vein leading to the right ventricle. There it produces an electric current that stimulates the myocardium when the normal pacemaker doesn't fire properly. Fast heartbeats may require medications to suppress abnormal areas of electric activity. Catheter ablation or surgery to destroy areas of the heart causing arrhythmia are also common procedures. In some cases, implantable defibrillators are used. These implants recognize fast rhythms and deliver an electric shock to the heart to restore the normal rhythm.

Miscellaneous Heart Disorders

There are a number of rare but often deadly tumors that can affect the heart. A variety of blood clots, or thrombi, can arise in heart chambers or valves, break off, travel out of the heart to the body, and block off blood flow in various organs. Rheumatic fever is an inflammatory response to strep-throat infection in which heart tissue is attacked by the immune system. In endocarditis, bacteria and other organisms grow on the valves, erode through the leaflets, and impede valve functions. Pericardial diseases lead to tightening of the pericardium around the heart, causing strangulation; painful inflammation of the pericardial layers; or accumulation of fluid between the layers, which can dangerously compress the heart and prevent it from filling with blood.

The information provided should not be used during any medical emergency or for the diagnosis or treatment of any medical condition. A licensed physician should be consulted for diagnosis and treatment of any and all medical conditions. Call 911 for all medical emergencies.

Copyright Information: Public domain information with acknowledgement given to the U.S. National Library of Medicine.


AIDS, or acquired immune deficiency syndrome, is a lethal disease that permanently disables the body's immune system. AIDS is characterized by a variety of debilitating symptoms that leave the body open to attack by one or more specific diseases, such as pneumonia, meningitis, or cancer. In the early years of the 21st century, nearly 40 million people are infected with the virus that causes AIDS, with an additional 20 million having already lost their lives to the disease. Although the number of new cases in highly industrialized nations has leveled off since the 1980s, the number has skyrocketed in Africa, Asia, and Latin America. In the United States, more than 925,000 are afflicted with the AIDS virus, and the disease has caused more than 520,000 deaths.

The disease was first recognized in 1981, when a number of previously healthy homosexual men in Los Angeles and New York City developed Pneumocystis carinii, a rare parasitic disease of the lungs. At the same time, there was also a notable increase in cases of Kaposi's sarcoma, a rare form of cancer previously seen only in elderly men. It soon became apparent that there was a growing epidemic of opportunistic infections, disorders associated with a depressed immune system. Since that time, the virus has gradually progressed: AIDS cases have been reported in almost every country.

The Human Immunodeficiency Virus

The causative agent of AIDS is a retrovirus. Retroviruses contain an enzyme called reverse transcriptase that enables the virus to replicate in a host cell. The AIDS virus was recognized independently by French researchers in 1983 and Americans in 1984. The French named the virus "lymphadenopathy-associated virus" or LAV; the Americans called their discovery the "human T-cell lymphotropic virus type III" (HTLV-III). Then, in 1986, the International Committee for the Taxonomy of Viruses designated "human immunodeficiency virus" (HIV) as the accepted name.

Since HIV was first identified, scientists have learned much about the virus and its mechanisms. However, despite the intensive worldwide biomedical effort against the disease, and the unprecedented speed with which knowledge has been accumulated, many questions remain. A vaccine, considered the best hope of stopping the AIDS epidemic, remains to be developed.

Pathology of AIDS

The immune system is the part of the body that fights infection and disease. It responds differently to various pathogens (disease-causing agents). Those pathogens that induce an immune response are foreign proteins called antigens.

To fight infection the body produces T cells and B cells, which are produced in the lymphatic system and are called lymphocytes. When an antigen is present in the body, T cells process the antigen so that it can then be recognized by B cells. The B cells produce antibodies that match the specific antigen. The antibody binds to the antigen, thus rendering the pathogen susceptible to attack and destruction by the infection-fighting phagocytes. If the body produces antibodies against a specific pathogen once, then memory B cells can produce new antibodies if a subsequent attack by the same pathogen occurs.

HIV acts differently from most pathogen organisms. It is this difference that makes it such a threat. When HIV enters the body, it seeks out T-helper lymphocytes, special cells that help other lymphocytes produce antibodies. Because these cells carry a molecule known as CD-4 on their surfaces, they are also referred to as CD-4 + T cells, or simply CD-4 cells. The CD molecules on these cells happen to serve as ideal receptors for HIV, so the invading virus seeks out CD-4 cells. When HIV locates one, the virus incorporates itself into the cell. Once inside, the HIV either reproduces so quickly that it destroys its host cell, or it replicates slowly, sending out more virus particles to attack other CD-4 cells.

HIV can duplicate itself because of its reverse-transcriptase enzyme. This substance enables the virus' genetic material, ribonucleic acid (RNA), to act as a blueprint for production of deoxyribonucleic acid (DNA). The DNA then transmits the hereditary characteristics to the next generation of HIV.

The destruction of CD-4 cells by HIV leaves the body vulnerable to attack by other pathogens. Although the lymphocytes still produce antibodies to match the HIV antigen, no binding occurs to neutralize the HIV. As a result, the HIV antibodies do not help the body eliminate the HIV.

Once infection occurs, AIDS takes time to develop. Initially, many people with HIV have no symptoms at all. Others develop an acute but transient flulike illness with such symptoms as fever, chills, night sweats, skin rashes, and diarrhea.

People develop full-blown AIDS when their supply of CD-4 cells becomes so depleted that opportunistic infections are able to take advantage of the body's impaired immune system. The more serious of these infections include Kaposi's sarcoma, Pneumocystis carinii, disseminated mycobacteriosis (a bacterial disease), and cryptosporidiosis (a fungus). AIDS can also cause life-threatening weight loss, often called "wasting syndrome," and neurological problems.

As doctors have learned more about how the disease manifests itself, they have expanded the roster of opportunistic illnesses that define cases of full-blown AIDS. Various additions to this list include pulmonary tuberculosis, recurrent pneumonia, and, in women, invasive cancer of the cervix. In addition, doctors use the actual count of CD-4 cells in persons with AIDS as a measure to classify the severity of HIV-related clinical conditions.

The period of time between becoming infected with HIV and developing AIDS varies considerably from person to person. Babies who become infected at birth often get sick before they are a year old. Some infected adults also rapidly develop AIDS, but others remain healthy for a decade or longer. One study found that 18 percent of HIV-infected hemophiliacs may live 25 years before developing full-fledged AIDS.

Findings indicate that, from the time of infection, a continual battle occurs between the viruses and the body's immune system. Each day, as many as 1 billion new viruses are produced, and the body produces up to 1 billion new CD-4 cells to fight them. Losses—on both sides—are equally staggering. Gradually, however, it seems that the body cannot keep up; more and more CD-4 cells are killed than are replaced, and eventually the person becomes vulnerable to opportunistic infections.


HIV is transmitted from an infected person to a healthy person in three main ways: sexual intercourse; sharing needles or syringes in the use of intravenous drugs; or from a mother to her child before, during, or shortly after birth.

Most AIDS cases in North America and throughout most of Europe have occurred among intravenous drug abusers, homosexual men, and bisexual men, as well as their sexual partners. In the United States, 11 percent of all adult AIDS cases in 2000 were contracted through heterosexual contact: 5 percent of men and 40 percent of women. Proportionally, in the United States, the spread of HIV through heterosexual contact is growing more rapidly than is its spread via other avenues.

The spread of AIDS leveled off in the United States during the 1990s. However, between 2000 and 2001, the number of diagnosed AIDS cases rose 1 percent, the first increase since 1993. Officials attribute the rise in HIV infections in part to increased risky sexual behavior.

In 1992, women accounted for 13.8 percent of persons living with AIDS; in 2001, the proportion exceeded 23 percent. Although African-Americans constitute only 12 percent of the population, they accounted for half of the new HIV infections reported in 2001.

In Africa and Asia, the primary route of transmission is via heterosexual contact. In response, the United Nations program on AIDS instituted a Global Strategy Framework on HIV/AIDS, with a particular focus on educating youth.

Some people who received transfusions of contaminated blood in the early 1980s were also infected with HIV and have since developed AIDS—and in many cases, died from the disease. Since 1985, when a test to determine the presence of HIV first became available, the blood supply in the United States and many other nations has been screened for HIV. Individuals who donate blood products are also screened.

No evidence supports the possibility of transmission by casual social contact. Nor have researchers observed any household transmission among family members and friends of persons with AIDS through close, but nonsexual, contact. Likewise, the virus has not been observed to spread via water, air, food, contact with inanimate objects, contact with raw sewage, or via insects such as mosquitoes. Although many bacteria and viruses are spread by these routes, HIV is not one of them. HIV has been found in the saliva and tears of infected persons, but these fluids have not been found to transmit the virus. In fact, these fluids contain HIV-killing proteins called lysozymes.

Testing for AIDS

In 1984 a test called the enzyme-linked immunosorbent assay (ELISA) was developed that can determine whether an individual is carrying HIV. The test does not detect the virus itself. Rather, it detects antibodies to HIV. If a tested person is found to be infected, or "positive," the test is often confirmed by another procedure known as a Western blot. Scientists now make use of a technique known as the polymerase chain reaction (PCR). This procedure, developed for use in the field of molecular biology, makes it possible to amplify tiny quantities of DNA. Scientists have used PCR to amplify quantities of viral DNA from the blood of HIV-infected persons. This has helped them to learn more about how the virus works.

Most people develop antibodies within 12 weeks of infection. If a person is tested after becoming infected but before developing antibodies, the results will be "negative"—yet the individual is carrying the virus and is capable of infecting others. There is also evidence that people are extremely contagious in the first eight weeks after getting HIV—a period during which they cannot know that they are infected.


There is no known cure for AIDS. And until 1995 there was no evidence to suggest that anyone who had become infected with HIV had been able to regain a normally functioning immune system. In 1995, it was reported that a 5-year-old boy infected at birth appeared to have fought off the virus by age 1; after that time, his body reportedly remained free of the virus.

Research facilities around the world design and test hundreds of possible treatments for AIDS. As more has been learned about the basic biology of the disease, there have been changes in the focus of research efforts and also changes in how and when people receive treatment. For example, it has been found that most cases of pediatric AIDS can be prevented if infected women take the drug AZT (azidothymidine) while they are pregnant.

AZT, the first drug licensed by the U.S. Food and Drug Administration (FDA) for AIDS treatment, is an antiviral agent. It inhibits HIV's reverse transcriptase enzyme, thus slowing reproduction and allowing the population of CD-4 cells to rebound, at least temporarily. AZT has usually been used with other drugs, such as DDI (didanosine), nevirapine, and indinavir. The grouping of such drugs has been nicknamed the "AIDS cocktail." By early in the 21st century, more than 25 drugs were being used in the treatment of HIV infection. In addition to the nucleoside reverse transcriptase inhibitors (NRTIs), which interrupt the early stage of the virus and include AZT, they are classified as nonnucleoside reverse transcriptase inhibitors (NNRTIs), which include nevirapine; protease inhibitors, which interrupt virus replication at a later stage of its life cycle; and fusion inhibitors. Since HIV can become resistant to any drug, a combination treatment may be used to suppress the virus. When NRTIs and protease inhibitors are given in combination, it is called highly active antiretroviral therapy (HAART). Such treatment can be available both to people newly infected with HIV as well as to AIDS patients. HAART has not only reduced the number of deaths from AIDS, but it also has improved the health of those afflicted with the disease. It is not a cure, however, and severe side effects can result. Patients receiving the therapy must be monitored closely.

Another promising avenue of treatment being investigated is the use of interleukin 2, a natural protein that regulates the body's immune defenses. Early studies indicate that infusions of this drug help increase CD-4 counts in people who are HIV-positive but have not yet developed AIDS.

Improvements also have been made in fighting the infections that threaten AIDS patients. For example, an aerosol drug known as pentamidine has been effective in treating the life-threatening illness Pneumocystis carinii pneumonia (PCP), which sometimes occurs repeatedly in AIDS patients. Drugs such as trimethoprim, sulfa, and atovaquone have also been effective.

Despite the promise of these various therapeutic approaches, there is widespread agreement among those in the medical community that achieving a "cure" for AIDS is not likely in the near future. A more obtainable objective is to control HIV and opportunistic infections, thus allowing people to remain relatively healthy for as long a period of time as possible.

Vaccine Research

A major quest of AIDS research is the development of a vaccine that will prevent HIV infection. In order to be successful, a vaccine must, in effect, "fool" the body's immune system into responding to what is actually a harmless threat, so that when the real viral enemy comes along, the body will already know how to fight it.

Scientists are continuing to experiment with various immunogens—substances that will evoke an immune response that protects against HIV. Some of the immunogens being evaluated include weakened or inactivated forms of the virus itself, or certain of its protein subunits. Another potential approach is the use of gene therapy. This would make CD-4 cells resistant to HIV infection by introducing an HIV-resistance gene directly into the cells.

Most vaccines are in Phase I trials, which assess safety. Several are in Phase II, dosage evaluation. In 2003, Phase III trials were completed on two vaccines—VaxGen's AIDSVAX B/B and AIDSVAX B/E—which proved to be ineffective. Meanwhile, efforts to produce an AIDS vaccine continue. The AIDS Vaccine Advocacy Coalition (AVAC) calls the ongoing development project "a long-term mission," maintaining that a vaccine would not be ready by 2007, a goal that had originally been set by President Bill Clinton and his administration.

Future Outlook

Clearly, the most profound impact of the AIDS epidemic is yet to come. The number of HIV infections and cases of AIDS continues to grow, creating prospects of a grim future, particularly in Africa and Asia. Providing health care for patients will require billions of dollars. In the United States, the cost of caring for a person with AIDS ranges from $14,000 to more than $34,000 per year. A substantial part of these costs is the price of drugs. Such high costs are even a greater problem in the poorer nations where AIDS has become so rampant. In response, the 2001 United Nations conference on AIDS called for $9 billion for a global AIDS fund. In 2003, U.S. President George W. Bush announced a new program to fight AIDS in Africa, pledging $15 billion to the cause.

It is critical that education about HIV and AIDS be promoted on a global scale, so that people make behavioral changes that will limit further transmission of the virus.

There are some bright spots on the horizon. Each year, scientists have a better understanding of how HIV works, and how the human immune system reacts to the virus. New drugs offer improved treatment. By the early 2000s, people infected with HIV were living longer than were patients in the early stages of the epidemic. There is hope that ever-more-sophisticated strategies will allow infected individuals to live longer before symptoms occur, and that some people with HIV may never develop AIDS.

The information provided should not be used during any medical emergency or for the diagnosis or treatment of any medical condition. A licensed physician should be consulted for diagnosis and treatment of any and all medical conditions. Call 911 for all medical emergencies.

Copyright Information: Public domain information with acknowledgement given to the U.S. National Library of Medicine.


When the normal equilibrium of the human body is threatened by an external agent, a sequence of automatic defense mechanisms moves into action. If the body becomes overheated, for example, it begins to perspire in order to cool itself. If disease-causing bacteria or viruses invade, the body's immune system produces protective antibodies. An allergy represents a response by the immune system to agents perceived as possibly dangerous to the body. But in this case, the response to the agent, known as an allergen, is excessive, and the reaction is neither normal nor desirable.

There are literally hundreds of possible causes of allergy, and the reaction may express itself in different ways. One person breaks out in hives after eating strawberries. Another starts to sneeze in the presence of cats. A third develops itchy, watery eyes in reaction to dust. In people who suffer from asthma, exposure to an allergen may trigger sneezing, shortness of breath, wheezing, or even sudden death.


The symptoms of allergy may first appear at any time from infancy to old age. They occur most often before the age of 20. One of the major problems facing doctors who treat allergies is to recognize the condition early enough so that treatment can be started when it can do the most good. Many allergic reactions are mistaken for less-serious conditions, especially in the case of infants and young children.

Genetics plays a significant part in allergy. A person who has two allergic parents or whose family has a history of allergy is 10 times more likely to develop an allergy than is a person from a nonallergic family, and symptoms will usually appear at an earlier age.

Yet people with no trace of an allergic inheritance may also develop allergies, and some members of an allergic family may be completely free of allergic illness throughout their entire lives. Apparently, it is not the allergy itself, but rather, a tendency or susceptibility to allergy, that is passed from parent to child.


Exactly what causes an allergic reaction? Briefly, when a foreign substance first enters the body, it finds its way to the bloodstream. This foreign agent stimulates the production of immunoglobulin E (IgE), a type of antibody specifically produced to combat the invader.

IgE antibodies attach themselves to mast cells, located in parts of the body where foreign substances typically enter, such as respiratory passages, skin surfaces, and the digestive tract. The mast cell–IgE antibody complex can be likened to a mine ready to detonate at the first sign of the allergen's reappearance, even if the allergen does not invade again for many years.

When the same kind of allergen enters the bloodstream again, it triggers the mast cell–IgE antibody complex to release chemicals called mediators from the mast cells. These mediators attack the allergen, but in the process they may damage surrounding tissue. More than 15 mediators with different functions have been identified. The histamines, released in hay fever, are an example of these chemicals. They initiate a local inflammatory response resulting in sneezing and watery, puffy, itchy eyes. The exact symptoms of an allergic reaction depend on the effects of the mediators and in which part of the body the allergen–mast cell–IgE complex exists. Three major types of allergenic substances are inhalants, foods, and skin-contact substances.


Inhalant materials are the major cause of allergic attacks, including asthmatic episodes. The most common is plant pollen, such as that produced by the flowers of ragweed, grass, and trees. These pollens are generally the cause of seasonal hay fever. Usually the allergic person is sensitive to only one type of pollen, but sometimes one type of seasonal allergy leads to another.

Ordinary house dust, formed by the slow deterioration of many different materials in the home, is another inhalant offender. The hair and dander of many animals are allergenic materials, as are feathers, molds, insect sprays, and vegetable fibers. Strong odors and fumes, including cigarette smoke, may bring about a severe attack in an allergic individual.

Foods and liquids.

Food is a common source of allergic attacks. Eggs, milk, nuts, wheat, fish, meats, chocolate, and many other standard foods may produce an allergic reaction when eaten by themselves, or together with other allergenic foods, or as a minor ingredient in an otherwise harmless dish.

Skin contact.

Perhaps the best-known cause of allergic reactions through direct contact is poison ivy, which will cause an itchy, bothersome skin rash, even if a person merely brushes against the plant. But there are many other materials that may cause such a reaction in sensitive individuals, including cosmetics, hair dyes, clothing fabrics, plastics, metals, woods, chemicals, paints and varnishes, and jewelry.

Other allergens.

Medications such as penicillin, aspirin, and sulfa drugs can cause severe allergic reactions whether injected, swallowed, or through mere skin contact. Insect stings and bites, particularly those of the yellow jacket and other wasps and bees, also have been known to cause violent allergic reactions.

These types of allergens produce a systemic (whole-body) reaction, as opposed to a local reaction. Very large amounts of mediator chemicals enter the circulatory system and cause lowered blood pressure and constriction of the small air passages in the lungs. In severe cases, without medical intervention, death may occur within half an hour. In the United States, allergic systemic reaction, called anaphylaxis, accounts for several thousand deaths a year.

The exact role of emotions in allergy, particularly asthma, remains uncertain.


A skilled allergist may correctly diagnose an allergy by the symptoms alone. Often, however, further study is needed.

Medical history.

If the patient had frequent bouts of colic or diaper rash as an infant, or if parents or other close relatives have been treated for allergy, it is very likely that the patient is susceptible to allergy. A complete medical history may also indicate the specific factors in the patient's environment that cause allergic attacks.

Skin test.

A small amount of pollen, egg, or other suspected allergenic material is mixed with a solvent and applied to a scratch in the skin. A different substance is applied to each scratch. If the patient is allergic to one of the materials, a minor reaction will usually appear on the site of the scratch, often within minutes.

Radioallergosorbent test (RAST).

In some cases, a blood test can be used instead of a skin test. A sample of the patient's blood and an allergen are placed together in a test tube. The RAST measures the amounts of specific IgE antibodies present in the patient's blood. High levels of IgE indicate an immune response to the specific allergen.


Once the cause of the allergy has been determined, the simplest and most effective treatment is to avoid all future contact with the offending substance. Unfortunately, this strategy is not always possible. While it is relatively easy to avoid swallowing aspirin, touching poison ivy, or eating allergenic foods, it is considerably more difficult to avoid house dust or ragweed pollen during the height of the pollen season.


When contact with the allergenic material cannot be avoided, the patient's system must be trained to live with it. This is done by means of hyposensitization. The physician first injects an extremely small amount of the offending substance into the patient's system. Gradually, the amount injected is increased. In time, the patient's body builds a tolerance and becomes used to the allergen.

In the case of a seasonal allergy, injections may be given once or twice a week for several months before the start of each pollen season; alternatively, after the first year, the patient may be injected once every two or three weeks on a year-round schedule. At least two full years without any signs of allergy are necessary before hyposensitization can be halted. Even then, symptoms can reappear at any time.

In another type of therapy, the pollen extract in a nasal spray causes the body to produce immunoglobulin G (IgG). This antibody then competes with IgE to block the body's reaction to the allergen. This method will enable patients to administer their own allergy treatments, thus reducing doctor visits and costs.


Great progress has been made in relieving the major symptoms of common allergies. Antihistamine drugs provide temporary relief for minor allergies. Steroid hormones (cortisone and hydrocortisone and their synthetic substitutes) can also be extremely valuable. Both antihistamines and steroids are generally used only for temporary relief until hyposensitization can take effect.

For people who suffer from asthma, beta-agonist bronchodilators widen respiratory passageways by quickly alleviating the shortness of breath that signals an asthma attack. Although providing symptom relief, these drugs do not address the underlying problem. Some studies suggest that asthma patients who rely heavily on inhaled beta-agonist bronchodilators increase their risk of dying, perhaps because opening the airways exposes the lungs to increased concentrations of allergens. Long-term treatment with anti-inflammatory drugs, including inhaled steroids and cromolyn sodium, generally forms the basis of asthma management.

As the mechanism of allergy reaction is better understood, new treatments may arise. Work with conjugated allergens (complexes of allergens with chemicals) suggests that IgE production and the level of IgE in the blood may be controlled. Also, by the early 2000s, researchers were working on new allergy vaccines that combine allergy-triggering substances with synthetic DNA (deoxyribonucleic acid).

The information provided should not be used during any medical emergency or for the diagnosis or treatment of any medical condition. A licensed physician should be consulted for diagnosis and treatment of any and all medical conditions. Call 911 for all medical emergencies.

Copyright Information: Public domain information with acknowledgement given to the U.S. National Library of Medicine.

Imaging and Diagnosis

Doctors make diagnoses based on what they can see. For the earliest recorded physicians, this meant what they could detect with their own sense organs: the color of the skin, tongue, and fingernails; the smell of the breath and urine; the feel of the skin temperature and pulse rate; and the sound of a cry of pain when they touched a tender spot.

For centuries, physicians relied on little more than their senses and their experience to make diagnoses. Not only were the inside workings of the body a mystery, but also the inside parts of the body. Physicians could feel hard bones, elastic muscles, and soft organs. But they had little knowledge of what lay beneath the skin. Only in the 16th century, when doctors in Italy are rumored to have stolen corpses from the gallows and dissected them, did physicians begin to understand the internal parts and workings of the human machine. In 1543, the same year as the publication of Copernicus' heliocentric theory of the solar system, Andreas Vesalius produced the first essentially correct book of anatomy.

But dead bodies and living ones are two very different things. More than three centuries passed before a tool to look inside the living human body was developed. This tool was the X ray. It was first applied to studying the human body in 1895, four days after its discovery was announced.

X Rays—The First Imager

Wilhelm Konrad Roentgen, a German physicist, was studying a device called a Crookes tube. This tube contained a partial vacuum and had electric wires connected to two metallic electrodes inside the tube. When a high-voltage current was applied to the electrodes, the tube glowed with a phosphorescent light. In one experiment, Roentgen had the tube enclosed in a black paper cover. In his darkened laboratory, he noticed a greenish glow several feet away whenever he turned the tube on. Investigating, he found that the glow came from a paper screen coated with barium platinocyanide. Further experiment showed that the screen continued to glow even when it was shielded from the tube by several layers of cardboard and paper, and that a photographic plate near the tube was blackened. Roentgen came to the conclusion that some kind of invisible rays must be coming from the tube. He dubbed them X rays, the "x" being the common mathematical symbol for an unknown.

Further experimentation led Roentgen to discover that X rays travel through some, but not all, materials. In time, he experimented with his own body. In his first public lecture on the new rays, he called for a volunteer to be X-rayed. He placed the volunteer's hand between the Crookes tube and the photographic plate. When he developed the plate, he found the image of the hand. Although the X rays could pass through the hand, they were blocked in different amounts by different tissues. The dense bones appeared relatively light in color because they resisted the passage of X rays to the plate. Softer surrounding tissues appeared darker because they were less dense and offered less resistance to the passage of X rays. The image on the photographic plate was essentially a negative on the basis of tissue density. Medical personnel now read the negative image obtained directly on photographic film to evaluate the X-ray findings.

In a Crookes tube, electrons are propelled through a partial vacuum. Electrons are subatomic particles that carry a negative charge of electricity. The glow that Roentgen saw in the tube was light waves produced when the electrons bounced against the gas molecules remaining in the tube. The electrons that hit the glass walls produced the X rays.

Today we know that X rays and light are different wavelengths of the same kind of electromagnetic radiation. Electromagnetic radiation is produced whenever electrons are deflected from their paths. The high energy of the electrons in the Crookes tube and the sharp deflection on meeting molecules of glass produce X rays. The more moderate deflection of the electrons on meeting gas molecules in the tube produces light.

X rays gave physicians their first easy look inside the living body. The procedure required no surgery and caused no pain. The patient just held still the body part being photographed until the plate was exposed.

X rays provided a great deal of information not available before. Physicians could study the living skeletal system and organ placements and get a clear picture of the broken bones, which they could previously feel only through the skin. They could also identify small, hairline fractures that tactile examinations could not pick up, and they could detect tumors in some body parts and displacements of certain bones and organs.

Today X-ray pictures, or radiographs, are still the most widely used images in medical diagnosis. They are almost routine when there is a limb or joint injury or a blow to the head. They also show lung tumors and enlarged organs.

But radiographs have their limitations. They give only a two-dimensional view of the body. Layers of body organs can make X-ray images difficult to interpret. X rays often cannot show structures deep within a tissue. Furthermore, X-ray overexposure can damage tissues and cause cancer. That is why doctors limit the amount of X rays that any one person receives. It is also why dentists usually leave the room while X rays are taken of the teeth. The accumulation of exposure to stray X rays could cause a dentist to develop cancer.

Cat Scans

A new tool emerged in the 1950s that revolutionized diagnostic imaging. This tool is the computer. The coupling of computers and imaging technologies enables physicians to obtain cross-sectional images of the body that eliminate X-ray shadows of structures in front of and behind the desired sections. The computer can store information, manipulate it, and show it as sequential images in motion. In other words, the computer can give moving pictures of the body at work. Doctors can now see internal organs at work, blood coursing through vessels, and even metabolic changes in active cells.

Some of the new imaging devices rely on X rays, while others exploit totally new medical technologies. The CAT (computerized axial tomography) scan—also called the CT (computed tomography) scan—is a technology that has been used medically since the early 1970s. It is derived from X rays, but with a powerful difference. The source of X rays is in a rotating doughnut in which the patient lies during examination. X rays are shot out in a fan shape. Sensors on the opposite side of the doughnut detect the amount of radiation passing through the body, and send this information to a computer in which the data are stored. The X-ray source moves a few degrees and fires another burst of radiation. The sensors pick up these readings and add these data to the computer. This process continues all the way around the body.

The computer then analyzes the information based on preprogrammed instructions and creates an image on a monitor. This image is a cross section of the body at the level of the scan. If the scan is taken of the head, the image will show a cross section of the brain. Likewise, a scan of the chest will show the lungs and heart.

Additional scans in an area provide a series of images of the internal structure of an organ. The images also show the changes in organs as they work, such as the pumping of the heart, the flow of blood through the vessels, the churning of the stomach. These images provide insight about the health of an organ.

DSR and DSA Scans

Dynamic spatial reconstructor (DSR) is a more sophisticated type of CAT scan. The DSR machine isolates a working organ, photographically slicing through the organ from any desired angle. In the time it takes a CAT apparatus to make one scan, the DSR machine can make up to 75,000. Using computer storage and manipulation, the DSR machine combines the information from these scans into a single three-dimensional image. Like the CAT machine, it can take images of organs as they are moving, but because it displays the organs in three dimensions, the image can provide more information to the trained eye.

Digital subtraction angiography (DSA) also uses X rays, but in a different way than CAT or DSR. An X-ray image is taken of the area under study, such as the heart. Then a substance that does not allow the passage of X rays is injected into the area of interest, say the coronary arteries. A second X-ray image is taken. The computer compares the two images, and subtracts the information in the first image from the information in the second image. The result is an image that highlights that area, using the substance that is opaque to X rays.

Bone Densitometry

Dual-energy X-ray absorptiometry (DEXA), also known as dual X-ray absorptiometry (DXA), is an effective method of testing bone density. Using photons at two different energy levels given off by X rays, DXA is able to measure the amount of photon absorption by the bone. The degree of photon absorption is related to the mineral content of the bone itself. DXA provides both an X-ray-like image of the bone and a measure of the bone's density. The process is faster than other diagnostic methods, taking from 2 minutes for a femur scan to 10 minutes for a total body scan.

Bone-density measurement is an extremely important factor in the early detection of osteoporosis, a disease in which bone mass is lost progressively, resulting in brittle, easily fractured bones. DXA can be targeted at people who are especially at risk for osteoporosis (such as women who are postmenopausal). Early detection often allows for more effective treatments.


Not all of the new tools that extend the physician's senses are based on X rays. One that has been in use for many years employs sound to "see" inside the human body. The apparatus for sonography, or ultrasound, consists of three specific components: a monitor, a computer, and a probe. The probe is both the transmitter and the receiver of ultrasound waves, sound in the range too high for human ears to hear.

Sonography works on the same principle as sonar—echoes from objects are used to locate the objects. Higher pitches give a more precise location, since sound waves become shorter as the pitch is raised. Bats, which use sonar to locate and zero in on small flying insects, use pitches that are in the ultrasound range.

The technician moves the probe in a circular motion over the area under study. Sound waves sent out from the probe penetrate the body and reflect off internal structures. The probe captures the returning echoes and transmits them to the computer. The echoes are analyzed and constructed into an image on a monitor. The process is so rapid that the technician can reposition the probe to get more information as the imaging occurs.

Sonography is widely used to study fetuses in utero; it can identify birth defects prenatally, particularly in women who are at high risk for pregnancy problems. Also, it is often used to determine the fetus' age when the time of conception is uncertain, to detect twins, and to guide invasive procedures, such as amniocentesis. Increasingly, sonography is also being used to detect and treat heart disease, as well as vascular diseases that can lead to stroke.

Sonography is also applied in the Doppler color flow imaging technique. By measuring sound waves reflected from blood flowing in vessels, an ultrasound machine can produce a picture of a patient's vascular system. The blood moving within the human body is shown in color, enabling doctors to pinpoint such heart-threatening problems as clots, cholesterol deposits, and vessel narrowing.

MRI Images

Magnetic resonance imaging (MRI) is a powerful type of imaging technology that began to be used in hospitals in the early 1980s. Also referred to as nuclear magnetic resonance (NMR), this technique uses the body's water molecules as the basis for its images.

The theoretical basis for this technology is that a powerful magnetic field turned on and off can cause the nuclei of certain atoms to line up in a particular direction. The hydrogen atoms in water are among the ones affected by magnetism in this way. The MRI machine also releases a burst of radio waves, which, like light and X rays, are a form of electromagnetic radiation. Just as the electrons in a Crookes tube interact with the molecules of glass, radio waves of the proper frequency can be used to change the alignment of the hydrogen atoms that have been "organized" by the magnetic field. When the magnetic field is turned off, the atoms move back into alignment, producing a burst of electromagnetic radiation of their own. Each atom produces a different signal. These signals can be received, compared, and used to produce an image. With the human body, this is accomplished with giant magnets.

The MRI machine itself looks like a tunnel. Its outer walls contain magnets that can be switched on and off. The magnets produce a magnetic field so powerful that it can pull iron objects into the machine from across the room. Physicians, nurses, technologists, and patients all have to be careful not to wear iron belt buckles or use other iron objects when the machine is going to be in operation. Other metal obects—such as pens, keys, or scissors—must be removed from the room. The patient is placed on a stretcher within the tunnel, and the magnetic field is turned on. The nuclei of hydrogen atoms in water molecules in the body line up parallel to the direction of the magnetic force. This is similar to aligning iron filings with a bar magnet. The radio waves then briefly shift the atoms, after which the atoms are allowed to return to their former alignment. In this brief period of changing position, the nuclei emit radio waves that can be recorded and analyzed.

The diagnostic images produced by MRI can help determine both stroke-ravaged regions of the brain and, through tumor imaging, the efficacy of cancer treatment. Another device, called the echo planar MR scanner, is able to pinpoint the area of a stroke at its earliest stages—even while it is happening—by measuring the rate of water diffusion in the brain. Water movement slows down considerably during a stroke. This technique may lead to advances in limiting the brain damage caused by strokes.

Radioactive Tracers and Pet Scans

Radioactive tracers are substances that include a radioactive isotope in place of a normal nonradioactive element. An isotope of an element is chemically the same as the element; however, it has a different mass because of a difference in the number of neutrons in its atoms. The number of neutrons in an atom can affect its stability, so many isotopes of nonradioactive elements are radioactive. Tracers often contain radioactive isotopes of carbon, oxygen, or nitrogen, all of which are elements found in high concentrations within the body.

Radioactive tracers are injected or inhaled into the body. Their interaction with the body can be tracked by instruments sensitive to radioactive emissions. Radioactive tracers can be used with CAT imaging as well as with simpler systems.

When the radioactive tracer is a nutrient such as glucose, physicians can study how it is used by the healthy body and by the diseased body. They can follow its uptake, distribution, and consumption within specific body tissues. When the radioactive tracer used is a drug or a poison, the doctors can study where the substance becomes concentrated, where it is metabolized, where it is stored, and how it is eliminated from the body.

Sometimes radioactive tracers are used because they attach to or become part of a tissue of interest. For example, carbon monoxide has an affinity for certain blood cells. By having a person inhale small amounts of radioactive carbon monoxide, the course of blood through the body can be traced.

Positron emission tomography (PET) is an imaging technique developed in the 1980s that uses a special type of radioactive tracer. This technique traces the path of specific radioactive molecules through the body as they enter cells or are metabolized by cells.

Some radioactive materials give off tiny particles called positrons. Positrons are subatomic elements that are exactly like electrons in size and amount of charge. However, positrons are positively charged—not negatively charged, as electrons are. In fact, a positron is the antiparticle of the electron—its mirror image. When a particle meets its antiparticle, both particles vanish, and only electromagnetic radiation is left. In PET, it is more convenient to think of the electromagnetic radiation as particles, or photons, than as waves. As the radioactive substances travel through the body, they act much as an electronic signaling device does. They leave a photon trail that the PET scanner can recognize and record.

The PET apparatus is shaped much like the CAT scanner. It has a ring or doughnut shape in which the patient lies. The ring is positioned along the area to be studied. The patient is given an appropriate radioactive substance, one that is normally used by the body and that has been made radioactive by incorporating isotopes into it. Usually, the physician "administers" the radioactivity intravenously via radioactive sugar or protein products, or has the patient inhale radioactive gases. As the radioactive substance is absorbed by and interacts with the area under study, photons leave the body and strike the encircling ring. The pattern they make is analyzed by computer and projected as an image on a monitor.

What is truly unique about PET is that it allows physicians to watch body processes as they occur. Oxygen uptake by the brain can be observed; the rate of sugar use in various cells under a given set of conditions can be compared. This technology not only aids in diagnosis, it also provides information on the relationships between structures and normal metabolic processes. It has proven to be valuable in studying brain functions, including memory, and observing the brain as words are spoken or read, and so on.

Recently, radiologists have begun to test multimodality machines that perform two types of nuclear imaging: PET and single photon emission computed tomography (SPECT). The latter uses radioisotopes to measure blood flow in small vessels and is particularly well-suited to imaging the brain. Multimodality machines appear to offer distinct advantages over devices that only perform PET or SPECT. The resolution is better, and the machine permits a greater range of applications. PET has proven to be a safe and efficient method for imaging various forms of cancer, heart disease, and a number of neurological disorders, including Alzheimer's disease.


Thermography uses differences in body temperature to create images. These differences arise because different areas of the body have different concentrations of blood vessels, and because cells that are metabolically active produce more heat than those that are less active.

Like all of the forms of imaging discussed, except sonography and radioactive tracers, thermography uses electromagnetism to see inside the body. In this case the waves are infrared, just a little longer than light waves. The image, therefore, is fairly sharp, although not as sharp as images based on X rays.

Thermography is a technique that is especially useful in locating cancerous tumors because these tumors are quite active metabolically. Thermography can also be used to identify areas of the body where there is reduced blood flow.

Better Images and Diagnoses

The images produced by CAT, DSR, DSA, MRI, and PET are initially produced in black and white. To the trained eye, they speak volumes about the tissues being observed, but these images can be enhanced further by the addition of color. By programming the computer to assign specific colors to specific signals, simulated color images can be produced. Not only is this more attractive visually, but it can be of real assistance to medical personnel using the images during surgery or other treatment. If, for instance, a brain tumor is found, and the course of treatment is to be radiation, color images that highlight sensitive structures nearby, such as the eyesor optic nerves, can be of immeasurable help to the physician as he or she plans the next course of action. Since the images are so precise, the physician administering the treatment has more information.

People who suffer from heart and circulatory diseases have benefited greatly from the new imaging techniques, several of which show the human heart in action and the flow of blood through vessels. Irregularities in the heart's beating, and faulty performance of the valves within the heart, can be seen; degeneration in the strong muscular ventricles identified; blockages in blood vessels detected; and ballooning of overstretched vessel walls in an aneurysm spotted.

These same techniques can be used to study the heart under stress. By injecting a patient with a heart stimulant, the heart can be made to race while a series of images are taken. By comparing images of the stressed and unstressed heart, physicians can determine even more about the health and overall functioning of this vital muscle.

The brain is the second body organ for which imaging technology has become especially important. CAT scans, PET scans, and MRI scans have made it easier to detect brain tumors, even when the tumors are quite small. MRI scans are especially useful because they can clearly distinguish between white and gray matter based on their different water content.

PET scans are good for detecting the locations of epileptic activity. During a seizure, these areas are more active than are surrounding tissues, and therefore consume more glucose. Once identified, these small areas sometimes can be surgically removed to control the epilepsy.

Brain scans conducted for a specific reason sometimes detect silent conditions. A small aneurysm or a congenital defect that has produced no symptoms may be identified. Treatment can prevent later problems caused by these conditions.

Early detection of cancer is another payoff of the new imaging devices. Cancer cells have a high metabolic rate that lends them to detection by PET and MRI. Tumors can be spotted much earlier with these techniques than with others.

Mammography, a technique used to identify breast cancer, can detect tumors of less than 0.5 inch (1.25 centimeters). X rays are sent through the breast. If tiny calcium deposits are present, they show up on the X rays. Such deposits are often associated with cancerous tumors and indicate that additional testing needs to be done. Sonography can also be used in breast-cancer detection. It can detect only those tumors larger than 0.4 inch (1 centimeter) in diameter, but it can distinguish between malignant and certain benign tumors. In addition, CAT scans and radioactive tracers can be used to identify breast cancer. Cancers of the breast take up iodine, so radioactive iodine is used as a tracer in this procedure. A baseline CAT scan is performed; then the patient is injected with radioactive iodine. After a waiting period, a second CAT scan is made. A comparison of the two helps doctors locate any cancerous growths that might be present.

The CAT scan is useful in producing images of bones and surrounding tissues. Bone misalignment, joint malformations, and traumatic damage to areas involving bones and nerves can be clearly imaged with CAT scans. An emerging area for CAT-scan use is in reconstructing damaged or deformed bones. The images from CAT can be used as the basis for creating artificial bones and joints. The information from these scans allows the artificial-bone makers to construct an almost perfect replacement—one that is a good fit with the person's own healthy bones.

Advanced imaging technologies improve our understanding of the human body, simplify the diagnosis of illnesses and conditions that were once difficult to identify, enhance the safety of surgery, and reduce the need for exploratory surgery. In addition, imaging technologies may detect certain conditions before symptoms appear, making preventive intervention possible. Nonetheless, the full potential of these technologies has yet to be realized. For example, in some cases, techniques may be more powerful in combination. Current research is evaluating PET/MRI in the study of pain, as well as CAT/MRI in determining the location and composition of tumors.

The information provided should not be used during any medical emergency or for the diagnosis or treatment of any medical condition. A licensed physician should be consulted for diagnosis and treatment of any and all medical conditions. Call 911 for all medical emergencies.

Copyright Information: Public domain information with acknowledgement given to the U.S. National Library of Medicine.

Medications and Drugs

Drugs are chemical compounds that modify the body's typical natural chemical reactions. Used as medications, they can alter how the mind and body function. As a general rule, the effects of medications and drugs are only temporary. They tend to alleviate or mask unpleasant symptoms while allowing the person's "system" to rejuvenate itself—or to function within a normal range while the additional substance (drug) is present in the host person's bloodstream. For example, many antibiotics do not actually kill bacteria; they merely slow down their growth so that the body's immune system can eliminate the infection.

Metabolism of Drugs

Most drugs used today are chemical compounds created in the laboratory rather than naturally occurring substances. After they have been at work for a certain period of time, the drugs are fully processed by the body and gradually eliminated.

In order for a drug to reach its intended destination, it usually needs to enter the bloodstream. This can be accomplished in several ways: medications can be swallowed, injected, applied to the skin (via transdermal patch), or introduced rectally. Drugs that are swallowed are picked up and transported to the liver, where a portion of its molecules can be metabolized (chemically altered), before making it into the general bloodstream. Some drugs enter the body in the inactive form and need the liver to metabolize them in order to become active.

Once the drug molecules are in the bloodstream, they travel not only to their destination, but throughout the body. Aspirin, for example, taken to relieve a headache, also goes to the stomach lining, increasing the risk of ulcers. Some drugs are distributed only into the bloodstream and extracellular fluid, while others can penetrate into cells.

The body eventually removes medication by metabolizing or eliminating it—many drugs require both processes. Most medications are metabolized by the liver, and the majority are eliminated by the kidneys or in the bile. Drugs eliminated through the kidneys go to the bladder and leave the body in the urine, while those absorbed by bile are sent out into the small intestine and ultimately eliminated in the feces.
Toxicity, Interaction, and Side Effects

Poisonous or toxic properties of medicines must always be examined in detail. Virtually all drugs are toxic to some extent. Sometimes a toxic effect is an extension of the desired effect at a higher dose level. For example, anticoagulants prolong the clotting time of blood, but if the clotting time is prolonged excessively because of an over-dosage, hemorrhaging can occur.

Toxicity can also take the form of a side effect more or less unrelated to the drug's primary action. Many drugs cause gastrointestinal disturbances that trigger nausea, vomiting, or diarrhea. These effects occur before the drug is absorbed and reaches its site of action. Other toxic effects include allergic reactions.

In studying drug toxicity in animals, the medicinal scientist uses specific criteria in relating effectiveness and toxicity—such as a median effective dose versus a median lethal dose. As a general rule, the wider the range between such lethal and effective dosage, the safer the drug is likely to be for human consumption—although medicines causing nonlethal side effects are considered exceptions to this rule. In such cases, the scientist must weigh the seriousness of the disease versus the benefit derived from a given drug, and the extent of discomfort or damage that may result.

The interactions of various drugs can produce unusual symptoms in specific patients. Because of the lack of an adequate explanation, many of these symptoms might be attributed to an unusual manifestation of the disease itself. Only in recent years has it become evident that one drug may affect the action of another, and when two or more drugs are ingested, the possibilities for interactions can be great.

Closely related to drug toxicity and interaction is another consequence of drug usage—the possibility of a drug becoming addictive. In the early 20th century, drug addiction was the general term for any illicit use of drugs. In the 1930s, the term habituation came to describe a person's psychological dependence upon a drug—the emotional distress, but not the physiological illness, that resulted with the drug's absence. Drug addiction was characterized as the physical dependence on a drug's effects, with illness, or withdrawal symptoms, developing if the drug use was halted.

These definitions hardly covered the scope of the problems associated with drug use. In 1965, the World Health Organization (WHO) recommended that the term "drug dependence" be substituted for "drug addiction." Drug dependence is characterized by: the compulsive use of a drug for an extended time; the loss of control over its use, such that users take extraordinary or even harmful measures to continue using the drug; and the continued use of a drug despite adverse consequences or to stave off illness once the drug use is halted.

While many drugs ranging from analgesics (painkillers) to tranquilizers may cause some degree of dependence, the severity is often related to drug tolerance—when progressively more and more of the drug is required to get the desired effect.

The History of Drugs

Before the advent of modern chemistry, physicians seeking drugs to treat their patients could select only natural products (plants, animals, fungi) or inorganic materials (minerals). In ancient Egypt, for instance, priest-physicians concocted vile potions from dead flies, dried excreta, and bitter herbs, which were then inserted inside an appropriate orifice of the body.

While natural medications have always been significant drugs, it remained for Paracelsus (1493–1541) and the spirit of the European Renaissance to usher in the age of chemical drugs. Paracelsus advocated the internal use of chemical remedies, such as mineral salts and acids, and substances prepared by such chemical processes as distillation and extraction. Paracelsus believed it was no longer the task of alchemy to make gold, but to make medicines.

Modern medicines have been largely synthetic, even if they have also been derivatives of natural substances. For instance, coca leaves and Peruvian bark from the Andes Mountains of South America were used as far back as 500 B.C. They came into conventional medical use during the 17th century, and two centuries later, the active substances cocaine and quinine were chemically isolated from them. From the "vomiting nut," nux vomica, the alkaloid known as strychnine was derived. While the ancient Greeks used willow bark and leaves crushed in olive oil to apply to arthritic joints or muscles, they could not have possibly known that willow (Salix) contains a glucoside called salicin; its chemical makeup led to the synthesis of salicylic acid, with its chief derivative being aspirin. Aspirin is a chemical compound created from a natural substance after evolving through several transitory stages in the laboratory.

Types of Drugs


Antibiotics illustrate how natural substances can be used to create synthetic drugs. For instance, when Scottish physician Alexander Fleming happened to discover the antibacterial properties of the mold Penicillium notatum in 1928, he could not have known that he had discovered the storied antibiotic penicillin—the first "wonder drug" able to prevent the growth of bacterial parasites in infected hosts. Antibiotics are natural products produced by microorganisms, and they have the unique property of inhibiting the growth of other microorganisms. Still, antibiotic therapy remains imperfect. For instance, infections caused by certain strains of Pseudomonas are difficult to control. That was why ticarcillin (Ticar), produced from ampicillin, a penicillin sister mold, was obtained in 1964. Ticar was the first antibiotic to successfully combat Pseudomonas aeruginosa, a particularly troublesome pathogen that could be life-threatening both in severely burned patients and in those with impaired immune systems. In 1971, Ticar was superseded by another penicillin derivative—azlocillin—which remains one of the most effective drugs used against Pseudomonas infections.

Other types of antibiotic-producing organisms have been derived from soil microbes. In 1940, microbiologist Selman Waksman first obtained actinomycin A from Actinomyces antibioticus. The actinomycetes have features in common with both bacteria and fungi; actinomycin A was the first of many antibiotics derived from them. Their advantage: effectiveness against both bacterial and fungi-borne infections. Actinomycin A and its daughter drugs led to the creation of streptomycin—the first antibiotic effective in fighting tuberculosis.

Following the discovery of streptomycin, the pharmaceutical industry began screening soil samples from every corner of the globe in search of potential antibiotics. Among the new batch of antibiotics that were discovered was chloramphenicol—the first antibiotic effective against a broad spectrum of diseases, including typhus, typhoid, and meningitis. Other often-used antibiotics include the tetracyclines—doxycycline, minocycline, and plain tetracycline. Extremely effective against a wide range of infections, these drugs were the first entirely synthetic antibiotics.

In recent years, there has been a dramatic rise in microbial resistance to antibiotics, which has complicated physicians' efforts to treat various kinds of infections. Most major bacterial pathogens have acquired antibiotic-resistant genes, and some strains have even developed a resistance to nearly all available antibiotics. Among these "superbugs" are the ones responsible for Salmonella food poisoning, upper respiratory illnesses like bronchitis, and sexually transmitted diseases, such as gonorrhea.

A major cause of the spread of resistant bacteria is the misuse and overuse of antibiotics. If an antibiotic is not taken in a high enough dose or for a sufficient length of time, some of the bacteria causing a patient's infection will survive.

In both of these situations, only the most drug-resistant bacteria survive and reproduce, leading to the creation of a "superbug." In addition, millions of doses of antibiotics are prescribed for viral infections, which do not respond to them.


In the mid-1980s, a new family of drugs to treat bacterial infections was discovered—fluoroquinolones (examples include ciprofloxacin and ofloxacin), which block DNA-gyrase, an enzyme needed for synthesis of bacterial DNA. Fluoroquinolones work differently from most other antibiotics, which, once reaching a minimum effective concentration in the blood, kill bacteria at a constant rate. Fluoroquinolones are concentration-dependent, meaning that the more highly concentrated they are in the bloodstream, the higher the bacteria-killing rate they achieve.

Fluoroquinolones can be used for bladder infections, skin infections, traveler's diarrhea, hospital-acquired pneumonia, and some sexually transmitted diseases. They are totally synthetic drugs, and are not modeled after a natural precursor.

Antitubercular Drugs

In 1938, two researchers from Johns Hopkins Hospital, in Baltimore, Maryland, reported that sulfanilamide had weak activity in animals infected with Mycobacterium tuberculosis, the bacterium responsible for tuberculosis. Other researchers soon discovered that sulfathiazole and sulfadithiazole were much more effective. By 1950, American physicians had derived thiacetazone from these drugs. While extremely toxic to the bacterium, thiacetazone was also excessively toxic to human livers.

A similar story was reported during the 1940s with the antibiotic streptomycin. While it was found to be effective against tuberculosis, large doses were found to produce deafness. Fortunately, the discovery of synthetic antitubercular drugs—in particular, isoniazid in 1952—eased this problem by permitting smaller doses of streptomycin to be used with other drugs.

Tuberculosis (TB) has made a comeback in recent years, partly because of the emergence of multidrug-resistant (MDR) strains of the tuberculosis bacterium.

Antimalarial Drugs

For centuries, quinine—derived from the bark of the Peruvian cinchona tree—had served as the chief means of treating malaria throughout the world. No worthwhile progress in developing antimalarial alternatives was made until 1924, when a technique was devised that led to the first really effective quinoline (as synthetic antimalarial drugs are called). With a trade name of Plasmoquine, this drug became widely distributed by 1928. Plasmoquine contained the active ingredient pamaquine.

While quinine acts by slowing down the spread of malarial spores into the bloodstream (where they soon rupture red blood cells), large doses of pamaquine attacked the malarial pathogens in their alternate target site, the liver—suppressing them with astonishing speed—and the blood. Large doses of pamaquine greatly lowered the incidence of relapses in patients infected with the most common type of malaria.

Even better results were later obtained when small doses of pamaquine were combined with quinine—eliminating most adverse side effects. By the onset of World War II, it soon became apparent that another quinoline, mepacrine, was still more effective than either quinine or pamaquine in the treatment and suppression of malaria. In 1952, pyrimethamine (Daraprim) was discovered. This drug was so potent against the disease and yet so relatively nontoxic that it could be injected directly into the patient's bloodstream. As a consequence of hycanthone and several daughter synthetics, the incidence of most strains of malaria has been greatly reduced in developed nations where such drugs are available for widespread use. Elsewhere, especially in poorer tropical lands where millions of people are infected with the deadliest of the malarial species, drug resistance has become a problem. Often, doctors have had no choice but to give patients high doses of quinine, despite its serious side effects.

Antiviral Drugs

Although antibiotics are not effective against viruses, they sometimes are prescribed for patients with viral diseases to prevent secondary bacterial infections. Many health professionals doubt the efficacy of this practice and fear the consequences of indiscriminate use of antibiotics.

Because viruses invade healthy cells, a major hurdle for drug developers has been to produce drugs that fight the viruses without disrupting or killing the cells. Another difficulty results from the ability of viruses to rapidly evolve: they quickly create variants that are resistant to drugs.

By and large, antiviral drugs do not cure most viral infections, but instead suppress viral outbreaks or prevent the harmful effects of the virus. The most prominent antiviral drugs target the human immunodeficiency virus (HIV), the virus that causes AIDS. The first major drug used to treat HIV, zidovudine (AZT), was introduced in 1989, and several others followed in the early 1990s. These antiretrovirals prevent the viral RNA from creating the human DNA responsible for viral replication (making copies of itself), and therefore slow the disease's progression. They also prevent transmission of the virus from mother to infant during childbirth.

In the mid-1990s, a second class of antiviral drugs, protease inhibitors, provided a way to reduce HIV's concentration and prevent viral proteins from becoming activated. Antiretrovirals and protease inhibitors, when used alone or in combination, are not able to eradicate HIV, but they can slow disease progression and reduce the risk of patient death.

Another well-known antiviral drug, interferon, works by boosting the body's natural ability to defend itself. Genetic-engineering techniques allow large-scale production of human interferon, which has helped to significantly reduce its cost. Many people believe that in the coming years, interferon will assume an increasingly important role in combating hard-to-treat viral infections.

CNS Depressants

The first central nervous system (CNS) depressants were volatile (prone to vaporize) anesthetics, such as nitrous oxide (laughing gas) and ether. By the 1840s, both of these drugs were being used by dentists during tooth extractions. A few decades later, these volatile anesthetics (and the dangerous chloroform) were also being used to anesthetize patients during surgical procedures—frequently with fatal results. Today, a variety of anesthetic gases are in use; which type of gas is chosen for a particular patient depends on a variety of factors. For instance, general anesthesia, which induces and maintains a state of unconsciousness, may involve different drugs than does local anesthesia, which numbs only a small part of the body. Also, an anesthesiologist may use a long-acting drug at the beginning of surgery, then switch to a short-acting agent later. Drugs commonly used for general anesthesia include nitrous oxide, halothane, and enflurane; lidocaine and benzocaine are popular local anesthetics.

Other CNS depressants include barbiturates and benzodiazepines. These drugs can treat insomnia and seizures, but may cause physical and psychological dependence. Barbiturates can also induce respiratory depression—a condition in which the brain fails to "tell" the lungs to breathe—and are rarely used today. Benzodiazepines are still commonly used for insomnia and anxiety.


The greatest advance in psychoactive drugs occurred in 1952 with the introduction of chlorpromazine, a phenothiazine antipsychotic. It was the first effective drug therapy for schizophrenia and, when coupled with monitoring by physicians and counseling, enabled many patients to leave mental hospitals and function in society. The next effective antipsychotic drug class were the butyrophenones, which gave the same benefits as the phenothiazines, but without the side effects. The atypical antipsychotics available today are even more effective at eliminating symptoms of schizophrenia.

Mood Stabilizers and Antidepressants

Lithium was the first drug shown to stabilize the moods of people with manic-depressive disorders (in the 1950s), and it is still widely used today. Soon thereafter, monoamine oxidase inhibitors were introduced, but have since been abandoned because they interacted with other drugs and even certain foods to produce severe adverse reactions. Tricyclic antidepressants were first used in the 1970s, and found to be effective—but are lethal if taken in overdose, a serious concern in the treatment of depressed patients.

Since the 1980s, selective serotonin receptor antagonists, such as fluoxetine (Prozac) and sertraline (Zoloft), have been used most frequently. These drugs work well and pose a low risk of death in overdose. During the initial weeks of therapy, children and teenagers must be monitored closely, as they are at a higher risk of suicide early in the treatment period.

CNS Modifiers

The gray area of central-nervous-system modifiers encompasses such often-used drug categories as anticonvulsants and analgesics. The first widely used anticonvulsant was probably bromide, discovered in seawater in 1826. Bromide was first used as an alternative to the chemically similar iodine in the treatment of a wide variety of diseases, including syphilis. Bromide couldn't curb grand mal epileptic convulsions, however, and not until a century later (the 1920s) did a more effective anticonvulsant—phenobarbital—achieve widespread acceptance. Phenobarbital was the wellspring of a wide variety of anticonvulsant phenol compounds. Most of these were obtained prior to World War II from chemical analgesics similar to aspirin. The most promising of these, troxidone, was discovered in 1943; while able to control petit mal epileptic seizures in children, it still possessed many toxic side effects. Finally, less-toxic anticonvulsants, such as ethosuximide, a succimide, came into widespread use in the 1960s. Classes of anticonvulsant drugs besides succimides in current use include hydantoins and benzodiazepines, which are better known as tranquilizers and sedatives.

Circulatory-System Drugs

Medications affecting circulatory-system function include antiarrhythmics (regulating heartbeat), antihypertensives (regulating blood pressure), and anticoagulants (preventing blood clotting).

The first drug used to regulate the heartbeat was quinidine, a derivative of quinine, in the 1920s. Many similar drugs followed. In the 1990s, potassium-channel-blocking drugs, including sotalol and amiodarone, were found to be safer and more easily tolerated. Since the mid-1960s, beta blockers, such as propranolol and metoprolol, have been used for arrhythmias after cardiac bypass surgery or after heart attacks.

The first effective antihypertensive was potassium thiocyanate, used to lower blood pressure as early as the 1870s. The first low-toxicity antihypertensive was sodium nitroprusside (Nipride), introduced in 1945. Today's hypertension medications can be grouped into six classes. Diuretics help eliminate excess water and sodium. Beta blockers cause the heart to beat with less force. Alpha blockers relax blood vessels. Calcium channel blockers prevent calcium from passing through cell membranes in the arterial walls, thus preventing constriction of the arteries. Angiotensin-converting enzyme (ACE) inhibitors, such as benazepril and captopril, prevent activation of a hormone that raises blood pressure. Central alpha agonists interfere with nerve impulses that cause arteries to constrict.

Heparin, the first drug used to prevent blood-clot formation, was introduced in 1926. It is still used, but more-recently developed anticoagulants are now favored. New medications called direct thrombin inhibitors (DTIs) do not require help from the body to prevent a clot, and can be used in cases where heparin cannot. Unlike the above medications, warfarin can be taken orally, and prevents the activation of several clotting factors. In patients with existing blood clots, a thrombolytic can help the body to quickly break them down. Risks associated with anticoagulants and thrombolytics include excessive bleeding.

What may well be the best-known circulatory drug in the world—sildenafil (Viagra)—was introduced in March 1998. Originally designed to treat angina, Viagra gained attention for one of its side effects: it increases blood flow in such a way as to improve male sexual function. In its first year, more than 4 million men had received prescriptions for the drug.

Gastrointestinal and Excretory-System Drugs

Most GI-tract medications are intended to relieve abdominal bloating, constipation, diarrhea, and excess stomach and intestinal acidity. In most cases, over-the-counter therapies are adequate, but sometimes prescription therapy is needed: antibiotics can cure certain gastrointestinal infections, metoclopramide increases the rate of stomach emptying, and drugs like granisetron (Kytril) can ease nausea and prevent vomiting.

Endocrine-System Drugs

On May 15, 1889, Charles Édouard Brown-Séquard, a highly respected 72-year-old professor of medicine at the Collège de France, injected himself with the first of eight doses of an extract of guinea-pig testicles. Soon thereafter, he announced that animal testes were capable of invigorating the sex drive of elderly men! Despite his dubious findings, Brown-Séquard can be regarded as the instigator of endocrine-drug-related research. Later compounds consisting of "monkey glands" and related ovarian hormones were tried as sex-hormone "elixirs." But it was not until 1934 that a male hormone (androsterone) was detected in male urine; the following year, 5 milligrams of another male hormone, testosterone, were also isolated. Endocrine-system-drug research has since led to the discovery of various female sex hormones, oral contraceptives, and anabolic steroids. Like many drugs, some of these discoveries were found to have significant side effects. For example, some increased the risk of breast cancer in men and women.

The information provided should not be used during any medical emergency or for the diagnosis or treatment of any medical condition. A licensed physician should be consulted for diagnosis and treatment of any and all medical conditions. Call 911 for all medical emergencies.

Copyright Information: Public domain information with acknowledgement given to the U.S. National Library of Medicine.

Drug Delivery Technology

Throughout history, pharmacologists have tread a fine line in determining dosage: too much of a medication can have negative consequences, but too little might fail to cause the desired response. The goal is to give a drug in such a way as to keep the blood concentration above the minimum therapeutic dose, but below a toxic level for as long as needed to treat the disorder or disease. Sometimes a single dose of a drug will do the trick; other times, lifelong therapy is required.

Because most modern drugs travel through the bloodstream, only a portion of the medication makes it to the part of the body where it is needed. Some drugs stay in the bloodstream only; others enter the cells or the spaces between them. The more the drug needs to be diluted before it makes it into the body, the higher the dosage required to achieve a desirable concentration. Some drugs are metabolized by the liver and other organs, while others are eliminated in the bile or urine. Each drug has an expected duration of action after which subsequent doses must be administered.

If a drug must attain a certain concentration in the body, and has a short duration of action, multiple daily doses will be needed. For example, morphine sulfate is dosed at 10 milligrams every 4 hours for continuous control of severe cancer pain. If a follow-up dose is missed or delayed, the pain will come back.

Tablets and Capsules

One challenge to such treatment is that short intervals interfere with a patient's sleep. Advanced-tablet technology has solved this problem. In the morphine sulfate example, an MS Contin timed-release tablet continuously dissolves over a 12-hour period, meaning that it only needs to be taken twice per day. The tablet is designed to allow only a certain amount of the drug to enter the body per a given unit of time.

There are many examples of such sustained-release products. Some capsules are made of plastic so that water cannot penetrate the tablet except through a tiny, laser-drilled hole. Since only a specific amount of liquid (and the drug dissolved therein) can escape, the drug is released over a long period of time. In a capsule core coat system, capsules are filled with drug pellets. Each one contains a top layer of the drug that is immediately absorbed into the bloodstream, a second layer of coating that dissolves slowly, and a third layer of medication. Enteric coatings are another gradual-release option. These dissolve in the small intestine, releasing the drug lying underneath so that it can be absorbed by the body.

Injections and Inhalers

In some cases, it is preferable to deliver a drug directly to the part of the body where it is needed, or to give a drug in a form that will travel to the site of action before becoming activated.

Local delivery of medication has been used for decades. For example, dentists administer lidocaine injections into the mouth nerves to numb the area, but include some adrenaline as well. The adrenaline does not deaden the pain, but it causes blood-vessel constriction in the mouth so that the anesthetics are not taken up by the bloodstream and rushed away. Asthma sufferers take inhaled beta-2 agonists, salmeterol, or corticosteroids through an inhaler. This allows high drug concentrations in the lungs—where the medicine is needed the most—and lower total body concentrations. The result is better asthma control with fewer side effects. In 2006, pharmaceutical giant Pfizer Incorporated unveiled an insulin inhaler called "Exubera," designed specifically for people with diabetes.

Transdermal Skin Patches

As an alternative to tablets and capsules, some medications are available in patches worn on the skin. The medication in the patches passes through the skin and is slowly taken into the bloodstream at a constant rate. The size of the patch determines the amount of medication the body receives, with larger patches dispensing more medication than smaller ones. The shape of the patches varies with the manufacturer.

The patches resemble the plastic bandages used on cuts, except that their adhesive is around all four edges. The basic design of the patch, called a transdermal skin patch, is simple. The side away from the body is a waterproof covering. Below this is a reservoir that contains the medication in a dispersal medium (such as mineral oil) in which the medication is evenly distributed. Below the reservoir is a porous synthetic membrane that controls the rate at which materials leave the reservoir. The patches are fairly small. The surface across which medications pass ranges from about 0.5 square inch (3.2 square centimeters) to 4.6 square inches (30 square centimeters).

Nitroglycerin patches have been available since the mid-1980s. A steady supply of this medication helps relieve the symptoms of angina, a heart condition that causes suffocating chest pains. Clonidine, a medication for high blood pressure, also comes in skin patches. Another medication available in skin patches is scopolamine, which is used to treat motion sickness. The nitroglycerin and clonidine patches are usually worn on the chest or inner thigh, while the scopolamine patches are placed behind the ear. As body temperature slowly melts the dispersal medium, the correct dosage is absorbed into the body.

Other skin patches include nicotine patches to wean people from smoking; estrogen patches for women in need of estrogen supplements; and fentanyl (Duragesic) patches for chronic cancer pain.

The transdermal skin patch does not upset the stomach like certain orally administered medications, and it is not messy like topical ointments. On the negative side, the patch is usually more expensive than conventional treatments, it may fall off, and it can cause skin irritations.

Implantable Pumps

Some people require regular doses of natural body products, such as hormones and other chemicals, which their bodies do not produce in sufficient quantities. Some body products, often proteins, break down in the digestive system if taken orally. The molecules of such proteins are too large to diffuse through the skin. Until recently, the only method of getting these products into the bloodstream has been by injection.

The most common condition in which a body product is lacking is diabetes, a disorder in which the body does not metabolize sugar properly because it produces too little insulin, or fails to use its insulin effectively. Many diabetics must inject themselves with insulin daily. The amount of insulin must be balanced with activity and diet to prevent side effects.

An alternative to daily injections, the implantable infusion pump has progressed to the production of miniaturized pumps about the size of a deck of cards. The pump is surgically placed in the abdomen or wall of the chest, much like a heart pacemaker. The cylinder in the pump that holds insulin is refilled through the skin every week. Once in place, the pump delivers programmed doses of insulin to the body.

An infusion pump is also being used to relieve severe pain in cancer patients. The tiny device, which contains concentrated amounts of a painkiller and programmed to dispense a predefined dose, is implanted under the skin, usually in the abdomen.

Monoclonal Antibodies

The body's natural defense system produces proteins called antibodies as one response to an invasion by foreign materials (such as pollen), called antigens. Antibodies are specific: a given antibody recognizes and attaches itself to a specific antigen. The antibodies circulate in the blood. When they come in contact with the specific antigen, they bind together, forming an antibody-antigen complex. Other blood cells, in turn, recognize an antibody-antigen complex and destroy it.

The disease-fighting chemicals produced by vaccination are antibodies. Blood plasma that contains antibodies to a given disease can be administered to a person to provide temporary immunity to that disease. Antibodies can also be used diagnostically. The antibodies for a specific pathogen are added to a blood sample; if antibody-antigen complexes form, doctors know that the organism is present.

But there are problems with using the antibodies that are present in normal blood plasma. There is a mixture of many different kinds of antibodies in any blood sample. The amount of antibodies produced in response to a particular antigen varies from individual to individual, whether the suppliers are humans or animals. In addition, the level of antibodies in the living body is simply too small for many clinical applications.

In 1975, scientists found a way to obtain a pure supply of antibodies from mice. First the mice were injected with a desired antigen; then they were given time to produce antibodies to that antigen. Next, scientists removed antibody-producing cells from the spleens of the mice. Unfortunately, the amount was not enough to grow these cells outside the animals' bodies. Mouse spleen cells live only a few days in culture. While they continue to produce antibodies during this time, the total amounts are small. The scientists used a technique called cell fusion to unite each mouse spleen cell with a mouse skin cancer cell. Because cancer cells divide eternally, the fused cells, or hybridomas, not only live, they increase in number while still producing their antibodies.

Each spleen cell produces only one kind of antibody, so each hybridoma produces only one antibody. The next step in this process is to separate the hybridomas from one another and grow each one individually. As individual hybridomas grow, they divide again and again to form a colony of identical cells. Finally, the antibody produced by each hybridoma is identified. Hybridomas producing the desired antibodies are kept and the rest are discarded. Since each colony, or clone, produces one kind of antibody, the products are called monoclonal antibodies. Using this technique, scientists can now get pure antibodies for many different antigens in almost unlimited quantities. This breakthrough was so significant that its discoverers, Georges Köhler of Germany and César Milstein of Argentina, received a Nobel Prize in Physiology or Medicine in 1984 for their work.

One major problem with this technique was that the human immune system had the tendency to fight and destroy mouse-derived monoclonals, viewing them as alien invaders. Newly developed monoclonals are more similar to human antibodies, so they are more likely to be accepted by the human immune system. These compounds are viewed as potential treatments for a wide range of maladies, from bacterial infections to arthritis and cancer.

Monoclonals are currently being used in a variety of diagnostic situations. A tool to help physicians diagnose patients at risk for heart attack and stroke uses monoclonal antibodies to identify fibrinogen, a blood protein that plays a key role in blood-clot formation. Since high levels of fibrinogen can cause coronary-artery disease, early detection of high fibrinogen levels may help physicians prevent or predict heart failure.

Certain disease-causing microorganisms as well as certain antigens that cause allergies can be detected using monoclonal antibodies. The sexually transmitted diseases herpes, gonorrhea, and chlamydia can be identified at very early stages. This is especially important with gonorrhea and chlamydia because, in women, these diseases may not cause any symptoms until they have done a great deal of damage. Rabies, too, can be diagnosed more quickly and accurately with monoclonal antibodies. In addition, monoclonal antibodies are helping scientists learn more about the basic structure of cells, an approach that may give more insight into cancer.

A few applications have been introduced, particularly drugs that help prevent the rejection of organ transplants. Also, it is possible to inhibit the clumping of platelets, which helps prevent the reclogging of arteries in patients recovering from angioplasty. New drugs show promise in treating rheumatoid arthritis and in shrinking tumors.

It is also possible to attach drugs or poisons to monoclonal antibodies. Drugs specific for certain types of cells, such as disease-causing microorganisms or cancer cells, can be released when the antibodies bind to the cells. The antibodies take the drugs or poisons directly to the target cell, reducing any effects on other healthy cells.


Liposomes are tiny bubbles of various sizes. Their outer boundary is a double phospholipid membrane like that of a cell. Inside is a watery medium. Liposomes are not new cell organelles; they are human-made vehicles. Temperature-sensitive liposomes, which may release the drugs they contain by breaking apart at sites exposed to heat generated by radio frequency energy, are being evaluated as a means of attacking specific targets, such as tumor-afflicted liver tissue, while sparing other tissues.

Liposomes are made by mixing precise proportions of phospholipids and water. Phospholipid molecules are characterized by having one end that is hydrophilic (attracted to water) and one end that is hydrophobic (repelled by water). When phospholipids are mixed with water in the right way, the phospholipids form hollow spheres with the hydrophilic ends all inside the spheres and the hydrophobic ends facing out. The spheres are the liposomes; they enclose the water with which they were mixed. If drugs or other water-soluble materials are present, they become enclosed within the liposomes as they form. If fat-soluble drugs are added, they become part of the surrounding membrane.

Depending on the exact procedure being used, different forms of liposomes are produced. Some have a single outer membrane, while others are like spheres within spheres—with a watery solution in between each pair of spheres.

By carefully selecting the phospholipids used in making the liposomes and adding cell surface proteins, scientists can create liposomes with an affinity for a specific type of cell. Once the liposomes reach the desired cell, directed heat may break them down or they may interact in one of several ways. They may attach to the cell surface; the materials within them then diffuse into the target cell. They may be taken into the cell, where they break down and release the material they carry. Or they may fuse with the cell membrane and empty their contents into the target cell.

Research on liposomes began in the 1960s, but was slowed by an unexpected difficulty. Scientists had thought that the similarity between liposomes and normal cells would be sufficient for the human body to accept the carriers, but the immune system destroyed the liposomes. The problem began to be solved in the late 1980s, when scientists incorporated glycolipids into the liposomes. Coupled with other developments, this has led to encouraging results with the anticancer drug doxirubicin.

Liposomes containing amphotericin B have been used to treat fungal infections. Liposomes are also used in the treatment of pulmonary illnesses.

Another drug-delivery technique is the use of nasal spray as an alternative to injections. One such product, a nasal-spray influenza vaccine carrying the brand name FluMist, became available by prescription for the 2003–2004 flu season. Nasal sprays and inhalers can also be used to administer nicotine to help wean smokers off tobacco.

Even foods may someday deliver drugs. Scientists may modify a plant's genes to include proteins from disease-causing organisms. Eating such a plant could stimulate a person's immune system to produce antibodies. Maintaining the food chain's integrity may become a concern—researchers are testing crops that have been genetically engineered with vaccines against several other illnesses as well.

Recombinant DNA

Improved drug-delivery methods mean very little if an insufficient quantity of drug is available for delivery. Many people have medical problems caused by the body's inability to produce certain hormones, enzymes, or other compounds. The problems can be combated by delivering the compounds; but these compounds often exist in limited quantities, making them costly or necessitating the use of substitutes. For instance, people with diabetes are unable to produce adequate amounts of insulin. Until recently, insulin was extracted from animals, mostly pigs. Although similar to human insulin, pig insulin is not identical, and some diabetics are sensitive to it. Thanks to a technique called recombinant DNA, diabetics can take human insulin, and thus avoid adverse reactions.

In recombinant DNA, the human gene responsible for directing production of the desired chemical is inserted into the genetic material (DNA) of another organism, usually a bacterium or yeast. The host organism then is able to produce the chemical—and passes this ability on to its descendants. It is the human insulin produced in this manner that is now used in the treatment of diabetes.

Other medications and drugs produced using recombinant DNA techniques include interferon, used to treat hepatitis, muscular sclerosis, and certain types of cancer; and somatostatin, a hormone that inhibits the synthesis and secretion of growth hormone.

Some chemicals are too complex to be mass-produced in bacteria or yeast. One of these is Protein C, a compound present in trace amounts in human blood, where it plays a role in clotting. Researchers inserted the gene for Protein C into pig embryos, attached to regulators that cause the protein to be produced only in milk. They were thus able to induce production of it in amounts large enough to be easily purified. Researchers also have been able to produce other proteins in farm animals, including human hemoglobin.

Recombinant DNA technology has also been applied in the development of vaccines. The first such human vaccine to win U.S. government approval was created to protect against hepatitis B, a highly infectious viral disease of the liver transmitted through blood and body fluids. Other vaccines currently under development will take aim against such conditions as AIDS, Lyme disease, acellular perussis, and hepatitis A.

The information provided should not be used during any medical emergency or for the diagnosis or treatment of any medical condition. A licensed physician should be consulted for diagnosis and treatment of any and all medical conditions. Call 911 for all medical emergencies.

Copyright Information: Public domain information with acknowledgement given to the U.S. National Library of Medicine.