Drugs are chemical compounds that modify the body's typical natural chemical reactions. Used as medications, they can alter how the mind and body function. As a general rule, the effects of medications and drugs are only temporary. They tend to alleviate or mask unpleasant symptoms while allowing the person's "system" to rejuvenate itself—or to function within a normal range while the additional substance (drug) is present in the host person's bloodstream. For example, many antibiotics do not actually kill bacteria; they merely slow down their growth so that the body's immune system can eliminate the infection.
Metabolism of Drugs
Most drugs used today are chemical compounds created in the laboratory rather than naturally occurring substances. After they have been at work for a certain period of time, the drugs are fully processed by the body and gradually eliminated.
In order for a drug to reach its intended destination, it usually needs to enter the bloodstream. This can be accomplished in several ways: medications can be swallowed, injected, applied to the skin (via transdermal patch), or introduced rectally. Drugs that are swallowed are picked up and transported to the liver, where a portion of its molecules can be metabolized (chemically altered), before making it into the general bloodstream. Some drugs enter the body in the inactive form and need the liver to metabolize them in order to become active.
Once the drug molecules are in the bloodstream, they travel not only to their destination, but throughout the body. Aspirin, for example, taken to relieve a headache, also goes to the stomach lining, increasing the risk of ulcers. Some drugs are distributed only into the bloodstream and extracellular fluid, while others can penetrate into cells.
The body eventually removes medication by metabolizing or eliminating it—many drugs require both processes. Most medications are metabolized by the liver, and the majority are eliminated by the kidneys or in the bile. Drugs eliminated through the kidneys go to the bladder and leave the body in the urine, while those absorbed by bile are sent out into the small intestine and ultimately eliminated in the feces.
Toxicity, Interaction, and Side Effects
Poisonous or toxic properties of medicines must always be examined in detail. Virtually all drugs are toxic to some extent. Sometimes a toxic effect is an extension of the desired effect at a higher dose level. For example, anticoagulants prolong the clotting time of blood, but if the clotting time is prolonged excessively because of an over-dosage, hemorrhaging can occur.
Toxicity can also take the form of a side effect more or less unrelated to the drug's primary action. Many drugs cause gastrointestinal disturbances that trigger nausea, vomiting, or diarrhea. These effects occur before the drug is absorbed and reaches its site of action. Other toxic effects include allergic reactions.
In studying drug toxicity in animals, the medicinal scientist uses specific criteria in relating effectiveness and toxicity—such as a median effective dose versus a median lethal dose. As a general rule, the wider the range between such lethal and effective dosage, the safer the drug is likely to be for human consumption—although medicines causing nonlethal side effects are considered exceptions to this rule. In such cases, the scientist must weigh the seriousness of the disease versus the benefit derived from a given drug, and the extent of discomfort or damage that may result.
The interactions of various drugs can produce unusual symptoms in specific patients. Because of the lack of an adequate explanation, many of these symptoms might be attributed to an unusual manifestation of the disease itself. Only in recent years has it become evident that one drug may affect the action of another, and when two or more drugs are ingested, the possibilities for interactions can be great.
Closely related to drug toxicity and interaction is another consequence of drug usage—the possibility of a drug becoming addictive. In the early 20th century, drug addiction was the general term for any illicit use of drugs. In the 1930s, the term habituation came to describe a person's psychological dependence upon a drug—the emotional distress, but not the physiological illness, that resulted with the drug's absence. Drug addiction was characterized as the physical dependence on a drug's effects, with illness, or withdrawal symptoms, developing if the drug use was halted.
These definitions hardly covered the scope of the problems associated with drug use. In 1965, the World Health Organization (WHO) recommended that the term "drug dependence" be substituted for "drug addiction." Drug dependence is characterized by: the compulsive use of a drug for an extended time; the loss of control over its use, such that users take extraordinary or even harmful measures to continue using the drug; and the continued use of a drug despite adverse consequences or to stave off illness once the drug use is halted.
While many drugs ranging from analgesics (painkillers) to tranquilizers may cause some degree of dependence, the severity is often related to drug tolerance—when progressively more and more of the drug is required to get the desired effect.
The History of Drugs
Before the advent of modern chemistry, physicians seeking drugs to treat their patients could select only natural products (plants, animals, fungi) or inorganic materials (minerals). In ancient Egypt, for instance, priest-physicians concocted vile potions from dead flies, dried excreta, and bitter herbs, which were then inserted inside an appropriate orifice of the body.
While natural medications have always been significant drugs, it remained for Paracelsus (1493–1541) and the spirit of the European Renaissance to usher in the age of chemical drugs. Paracelsus advocated the internal use of chemical remedies, such as mineral salts and acids, and substances prepared by such chemical processes as distillation and extraction. Paracelsus believed it was no longer the task of alchemy to make gold, but to make medicines.
Modern medicines have been largely synthetic, even if they have also been derivatives of natural substances. For instance, coca leaves and Peruvian bark from the Andes Mountains of South America were used as far back as 500 B.C. They came into conventional medical use during the 17th century, and two centuries later, the active substances cocaine and quinine were chemically isolated from them. From the "vomiting nut," nux vomica, the alkaloid known as strychnine was derived. While the ancient Greeks used willow bark and leaves crushed in olive oil to apply to arthritic joints or muscles, they could not have possibly known that willow (Salix) contains a glucoside called salicin; its chemical makeup led to the synthesis of salicylic acid, with its chief derivative being aspirin. Aspirin is a chemical compound created from a natural substance after evolving through several transitory stages in the laboratory.
Types of Drugs
Antibiotics illustrate how natural substances can be used to create synthetic drugs. For instance, when Scottish physician Alexander Fleming happened to discover the antibacterial properties of the mold Penicillium notatum in 1928, he could not have known that he had discovered the storied antibiotic penicillin—the first "wonder drug" able to prevent the growth of bacterial parasites in infected hosts. Antibiotics are natural products produced by microorganisms, and they have the unique property of inhibiting the growth of other microorganisms. Still, antibiotic therapy remains imperfect. For instance, infections caused by certain strains of Pseudomonas are difficult to control. That was why ticarcillin (Ticar), produced from ampicillin, a penicillin sister mold, was obtained in 1964. Ticar was the first antibiotic to successfully combat Pseudomonas aeruginosa, a particularly troublesome pathogen that could be life-threatening both in severely burned patients and in those with impaired immune systems. In 1971, Ticar was superseded by another penicillin derivative—azlocillin—which remains one of the most effective drugs used against Pseudomonas infections.
Other types of antibiotic-producing organisms have been derived from soil microbes. In 1940, microbiologist Selman Waksman first obtained actinomycin A from Actinomyces antibioticus. The actinomycetes have features in common with both bacteria and fungi; actinomycin A was the first of many antibiotics derived from them. Their advantage: effectiveness against both bacterial and fungi-borne infections. Actinomycin A and its daughter drugs led to the creation of streptomycin—the first antibiotic effective in fighting tuberculosis.
Following the discovery of streptomycin, the pharmaceutical industry began screening soil samples from every corner of the globe in search of potential antibiotics. Among the new batch of antibiotics that were discovered was chloramphenicol—the first antibiotic effective against a broad spectrum of diseases, including typhus, typhoid, and meningitis. Other often-used antibiotics include the tetracyclines—doxycycline, minocycline, and plain tetracycline. Extremely effective against a wide range of infections, these drugs were the first entirely synthetic antibiotics.
In recent years, there has been a dramatic rise in microbial resistance to antibiotics, which has complicated physicians' efforts to treat various kinds of infections. Most major bacterial pathogens have acquired antibiotic-resistant genes, and some strains have even developed a resistance to nearly all available antibiotics. Among these "superbugs" are the ones responsible for Salmonella food poisoning, upper respiratory illnesses like bronchitis, and sexually transmitted diseases, such as gonorrhea.
A major cause of the spread of resistant bacteria is the misuse and overuse of antibiotics. If an antibiotic is not taken in a high enough dose or for a sufficient length of time, some of the bacteria causing a patient's infection will survive.
In both of these situations, only the most drug-resistant bacteria survive and reproduce, leading to the creation of a "superbug." In addition, millions of doses of antibiotics are prescribed for viral infections, which do not respond to them.
In the mid-1980s, a new family of drugs to treat bacterial infections was discovered—fluoroquinolones (examples include ciprofloxacin and ofloxacin), which block DNA-gyrase, an enzyme needed for synthesis of bacterial DNA. Fluoroquinolones work differently from most other antibiotics, which, once reaching a minimum effective concentration in the blood, kill bacteria at a constant rate. Fluoroquinolones are concentration-dependent, meaning that the more highly concentrated they are in the bloodstream, the higher the bacteria-killing rate they achieve.
Fluoroquinolones can be used for bladder infections, skin infections, traveler's diarrhea, hospital-acquired pneumonia, and some sexually transmitted diseases. They are totally synthetic drugs, and are not modeled after a natural precursor.
In 1938, two researchers from Johns Hopkins Hospital, in Baltimore, Maryland, reported that sulfanilamide had weak activity in animals infected with Mycobacterium tuberculosis, the bacterium responsible for tuberculosis. Other researchers soon discovered that sulfathiazole and sulfadithiazole were much more effective. By 1950, American physicians had derived thiacetazone from these drugs. While extremely toxic to the bacterium, thiacetazone was also excessively toxic to human livers.
A similar story was reported during the 1940s with the antibiotic streptomycin. While it was found to be effective against tuberculosis, large doses were found to produce deafness. Fortunately, the discovery of synthetic antitubercular drugs—in particular, isoniazid in 1952—eased this problem by permitting smaller doses of streptomycin to be used with other drugs.
Tuberculosis (TB) has made a comeback in recent years, partly because of the emergence of multidrug-resistant (MDR) strains of the tuberculosis bacterium.
For centuries, quinine—derived from the bark of the Peruvian cinchona tree—had served as the chief means of treating malaria throughout the world. No worthwhile progress in developing antimalarial alternatives was made until 1924, when a technique was devised that led to the first really effective quinoline (as synthetic antimalarial drugs are called). With a trade name of Plasmoquine, this drug became widely distributed by 1928. Plasmoquine contained the active ingredient pamaquine.
While quinine acts by slowing down the spread of malarial spores into the bloodstream (where they soon rupture red blood cells), large doses of pamaquine attacked the malarial pathogens in their alternate target site, the liver—suppressing them with astonishing speed—and the blood. Large doses of pamaquine greatly lowered the incidence of relapses in patients infected with the most common type of malaria.
Even better results were later obtained when small doses of pamaquine were combined with quinine—eliminating most adverse side effects. By the onset of World War II, it soon became apparent that another quinoline, mepacrine, was still more effective than either quinine or pamaquine in the treatment and suppression of malaria. In 1952, pyrimethamine (Daraprim) was discovered. This drug was so potent against the disease and yet so relatively nontoxic that it could be injected directly into the patient's bloodstream. As a consequence of hycanthone and several daughter synthetics, the incidence of most strains of malaria has been greatly reduced in developed nations where such drugs are available for widespread use. Elsewhere, especially in poorer tropical lands where millions of people are infected with the deadliest of the malarial species, drug resistance has become a problem. Often, doctors have had no choice but to give patients high doses of quinine, despite its serious side effects.
Although antibiotics are not effective against viruses, they sometimes are prescribed for patients with viral diseases to prevent secondary bacterial infections. Many health professionals doubt the efficacy of this practice and fear the consequences of indiscriminate use of antibiotics.
Because viruses invade healthy cells, a major hurdle for drug developers has been to produce drugs that fight the viruses without disrupting or killing the cells. Another difficulty results from the ability of viruses to rapidly evolve: they quickly create variants that are resistant to drugs.
By and large, antiviral drugs do not cure most viral infections, but instead suppress viral outbreaks or prevent the harmful effects of the virus. The most prominent antiviral drugs target the human immunodeficiency virus (HIV), the virus that causes AIDS. The first major drug used to treat HIV, zidovudine (AZT), was introduced in 1989, and several others followed in the early 1990s. These antiretrovirals prevent the viral RNA from creating the human DNA responsible for viral replication (making copies of itself), and therefore slow the disease's progression. They also prevent transmission of the virus from mother to infant during childbirth.
In the mid-1990s, a second class of antiviral drugs, protease inhibitors, provided a way to reduce HIV's concentration and prevent viral proteins from becoming activated. Antiretrovirals and protease inhibitors, when used alone or in combination, are not able to eradicate HIV, but they can slow disease progression and reduce the risk of patient death.
Another well-known antiviral drug, interferon, works by boosting the body's natural ability to defend itself. Genetic-engineering techniques allow large-scale production of human interferon, which has helped to significantly reduce its cost. Many people believe that in the coming years, interferon will assume an increasingly important role in combating hard-to-treat viral infections.
The first central nervous system (CNS) depressants were volatile (prone to vaporize) anesthetics, such as nitrous oxide (laughing gas) and ether. By the 1840s, both of these drugs were being used by dentists during tooth extractions. A few decades later, these volatile anesthetics (and the dangerous chloroform) were also being used to anesthetize patients during surgical procedures—frequently with fatal results. Today, a variety of anesthetic gases are in use; which type of gas is chosen for a particular patient depends on a variety of factors. For instance, general anesthesia, which induces and maintains a state of unconsciousness, may involve different drugs than does local anesthesia, which numbs only a small part of the body. Also, an anesthesiologist may use a long-acting drug at the beginning of surgery, then switch to a short-acting agent later. Drugs commonly used for general anesthesia include nitrous oxide, halothane, and enflurane; lidocaine and benzocaine are popular local anesthetics.
Other CNS depressants include barbiturates and benzodiazepines. These drugs can treat insomnia and seizures, but may cause physical and psychological dependence. Barbiturates can also induce respiratory depression—a condition in which the brain fails to "tell" the lungs to breathe—and are rarely used today. Benzodiazepines are still commonly used for insomnia and anxiety.
The greatest advance in psychoactive drugs occurred in 1952 with the introduction of chlorpromazine, a phenothiazine antipsychotic. It was the first effective drug therapy for schizophrenia and, when coupled with monitoring by physicians and counseling, enabled many patients to leave mental hospitals and function in society. The next effective antipsychotic drug class were the butyrophenones, which gave the same benefits as the phenothiazines, but without the side effects. The atypical antipsychotics available today are even more effective at eliminating symptoms of schizophrenia.
Mood Stabilizers and Antidepressants
Lithium was the first drug shown to stabilize the moods of people with manic-depressive disorders (in the 1950s), and it is still widely used today. Soon thereafter, monoamine oxidase inhibitors were introduced, but have since been abandoned because they interacted with other drugs and even certain foods to produce severe adverse reactions. Tricyclic antidepressants were first used in the 1970s, and found to be effective—but are lethal if taken in overdose, a serious concern in the treatment of depressed patients.
Since the 1980s, selective serotonin receptor antagonists, such as fluoxetine (Prozac) and sertraline (Zoloft), have been used most frequently. These drugs work well and pose a low risk of death in overdose. During the initial weeks of therapy, children and teenagers must be monitored closely, as they are at a higher risk of suicide early in the treatment period.
The gray area of central-nervous-system modifiers encompasses such often-used drug categories as anticonvulsants and analgesics. The first widely used anticonvulsant was probably bromide, discovered in seawater in 1826. Bromide was first used as an alternative to the chemically similar iodine in the treatment of a wide variety of diseases, including syphilis. Bromide couldn't curb grand mal epileptic convulsions, however, and not until a century later (the 1920s) did a more effective anticonvulsant—phenobarbital—achieve widespread acceptance. Phenobarbital was the wellspring of a wide variety of anticonvulsant phenol compounds. Most of these were obtained prior to World War II from chemical analgesics similar to aspirin. The most promising of these, troxidone, was discovered in 1943; while able to control petit mal epileptic seizures in children, it still possessed many toxic side effects. Finally, less-toxic anticonvulsants, such as ethosuximide, a succimide, came into widespread use in the 1960s. Classes of anticonvulsant drugs besides succimides in current use include hydantoins and benzodiazepines, which are better known as tranquilizers and sedatives.
Medications affecting circulatory-system function include antiarrhythmics (regulating heartbeat), antihypertensives (regulating blood pressure), and anticoagulants (preventing blood clotting).
The first drug used to regulate the heartbeat was quinidine, a derivative of quinine, in the 1920s. Many similar drugs followed. In the 1990s, potassium-channel-blocking drugs, including sotalol and amiodarone, were found to be safer and more easily tolerated. Since the mid-1960s, beta blockers, such as propranolol and metoprolol, have been used for arrhythmias after cardiac bypass surgery or after heart attacks.
The first effective antihypertensive was potassium thiocyanate, used to lower blood pressure as early as the 1870s. The first low-toxicity antihypertensive was sodium nitroprusside (Nipride), introduced in 1945. Today's hypertension medications can be grouped into six classes. Diuretics help eliminate excess water and sodium. Beta blockers cause the heart to beat with less force. Alpha blockers relax blood vessels. Calcium channel blockers prevent calcium from passing through cell membranes in the arterial walls, thus preventing constriction of the arteries. Angiotensin-converting enzyme (ACE) inhibitors, such as benazepril and captopril, prevent activation of a hormone that raises blood pressure. Central alpha agonists interfere with nerve impulses that cause arteries to constrict.
Heparin, the first drug used to prevent blood-clot formation, was introduced in 1926. It is still used, but more-recently developed anticoagulants are now favored. New medications called direct thrombin inhibitors (DTIs) do not require help from the body to prevent a clot, and can be used in cases where heparin cannot. Unlike the above medications, warfarin can be taken orally, and prevents the activation of several clotting factors. In patients with existing blood clots, a thrombolytic can help the body to quickly break them down. Risks associated with anticoagulants and thrombolytics include excessive bleeding.
What may well be the best-known circulatory drug in the world—sildenafil (Viagra)—was introduced in March 1998. Originally designed to treat angina, Viagra gained attention for one of its side effects: it increases blood flow in such a way as to improve male sexual function. In its first year, more than 4 million men had received prescriptions for the drug.
Gastrointestinal and Excretory-System Drugs
Most GI-tract medications are intended to relieve abdominal bloating, constipation, diarrhea, and excess stomach and intestinal acidity. In most cases, over-the-counter therapies are adequate, but sometimes prescription therapy is needed: antibiotics can cure certain gastrointestinal infections, metoclopramide increases the rate of stomach emptying, and drugs like granisetron (Kytril) can ease nausea and prevent vomiting.
On May 15, 1889, Charles Édouard Brown-Séquard, a highly respected 72-year-old professor of medicine at the Collège de France, injected himself with the first of eight doses of an extract of guinea-pig testicles. Soon thereafter, he announced that animal testes were capable of invigorating the sex drive of elderly men! Despite his dubious findings, Brown-Séquard can be regarded as the instigator of endocrine-drug-related research. Later compounds consisting of "monkey glands" and related ovarian hormones were tried as sex-hormone "elixirs." But it was not until 1934 that a male hormone (androsterone) was detected in male urine; the following year, 5 milligrams of another male hormone, testosterone, were also isolated. Endocrine-system-drug research has since led to the discovery of various female sex hormones, oral contraceptives, and anabolic steroids. Like many drugs, some of these discoveries were found to have significant side effects. For example, some increased the risk of breast cancer in men and women.
The information provided should not be used during any medical emergency or for the diagnosis or treatment of any medical condition. A licensed physician should be consulted for diagnosis and treatment of any and all medical conditions. Call 911 for all medical emergencies.
Copyright Information: Public domain information with acknowledgement given to the U.S. National Library of Medicine.