Friday, November 28, 2014

Thank a White Male

Surely not! Wasn't it White males who invented slavery and all that awful stuff, while nonWhites and nonMales were and are responsible for all the good stuff in the world? I mean, I keep hearing from earnest young men in their mothers' basements, assuring me that Native Americans lived in harmony with nature (when they weren't ripping one another's hearts out), and that there was some kind of great civilization in Africa before the White man came and burned down all their great cities and replaced them with mud huts, and that women could explore space and invent interplanetary probes much better than men can, if only male scientists would stop scaring them away with icky sexist shirts. And that's not to mention the Trayvons and Michael Browns out there, who were on the point of Going to College and becoming great engineers or doctors or something, when some White guy, or White Hispanic guy, decided to shoot them just for the hell of it.

With the White males working so hard over the centuries to keep all the other people down, it's hard to believe they've had time to be responsible for developing practically all of modern civilization. But somehow, they have.  Bob wallace explains:

"Thank a White Male"

The attacks on Dead Whites as racist, sexist whatevers who committed genocide on everyone is not done out of any sense of justice. It's done out of envy.

It's because the attackers can't compete with them, so have to drag them down to their level. The enviers, as always, don't realize they're dragging themselves down, too.

And in envy there is no gratitude at all, just resentment.
I can understand this attitude from eaten-alive-with-envy Third Worlders, but American women, the most pampered in the world? Are they that deluded, to cut their own throats?

This article is from the American Thinker and was written by Jeff Lipkes.

Do you like internal combustion engines?

Thank a few white men. (Jean Lenoir, Nikolaus Otto, Karl Benz, Rudolf Diesel, Gottlieb Daimler, Emil Jellinek, Henry Ford among others.)

Are you a fan of flush toilets and indoor plumbing?

Thank white males Alexander Cumming, Thomas Twyford, and Isaiah Rogers.
Toilet paper?

Thank Joseph Gayetty, W.M.

How about washing machines and dryers?

Thank white males Alva Fisher and J. Ross Moore.

“When you’ve got your health, you’ve got just about everything” ran the tag-line in a famous Geritol commercial from the 1970s, and the guys we most have reason to be grateful for are undoubtedly those who’ve developed the medical practices and the drugs and devices that have transformed our lives over the past hundred fifty years.

Before the turkey gets carved, it’s worth taking a moment to remember a few of these brilliant, persistent, and lucky men, and recall their accomplishments. Even when they’ve won Nobel Prizes in Medicine, their names are virtually unknown. They’re not mentioned in the Core Curriculum or celebrated by Google on their birthdays.


If you ever had surgery, did you opt for anesthesia?

If so, thank a few more white males, beginning with William Clarke in New York and Crawford Long in Georgia who both used chloroform in minor surgeries in 1842. A paper published four years later by William Morton, after his own work in Boston, spread the word. Ether replaced chloroform during the next decade. There are now scores of general and regional anesthetics and sedatives and muscle relaxants, administered in tandem. The first local anesthetic has also been superseded. It was cocaine, pioneered by a Viennese ophthalmologist, Carl Koller, in 1884.
Ever take an analgesic?

Next time you pop an aspirin, remember Felix Hoffmann of Bayer. In 1897, he converted salicylic acid to acetylsalicylic acid, much easier on the stomach. Aspirin remains the most popular and arguably the most effective drug on the market. In 1948 two New York biochemists, Bernard Brodie and Julius Axelrod, documented the effect that acetaminophen (Tylenol), synthesized by Harmon Morse in 1878, had on pain and fever. Gastroenterologist James Roth persuaded McNeil Labs to market the analgesic in 1953.

Infectious Diseases

Most Americans today die of heart disease or cancer, but before the twentieth century, it was infectious diseases that struck people down, and children were the primary victims. In pre-industrial England, still with the most developed economy in the world in the late 17th century, 50% of all children didn’t survive the age of 15. With the phenomenal growth of cities during the 19th century, cholera, typhoid fever, and tuberculosis became the leading killers.

In 1854, a London medical inspector, John Snow, proved that a cholera epidemic in Soho was caused by infected sewage seeping into the water supply. Until then it was thought the disease spread through the air. The sanitary disposal of sewage and the provision of clean water, possible thanks to mostly anonymous metallurgists and engineers -- an exception is the famous Thomas Crapper, who pioneered the u-shaped trap and improved, though he didn’t invent, the flush toilet -- has saved more lives than any drug or surgical innovation.

Dramatic improvements in food supply have also had an incalculable effect on health. Agricultural innovations, beginning with those introduced in England in the 18th century, were disseminated globally by the end of the 20th century -- the “Green Revolution.” Famines struck Europe as recently as the late 1860s. (The man-made famines of the 20th century are another story.) A transportation revolution made possible the provision of more than sufficient protein, calories, and nutrients worldwide. Needless to say, it was white males who designed and built the roads, canals, railroads, and ports and airports, and the ships, trains, planes, and trucks that used them, and the mines, and then wells, pipelines, and tankers that supplied the fuel they ran on.

Whatever the merits of taking vitamins and supplements today, no one has to take vitamin C to prevent scurvy, or vitamin B to prevent pellagra, or vitamin D and calcium to prevent rickets. And, for the time being, we all live in a post-Malthusian world. The global population was about 800 million to 1 billion when the gloomy parson wrote his famous book in 1798. It’s now over 7 billion.
Dr. Snow had no idea what was actually causing cholera. It was Louis Pasteur who gave the world the germ theory of disease, as every schoolchild once knew. Studying the fermentation of wine, he concluded that this was caused by the metabolic activity of microorganisms, as was the souring of milk. The critters were responsible for disease, too, he recognized, and identified three killer bacteria: staphylococcus, streptococcus, and pneumococcus. Nasty microorganisms could be killed or rendered harmless by heat and oxygenation, Pasteur discovered, and would then prevent the disease in those who were inoculated. He went on to develop vaccines for chicken cholera, anthrax, and rabies. Edward Jenner had demonstrated in in the late 1790s that the dreaded smallpox could be prevented by injecting patients with material from the pustules of cowpox victims, a much milder disease. (The word vaccine comes from vaca, one of the Latin words for cow.) Pasteur, however, was the first to immunize patients by modifying bacteria rather than through cross-vaccination.

A parade of vaccines followed. People in their mid-60s and older can remember two of the most famous: the Salk and Sabin vaccines against poliomyelitis, a paralyzing disease that had panicked American parents in the late ‘40s and early ‘50s. Children preferred Albert Sabin’s 1962 version: the attenuated virus was administered on a sugar cube. Jonas Salk’s inactivated vaccine, available in 1955, was injected.

In 1847, more than a decade before Pasteur disclosed his germ theory, the Viennese obstetrician Ignaz Semmelweis documented the effectiveness of hand washing with chlorinated water before entering a maternity ward. He brought mortality rates from puerperal fever down from 8% to 1.3%. Two decades later, having read a paper by Pasteur, Joseph Lister demonstrated the effectiveness of carbolic acid to sterilize wounds and surgical instruments. Mortality rates fell from around 50% to about 15%. The efforts of both men, especially Semmelweis, were met with ridicule and disdain.

Pasteur’s German rivals Robert Koch and Paul Ehrlich made monumental contributions to biochemistry, bacteriology, and hematology, but left the world no “magic bullet” (Ehrlich’s term). Koch identified the organism causing tuberculosis, the leading killer of the 19th century, but his attempts at finding a vaccine failed. His purified protein derivative from the bacteria, tuberculin, could be used to diagnose the disease, however. It was two French researchers, Albert Calmette and Camille Guerin, who developed a successful vaccine, first administered in 1921, though it was not widely used until after World War II.

Ehrlich joined the search for antibacterial drugs that were not denatured bacteria or viruses. He synthesized neoarsphenamine (Neo-Salvarsan), effective against syphilis, a scourge since the late15th century, but which had toxic side effects. It was not until the 1930s that first generation of antibiotics appeared. These were the sulfa drugs, derived from dyes with sulfa-nitrogen chains. The first was a red dye synthesized by Joseph Klarer and Fritz Mietzsch. In 1935, Gerhard Domagk at I. G. Farben demonstrated its effectiveness in cases of blood poisoning.

The anti-bacterial properties of Penicillium had already been discovered at this point by Alexander Fleming. The Scottish bacteriologist had famously left a window open in his lab when he went on vacation in 1928, and returned to find that a mold had destroyed the staphylococcus colony in one of his petri dishes. But it’s one thing to make a fortuitous discovery and another thing to cultivate and purify a promising organic compound and conduct persuasive trials. This was not done until 1941. Thank Oxford biochemists Howard Florey and Ernst Chain. A Pfizer chemist, Joseph Kane, figured out how to mass-produce penicillin and by 1943 it was available to American troops. The wonder drug of the 20th century, penicillin killed the Gram-positive bacteria that caused meningitis, diphtheria, rheumatic fever, tonsillitis, syphilis, and gonorrhea. New generations of antibiotics followed, as bacteria rapidly developed resistance: among them, streptomycin in 1943 (thank Selman Waksman), tetracycline in 1955 (thank Lloyd Conover), and, the most widely prescribed today, amoxicillin.

Diagnostic technologies

Microscope: While the Delft draper Antonie van Leeuwenhoek didn’t invent the compound microscope, he improved it, beginning in the 1660s, increasing the curvature of the lenses, and so became the first person to see and describe blood corpuscles, bacteria, protozoa, and sperm.
Electron microscope: Physicist Ernst Ruska and electrical engineer Max Kroll constructed the prototype in Berlin in 1933, using a lens by Hans Busch. Eventually, electron microscopes would be designed with two-million power magnification. Leeuwenhoek’s had about two hundred.
Stethoscope: Thank the French physician René Laennec, who introduced what he called a microphone in 1816. British nephrologist Golding Bird substituted a flexible tube for Laennec’s wooden cylinder in 1840, and the Irish physician Arthur Leared added a second earpiece in 1851. Notable improvements were made by Americans Howard Sprague, a cardiologist, and electrical engineer Maurice Rappaport in the 1960s (a double-sided head), and Harvard cardiologist David Littmann in the same decade (enhancing the acoustics). The device undoubtedly transformed medicine, and with good reason became the symbol of the health care professional.
Sphygmograph: The first machine to measure blood pressure was created by a German physiologist, Karl von Vierordt in 1854.

X-rays: Discovered by Karl Wilhelm Röntgen, at Wurzberg in 1895, this was probably the single most important diagnostic breakthrough in medical history. Before Röntgen noticed that cathode rays, electrons emitted from a cathode tube, traveled through objects and created images on a fluorescent screen, physicians could only listen, palpitate, examine stools, and drink urine.
PET scans: James Robertson designed the first machine in 1961, based on the work of number of American men at Penn, Wash U., and Mass General, designed the first machine. The scanner provides an image from the positron emissions coming from a radioactive isotope injected into the patient, and is particularly useful for mapping activity in the brain.

CAT scans: The first model was developed by electrical engineer Godfrey Hounsfield, in London, 1972, drawing on the work of South African physicist Alan Cormack in the mid-1960s. It generates three-dimensional and cross-sectional images using computers and gamma rays.
MRI: Raymond Damadian, a SUNY professor of medicine with a degree in math, performed the first full-body scan 1977. His design was anticipated by theoretical work by Felix Bloch and Edward Purcell in the 1930s, and, later, Paul Lauterbur. MRIs map the radio waves given off by hydrogen atoms exposed to energy from magnets, and are particularly useful in imaging tissue -- and without exposing the patient to ionizing radiation.

Ultrasound: Ian Donald, a Glasgow obstetrician, in the mid-1950s adopted a device already used in industry that generated inaudible, high frequency sound waves. The machine quickly and cheaply displays images of soft tissue, and now provides most American parents with the first photo of their baby.

Endoscopes: Georg Wolf produced the first flexible gastroscope in Berlin in 1911, and this was improved by Karl Storz in the late ‘40s. The first fiber optic endoscope was introduced in 1957 by Basil Hirschowitz, a South African gastroenterologist, drawing on the work of British physicist Harold Hopkins. The scope is indispensible in diagnosing GI abnormalities.

Angiogram: Werner Forssmann performed the first cardiac catherisation -- on himself -- in Eberswald in 1929. He inserted a catheter into his lower left arm, walked downstairs to a fluoroscope, threaded the catheter to his right atrium and injected a radioptic dye. The technique was further developed by Dickson Richards and André Courmand at Columbia in the ‘40s, and then extended to coronary arteries, initially accidentally, by Frank Sones at the Cleveland Clinic in 1958.

X-rays and scopes were quickly used in treatment as well diagnosis. Roentgen himself used his machines to burn off warts. Similarly, in 1964, Charles Dotter and Marvin Judkins used a catheter to open a blocked artery, improving the technique in 1967. Andreas Gruentzig then introduced balloon angioplasty in 1975, an inflated balloon opening the narrowed or blocked artery. In 1986, Jacques Puel implanted the first coronary stent at U. of Toulouse, and soon afterwards a Swiss cardiologist, Ulrich Sigwart, developed the first drug-eluding stent.

The men who developed five of the most dramatically effective and widely used drugs in internal medicine deserve mention.

In the late ‘30s, two Mayo Clinic biochemists hoping to cure rheumatoid arthritis, Philip Hench and Edward Kendall, isolated four steroids extracted from the cortex of the adrenal gland atop the kidneys. The fourth, “E,” was very difficult to synthesize, but Merck chemist Lewis Sarrett succeeded, and in 1948, the hormone was injected into fourteen patients crippled by arthritis. Cortisone relieved the symptoms. Mass produced, with much difficulty, by Upjohn chemists in 1952, it was refined by their rivals at Schering three years later into a compound five times as strong, prednisone. In addition to arthritis, corticosteroids are used in the treatment of other inflammatory diseases, like colitis and Crohn’s, and in dermatitis, asthma, hepatitis, and lupus.
Anyone over fifty can remember peptic ulcers, extremely painful lesions on the stomach wall or duodenum. They were thought to be brought on by stress. “You’re giving me an ulcer!” was a common expression. Women were especially affected, and a bland diet was the only treatment, other than surgery. The lesions were caused by gastric acid, and two British pharmacologists and a biochemist, George Paget, James Black, and William Duncan, investigated compounds that would block the stomach’s histamine receptors, reducing the secretion of acid. There were endless difficulties. Over 200 compounds were synthesized, and the most promising, metiamide, proved toxic. Tweaking the molecule, replacing a sulfur atom with two nitrogen atoms, yielded cimetidine in 1976. As Tagamet, it revolutionized gastroenterology. It was also the first drug to generate over $1 billion in annual sales. Its successors, the proton pump inhibitors Prilosec and its near-twin Nexium, more than doubling the acid reduction, have also been blockbuster drugs.
Cimetidine was the culmination of one line of research that began in 1910, when a London physiologist, Henry Dale, isolated a uterine stimulant he called “histamine.” Unfortunately, when it was given to patients, it caused something like anaphylactic shock. The search began for an “antagonist” that would block its production, even before it was recognized as the culprit in hay fever (allergic rhinitis). The most successful antagonist was one was developed in 1943 by a young chemist in Cincinnati, Geroge Rieveschl, diphenhydramine, marketed as Benadryl. Ten to thirty percent of the world’s population suffers from seasonal allergies, so this was hailed as miracle drug. In the early ‘80s a second generation of antihistamines appeared that didn’t cross the brain-blood barrier and thus didn’t sedate the user. Loratadine (Claritin), the first, was generating over $2 billion in annual sales before it went generic.

Diabetes, resulting in high blood glucose levels (heperglycemia), has been known for two millennia. It was a deadly disease, type 1 rapidly fatal, type 2, adult onset, debilitating and eventually lethal. By the end of the 19th century, the Islets of Langerhans in the pancreas had been identified as the source of a substance that prevented it, insulin, but this turned out to be a fragile peptide hormone, broken down by an enzyme in the pancreas during attempts to extract it. In 1921, Canadian surgeon Frederick Banting and medical student Charles Best determined a way to disable the production of the enzyme, trypsin. Injected in a teenager with type 1 diabetes, insulin was immediately effective. There is still no cure for diabetes, but today the 380 million sufferers globally can live normal lives thanks to Banting and Best.

Finally, millions of men and their wives and girlfriends owe a big debt to British chemists Peter Dunn and Albert Wood, and Americans Andrew Bell, David Brown, and Nicholas Terrett. They developed sildenafil, intended to treat angina. It works by suppressing an enzyme that degrades a molecule that relaxes smooth muscle tissue, increasing blood flow. Ian Osterloh, running the clinical trials for Pfizer, observed that the drug induced erections, and it was marketed for ED. Viagra made the cover of Time Magazine after it was approved in March 1998. The blue pill still generates about $2 billion annually in sales, despite competition, and is prescribed for 11 million men.

Two incredible machines build in the mid-20th century revolutionized the practice of medicine. Both remove blood from the body.

During World War II, the Dutch physician Willem Kolff constructed a machine to cleanse the blood of patients suffering from renal failure by filtering out urea and creatine. Over 400,000 Americans are on dialysis today.

In 1953, after 18 years of work, John Gibbon, a cardiologist at the University of Pennsylvania, produced a machine that oxygenated blood and pumped it around the body, permitting operations on the heart, like those performed a decade later by Michael DeBakey in Houston and René Favaloro in Cleveland. The two surgeons pioneered coronary bypass grafts, using a blood vessel in the leg or chest to re-route blood around a blocked artery. About 200,000 operations are performed each year, down from about 400,000 at the turn of the century, thanks to stents. Gibbon’s machine enabled the most widely covered operation in history, the heart transplant, first performed by South African surgeon Christian Barnard in 1967, based on research by Norman Shumway and others. Over 2,000 Americans receive heart transplants each year.

The cardiac device Americans are most likely to encounter is the defibrillator, now in airports, stadiums, supermarkets, and other public places. Thank two Swiss professors, Louis Prévost and Frédéric Batelli, who, in 1899, induced ventricle fibrillation, abnormal heartbeat, in dogs with a small electrical shock, and restored normal rhythm with a larger one. It was not until the 1940s that a defibrillator was used in heart surgery, by Claude Beck in Cleveland. A Russian researcher during World War II, Naum Gurvich, discovered that biphasic waves, a large positive jolt followed by a small negative pulse, was more effective, and a machine was constructed on this basis by an American cardiologist, Bernard Lown. Improvements by electrical engineers William Kouwenhoven and Guy Knickerbocker, and cardiologist James Jude at Hopkins in 1957, and subsequently by Karl and Mark Kroll, and Byron Gilman in the ‘90s made the device much smaller and portable.

Over three million people worldwide don’t have to worry about defibrillators or face open-heart surgery. These are the recipients of pacemakers, and can thank a Canadian electrical engineer, John Hopps. Predecessors were deterred by negative publicity about their experiments, which were believed to be machines to revive the dead. Gurvich had faced this as well. Hopps’ 1950 device used a vacuum tube. With the invention of the transistor, a wearable pacemaker became possible, and Earl Bakken designed one in 1958. Not long afterward, two Swedish engineers, Rune Elmquist and Åke Senning created an implantable pacemaker. The first recipient eventually received 26 and lived to age 86. Lithium batteries, introduced in 1976, enabled the creation of devices with a much longer life.

Cardiac Drugs

Cardiac stimulants have been around since the late 18th century. Thank William Withering, who published his experiments with the folk-remedy digitalis (from foxglove) in 1785.
Anti-anginal drugs were introduced a century later, also in Britain: amyl nitrite in the mid-1860s and nitroglycerin a decade later. Both compounds had been synthesized by French chemists. Thank Thomas Bruton and William Murrell.

The first diuretics, to reduce edema (swelling) and lower blood pressure, were alkaloids derived from coffee and tea. These were not very effective, but better than leeches. Mercury compounds were pioneered by the Viennese physician Arthur Vogel in 1919. These worked, but were tough on the kidneys and liver. The first modern diuretics, carbonic anhydrase inhibitors, were developed in the 1940s, with the American Karl Beyer playing a leading role.

The first anti-coagulants date from the ‘20s. A Johns Hopkins physiologist, William Howell, extracted a phospholipid from dog livers that he called heparin and that appeared to prevent blood clots. The first modern anti-coagulant, and still the most widely prescribed, was warfarin (Coumadin), developed as a rat-poison by Karl Link in Wisconsin in 1948. Its effectiveness, and lack of toxicity, was revealed when an army recruit took it in a suicide attempt.

Anti-arrhythmic drugs, to stabilize the heartbeat, were introduced in the opening decade of the 20th century. The first was derived from quinine. The big breakthrough occurred in 1962. Thank, once again, the Scotsman James Black, who synthesized propranolol in that year, the first beta-blocker. What they block are the receptors of epinephrine and norepinephrine. These two chemicals (catecholamines) increase the heart rate, blood pressure, and blood glucose levels, useful for many purposes, but not a good thing in patients with cardiac arrhythmia, irregular heartbeats. Beta-blockers are also prescribed to lower blood pressure.

ACE inhibitors lower the levels of an enzyme secreted by the kidneys and lungs that constricts blood vessels. The unpromising source for the first inhibitor was the venom of the Brazilian pit-viper. It was extracted, purified, and tested by three Squibb scientists in 1975, David Cushman, Miguel Ondetti, and Bernard Rubin. It’s still widely prescribed, though many other ACE inhibitors have since been designed. They are used for patients with congestive heart failure or who have had a heart attack, as well as those with hypertension.

Finally, mention must be made of the statins, which, though over-hyped and over-prescribed, lower serum cholesterol and reduce the risks of a second heart attack. A Japanese microbiologist, Akira Endo, derived, from a species of Penicillium, a substance that inhibited the synthesis of cholesterol, but it was too toxic to use on humans. In 1978, a team at Merck under Alfred Alberts had better luck with another fungus, and called the compound lovastatin. Statins work by inhibiting the activity of an enzyme called HMGR.

Cancer Drugs

In the forty-three years since Richard Nixon’s “war on cancer” was launched, the disease has received the lion’s share of government, foundation, and pharmaceutical industry funding, though heart disease kills more people -- 596,577 Americans last year to 576,691 for cancer, according to the most recent data. This makes it particularly difficult, and invidious, to single out individual researchers.

There is still, of course, nothing close to a magic bullet, though cancer deaths have dropped about 20% since their peak in 1991. Around 27% of cancer deaths this year will be from lung cancer, so the rate will continue to fall as more people stop smoking.

The originators of a few therapies with good five-year survival rates ought to be singled out and thanked.

Seattle oncologist Donnall Thomas performed the first successful bone marrow transplant in 1956. The donor was an identical twin of the leukemia patient. With the development of drugs to suppress the immune system’s response to foreign marrow, Thomas was able to perform a successful transplant from a non-twin relative in 1969. About 18,000 are now performed each year.

One of the more notable successes of chemotherapy has been in the treatment of the childhood cancer acute lymphoblastic leukemia (ALL). Sidney Farber in the late ‘40s carried out clinical trials with the antifolate aminopterin, synthesized at Lederle by the Indian biochemist Yellapragada Subbarow. This proved the first effective compound in treating the disease. It was superseded by methotrexate, and now, as in all chemo treatments, a combination of agents is used. The five-year survival rate for ALL has jumped from near zero to 85%.

Early detection is the key to successful treatment in all cancers, and survivors of breast cancer can thank at least four men who pioneered and popularized mammography over a fifty-year period beginning in 1913: Albert Salman, Stafford Warren, Raul Leborgne, and Jacob Gershon-Cohen.

A second key to the comparatively high survival rates for women with breast cancer is tamoxifen. First produced in the late ‘50s by British endocrinologist Arthur Walpole, it was intended as a “morning-after” birth control pill because it blocked the effects of estrogen. However, it failed to terminate pregnancy. Researchers had meanwhile discovered that some, though not all, women with breast cancer recovered when their ovaries were removed. Walpole thought tamoxifen might block breast cancer estrogen receptor cells, inhibiting their reproduction, and persuaded a professor of pharmacology, Craig Jordan, to conduct experiments. These demonstrated the drug’s efficacy, and after clinical trials it was approved and marketed in 1973. Think of Arthur W. the next time you see one of those ubiquitous pink ribbons.

Most chemo agents are cytotoxic metal-based compounds that do not distinguish between abnormal cells and healthy cells that also divide rapidly. The nasty side effects range from hair-loss and nausea to decreased production of red blood cells, nerve and organ damage, osteoporosis and bone fusion, and loss of memory and cognition. More selective drugs, monoclonal antibodies, have been used for some time. These were first produced by Georges Köhler and César Millstein in 1975 and “humanized” by Greg Winter in 1988, that is, made more effective by using recombinant DNA from mammals. Over 30 “mab” drugs have been approved, about half for cancer.

Research has also been underway for years into delivery systems using “nano-particles” that will target tumors exclusively. Another approach, pioneered by Jonah Folkman, has been to find drugs that will attack the blood supply of tumors, angiogenesis inhibitors. This turned out not to be the magic bullet Folkman hoped for, but more than fifty of these drugs are in clinical trials, and a number are effective owing to other mechanisms, and are currently used.

Psychiatric medicine

Drugs have revolutionized the practice of psychiatry since the 1950s, and brought relief to millions suffering from depression, anxiety, and psychoses. For obvious reasons, these are some of the most highly addictive and widely abused drugs.

A few men to thank:

Adolph von Baeyer, Emil Fischer, Joseph von Mering: barbiturates, synthesized in 1865, but not marketed until 1903. The most commonly prescribed today are phenobarbital sodium (Nembutal) and mephobarbital (Membaral).

Bernard Ludwig and Frank Berger: meprobamate, the tranquilizer Miltown. By the end of the ‘50s, a third of all prescriptions in America were for this drug

Leo Steinberg: the anxiolytic (anti-anxiety) benzodiazepines, first synthesized in 1955. The most successful initially was diazepam, Valium, marketed in 1963. The most widely prescribed benzodiazepine today is alprazolam, Xanax. It’s also the most widely prescribed psychiatric drug, with nearly 50 million prescriptions. It increases concentrations of dopamine and suppresses stress-inducing activity of the hypothalamus.

Leandro Panizzon: methylphenidate (Ritalin). The Swiss chemist developed it in 1944 as a stimulant, and named it after his wife, whose tennis game it helped improve. Until the early ‘60s amphetamines were used, counter-intuitively, to treat hyperactive children. Thirty years after its patent expired, the controversial dopamine re-uptake inhibitor is still the most widely prescribed medication for the 11% of children who’ve been diagnosed with ADHD.

Klaus Schmiegel and Bryan Malloy: the anti-depressant fluoxetine, the first SSRI, selective serotonin reuptake inhibitor, increasing serotonin levels. Marketed as Prozac in 1988, it made the cover of Newsweek and is still prescribed for over 25 million patients.

Paul Janssen: risperdone (Risperdal), the mostly widely prescribed antipsychotic drug worldwide. The Belgian researcher developed many other drugs as well, including loperamide HCL (Imodium). When commenters on web articles advise trolls to take their meds, they might want to specify risperdone.

Seiji Sato, Yasuo Oshiro, and Nobuyuki Kurahashi: aripiprazole (Abilify) which blocks dopamine receptors, and was the top selling drug at the end of 2013, grossing $1.6 billion in Q4.
A few observations.

Japanese and Indian researchers will make important contributions to future drugs, as the trio responsible for Abilify reminds us.

And, naturally, some women have played roles in the advances that have been summarized. Mary Gibbon, a technician, assisted her husband on the heart-lung machine. Lina Stern did important research on the blood-brain barrier, and it was in her lab that Guravich improved the defibrillator. Jane Wright conducted early trials of methotrexate that helped demonstrate its efficacy. Lucy Wills did pioneering work on anemia in India. Roslyn Yalow helped develop radioimmunoassay, which measures concentrations of antigens in the blood. Anne-Marie Staub did interesting work on antihistamines, though her compounds proved toxic.

They are exceptions. Our benefactors have not only been overwhelmingly European males, but are mostly from England and Scotland, Germany, France, Switzerland, and the Netherlands, as well as Americans and Canadians whose families emigrated from those countries. And, of course, Jews, who’ve won 28% of the Nobel Prizes in Medicine.

Some of the beneficiaries in particular might want to think about this.

Muslims boast that their faith has over 2 billion followers throughout the world. If this number is accurate it has far less to do with the appeal of Islam or with Arab or Turkish conquests, and everything to do with the work of some Northern Europeans and Jews, along with the “imperialists” who built roads, canals, and ports and the vehicles that use them, as well as schools and hospitals -- like the traveling eye clinics in Egypt funded by the Jewish banker Ernest Cassel, which nearly eliminated blinding trachoma, then endemic.

The fact that we in the U.S. idolize our entertainers as no society has before is not going to cut off the supply of outstanding medical researchers. Very bright and inquisitive people usually don’t pay much attention to popular culture. But it diminishes us.

It’s the ingratitude, though, not the indifference, that’s more troubling.

Biting the hand that feeds is a core principle of Leftists. For 150 years, they’ve sought to concentrate power in their own hands by exploiting the resentment of ignorant people against a system that has finally enabled mankind to spring the Malthusian trap.

Multiculturalism, with its simple-minded relativism, has broadened the scope of the party line. Not only shadowy “capitalists” are vilified, but whites and males. Ignorant people can now think well of themselves by opposing “racism” and “the patriarchy” -- and by voting for an unqualified and deceptive poseur because, though a male, he is not white.

The first step parents can take to help spare America from being “fundamentally transformed” is to insist that history be properly taught. This means, among other things, recognizing the accomplishments of a few men who’ve found cures for or relieved the symptoms of diseases that have killed and tortured humans for millennia.
Read the original, and see Bob's links, l here:
Quibcag: The illustration of the hand-biting dog is from Nichijou (日常).

Fred Crosses the Racial Rubicon

Actually, he's crossed it before, but he keeps crossing it more intensely, and more power to him. An aside: Just yesterday, an earnest young liberal on the net asserted that White people are prejudiced against Blacks because the only Blacks they see on TV are behaving badly, but if they actually interacted with Blacks in real life, they wouldn't be prejudiced against them, because they'd see that Blacks Are Just Like Everybody Else.

If this was true, you'd find that White people in regions with few or no Blacks would have the lowest opinion of Blacks. People like Swedes and the inhabitants of Maine and Oregon. And, of course, White people who interact with actual Blacks a lot because there are a lot of Blacks where they live, they would have a very high opinion of Blacks. White people like the inhabitants of the former Confederacy.

Of course, the reverse is true. This is a useful rule of thumb. Whatever liberals assert passionately usually is the reverse of the truth. White people who have to deal with Blacks on a regular basis develop an aversion to them and try to stay away from them. Whites who never deal with Blacks assume that Blacks are just like they are, only with darker skin, and never have any experiences that undermine this idea.  But Fred, fresh from his latest Rubicon crossing, can say it better than I can. This is from his website here:

Fergusons in Perpetuity

Thoughts on the Unfixable

Two questions, methinks, arise from Ferguson's latest outburst. The first, political, is "Why does the country tolerate it?" The second, more anthropologically interesting, is "Why the eerie incapacity of underclass blacks to understand evidence, or law, or much of anything?" Of the countless explanations given for the poor performance and poor behavior of blacks in the US, one of them dares not speak its name: Low intelligence.

Yet it fits all the evidence. It explains why Africa never built cities, why it did not invent writing, why there was no African Fifth-Century Athens. It explains why Rhodesia, prosperous and an exporter of food when run by whites, fell immediately into hunger and barbarism when whites left. It explains the dysfunction of black societies from Africa to Haiti to Detroit. It explains why blacks invariably score far below whites and Asians on tests of IQ, on the SATs, GREs, on entrance and promotion exams for fire and police departments.

It explains the need for affirmative action and for departments of Black Studies in universities when black students can’t handle real courses. It explains why the gap in academic achievement never closes. It explains the criminality, the violence, the poor impulse control, the dependency on welfare, the unemployment, and the inability to integrate themselves into a high-tech society. It explains the constant scandals involving teachers in black schools giving students the answers on standardized tests.

Further, it explains why none of the programs intended to raise performance of blacks in the schools ever work. Head Start didn’t work. Integrated schools didn’t work, nor segregated schools, nor black schools with white teachers nor black schools with black teachers. Expensive laboratories and free computers didn't work. Schools run entirely by blacks with very high per-student expenditure (Washington, DC for example) didn’t work. There is no indication that anything at all will ever work. Low intelligence is the obvious explanation. There is precious little counterevidence.

Endless evasions seek to avoid the unavoidable. Tests are biased, all tests without exception. Africa is primitive because of colonialism, or for geographic reasons, or because the natives liked hunting and gathering. Detroit is largely illiterate because of slavery, or low self-esteem, or  institutional racism, which seems to mean undetectable racism. On and on.

 If the consequences didn’t affect others, it would be needless, even cruel, to mention cognitive deficits. But they do affect society, very damagingly. They result in the enstuipidation of schools to which the bright go, and cripple the high-end brains upon which the prosperity of the United States depends. They result in Fergusons.

Among people who study intelligence, the racial disparity is not debated. It is evident, accepted. I suspect that it is evident also to many thoughtful liberals who fear the question: If we admit the obvious, what now? And would they be invited to any more cocktail parties of the politically correct?
And so, if psychometrists state the truth publicly, they are shouted down and said to be racists, bigots, and “pseudo-scientists.” They are not. Rather they are highly intelligent and competent statisticians, far more aware than the public of possible sources of error. The achievements of blacks closely fit the predictions that come out of psychometrics.

These scholars are worth reading. Try  Social consequences of group differences in cognitive ability, by Linda Gottfredson of the University of Delaware, long but comprehensive. Daniel Seligman is short and clear.

Unfortunately, understanding their writings (should one want to) requires some faint memory of eighth-grade algebra, such as what a curve means, and some mathematics barely beyond arithmetic. This eliminates most of those who dispute the evidence.

A glance at the data reveals that there will be a small number of very smart blacks and a larger number of fairly smart blacks. This we see.  They are engineers and lprogrammers. They appear on television as well-educated talking-heads speaking good English. To whites who never see any other blacks, this gives the impression that, since these blacks are like white people, all would be if it weren’t for discrimination. Would that it were so. It isn’t.

What are the implications?

First, we will see a continuation of hostility by blacks toward whites. This often amounts to outright hatred, as seen in the intermittent riots that never cease, and in the frequent, though carefully under-reported, racial attacks on whites. If blacks cannot rise, and it seems they cannot, they will remain angry in perpetuity. Then what?

If you believe the hostility does not exist, or is not intense, read rap lyrics. Many examples could be adduced. Here is one:

“Niggas in the church say: kill whitey all night long … the white man is the devil … the CRIPS and Bloods are soldiers I’m recruiting with no dispute; drive-by shooting on this white genetic mutant … let’s go and kill some rednecks … Menace Clan ain’t afraid … I got the .380; the homies think I’m crazy because I shot a white baby; I said; I said; I said: kill whitey all night long … a nigga dumping on your white ass; fuck this rap shit, nigga, I’m gonna blast … I beat a white boy to the motherfucking ground...."”
(“Kill Whitey”; Menace Clan, Da Hood, 1995, Rap-A-Lot Records, Noo Trybe Records, subsidiaries of what was called Thorn EMI and now is called The EMI Group, United Kingdom.)

Not encouraging.

Second, things will get—are getting—worse. First-world countries are brain-intensive. Automation eats rapidly away at the low-end jobs for which blacks are usually qualified. So do Mexicans. In a technological society, people at the bottom at some point become economically unnecessary, unemployable for anything at any wage. This happens now to blacks, and soon will  to unintelligent whites. The unnecessary will need, do need, to be kept in custodial care, however disguised. The alternative is starvation.

Third, serious conflict is likely between blacks and Hispanics. There is no love between the two. Today when Latinos move into a neighborhood, they tend to drive blacks out. They are brighter and work harder. For the moment blacks hold the political upper-hand, but Latinos grow in number and in their proportion of voters. A train-wreck is on the way.

Fourth, the danger will grow of serious conflict between whites and blacks. I suspect that even now only heavy federal pressure and dissimulation by the media keep the cork in the bottle. Among whites a large proportion loathe affirmative action, degraded educational standards, toleration of crime, and compulsory integration.

 As the economy declines and jobs become scarcer, the likelihood grows that jobless whites will rebel against racial preferences. The hidden rock in the current is that if affirmative action were eliminated, blacks would almost disappear except in sports and entertainment. There will be hell to pay, though in what currency is not clear.

What in god’s name to do?
What indeed? To learn about Fred's set of options, go here to read the rest:
Quibcag: The clueless cute girl reporter seems to be from Fruits Basket (フルーツバスケット Hepburn: Furūtsu Basuketto).

Thursday, November 27, 2014

Feminism: Silly or Just Wrongheaded?

Both, actually. Most feminists are just silly, especially feminist leaders, who mix their silliness with a generous portion of evil. But some feminists, especially the ones who don't seem automatically flaky about everything, are just wrongheaded. And that means that they have good instincts and decent reasoning power but lack the facts to apply them to in order to come to sensible conclusions. This is actually true of a lot of liberals. They've had lies fed to them as facts, and actual facts hidden from them, so of course they're going to come to erroneous ideas about things. The lies include the whole human equality/blank slate concept, which states that all human beings have the potential to be equal in intellect and temperament, an idea so breathtakingly unscientific as to appeal to only those who have been protected from facts and logic for their whole lives. And a special case of the "equality" lie is that there are no intrinsic differences in the thinking and behavior of the human sexes, and that all such differences are caused by environment.  But these are the lies that we all breathe in daily, courtesy of the MAG (Media, Academia, Government), so why should we be surprised that liberals in general and feminists in particular, believe them?

Anyhow, simply dismissing all feminists as silly/evil might be a mistake and is certainly not productive. Lysander Swooner, over at The Right Stuff, expands on this notion:

Feminism Exists. . .

But only in Western nations and due to their inherent egalitarian natures. Is it not the epitome of irony that the only societies where feminism can persist are the very ones that do not need it? Perhaps, but I believe that we can go deeper and find an even more biting irony.

In order to make progress in an honest discussion about this issue, I’m going to ask readers to ease back into their office chairs and read with an open mind. I’m sure that some will accuse me of “iron knighting” but I really don’t care. I’m going to speak the truth as I see it. I’m looking to speak directly to other men on the right, not to troll feminists, but to actually explore the phenomenon that is feminism.
So take off your manly Rightist coxcomb and put away your snappy repertoire of kitchen jokes for a moment. Only for a moment though, I’m not asking you to lose focus on rhetoric, goal, or intention. I only ask that you lower your guard enough to get in touch with your Occidental vag.


In other cultures, women are seen purely as brood mares. We are all aware of how rape is treated in Islamist law. Women are not allowed to drive, all decisions are made by men, their genitals are mutilated in youth, and “honor killings” are common. Rape in Africa is a de facto rule and women there are just as often butchered in the streets. Many of these states will make pretenses toward justice but that is all it is. Northeast Asia, as is usual, proves an exception and has a native homeostasis as the women care as little about gender relations as the men do. In most countries, women maintain a status as property of men. Any group of women attempting to change the status quo are met with harsh retaliation in most nonwhite countries.

Fairness to the fairer sex will intensify with proximity to Western nations and increased white admixture or demographics. So it is the whiter a country, moderated in part by culture, the more tolerant of absurdist and detrimental feminist arguments. Interestingly, and for the wrong reasons, better balances between gender egalitarianism and traditional gender roles will be found in these medial nations.
Before you slip that coxcomb back on in pride of being ein Westlichen Menschen, reflect for a moment of all that we have accomplished. Just think about the character and strength, intelligence and ability allowing Western men to dominate in all true civilized pursuits. Swell triumphant with the philosophical and metaphysical accomplishments possessed of minds not too dissimilar to your own. Traditions that continue to confound and bedazzle the minds of those in far off lands millennia hence. Turn through the pages of history and witness tyrants fall where they might have succeeded in other domains. Political movement, revolution, crusade, break through, war, masses of men carefully organized, and carnage in a perpetual struggle for balance and control.
Now, how the hell is it that we allow our women to run amok in our societies? Brought to our knees after all these years and after all we’ve accomplished? If you say it’s purely on account of Western man’s magnanimity, you’re fooling yourself and you’d better hope it’s not true as it would mean we are past largesse in the deep waters of abysmal weakness. It is not weakness though, at least not wholly.

The Problem

It is not only the Sons of Western man that have inherited strength and persistence but the Daughters as well. Western women have become prisoners of misdirected transcendence to male roles and, admittedly, they do it better than men of other societies. Regardless of your opinion of religion, our societies have left a vacuum of power in the dispossession of tradition. We are living in a time where man has been befuddled by liberal altruism, ran afoul of original intent and directed toward strangers. Foolishly replacing guarded kindness with foolish generosity. Western man, in this spiritually deprived age, has left a vacuum himself as a result and Western woman has taken up the mantle of power and revolution in his stead.
They are locked in a state of fear, just as Western men had been before taking up arms against tyrants in bygone ages. They have chosen as their enemy white men because, in their struggle for balance, they cannot find their rightful place. They see something is wrong yet they know not what. The narrative, the explanation for what went wrong, has been set by others and it is not favorable to us. Men had revolted with a solid understanding of duty and clearly defined functions for the particles of nation. Women can only emulate the functions of men and cannot, in their fervor, envision a place for men in the new order they struggle to create. A woman without a conscientious man, to give direction and purpose, is a miserable wretch just as a man without a dutiful woman, to give solace and comfort, is a sorrowful beast.
White women are of a higher value, attested by their strength and persistence despite being misguided. Men, do not hate your women for what they had no agency in creating, reserve your bile for the traitorous men amongst you. This state of affairs did not take place overnight. It has been a slow, progressive ebb and flow. Every small step back by men has been taken in turn by them. They press more for the same reasons men have always pressed the boundaries where they have smelled weakness. Being of strong constitution, we cannot blame our women for instinctively grasping for any semblance of control that may manifest as we slide evermore into decline.

With liberalist theoretical reduction of all values down to “rights,” we can begin to see the appeal and how the misunderstanding happened. You become aware of how one could grow confused and conflate “rights” with fulfillment. Many of you reading this are neoreactionaries and familiar enough with this conflation to understand the dynamic in action. You will never convince feminists to abjure without real world action and change. Arguing with them on their own terms or mocking them is only mental masturbation.

A Solution

The inappropriate reaction to feminism is attempting to enact complete servitude upon women. The beauty of Western civilization is in the harmony and symmetry of life promised to its inheritors. Our women want fulfillment and they deserve it.
In light of all this, taking identical stances toward our superior women that barbaric societies do to their inferior and incapable women is not the proper reaction to feminism. It is not even in our nature, which is why we are losing. As fun as you might think it is, we cannot win by simply mocking them with images of female serfdom. This only legitimizes their delusion of white males as tyrants to be destroyed. Above and beyond anything else, such a model is dysgenic.
Would you trade strong mothers for weak consorts? Would you wish a heritage of subservience upon your children?
When a wise king looks upon his kingdom and sees strife and anger, he will not recklessly attack them for their rebellion. That would be imprudent, selfish, and unproductive. Neither would he immediately submit to their demands without considering his options because that would be a cowardly action unfitting of a ruler.
He would recognize the confusion early on and attempt to understand their plight. Once the problem is identified, he takes corrective measures in the least destructive manner possible. This is the primary distinction between a benevolent leader and a tyrant. He cannot govern his own people the way he might attack his enemies.
We must concentrate our energies on other men, not to teach them to be misogynists but real conscientious shepherds and stewards. This is no easy task, as few worthy efforts are, but it is the only way to reverse the damage that has been done to our society. When we have enough leverage, we can then begin to create the institutions by which we might reeducate our women on the desirable status of the consummate nurturer and caretaker. In our fight for restoring Western traditions, we must not lose sight of what made it beautiful and what we know is Right.
Go here for the original, and to see some appropriate illustrations:
Quibcag: The illustration is a couple from Denkigai no Hon'ya-san (デンキ街の本屋さん?, lit. "The Electric Town's Bookstore"), who could go either way.

Tuesday, November 25, 2014

Lily W. Liberal #2

Based on a joke from


Buchanan on Hagel

I've always thought of Chuck Hagel as a bit of a doofus, because why did he take the job in the first place? He couldn't possibly be so naive as to think Obama planned anything good in the military sphere, or that he wouldn't toss him overboard ASAP if he ever became critical.

But I like to take the long view of these things — long in comparison to the TV talking heads, at least, whose memories seldom go back further than a decade, if that.

So I'll give you my opinion quick, and then pass you on to Pat Buchanan, who knows a lot more about it than I do.

Ronald Reagan, for all his flaws, had enough street smarts to know that just about anything we might try to do in the Middle East would turn around and bite us sooner or later. At least he realized that after Lebanon. Bush First had a similar understanding of the Middle East, at least, and had realist advisors who didn't think we could bomb the whole area into a representative democracy any time soon.

But the Clinton-Bush-Obama years have been different. The world is a board game to those three idiots. They're all convinced that constant diddling with the people of the Middle East can somehow improve the situation. They are, of course, dead wrong. One of the reasons I understand things like that is that I've been reading Buchanan for years. And this week, he writes:

Hagel Didn’t Start the Fire

Defense Secretary Chuck Hagel, a Vietnam war veteran and the lone Republican on Obama’s national security team, has been fired.

And John McCain’s assessment is dead on.

Hagel, he said, “was never really brought into that real tight circle inside the White House that makes all the decisions which has put us into the incredible debacle that we’re in today throughout the world.”

Undeniably, U.S. foreign policy is in a shambles. But what were the “decisions” that produced the “incredible debacle”?

Who made them? Who supported them?

The first would be George W. Bush’s decision to invade Iraq, a war for which Sens. John McCain, Joe Biden, John Kerry and Hillary Clinton all voted. At least Sen. Hagel admitted he made a mistake on that vote.

With our invasion, we dethroned Saddam and destroyed his Sunni Baathist regime. And today the Islamic State, a barbaric offshoot of al-Qaida, controls Mosul, Anbar and the Sunni third of Iraq.

Kurdistan is breaking away. And a Shia government in Baghdad, closely tied to Tehran and backed by murderous anti-American Shia militias, controls the rest. Terrorism is a daily occurrence.

Such is the condition of the nation which we were promised would become a model of democracy for the Middle East after a “cake-walk war.” The war lasted eight years for us, and now we are going back—to prevent a catastrophe.

A second decision came in 2011, when a rebellion arose against Bashar Assad in Syria, and we supported and aided the uprising. Assad must go, said Obama. McCain and the neocons agreed.

Now ISIS and al-Qaida are dominant from Aleppo to the Iraqi border with Assad barely holding the rest, while the rebels we urged to rise and overthrow the regime are routed or in retreat.

Had Assad fallen, had we bombed his army last year, as Obama, Kerry and McCain wanted to do, and brought down his regime, ISIS and al-Qaida might be in Damascus today. And America might be facing a decision either to invade or tolerate a terrorist regime in the heart of the Middle East.

Lest we forget, Vladimir Putin pulled our chestnuts out of the fire a year ago, with a brokered deal to rid Syria of chemical weapons.

The Turks, Saudis and Gulf Arabs who aided ISIS’ rise are having second thoughts, but sending no Saudi or Turkish troops to dislodge it.

So the clamor arises anew for U.S. “boots on the ground” to reunite the nations that the wars and revolutions we supported tore apart.

A third decision was the U.S.-NATO war on Col. Gadhafi’s Libya.

After deceiving the Russians by assuring them we wanted Security Council support for the use of air power simply to prevent a massacre in Benghazi, we bombed for half a year, and brought down Gadhafi.

Now we have on the south shore of the Mediterranean a huge failed state and strategic base camp for Islamists and terrorists who are spreading their poison into sub-Sahara Africa.

The great triumphs of Reagan and Bush 41 were converting Russia into a partner, and presiding over the liberation of Eastern Europe and the dissolution of the old Soviet Union into 15 independent nations.

Read the rest here:
Quibcag: I think the illustration is meant to be personifications of a few of the successor states to the USSR. At any rate, I found it here:

Monday, November 24, 2014

Enough is Enough, Dare We Hope?

For someone like me, who likes to think of himself as a funny writer, Fred Reed is both an inspiration and a depressant. An inspiration because he shows just how well it can be done, but a depressant because you think that no matter how you try, you can't ever get that funny.  On the other hand, you can be pretty funny anyway even if you don't reach Fred's level.

Fred's subject here is revolution, and my liberal (and neocon) friends all assure me that while revolution was an option a couple of centuries ago, once we got ourselves a "democratic" government, revolution of any kind became both unnecessary and evil. On the liberal side, the same ones who say that always seem to be heavily in favor of Black rioting and civil disobedience, hippies and neo-hippies defecating on cars, stuff like that. But I guess that isn't really revolutionary behavior, but just good clean fun or something.

True revolutionaries, the kind of scoundrels who would re-establish constitutional government and adopt a mind-your-own business, are the real problem to my liberal/neocon friends, and so as not to upset them, I warn them not to read the following from:

The Second American Revolution

An Utterly Objective Analysis

November 23, 2014

 The Revolution of 2019 began, curiously enough, in fall of 2019 when Mary Lou Johnson, the nine-year-old daughter of a ranching family outside of Casper, Wyoming, came home from her sex-ed class at Martin Luther King Elementary with a banana, a packet of condoms, and a book called Sally Has Two Mommies. Her mother Janey Lou, a political reactionary, took one look and began screaming. “Goddamit! Goddamit! I’m not going to take it anymore!”

 She grabbed the shotgun, a nice Remington 870 loaded with double-ought buck, and headed for the school.

Historians would debate just what led the surrounding population spontaneously to join her. Much of it seemed to have something to do with the schools. One father reported that he snapped when his daughter came home during Harriet Tubman Week, and he asked her about Robert E. Lee.


Another father, objecting to students who wore low-hanging pants, said, “It’s supposed to be a school, not a frigging proctology workshop.” A common concern was that in a fifth-grade class on Lesbian, Bisexual, Gay, and Transgendered Rights, the teacher had criticized Primate Privilege, saying that animals had rights too. She then gave the class a pamphlet called Mommy Says Moo. Wyoming was cattle country. Local wives were wroth. They thought it an invitation to infidelity.
There then followed the now-infamous Near Death March, in which the entire faculty of the school was run across the Montana line by infuriated citizens wielding cattle prods. These, dubbed the Poor Man’s Taser, were then turned against anyone associated with the federal government. “The bastards won’t leave us alone. I’m gonna tase’m where the sun don’t shine. They’ll sail back to Washington in one hop like a damn electrified bull frog.”

The uprising, which had started locally with Janey Lou’s shotgun, began to spread both geographically and in its content. Apparently people were fed up with a lot of things. Nobody in government had noticed.

It is now agreed that the catastrophic events which followed occurred in part because Washington, which was celebrating American-African History Week, simply did not recognize the depth of resentment in the country. The city traditionally was inward-looking. Few knew exactly where Wyoming was. Their sources of information were chiefly talking heads talking to each other about each other.

By unfortunate happenstance, the Supreme Court had just issued its landmark decision that public display of the Bible contravened the constitutional prohibition of the establishment of religion. Mere possession, the justices said, would not be sufficient to trip the prohibition and lead to prosecution, but “a reasonable reticence in display” should be practiced. It was agreed that the Holy Book could be carried in a sealed bag with a child-proof lock. That this happened during Moslem Heritage Week further fueled ire among the intolerant.

The Court’s ruling had ripple effects unforeseen in the capital, as most things were. When the rebellion metastasized to Rosa Parks County (formerly Jackson County), Virginia, forty miles outside of Richmond, they were shocked. The provoking incident occurred in Sojourner Truth High School in a rural and not very Reconstructed county.

Specificallly, Johnny Loggins, in the tenth grade, had been issued a condom and, in the back of his African Civilization class, was discovered to be praying for a chance to use it. This also constituted an establishment of religion. He was arrested by several of the thirty-five police patrolling the corridors and remanded for psychiatric evaluation.

Runors flew, fanning the flames. The Democrats, having elected the first black president and then Hillary, the first woman, were said to be looking for a transvestite for 2024. In respect to 2032, the ominous word "trans-phylum" floated about. The people of Casper feared they might have a President who said "Arf," or perhaps had tentacles.

The insurrection went viral thanks to the internet. Incident followed incident. In Brooklyn on Sixth Avenue, seven teens between the ages of 21 and 28 beat to death a 95-year old white veteran in a wheel chair, shouting “Kill Whitey!” and “That’s for Travon!” and “White dude bleed a lot.” The chief of police undertook a thorough investigation. He reported that there was no evidence of racial motivation. Jesse Jackson said it was unfortunate, but white men in wheel chairs needed to learn not to attack black teenagers.

After an enraged mob of R-Cubed—the movement was now calling itself Rural, Retrograde, and Right, the Three Rs—had surrounded Columbia Teachers College and burned it, Washington recognized that things were getting out of hand. Reporters asked why the arsonists had, well, arsoned Columbia. An irate woman screamed:

“My kid is fifteen, can’t read, and doesn’t know who Thomas Jefferson was but he’s had three different classes on safe anal sex. I didn’t raise him to be an analphabetic butt-plug. Excuse me. I need to find a professor.” She left, brandishing her ball-bat.

Her assertion was not entirely implausible. A recent poll by the New York Times had showed that 87% of college freshmen, or freshpersons, couldn’t find the Pacific Ocean on a map of California, and fully 54% didn’t know what “Ocean” meant. (“Didn’t she sing with Grody Kate and the G-Spots?” asked one female junior.) Others couldn’t identify Jesus Christ, Mother Theresa, or George Washington, but were “sure or almost sure” that they were racists.

Washington was soon surrounded by R-Cubed insurgents, many of whom proved to be well-armed and with military experience. They soon revealed their true colors as homophobes. Rampaging, they burned gay bars such as Moby Dick and The White Swallow, shouting, “We don’t care where you stick it, but we don’t want to hear about it.” Much squealing and a mass exodus followed.

Surprisingly, it was Maxwell Birnbaum, inevitably know as "Ol’ Burn and Bomb,” who led the hous- to-house fighting. He was not a soldier, or ex-soldier, but a classics professor from the University of Virginia in Charlottesville. As the Three R’s fought their way through Arlington in the Virginia suburbs of Washington and reached Key Bridge, key to the city itself, Birnbaum told a reporter, “Twenty-five hundred years of European civilization, and we’re going to give it up to people whose Mothers Say Moo? Like hell we are. Did trilobites scuttle the Cambrian seas to bring us Clitler? Hillary, I mean. If they had known, they would have stopped reproducing. I won’t stand for it.”
The rest is well known. Congress in its entirely was slaughtered, and hung upside-down from lamp posts though, unlike Mussolini, they were not emasculated. It was pretty much agreed that they had taken care of this themselves long ago.

Peace returned. Janey Lou put away the shotgun, and made lunch.

Philip Francis Stanley and Grotesque Ophthalmological Malpractice