Current Columns
Selected History of Presidential Election Defeats
November 2020
In an essay for the History News Network entitled, “John Adams Knew When to Go Home,” political science professor R.B. Bernstein writes, “n 1801, John Adams did something just as momentous, just as reaffirming of democratic constitutional principle. After losing the presidential election of 1800 to his former friend and political rival Thomas Jefferson, Adams decided that losing an election, even one for the presidency means what it says. Adams went home.” And though the election of 1800 was just the fourth time the American people went to the polls to elect the president, this one was different. The two leading candidates, Aaron Burr and Thomas Jefferson had the same number of electoral votes meaning that the House of Representatives would decide the election.
Adams, after a bitterly fought campaign, came in third. “Nor did he listen to hints by fellow members of his Federalist party that he let them keep him in office as a caretaker president while the House of Representatives wrestled with resolving the electoral deadlock between Jefferson and Aaron Burr.” Adds Bernstein. So what happened to Adams after his time in the White House (in fact, the first to occupy that structure because Washington held his presidency in New York)? Bernstein writes, “He never left Quincy again. For twenty-five years, he read, wrote, argued, reflected, and philosophized about politics, government, history, religion, and his life and career. He carried on a bitter quarrel in print with a foe long dead, Alexander Hamilton. He entertained himself by exchanging dozens of letters with such old friends as Benjamin Rush and Benjamin Waterhouse, and revived an old friendship by exchanging more letters with Thomas Jefferson.”
In some regards, Adams’s best presidential decision was the one at the end of his presidency. If Washington set the standard for the peaceful transition of power and established the precedent for a limited presidency, Adams ceding of power in 1800 was equally important. Washington chose not to run again, but even in that choice, did so with the knowledge that had he run for a third term, he would have won. John Adams was not just the second president of the United States; he was the first one defeated in reelection. How Adams managed the transition of power provided an essential precedent for future presidents. It is one thing to cede power after two terms: Jefferson, Jackson, and Barack Obama. It is quite another to cede control after a single term after a stinging defeat.
Contrast Adam’s behavior with that with his contemporary, and due to a certain musical from Lin-Manual Miranda, the now-famous Aaron Burr. After serving as Jefferson’s vice-president, Burr realized that Jefferson would run again in 1804 but would not feature Burr as a running mate. Burr then conducted an unsuccessful run for New York Governor. Whatever his plans after this loss, he died along with Alexander Hamilton on the plain of Weehawken, New Jersey. In 2020 politics can seem vitriolic, but this would be the equivalent of Mike Pence gunning down Obama era Secretary of the Treasury Jack Lew. After killing Hamilton, Burr engaged in a series of misadventures, resulting in a treason trial. He fled the United States, got married at 77, and died in relative peace in New York. Burr is one of the few individuals on this list whose post-presidential run life was more exciting than the preceding years.
Some Presidents choose not to run because they were not interested in a second term, as was the case with Chester A. Arthur and James K. Polk. Others know the outcome, so do not even try as was the situation with John Tyler, James Buchanan, and Andrew Johnson. But most of those who serve one term usually run again, and in the history of the presidency, nine of them, including Adams, ended on the losing side.
Because of the power of the “Virginia Dynasty,” elections after 1800 consisted of names that only a real history geek would love. This list includes Charles Pinkney and Rufus King. It is doubtful that the talented Manual Miranda will be featuring a lavish Broadway musical called “Pinkney!” any time soon.
DeWitt Clinton was the loser in 1812 to James Madison. Yet his fame is ensured through his efforts as Governor of New York to drive the Erie Canal construction. This accomplishment greatly impacted the region’s entire economy and helped New York cement its status as the Union’s 1st city. Clinton’s historical impact is far more relevant for this than as a name in Madison’s biography. In 1820, the Dynasty, and its standard-bearer of that year, James Monroe, ran unopposed and received 76% of the popular vote. This election was the first time and the last time this has happened in 58 presidential elections, and 200 years after Monroe and his “Era of Good Feelings” that situation seems increasingly anachronistic.
During his years in the early Republic, Andrew Jackson made so much history, good, bad, and heinous, that it is challenging to pick which issue to focus on. But fortunately, this piece guides us to the elections of 1824, the first time that a loser in a presidential election would run again and win the White House. John Quincy Adams, like his father, only served a single term. But unlike his father, Quincy Adams went on to a prominent career in the House of Representatives. As Margaret Hogan writing for the Miller Center notes, “Adams served nine post-presidential terms in Congress from 1830 until he died in 1848, usually voting in the minority. He supported the Bank of the United States’ rechartering, opposed the annexation of Texas and the war with Mexico, and struggled for eight years to end the House’s notorious “gag rule,” which tabled without debate any petition critical of slavery. Adams attempted to read into the record at every opportunity the hundreds of anti-slavery petitions that abolitionists around the country sent him regularly. The House finally relented and repealed the rule in 1844.” Adams’s career is one of the many examples that puts paid to the conjecture of individual progressive writers of this day that whites were somehow tolerant of slavery. It also should be noted that neither Adams nor his father attended their successor’s inaugurations.
Martin Van Buren became president in 1836 upon the success of the Democratic machine he had built and the popularity of his predecessor and patron, Andrew Jackson. But like many one-term presidents, he also inherited an economic debacle. In 1992 political operative James Carville famously intoned that “It’s the economy stupid.” In this pithy phrase lies the kernel of wisdom and presidential success. It is not a coincidence that Van Buren, Hoover, and H.W. Bush all had either a recession or depression occurring that affected their electoral chances and made them one-term presidents.
After his defeat in the 1940 election, Van Buren remained active in politics but took on an increasingly anti-slavery position, an exciting policy given his patron was a slaveholder. In 1848 Van Buren became the rare presidential loser who tried to run again but in this case, not the Democratic Party he had built, but running for the Free Soil party whose central plank was abolition. A canny veteran of politics, Van Buren probably knew he had no chance but was instead making a political statement. As it happened, the Free Soil Party garnered 10% of the popular vote but lost to Whig Zachery Taylor, as did Lewis Cass, the Democratic candidate. Cass was the first harbinger of the decline of the dominance of the Democratic Party. He was the first non-incumbent Democrat to lose and the first who did not succeed another Democrat. Cass later went on to a Senate seat and in 1857 at 75, Secretary of State.
Between Jackson, who won a second term in 1832, and Abraham Lincoln, who won his 2nd term in 1864, no president could win a second term. Some of them had the misfortune to die in office, such as William Henry Harrison and Zachery Taylor. Others, such as James Polk, who might have won a second term, chose not to run again. Since he passed away shortly after leaving office, it was probably best that he did. That leaves a who’s who of one-termer, including Martin Van Buren, Franklin Pierce, and James Buchanan. In the latter case, his popularity was at such low ebb that he declined to even try for the second term. John Tyler was one of those presidents whose popularity was such that a run was out of the question for a term in his own right.
Because William Henry Harrison died so early in his presidency (the first such succession in American history), Tyler served a nearly full term. His post-presidency did not cover him in glory as he later became a Congressman in the Confederate Legislature during the Civil War.
Andrew Johnson experienced one of the most tumultuous presidencies in the Republic’s history, overseeing a pro south version of Reconstruction, getting impeached and nearly convicted, and failing to secure his party’s nomination 1868. After his presidency, he returned to his native Tennessee. Once a pariah in that state due to his pro-union stance, his position on Reconstruction was such that white Tennesseans saw him as a hero, and in 1875, he became the only former president elected to the Senate.
U.S. Grant was unique as one of the few presidents never to have served in an elected office before his election. Other members of that group include William Taft, Herbert Hoover, Dwight Eisenhower, and Donald Trump. In an era where close elections would become the norm, Grant won both his elections easily besting Horatio Seymour and Horace Greeley. Seymour never ran for office after his loss.
Grover Cleveland was a great president. He reversed many of the destructive policies of the Harrison Administration. He was pro-business, pro-gold standard, anti-tariff, and anti-trust busting. One of my favorite political sayings of all the presidents comes from him, ““Though the people support the government, the government should not support the people,” in response to the Panic of 1893. His hands-off attitude meant that the subsequent depression was over in about three years. FDR’s differentiated response, massive government intervention meant the Great Depression was not over after eight years of the New Deal, and even then, World War II had to bail out the nation. Unfortunately, Cleveland is not remembered for any of this. Instead, he is remembered, if at all, as the only presidential loser to run again and win and a second term.
There have been many economic recessions and even depressions in our history. It was one such calamity that did in Martin Van Buren’s reelection. Had Cleveland not already served the first term, it would have been difficult to see him winning an election after the panic of 1893. George H.W. Bush had a presidential approval rating of nearly 90%. This was so high, prominent democrats of the era looked to 1996 for their chances. Little did they know that a brief economic recession would so impact the election of 1992 that H.W. Bush would become a one-term president and that a little known Governor of Arkansas would take the oval office.
Of all the presidential losers, arguably, none had a greater post-presidency than William Taft. The salient personality of Taft’s presidential elections was the borderline narcissist Theodore Roosevelt. TR was one of the presidents, such as Washington, Jackson, and Reagan, who were so popular that they could nearly designate their successor. The problem is that all of these successors ended as one-term presidents. Indeed there were circumstances around these defeats. Van Buren had the Panic of 1837 on his watch, and Taft faced a split party. But part of the problem with these presidents is that they were not their predecessors. In 1988 Republicans elected a third Reagan term, but the president was H.W. Bush.
TR’s backing was instrumental in Taft’s 1908 win. Unfortunately, Roosevelt was also the architect for his loss. Such was the power of the Republican coalition after the elections of 1894 and 1896 that they could win any election (until the Great Depression and 1932). The exception was 1912, when the egomaniacal Roosevelt broke the Washingtonian precedent and ran for a third term. This entry split the Republican vote and ushered Woodrow Wilson into office as one of the worst presidents. Though embittered by the loss, Taft did not run for office again, instead accepting a teaching post at Yale and giving paid speeches. In 1921 President Warren G Harding nominated Taft to the Supreme Court for the role of Chief Justice. The Senate confirmed Taft by a vote of 61-4.
United States Supreme Court Chief Justice was the role that Taft coveted even more than the presidency. In an article entitled “Chief Justice, Not President, Was William Howard Taft’s Dream Job” writer Erick Trickey notes, “William Howard Taft never really wanted to be president. Politics was his wife’s ambition for him, not his own. Before he was Secretary of War or governor of the Philippines, Taft, an intellectual son and grandson of judges, spent eight blissful years as a federal appeals court judge. “I love judges, and I love courts,” President Taft said in a speech in 1911. “They are my ideals that typify on earth what we shall meet hereafter in heaven under a just God.” Trickey adds, “s chief justice, Taft rejoiced in his reversal of fortune. On the bench, wrote journalist William Allen White, he resembled “one of the high gods of the world, a smiling Buddha, placid, wise, gentle, sweet.” To manage his declining health and reduce his famous girth, Taft walked three miles to work at the Supreme Court’s chamber in the U.S. Capitol building. Soon he was down to 260 pounds, a near-low for him. He rarely looked back at his years as a politician, except to bid them good riddance.”
Because many presidents, and party nominees, have received their opportunities into their middle years, post-presidential life is often measured in years but not decades. There are two 20th century exceptions to this – Herbert Hoover and Jimmy Carter. Of Hoover, historian Daniel Hamilton writes, “till a relatively youthful man upon his defeat in 1932, the fifty-eight-year-old former President lived another thirty-two years before his death on October 20, 1964. Immediately after the inauguration of Franklin Roosevelt, Herbert Hoover retreated to his home in Palo Alto, California. For much of the 1930s—and, indeed, for decades to come—the public, and especially the Democratic Party, blamed Hoover for the Great Depression. Likewise, few Republicans in the 1930s wanted Hoover involved in party politics because of his negative standing in the popular mind. Wealthy and generous, Hoover did not need to work, but even the fishing that he loved could consume only so many hours of the week. From his home in Palo Alto, Hoover launched a series of bitter attacks on the New Deal in letters and essays.” Hoover spent much of these years getting foreign policy wrong. Though no fan of Hitler he opposed American entry into World War II, using the Atom Bomb, and the Cold War.
It has now been 40 years since James E. Carter lost his reelection bid in 1980. His opponent in that race and both Vice Presidential nominees have all passed. Unlike many presidents, after their term officially ended, Carter has kept his profile relatively high working through his
As Carl Cannon has noted, writing for RealClearPolitics, “Ostensibly, Carter’s 2002 award was given for “his decades of untiring effort to find peaceful solutions to international conflicts, to advance democracy and human rights, and to promote economic and social development.” Few would quarrel with that description; and if one were to consider only the Carter Center’s work to eradicate a disease known as river blindness, Jimmy Carter would have been a deserving recipient.” Of course, the Nobel Committee being what it is, “politics is never far from the surface of human affairs, and in 2002 Norwegian Nobel Committee Chairman Gunnar Berge sullied Carter’s award by blurting out in an interview that it “should be interpreted” as a “kick in the leg” to George W. Bush.” Many thought Carter should have been awarded much earlier for his work on the 1978 Camp David Accords.
Not as worthy has been Carter’s virulence on the Palestinian Israeli conflict. “The bottom line is this: Peace will come to Israel and the Middle East only when the Israeli government is willing to comply with international law, with the Roadmap for Peace, with official American policy, with the wishes of a majority of its citizens — and honor its previous commitments — by accepting its legal borders. All Arab neighbors must pledge to honor Israel’s right to live in peace under these conditions.” – An excerpt from Carter’s book, “Palestine: Peace Not Apartheid.” The title, provocative in itself, tells all one needs to know on Carter’s stance. The Palestinian leadership does not want peace, for their legitimacy rests not on their ability to lead, but instead, the power to fight Israel is lost on Carter. Carter lost the presidency for several reasons, not the least because the job was too big for him. His post-presidency often shows the same lack of judgment and awareness.
If Carter’s post-presidency was somewhat controversial, George H.W. Bush was a model of what a post president can accomplish. In addition to focusing on his library, the Miller Center states, “Bush also joined with former President Bill Clinton after a tsunami from the Indian Ocean struck Southeast Asia in December 2004. The two former Presidents created the Bush-Clinton Houston Tsunami Fund, a national fundraising campaign to assist damaged communities throughout the region.” This was to become one of four projects that the two joined in for assistance to other nations. George W. Bush noted of his father, “He has two favorite 62-year-olds, myself and Bill Clinton.”
As of this writing, Joe Biden has won the presidential election of 2020. I supported Trump in 2016 and again in 2020 but not based on his personality but rather his policies. Knowing the limitations of his character, it is not easy to believe he will go gently. As noted above, neither Adams attended their successor’s inaugurations. John Quincy did not see the advent of a Jackson presidency in the most favorable light, “He wrote in his diary that “The sun of my political life set in the deepest gloom.” Filled with sadness for the nation, Adams stayed in Washington for a few months before returning to his hometown of Quincy, Massachusetts.”
Hard to predict but to think about speeches, a radio show, and a possible TV program. But the looming question for Republican presidential hopefuls running against President Harris in 2024 (you read that right) will be twofold: who will Trump support, and maybe, just maybe, will he do what only one other of the 46 presidents has done. Run again and win.
The Lamentations of Our Times
October 2020
Lamentation: the passionate expression of grief or sorrow; weeping.
“If it were possible to cure evils by lamentation and to raise the dead with tears, then gold would be a less valuable thing than weeping.” Sophocles, Greek Playwright
“There is no harm in patience, and no profit in lamentation.” Abu Bakr, Arab Caliph
In a recent TikTok video, a young woman can be seen screaming that she wished she had never been born because of her whiteness. The daughter of a close friend of mine stated on Facebook that what happened to Jacob Blake made her “sick, absolutely sick to my stomach.” I should note this friend is a multi-millionaire, the daughter has a well-paying role in a giant insurance company, and has never wanted for anything in terms of physical sustenance in her life.
Upon hearing the death of Ahmad Aubrey at the hands of white racists, one of 223 (2015 stats) murders of blacks by whites, or about .00001% of the population of 43 million, Los Angeles Laker basketball player Lebron James, an African American, stated that it is not even safe for him to go outside. Notwithstanding, the hundreds of millions of dollars do not seem sufficient for Mr. James to afford the necessary security to avoid the 1 in 187,000 odds. Lebron should immediately fill in his Olympic sized, lushly decorated, swimming pool, which he photographed himself in for his Instagram account, because the odds of him drowning are about 1 in 1,190. He is more likely to die of a dog attack than to die at the hands of a white human being.
Breonna Taylor and Jacob Blake lawyer, Benjamin Crump, says about America in 2020. “The unjust verdict in Breonna Taylor’s case affects the mental health of Black people in Louisville and nationwide. 1/3 of new mental health clients said Bre, not getting justice, was their reason for needing treatment.” The evidence for this? A TMZ article using a sample of 30, of which 10 of them said that the Breonna Taylor judgment was the reason for their mental illness the day after the decision. How Mr. Crump extrapolates that to “and nationwide” which would comprise about 100 million people, is a little opaque. Still, like the Taylor family lawyer, who presumably lost his lawsuit, it sounds like good copy for Twitter.
The lamentation of being African American in the United States in 2020 is just one of many laments extolled in America. Another is around the COVID-19 virus that emanated from China.
- “The world officially recorded 1 million deaths from Covid-19 in one of the most sobering milestones of the pandemic, but the real tally might be almost double that.” Bloomberg
- “Coronavirus pandemic could have caused 40 million deaths if left unchecked.” Imperial College of London
- “Efforts to beat the coronavirus pandemic could cause over 1 million extra deaths from other diseases, experts warn.” CNN
What is not said is this simple fact of COVID-19, and this is the one fact that was known in March, and it is known as of this writing in October 2020. According to the Centers for Disease Control, of a sample of 194,000 deaths, 58% of those were above the age of 75. And when the period is lowered to 65, it comprises over 80% of all fatalities. COVID is not a plague in a traditional sense but one that preys on the old and infirm. Given that the average life span is 79 were these deaths COVID or “complications related to COVID.” In one case, Annie Glenn, widow of astronaut John Glenn, was a COVID victim at 100 years of age? That is a COVID death, or was COVID the determinant of death by old age? Nevertheless, upon this data and worldwide CASE COUNT of 40 million, one half of one percent of the total world population, governments decided to put the entire planet into economic recession.
The subject of hunger is also a lament of our time. According to the non-profit Ample Harvest, “although approximately one out of six Americans experiences food insecurity today, there is a more than adequate amount of food available. Hunger in America can be solved.” Just to be clear, even a non-profit seeking donations and political relevance is not saying we are starving, just this food insecurity thing. We are not starving, not famine—food insecurity.
Here is another article on food hunger, “These facts do not end the debate. Food insecurity is defined as the disruption of food intake or eating patterns because of a lack of money and other resources. In 2014, 17.4 million U.S. households were food insecure at some time during the year.” This according to the Department of Health and Human Services. And this from a different U.S. Executive department, “The Agriculture Department announced this morning that 48 million Americans live in “food insecure” households.” And, as Senator Bernie Sanders noted in 2012, in his usual measured style, “Nationwide, hunger is at an all-time high.”
Given that the United States is a substantial net exporter of food, this is all pretty alarming. Only it is not valid. James Bovard, writing for the Foundation of Economic Education, notes, “Food insecurity” is a statistic designed to mislead. USDA defines food insecurity as being “uncertain of having, or unable to acquire, enough food to meet the needs of all their members because they had insufficient money or other resources for food.” But as Bovard goes onto to note, it is not entirely what the government is implying, “USDA noted: “For most food-insecure households, the inadequacies were in the form of reduced quality and variety rather than insufficient quantity.” The definition of “food insecure” includes anyone who frets about not being able to purchase food at any point. If someone states that they feared running out of food for a single day (but didn’t run out), that is an indicator of being “food insecure” for the entire year — regardless of whether they ever missed a single meal. If someone wants organic kale but can afford only conventional kale, that is another “food insecure” indicator.”
A Journal of the Academy of Nutrition and Dietetics study concluded that “food insecure” adults are far more likely to be obese than “food secure” adults — indicating that a shortage of food is not the real health problem. According to the American Medical Association Journal, “seven times as many (low-income) children are obese as are underweight.”
Politicians, including the aforementioned Sanders, like to say that this or that is unique in American history, the this or that always aligning to their political narratives. But in the case of obesity in America, that is historically unique. According to the Centers for Disease Control, “From 1999–2000 through 2017–2018, the prevalence of obesity increased from 30.5% to 42.4%, and the prevalence of severe obesity increased from 4.7% to 9.2%. Essentially nearly one half of Americans are overweight. Given the million-year history of humanity, that in and of itself is unique. It is also odd that both young and middle-age is comparable to older populations in obesity. Given the greater activity of the young, and a more active metabolisms, this is odd. Yet this is where unique gets piled onto exceptional in human history. As stated by the American Diabetes Association, “In contrast to international trends, people in America who live in the most poverty-dense counties are those most prone to obesity. Counties with poverty rates of >35% have obesity rates 145% greater than wealthy counties.” In all other countries before the 19th century, lack of food would always hit the poorest people, usually a high percentage of the population, much harder than the rich. This is the first time, in about 14,000 years, that poor people have higher rates of obesity than rich ones.
The other troubling aspect is that the governmental entities providing the data are also the same to most benefit from hunger issues. The agricultural department mainly exists to support food producers. Here is the boilerplate from the website, “We have the vision to provide economic opportunity through innovation, helping rural America to thrive; to promote agriculture production.” The dots self-connect. Hunger in America means we need more food. Since food insecurity by nature affects more impoverished Americans, it is better to distribute through governmental entities in vouchers such as food stamps. The best, most consistent customer for farmers is the U.S. government. If the food insecurity issue was as severe as noted, who benefits the most: farmers and those government departments managing the system.
So now that we have learned how awful and terrible it is to live in the United States of today, a nation rife with racial animosity, continuous hunger, and pestilence. It is time to compare the hellhole of American today with historical examples of these lamentations.
Racism
The primary note of “systematic racism” is a clarion call to roll over any barriers to political power acquisition. Yet our racism has not prevented African Americans from becoming millionaires, mayors, governors, CEOs, Senators, cabinet officials, and even president. Barack Obama’s post-presidency saw him and his wife follow up his two terms in office by earning nearly 100 million dollars. Contrast this with the fate of groups within the medieval period provided by this author. Geraldine Heng, writing for History Magazine stated, “Slavery in the medieval period was also configured by race: Caucasian slave women in Islamic Spain birthed sons and heirs for Arab Muslim rulers, including the famed Caliphs of Cordoba; the ranks of the slave dynasties of Turkic and Caucasian sultans and military elites in Mamluk Egypt were regularly resupplied by European, especially Italian slavers; and the Romani (“Gypsies”) in southeastern Europe became enslaved by religious houses and landowning elites who used Romani slaves as labor well into the modern era, making “Gypsy” the name of a slave race.”
The point is that racism has always been with humankind, and in a nation of 330 million if one 10th of one10th of one 10th of one 10th Americans are racist, that is 33,000 people. That is not a tiny number, but for those who are yelling racist, it is enough to say there is racism. But that is simply not nearly enough to establish the charge of systematic racism. The type of systematic racism that was the official policy of Caliphates in Spain or Sultans in Egypt.
Disease
The Bubonic Plague, or Black Death, which occurred in the 14th century, is relatively well known to Western minds. Less known is the “Third Plague” that happened in the last 19th century and emanated from China. “culminating in 1907, where the death toll reached more than one million. Altogether, the third plague pandemic claimed around 12 million lives in India.” It also caused another 3 million deaths worldwide. That 12 million was when India’s population was around 238 million, meaning that 5% of the population perished during the plague. In the United States, the CASE COUNT is 7 million or a little over 2% of the community.
But again, there are lockdowns, multi-trillion-dollar packages, and political recriminations. Worldwide deaths are over 1 million or 1/15th of the Third Plague. And this is also based on a population nearly five times as large as that at the time of the Third Plague. The 19th century Indians would have laughed at us.
Hunger
As stated by author Carly Dodd writing for history, “The deadliest famine in history took place in China between 1959 and 1961. This catastrophe has often been referred to as one of the greatest human-made disasters, though regional droughts did play a part. The famine was caused by a combination of political and social factors by the People’s Republic of China, led by Mao Zedong. These policies, namely the Great Leap Forward, which began in 1958, and the people’s communes, created a disastrous environment that cost tens of millions of lives.” This is a first-hand account from Yang Jisheng, provided by the Guardian, “a survivor of the famine. In barely nine months, more than 12,000 people – a third of the inhabitants – die in a single commune; a tenth of its households are wiped out. Thirteen children beg officials for food and are dragged deep into the mountains, where they die from exposure and starvation. A teenage orphan kills and eats her four-year-old brother. Forty-four of a village’s 45 inhabitants die; the last remaining resident, a woman in her 60s, goes insane. Others are tortured, beaten, or buried alive for declaring realistic harvests, refusing to hand over what little food they have, stealing scraps, or simply angering officials.” And by the way, this resulted from type of socialist Nirvana that Bernie Sanders, Elizabeth Warren, and Alexandria Ocasio Cortez would drag the United States.
There were famines in ancient Rome. Famines in medieval China. A famine in Bengal, perpetuated by the practices of the British East India Company, a true exemplar of crony capitalism, saw the deaths of nearly 10 million Indians due to lack of food. Again, the poor Indians died. The wealthy British were the obese ones.
And compare this to the lamentations of “food insecurity” in the United States. Is this to say we do not have severe problems in the United States? We are virtually bankrupt economically. Our politics are increasingly devoid of serious address to the issues at hand. And there is racism, hunger, and disease within our borders.
But these three are not the real issues of our time but rather the real issues ambitious people use to scare American citizens into providing support and resources. There is no better place to live in humankind’s history than in the 21st century, capitalistic based; limited government-managed America. Our real lamentation is that so few Americans truly understand what they have, and seem willing to give away.
Elections that ACTUALLY Did Change Everything
August 2020
This past week, at the Democratic National Convention, Senator Bernie Sanders stated, “This election is about preserving our democracy.” This sentiment was echoed by, well, nearly everyone else on the docket. In an August 29, 2019 piece for the Washington Post, author Avi Selk says, “Now I know every election everyone says, ‘This is the most important election of our lifetime,’ but this time it actually is the most important.” Earlier that month, Joe Biden stated, “You all know in your gut, not because I’m running, that this is maybe the most important election, no matter how young or old you are, you’ve ever voted in.” Running in 2016, Donald Trump stated, “You’re going to look back at this election, and say this is by far the most important vote you’ve ever cast for anyone at any time.” Adds Selk, “President Harry S. Truman spent much of 1952 saying the same thing about that year’s election, in which he stumped for Democrat Adlai Stevenson to succeed him.” John F. Kennedy called his race the most important since Abraham Lincoln’s. In 2000, actor Baldwin was so distraught at the prospect of a win by George W. Bush that he promised to move to Canada. He stayed and a good thing for him because Tina Fey may not have saved his foundering career if he were in some hut north of Manitoba.
How does the discerning political analyst separate the critical elections from the rhetoric instilled by politicians and their minions who wish to drive up donation levels? First off, there are specific, critical issues that neither party will touch.
Candidates love to talk about jobs and the economy and how they are the ones to eliminate inequality, but you will never hear them address the greatest equality within the Republic; current government services vs. future ones, and the young vs. old. For all of Bernie pablum about Billionaires, there are not enough rich people to pay for the deficits created by wealth transfers for social security, pensions, and Medicare. So on what is arguably the most significant issue, there is consensus no matter who gets elected. Entitlements are untouchable.
That being said, real transformative, landscape-altering elections contain two elements; the imposition of significant, long-lasting government institutions that rely on governmental subsidies and the alteration of the parties themselves.
In the 1800s, five critical elections transformed the trajectory of the nation. The first was in 1800 itself. The transformative moment came after the election. The electoral vote tied meaning that Jefferson and Burr had to go to the House to determine the winner, but it was the first time in four elections that the incumbent lost. What would the authoritarian Adams do? Not a lot as it turns out. Washington gets the right amount of glory for setting a precedent for the peaceful transfer of power, but he never lost. It was one thing to hire someone. There is always a honeymoon in that. It is another to see the character of a human when they are fired.
Twenty-eight years later occurred another transformative election. The incumbent lost, perhaps not ironically, it was another Adams, but the winner was transformative. Andrew Jackson was the first non-founder elected president, and his refurbished Democratic Party won 6 out of 8 presidential elections between 1828 and 1860. As the Brittanica.com website states, “The election of 1828 was arguably one of the most significant in United States history, ushering in the era of political campaigns and paving the way for the solidification of political parties.” Jackson also introduced authoritarianism that was lacking from his six predecessors.
Whether it was destroying the 2nd Bank of the United States, stopping secession talk from South Carolina, or sending hundreds of native Americans to their doom on the trail of tears, Jackson was driving the narrative. Because the actual machinery of government was small, and Congress had the power, the country would wait another 70 years to see real blue imperial power in the person of Teddy Roosevelt. Jackson did not necessarily build the machinery of the government as 20th-century presidents would do, but he did provide the template of what a president could accomplish if pushed forward hard enough.
In 1954, old Whig elements and members of the free soil party merged to form the Republicans, but their impact would not come for another six years. The 1860 election in which Abraham Lincoln won the White House speaks for itself, and from this point, the Democratic Party once the dominant party of the nation, became far more regionalized with their primary base in the South. According to History.com, “The election of 1860 was one of the most pivotal presidential elections in American history. The main issue of the election was slavery and states’ rights. Lincoln emerged victoriously and became the 16th president of the United States during a national crisis that would tear states and families apart and test Lincoln’s leadership and resolve: The Civil War.” Subsequently, this change led to consistent policy changes.
The Democratic Party of the late 19th century was anti protectionist and in their Southern voter suppression of Southern, republican, African Americans. Yet one of the issues that divided both parties in this period was the concept of easy vs. hard money.
These policies ended with two elections, one a rare mid-term transformation and the next presidential election. In the election of 1894, in the wake of the Panic of 1893 and the subsequent depression, experienced the single most considerable turnover of the seat in the House of Representatives. Over 125 Democrats lost their positions. In the wake of the worst defeat in Congressional history, the Democrats abandoned their party’s two key planks. The first was bi-metalism in favor of a focus on easy money silver. The second was to become more of a populist party and even anti-business, they began evolving as a pro-labor, anti-business party. This change was cemented by the choice of populist William Jennings Bryan as their standard-bearer in 1896.
Bryan’s opponent, William McKinley, was a civil war hero, ex Ohio Governor, author of the McKinley tariff, and protégé of businessman Mark Hanna. A 2020 democrat would not recognize their party in 1892, especially under the leadership of Grover Cleveland. But a modern democrat would have much to like in William Jennings Bryan. Teddy Roosevelt was technically a progressive president that was more about his personality than his policy beliefs. Woodrow Wilson’s new freedom, FDR’s New Deal, LBJ’s Great Society, and the current flirtation of today’s 2020 democrats with socialism stemmed from the 1894 and 1896 elections in which the Democrats took on their populist veneer. According to the Miller Center blog, in an article written by Lewis Gold, “The Republican victory reflected a winning coalition of urban residents in the North, prosperous Midwestern farmers, industrial workers, ethnic voters (except for the Irish) reform-minded professionals. It launched a long period of Republican power lasting until 1932, broken only by Woodrow Wilson’s victory in 1912, which occurred principally because of a split in the Republican Party.”
The historic nature of 1932 was that this election was the twofer. The New Deal provided a host of permanent governmental institutions such as Social Security, but the party was altered. For nearly 70 years, African Americans had been mostly Republican, logically voting for the party of Abraham Lincoln, Thaddeus Stevens, and Charles Sumner. It was the late 19th century democrats who exercised voter suppression and intimidation, and under Woodrow Wilson, lynching in the South averaged over 50 annually. It was under Calvin Coolidge, whose bold law enforcement reduced this heinous acts to less than 10. Just this year, Joe Biden did something that no other presidential nominee has accomplished.
In the primary, he lost Iowa and New Hampshire, badly, and yet came back to win the nomination. His entire success can be pinned on the support of African American representative Jim Clyburn, and the black vote in South Carolina. Mostly, it was core African American voters who drove the direction of the Democratic Party and began with the election of 1932. In an article entitled, “The Consequential Elections in History: Franklin Roosevelt and the Election of 1832,” written for U.S. News and World Report, author Kenneth Walsh notes, “The electorate had, in effect, taken nearly 150 years of tradition upholding limited government and, in their anxiety and anger, thrown it out the window.” There is an argument that with the defeat of the Bourbon Democrats in 1894, Teddy Roosevelt and Woodrow Wilson had laid the groundwork for this belief, but it is without a doubt that in 1932 this concept of permanent governmental involvement realized its inception. “he became one of the nation’s most beloved presidents and built a vast and powerful governing coalition. It consisted in part of working-class whites, union members, immigrants, African Americans, Southern whites, Catholics, Jewish voters, and city dwellers. This coalition dominated American politics for more than a generation—another key part of FDR’s legacy,” Adds Walsh.
But the whirlwind of the Great Depression, African Americans changed allegiance, a movement cemented by another transformational election of 1964. In that year and 1965, it was Republican legislators who got Johnson’s Civil Rights and Voter Rights through Congress, but people remember presidents, and blacks remember Johnson. As the website History central the “The election of 1964 was the first election, since 1932, that was fought over real issues. This election brought ideology into American politics.” Lyndon Johnson’s 61% popular vote tally, against just 38% for opponent Barry Goldwater, cemented several movements.
In 2008, a historic election with the first African American president’s election, Barack Obama held the House of Representatives, the Senate, and enjoyed a four v. 4 Supreme Court with Anthony Kennedy as swing. And with all that, did not fundamentally transform the nation, as he wished. The Affordable Care Act has had less effect on healthcare than did George W. Bush’s drug benefit. The Trump Administration has successfully curtailed the Consumer Protection Bureau, and the stimulus bill was just another trillion added to the ever-growing debt bomb. Other than that, most Obama items, including the ban on drilling and Iran treaty, have been rescinded. The difference between Obama and a Harris Administration (Biden will not finish his term) is the appetite for institutional reform. If the next Congress adds two new blue states in the form of D.C. and Puerto Rico, if they pack the Supreme Court and create more new constitutional institutions such as the Consumer Finance Protection Bureau, and they rescind the 2nd amendment, then this will be the most important election since 1964 and 1932. Two 20th century elections that mattered.
Of the 57 presidential elections that have been held since Washington’s win in 1788, about 6-7 were indeed among the “most important of our lives!”
The Historical Context of Class Envy, Marxism and the Road to Power
August 2020
In 133 BCE, the Roman people elected Tiberius Gracchus as Tribune. Building on centuries of class rivalry, Tiberius’ goal was to redistribute “state” land to the poor. In Mary Beard’s magisterial history of Rome entitled SPQR: A History of Ancient Rome, the author states, “Whatever the economic truth, however, he certainly saw the problem in terms of the displacement of the poor from farming land.” But was this entirely about the poor? Beard contends that behind the populism lay something else, “Some observers at the time, and since, claimed that far from being genuinely concerned with the plight of the poor, Tiberius was driven by a grudge against the Senate, which had humiliatingly refused to ratify a treaty he had negotiated.” The rather high handed nature of his actions, along with the logical animosity of wealthy Romans and Italians who lost the opportunity to profit from the land, led to the murder of Tiberius in 133 BCE. And Americans lament the state of politics today.
Ten years after Tiberius’ death, his populism, based on class warfare, was taken up by Gaius Gracchus, his younger brother. Gaius planned to assist the poor of Rome with a grain supplement. This grain provision was not quite welfare in the 20th-century understanding but rather having the state subsidize food purchasing. Like all state-provided giveaways, this program was converted from expediency to entitlement, and further expanded by subsequent Roman governments extending into the Imperial age.
Additionally, because the state provided it, it soon took on the concept of “free” provision. However, much of it was subsidized in turn by taxing the provinces or extracting concessions from allies. As Beard states, this was not just about helping the poor, “The debate was about who had a claim on the property of the state and where the boundary lay between private and public wealth.”
From 1836 and for the next ten years, a group in Britain called the Chartists, led by William Lovett made a series of demands upon the British government. These demands were for greater political participation but were based on the needs of the working class. In 1832, voting rights were provided to middle classes in Britain but depended on property ownership. It was mainly this provision that the chartists wished to omit. The charter itself was called the People’s Charter, and Lovett is described as an “activist.” Though the United States is blessed (or cursed) with thousands of these creatures in 2020, it was rare in the 19th century. Lovett himself stated. “The franchise being confined to a small portion of our population, and that portion-controlled and prejudiced to an incalculable extent by the wealthy few.”
These debates, nearly 1900 years apart, are ones that an avowed socialist, or Marxist, would revel. Not just in theoretical or economic terms, but political. When Barack Obama made his infamous “you did not build that” speech, he touched on the divide between public spheres, such as roads leading to a given business, and the private company itself. The obvious rejoinder is that the road does not get built without the tax revenues collected from the business. The state can only produce “revenue” in the context of taxation.
Writing in 2019, columnist George Will noted socialism, “This means having government distribute, according to its conception of equity, the wealth produced by capitalism. This conception is shaped by muscular factions: the elderly, government employees unions, the steel industry, the sugar growers, and so on and on and on. Some wealth is distributed to the poor; most goes to the “neglected” middle class. Some neglect: The political class talks of little else.” Will is describing American politicians who knew what the Gracchi understood; that he who does the redistribution, gets the votes.
One of the reasons that socialism, and Marxism, still prevails, unlike fascism, is that socialism sets aside any debate between public and private. It carries the perception that when everything is public, everything will accrue to the benefit of those who need it most, the poor. In an imagined, revised Obama speech, the phraseology would be that you did not build any of it-the state did. If the state builds the road, and the factory, and runs the farm, and decides who gets what, then the citizen is free to participate in any of those activities without the anxiety of success or failure. As Marx noted in the Communist Manifesto, “In place of the old bourgeois society, with its classes and class antagonisms, we shall have an association in which the free development of each is the condition for the free development of all.” The ability to pick and choose, be an engineer and be an expert in comparative literature, and enjoy the same fruits regardless of vocation is pretty heady stuff for that comparative literature devotee.
The problem with the utopianism of Marxism is that what the state can give, the state can then take away, and since the citizen has abrogated their rights to the state, there is no recourse. This unchecked state power is one of the many reasons why Marxism fails its core goal. In communist countries, the inequality between the have and have nots always increases as the rulers take more power and wealth.
And then there is the Omni-present, fear factor with capitalism. Because there is no clear, from the top down, discernible plan, markets will rise, and on occasion, markets will fall. The fact that it was often state intervention, Hoover’s Smoot Hawley Tariff, or Barney Frank’s demand for low-income housing loans was but two examples. The relationship between state interventions and subsequent economic crises is one of the most reported stories of economic history.
Yet the fear factor is in play. In an article for the Financial Times written in 2018, author Adam Tooze notes, “the world of globalized free-market capitalism we inhabit today has much in common with the world about which Marx wrote in the mid-19th century. “It is the Marx of the 19th century,” he tells us, “who can attract the people of the twenty-first”. What speaks to us today is the true Marx of the mid-Victorian period, not the traduced Marx of the 20th-century state ideologies.” Whether it be the 2nd century BCE Rome, 19th century Europe, or 21st century the United States after the bank meltdown of 2008, there are financial calamities that adversely affect the poorest in society. These are the perpetually fertile ground for the seeds of Marxism. Under the aegis of limited government, capitalist systems, humanity enjoys prosperity that would have been unheard of in Marx’s 19th century. But there are still fears, and Marxists need to make a living too, so the concerns are stoked.
Populism can take on many different forms, and indeed, fascism was one of those, but. In contrast, the worst mass murderers of communism have primarily targeted their people for destruction; fascism will be forever linked with World War II in general, and with Adolph Hitler in particular. For those among the German people that executed, and there were millions, Hitler also targeted groups of non-Germans. When Hitler began his totalitarian regime in the 1930s, the world looked away even when he took Austria and Czechoslovakia. Only when he invaded Poland did France and England intervene. And even then, the United States stayed on the sidelines, and the Soviets went into Poland on the other side. What if Stalin had exercised an openly genocidal campaign against the Bulgarians, Iranians, or Serbians instead of Cossacks and Ukrainians? One of the clear understandings of Marxist thugs is to keep their oppression within their borders, counting on the weakness and capitulation of other nations to prevent outside interference.
Another aspect of the endurance of class warfare is it is only good politics. Whenever a politician, whether it be Tiberius Gracchus or Bernie Sanders, speaks of inequality, it is always about relative positions, not total outcomes. The Rome of Gracchus’ day was the preeminent state of its time having vanquished every foe from the Carthaginians to the Spanish to the Greeks. To be a Roman was infinitely better than to be one of the poor sods from Carthage after Cato the Elder got through with it. But the nascent Roman Empire brought incredible wealth to individual families such as the Scipios, and Gracchus could exploit that envy for political gain. The people of the United States spend $75 billion on sports entertainment and nearly another $50 billion on streaming services. But there are inequalities. A person worth $10 million has 1/10th of 1% of Jeff Bezos’s wealth, but that does not make the multimillionaire poor. But talking of relative wealth between wealthy Americans and impoverished people say in Cuba does not accrue votes or power. So class envy and watered-down Marxism it is.
Will notes, “The temptress of socialism is constantly luring us with the offer: “give up a little of your freedom, and I will give you a little more security.” As the experience of this century has demonstrated, the bargain is tempting but never pays off. We end up losing both our freedom and our security.”
Writing National Review in 2014, Tim Cavanaugh noted, “Marx has never vanished from the academy. The stubborn refusal of applied Marxism to produce anything but mass murder merely led to efforts to reframe the philosophy.” One of the excellent rejoinders of classists, and their Marxist ideology, is that it has never been adequately implemented. Given that at least 15 nations have attempted some form of Marxism, and all came later to regret the attachment deeply, it is safe to say it simply will not work. Yet this also misses the point of Marxism itself, as Cavanaugh states, “Defining the Soviet and Maoist states as failed experiments in social justice misses the point. They were attempts to put the essential violence of Marxism in motion, and they succeeded on a spectacular scale. Violence is not incidental to Marx. It’s Throughout his work; it’s there between attacks on “vampire capital” and “Jewish hucksterism.” Some samples:
- “The only antidote to mental suffering is physical pain.”
- “The Communists disdain to conceal their views and aims. They openly declare that their ends can be attained only by the forcible overthrow of all existing social conditions.”
- “The meaning of peace is the absence of opposition to socialism.”
And of all these comments, the one that is also quite attractive to movements, including in 2020, the Black Lives Matter movement, which has avowed Marxists leading the charge but positions themselves accordingly with comments such as rejection of two-parent households for “the collective.” The logical extension of class envy and class warfare is accumulation of power for the state. The result of collectivization is a disincentive to improve. The result of political or economic coercion is violence. These are horrible things, but marvelous things if one wishes to accrue political power.
Defund the University
July 2020
For centuries, the United States has relied upon the university system to educate our young.
In that location, they learned of religion, civics, languages, culture, writing, philosophy, and history. There were absolute basics: platonic philosophy, critical thinking, and the values captured in the Enlightenment. Philosophers were taught there. The ones that shaped the most successful, most prosperous nation to have ever existed in the history of humanity were part of the canon. These philosophers, such as Locke and Montesquieu, shaped the vision for this nation, a vision that still exists. “When you consider what an enormous windfall gain it is to be born in America, it is painful to hear some people complain bitterly that someone else got a bigger windfall gain than they did,” Noted Thomas Sowell. But these beliefs, and the people who built them, are no longer taught. It is time to rethink the University.
The concept of the University goes back millennia, but the University of Bologna, founded in 1088, has never been out of operation. And the view of education is supported by some of the greatest luminaries of the past 100 years. Nelson Mandela stated, “Education is the most powerful weapon you can use to change the world.” Doctor Martin Luther King Junior noted, “The function of education is to teach one intensively and think critically. Intelligence plus character – that is the goal of true education.” G.K. Chesterson said, “Education is simply the soul of a society as it passes from one generation to another.”
I try to avoid what I call the tyranny of the anecdote. These are personal stories that take individual life experiences and extrapolate an entire narrative about complex systems through the vision of a single person. One may have had an incident with an immigrant. But this does not mean that all immigrants are bad. One may have lost a job, but this does not mean that capitalism is inherently evil or exploitative. A boss may have treated one differently, but this does not mean that the system necessarily is discriminatory. I write all of this and then go back on my own rule and provide the following anecdote.
My undergraduate degree was in history with a minor in education. This major meant I was able to secure two, and only two types of jobs. Either I was to be a university professor or a high school social studies teacher. There are about 4,000 history professorial jobs in the United States. Given an average 40 year tenure, that means that at any given moment, there about 100 openings throughout the United States. And of these, many either required or wished for, Master and Ph.D. style candidacies. Having already accrued significant debt, the thought of two or three years of college was not in the cards. Many history majors do pursue legal degrees or go into journalistic vocations. Additional education would have meant even more schooling, so high school social studies was the only logical path.
I was hired at one high school as a long term substitute at reasonably low pay. I was able to sustain myself but not pay down any debt, much less begin to save money. At the end of a highly successful year (based on principal, dean, fellow teachers, student, and parental feedback), I was rewarded with being laid off. At the ripe old age of 23, I was the junior staffer. The incompetence and burn out of several of the other social studies teachers were beside the point. The unions reactively protect all members regardless of ability.
A merit system would have precluded the union’s concept, so the seniority system ruled, as it does to the present day. Frustrated beyond measure, I left teaching, something I miss to this day and help compensate with my blog and podcast. But back then, the bills had to be paid, and I was not going to wish for a Bernie Sanders to come along and transfer my debt to some other poor sod.
Upon the advice of my father, I went into business, sales to be exact. I started at a small paper merchant that, if memory serves, was not dissimilar to Dunder Mifflin excepting one notable distinction. Instead of the semi-incompetence of Michael Scott, I was trained by an entrepreneur named Sammy Chaiken. One of the first things Sammy taught me was the Hollywood shibboleth of the fast-talking sales person was a fiction. Instead, the best sales reps listen, probe, discover the specific needs of their clients, and endeavor to deliver something others cannot. I learned the importance of product superiority, or lack thereof, quality service, and low pricing.
Given the challenges of the market and Sammy’s frugality around employee pay, I left his firm after two years working in a billion-dollar company. Later I left sales for my more natural milieu of marketing. And the lucrative nature of that vocation is how I am now able to spend my time working this blog. And what did all of those classes and the hundreds of thousands of dollars spent on them mean? Almost nothing. I possess a few fond memories of those days and some friends, but the successes in my life were, despite, not because of, my university experience.
I later obtained a Masters of Business Administration from the right school in Chicago, but again, my real-world learning came on the job. Simple question. If you never attended a single university class, would your career have been negatively altered? My answer is no.
Why the personal dissertation? The tyranny of the anecdote that I oppose? Because this scenario is not anecdotal but played out over millions of experiences within the United States. There are currently over 1 million individuals with a history degree, and of those, the vast majority do not use their education, their hundreds of thousands of dollars spent, in their jobs. This type of major does not count all of the majors for which universities provide students, but which contains no connection to real-world opportunities. As Bryan Caplan notes in his astounding, The Case Against Education: Why the Education System Is a Waste of Time and Money, education is not about getting a job, it is signaling to potential employers your perceived value, and limiting the benefit of those who do not possess certain degrees. “There’s two ways to raise the value of a diamond. One of them is, you get an expert gemsmith to cut the diamond perfectly, to make it a wonderful diamond.” That adds value by making the stone objectively better—like human capital in the education context. The other way: “You get a guy with an eyepiece to look at it and go, ‘Oh yeah, yeah, this is great—it’s wonderful, flawless.’ Then he puts a little sticker on it saying ‘triple-A diamond.’ ” That’s signaling. The jewel is the same, but it’s certified.”
Caplan then adds a piece about credential inflation, He uses the following analogy. You are at a concert and you stand up to see better while everyone else remains seated. You now have the best view. But if everyone else stands up, then you cannot see as well but of course if you sit down, you cannot see at all. “The result is “credential inflation.” Today a college degree is a prerequisite for jobs that didn’t previously require one—secretary, rental-car clerk, high-end waiter. And to return to the concert analogy, if you’re unable to stand, you’re objectively worse off than before. “People who are in the bottom 25% of math scores—their odds of finishing college, if they start, are usually like 5% or 10%,” Mr. Caplan says. They end up saddled with debt and shut out of jobs they may be perfectly capable of performing.”
Other offerings from universities have little to do with jobs. The first is the social networking available to students. Hard to argue with that as many make life-long friends and even meet spouses on the campus. So is college a $150,000 cocktail party?
Caplan states, “The heralded social dividends of education are largely illusory: rising education’s main fruit is not broad-based prosperity, but credential inflation.” Caplan goes onto to write, “The rise of the Internet has two unsettling lessons . . . First: the humanist case for education subsidies is flimsy today because the Internet makes Enlightenment practically free. Second: the humanist case for education subsidies was flimsy because the Internet proves low consumption of ideas and culture stems from apathy, not poverty or inconvenience. Behold: when the price of Enlightenment drops to zero, it remains embarrassingly scarce.” Currently there is a popular online learning course called Master Class, for about $15.00 per month you can access about 80 online lectures. Instead of learning astrophysics from some TA or professor who only teaches so that she can write her books, you get the Neil deGrasse Tyson. Instead of the actress who starred in off, off broadway remakes of Mamet, you get Academy Award winning Natalie Portman.
So colleges are not necessary for knowledge obtainment. Historians or the dreaded history buffs are those at the dinner party who drone on about Eisenhower’s dog, but no else at the table cares. If they did, they could Google Eisenhower and save the $2,500 per class required to learn about him.
Another argument is that one learns critical thinking skills, something more difficult sitting in one’s room and looking at Zoom broadcasts. A highly successful college president from the Midwestern Lawrence University, Richard Warch, was named one of the 100 most effective college presidents, put the mission this way, “to provide an education fit for free men and women in a democratic society.” Students must, “present and follow an argument, develop a standard of abstraction, challenge assumptions, question ruthlessly, and inquire rigorously.” Warch goes on to state his concerns that an Orwellian purpose may preclude this vision, “In Orwell’s Oceania, the party has two aims, conquer the whole surface of the earth and extinguish once and for all the possibility of independent thought.” So there are two scenarios. The first scenario consists of training people think freely, to think critically, to challenge assumptions. The second is to ascribe to a pre-determined narrative that presents one view, and squelches other views. Which do you believe is dominant within the academy today?
The 40 plus year march of the academy towards something akin to Oceania has been steady in its progress, but in the wake of the murder of George Floyd, and the subsequent mass protests, boycotts, cancelations, name changes, monument defacing, and rewriting of the historical narrative of the United States has been put on steroids. The fundamental goal, the reworking of the limited government, capitalistic United States, into something for more akin to a socialistic entity, is not a new concept, but goes back to the late 1800s. Even the idea of using education as the Trojan Horse to evict the old order with the socialistic new one is not fresh. In his 1980 tome, A People’s History of the United States, Howard Zinn encapsulated the concept of America being evil. In this case, Zinn cleverly pushed his books into social studies classes as an “alternative” way to look at the United States. Alternative indeed. Zinn’s world view is one of “executioners and victims” as stated in his work. The U.S. is the executioner. Liberal curriculum directors decided to use the book to provide moral equivalency to the United States as a good place. This same concept, using education to rework the foundations of American society, is even more resident within the academy.
This is what Caplan, who gets so much right, gets two things wrong. First, though there is learning apathy around several subjects in the academy, the victimology so prevalent in the academy is not one of them. Americans are fundamentally decent human beings who believe in justice. They are far more likely to listen, and engage with a professor who lays out a victimized group, and then implores his impressionably young students to help. Though there is apathy for much of what is taught in the University, there are enough malleable minds teach, and enough acolytes to convert, to make an indelible impact on the future of the nation.
Second, it is Caplan’s contention that the leftist, ideological swimming pool of the modern academy has little effect on the minds of the students. In fact, one of his key contentions is that the university has little real affect at all. But in 2016, avowed “democratic” socialist Bernie Sanders was the main challenger to the eventual nominee. Then during one period in 2020, he was the leader. One can say that the Democratic Party rejected him both times but had Sanders run as recently as 1992, when Bill Clinton obtained the nomination, he would have been, at best, a fringe candidate, not the runner up. And Bill Clinton in 2020, he would have been seen as too moderate with his views on limited government, balanced budget and support of NAFTA. When Nancy Pelosi and Barack Obama have moved to the center of their parties, as opposed to being on the left, flanked by the likes of Elizabeth Warren and Alexandria Ocasio Cortez, the effect had to come from somewhere. And we know in the case of Sanders, it came from the young. According to a March 13 column in the New York Times, “The youngest voters in Michigan, those 18 to 29 years old, gave the vast majority of their votes to Sanders. The next youngest voters (age 30 to 44) also backed Sanders, 52 percent to 42 percent. But they were swamped, in turn, by the next oldest voters (age 45 to 64) who backed Biden 62 percent to 26 percent. In 1912, socialist candidate Eugene Debs ran as leader of his own party because neither of the two major parties would have supported him. He obtained 6% of the vote, the largest percentage prior to Sanders. This left word tilt did not emerge from the ether.
So what does this new narrative consist? Instead of learning a nuanced lesson about a flawed but essential person such as Jefferson, students learned he owned slaves. End of Story. Nothing about his founding of the University of Virginia. Little about the expansion of the United States in the Louisiana Purchase. His decisions about the location of the capital nor his role in France. And that other thing. Jefferson is the one figure in history who took the fundamental values of the Enlightenment and articulated them into a concept of liberty never matched in the history of humanity. The idea of freedom was not known. But the idea of how freedom could be translated into the idea of a nation was his. Yes, he owned slaves. Yes, slavery is an evil institution. But it was his words that inspired the eventual end of slavery perpetrated by the sacrifice of hundreds of thousands of northern whites. Additionally, the context of the time entails the understanding that in 1776, the Chinese, Indians, Turks, French, Dutch, and British owned slaves.
The academy is no longer interested in educating but indoctrinating. Here are a few of the samples of what students can now obtain.
- Taking Marx Seriously (Amherst)
- FemSex (Carleton College – should Minnesota be allowed to have universities?)
- Hidden Spaces, Hidden Narratives: Intersectionality Studies in Berlin (Colorado College – can we be more specific?)
- Kitchen Culture: Women, Gender, and the Politics of Food (Hamilton – Students will learn how the Tomato and the Apple have been oppressing blueberries, asparagus, and rutabagas).
- From Lenin to Pussy Riot: Gendering (Post) Soviet Russia (the University of Maryland – Putin keeps his shirt on for this one).
- The Politics of Kanye West: Black Genius and Sonic Aesthetics (Washington University – Do they still think he is genius once they learn he supports Trump?)
- Lemonade: Black Women, Beyonce & Popular Culture (University of Texas) Politics of Struggle: Race, Solidarity, and Resistance
- Constructing Race
UCLA, one college in one system, has over 24 courses in their African American studies track.
Besides specific requirements, what could be used by this major apart from sociology, teaching, diversity administration roles, and advocacy, and activism?
According to Education.org, there are currently 18.2 million students enrolled in post-secondary education. Of these millions, most will take courses in history, sociology, anthropology, or related work in the humanities. It is commonly acknowledged that liberal dogma reigns in movie making, newspaper editorials, and one notable exception, T.V. newsrooms.
Yet how is it that Americans send their children to schools and colleges, spend tens of thousands of dollars annually per student, and never question the level of diversity of thought that students’ experience? In a 2005 Study conducted by Stanley Rothman of Smith College and Neil Nevitte of the University of Toronto, “72 percent of those teaching at American universities and colleges are liberal, and 15 percent are conservative.” he website Econ Journal Watch, a blog run by three economics professors, in 2016, issued an article entitled Faculty Voter Registration in Economics, History, Journalism, Law, and Psychology.” The premise for the article is quite simple. Look up the public voter registrations that are available to the public of professors in the academy, “The 40 universities we investigated were determined, in early 2016, by starting at the top of the U.S. News and World Report list “National Universities Rankings.”
They then broke up their ratios by fields of Economics, History, Journalism/Communications, Law, and Psychology and looked up 7,243 professors and found 3,623 to be registered Democratic and 314 Republican, for an overall D:R ratio of 11.5:1.
- History – 33.5:1,
- Economics – 4.5:1,
- Journalism/Communications – 20.0:1,
- Law – 8.6:1,
- Psychology – 17.4:1
According to the Audubon Society, there is an insidious bird called the Common Cuckoo, Cuculus canorus for bird nerds. This particular avian is the famous bird of Europe whose voice is imitated by cuckoo clocks (and whose call, coo-coo, gave the name to the entire cuckoo family). It is well known as a brood parasite: females lay eggs in the nests of smaller birds, and their hapless “hosts” raise only young cuckoos. A common migratory bird across most of Europe and Asia, it regularly strays to the western Alaskan islands in late spring and early summer.” The Audubon does not tell you that the cuckoo young will shove the host bird’s young out of their nests, to their deaths. Imagine the values that we hold dear, the values upon which this Republic is built, are the host birds. This willful ejection is what the modern University is doing today. We are the host birds who keep feeding the very means for the destruction of that we hold dear, and we are not fully aware we are doing so.
One example is a book taught in many universities, including a Catholic university run by Augustinian monks, and has been prominently displayed on Barnes and Noble’s website to help the uneducated with their wokeness. The New Jim Crow, by Michelle Alexander, says that the social and cultural issues surrounding African Americans are because of an incarceration program on the part of whites, which removes black men. “The nature of the criminal justice system has changed. It is no longer primarily concerned with the prevention and punishment of crime, but rather with the management and control of the dispossessed.” Alexander goes onto to note, Rather than rely on race, we use our criminal justice system to label people of color “criminals” and then engage in all the practices we supposedly left behind. Today it is perfectly legal to discriminate against criminals in nearly all the ways that it was once legal to discriminate against African Americans.”
And here is the most insidious piece about Alexander’s and by extension, the academy’s, belief system. One need not be a member of the KKK or even profess bigotry. Liberals such as Alexander know what is in one’s heart, “racial caste systems do not require racial hostility or overt bigotry to thrive. They need only racial indifference.” In other words, if I do not proscribe the narrative professed by Alexander, then I am guilty. If I hold a belief system shaped by the likes of Thomas Sowell, or Jason Riley, two prominent African American conservatives, then I am a racist. This is akin to McCarthyism, where it was not enough to be an anti-communist. One had to be actively anti-communist, professing allegiance to the United States in a way that the Committee of Un American Activities approved to be a “real” American. This intolerance was also apparent in the days of the Inquisition in which the indifference to Catholic religion was seen as a sign of heresy. And we understand what fate awaited those heretics. Sir Thomas More was not killed because he opposed Henry VIII’s regime. He was murdered because he did not openly support it.
Here are two problems with Alexander’s narrative that are not acceptable to the academy and would not be found there. First, police tend to respond to calls. If the calls are disproportionately to one race, is that racism. Would it be better if police based responses on local percentages of race? And nothing in Alexander’s work refutes the gang and drug violence of which police are responding. Second, assuming police departments are targeting by race, and not response, most of the incarcerated emanate from communities where liberal, progressive, and African Americans have held key positions of power for decades. The city of Detroit has had one white mayor since the 1970s. Philadelphia had as many black mayors as white ones in the past 40 years. Of the previous five mayors of Chicago, three have been African American. In Baltimore in 2015, there was a black mayor. In Chicago during the Laquand McDonald fatality, there was a white mayor, but he was the chief of staff to Barack Obama. In Tacoma, the mayor, Mayro Victoria Woodards, decries the death of Manual Ellis at the hands of the police department that she oversees. Keisha Lance Bottoms has been the mayor of Atlanta for the past three years. But she is not taking accountability for the police actions in the killing of Rayshard Brooks. The problem with the “caste system” narrative extolled by Alexander and so many others in the academy is that the top of caste is not white too often. The narrative of a bad shooting in the case of Ferguson suspect Michael Brown is a case in point. The hands up narrative did not survive scrutiny by Eric Holder, the Attorney General at the time, who was working for his boss, Barack Obama. Is Alexander suggesting that these men are at the bottom of a caste system?
Of course, in the modern academy, all of these points are not discussed. There is no discussion but the narrative. Those young, poor host cuckoos do not have a chance to argue until they are ejected from the nest and nor do any opinions that content with the prevailing orthodoxy of America as racist, sexist, classist, and imperialistic, to the core. And our children are being taught this every day. The University’s brood parasites eject views such as limited government, the natural rights extolled in the declaration, separation of powers in the Constitution, and freedom of speech in the bill of rights. The only difference is that the cuckoo eventually leaves the nest, but tenured professors continue their parasitic nature.
The basis of this piece is to defund the University. From the inception of the Conservative Historian blog in 2012, I have asked conservative-minded parents to take less interest in university sports or the most beautiful dorms or whether the cafeteria has a robust veggie selection and pays more attention to what the professors are teaching. It was my call for them to begin to take a hand in these matters as they would in any other matter of their lives. If a parent were to buy a home for $200,000, would they just write the check and have no say in the decoration or the furniture. It is almost as if these parents are more afraid of being uncool than about throwing away a significant portion of their life savings on wasted curriculums that drive ideological narratives.
But of course, it is not just the choice of parents. People who have no children, in fact, all of us, are on the hook. According to Caplan, “Government heavily subsidizes education. In 2011, U.S. federal, state, and local governments spent almost a trillion dollars on it…It is precisely because education is so affordable that the labor market expects us to possess so much. Without the subsidies, you would no longer need the education you can no longer afford.” Why is Bernie Sanders, Alexandria Ocasio Cortez, and Elizabeth Warren demanding “free” education? Why the better to indoctrinate more students at the cost of those who still believe in America.
The academy is ineffective in helping us with our roles in the real world but it marvelously effective in creating a social justice warrior class that will leave the world’s most prosperous nation looking like something akin to a South American dictatorship.
it is past time we stopped this game and defunded the University.
A Revival of the Case for Term Limits
June 2020
When one of the most substantial pieces of legislation was passed in 2010, the Affordable Care Act (ACA), Nancy Pelosi famously stated, “We have to pass the bill so that you can find out what is in it.” This remark was, logically, met with ridicule, and it might have been funny had the stakes not been so high. This 2,000-page law, altering one-sixth of the U.S. Economy, was crucial to the future of the country that members of Congress were not expected to read it. But it went even further than that. The 2,000 pages were not the law itself. Over four hundred times, the ACA referenced not how the law was implemented, but what the actual content of the law would entail. The ACA, as provided by Congress, was not a policy document but rather priority, and the real governance itself was left to the executive branch to build.
In too many issues ranging from the environment, immigration, war powers, regulation, and gun control, Congress punts the heavy lifting of legislation to the executive branch. Another congressional tactic relies on the courts in the hopes that, as Roberts did with the ACA, the Supreme Court will clean up the mess. A few weeks ago, in Clayton V. Bostock, the Supreme Court again wrote the legislation. Title VII of the civil rights act talks of “sex,” meant in the 1960s context to mean gender, but makes no mention of sexual orientation. In the 1960s, there was a prominent movement for black and women’s empowerment. But the Congress of 1965 was not about to undertake a decision on gay rights, much fewer rights about transgender. If Congress, as constituted 55 years later, believes that Title VII should extend to sexual orientation, they could have passed a law declaring as much. Again, they punted.
The reason that Congress is so reluctant to perform their duties is twofold. By democratizing Congress’s selection, including primary systems and the 17th amendment, members are now vulnerable to every vote on every piece of legislation. This exposure does not emanate from an inter-party opponent but rather an intra-party rival. Veer too much towards the center, and a member is flanked by a more ideologically pure version of oneself. Yet go too far to the flanks and risk being labeled a radical.
The other issue is simple fundraising. Again, the seeming value of taking party leaders entirely out of the mix, and increasing democracy, means that Congress is in perpetual campaign mode, driving an endless fundraising mode. To limit the majority’s power is why the founders envisioned the United States as a republic and not a democracy. It is was also the goal to avoid majoritarianism by having the state leaders choose the Senators, something changed by the 17th amendment.
The first part of member’s dilemma, avoiding hostile votes that can be used as primary fodder, and the second part, continual campaigns, both emanate from the same principle – hold onto the seat. In essence, Congress, the legislative body, no longer legislates. So what do they do? Run for office, again, and again. The answer is to take them out of campaign mode – literally – with term limits. There was a time in our recent history, particularly from the mid-1990s to the mid-2000s, when term limits seemed an unstoppable force. But nothing can stem the tide of history faster than a politician fighting for their seat. After some initial successes, the movement slackened. It is time to revive it.
There is a term limit already in place, and it is guaranteed with the 22nd amendment. Passed on February 27, 1951, the 22nd amendment limits the presidency to two elected terms or a maximum of 10 years. In the case of Lyndon B. Johnson, he could have exceeded eight years had chosen to run in 1968. Scott Bombay, writing for the Constitution Daily, notes the amendment, “These doubts about unlimited presidential terms of office did not fade away after President Washington set the unofficial two-term precedent in 1796. Scholar Stephen W. Stathis explains in a 1990 paper that “Congress considered early versions of presidential term limit amendments in 1803 and 1808. The Senate approved term-limit resolutions in 1824 and 1826, only to be rejected by the House.”
The fundamental concern was the fear that a president would use his office’s power to secure that third reelection, or a fourth, creating a president for life. Between Washington and the passage of the 22nd amendment, there were several attempts to create a law limiting presidential terms. Yet the necessary votes were not in place due to presidential maintenance of the two-term standard. Since Washington, six presidents have been elected twice and completed their terms without choosing to run again, before 1951. “In 1876, the House passed a resolution that “the precedent established by Washington and other Presidents of the United States, in retiring from the Presidential office after their second term, has become by universal concurrence a part of our republican system of government.” Adds Bombay.
It was Roosevelt’s setting aside of the Washington precedent, that made the desire for an amendment on presidential term limits necessary and desirable. Not coincidentally, it was Republicans who pushed for this amendment with the help of Southern democrats.
In two cheers for the 22nd amendment. The two-term limit for presidents helps prevent political stagnation, writer Thomas E. Cronin, states, “The two-term limit is healthy for the two-party system. It helps prevent political stagnation. The two parties benefit and are rejuvenated by the challenge at least every eight years of nurturing, recruiting, and nominating a new term of national leaders.” But does this include Congress as well?
Writing in 1994, Dan Greenberg of the Heritage Foundation noted, “Term limits would ameliorate many of America’s most serious political problems by counterbalancing incumbent advantages, ensuring congressional turnover, securing independent congressional judgment, and reducing election-related incentives for wasteful government spending. Perhaps most important, Congress would acquire a sense of its fragility and temporariness, possibly even coming to learn that it would acquire more legitimacy as an institution by doing better work on fewer tasks.”
In a 2014 Washington Post column, George Will notes, “Congress increasingly attracts people uninterested in reversing its institutional anemia. They are undeterred by — perhaps are attracted by — the fact that they will not be responsible for important decisions such as taking the nation into war. As Congress becomes more trivial, its membership becomes less serious. It has an ever-higher portion of people who are eager to make increasingly strenuous exertions to hold offices that are decreasingly consequential.”
In a January 29, 2019 article, writing for the Cato Institute, Doug Bandow states, “Term limits most directly prevent politicians from turning office‐holding into a career, spending 30 or 40 years as a congressman or senator, hanging on until they can barely function. Forcing rotation in office would also hinder the development of permanent relationships among members and interests/lobbyists. Even when these ties did develop, they would last only until the member’s term ends.”
The opposition is fierce, and it is not lost on the proponents of term limits that the very people who would have to vote for them, are those with the most to forget: career politicians, on both sides of the aisle. There are other arguments proffered to support term limits. Writing for the Brooking’s Institute, author Casey Burgat, in a January 18, 2018 article, states, “Despite widespread support, instituting term limits would have numerous negative consequences for Congress.” Burgat goes on to list the following: taking power away from voters, decrease congressional capacity, lack of incentives for policy experience, removal of active lawmakers, and increase, rather than decrease, of influence of special interests. It is alarmingly easy to refute every one of these contentions.
Given the nature of gerrymandered districts in the case of the House and the power of incumbency in the case of the Senate, there is already a limited choice for voters. Of the past six elections, the rate of House incumbency never dipped below 85%, and that of the Senate never below 79%. Yet, this safety does not set aside the continuous need for campaigning. Instead, it is a result of the focus on members on keeping their seats rather than doing their jobs.
Congress has already abrogated its legislative prerogatives to the executive branch. This transfer has been proven not just by the passage of the Affordable Care Act but by Dodd-Frank, the power to make war, immigration, and a host of other major legislative initiatives. There is already a decrease in congressional capacity because legislators do not care to legislate, or do not have the time with continuous fundraising.
The arguments around minimizing effective lawmakers, an example being Paul Ryan, is also erroneous. Nancy Pelosi has been in Congress for thirty-three years, and her intelligence and success are unquestioned. But successful at what? Her effectiveness as a legislator is unknown because she has done so little of it—meanwhile, Ryan, a lawmaker who could write a law, Term-limited himself.
Then there is the special interest argument. Burgat states, “However, the term limit literature commonly finds that more novice legislators will look to fill their own informational and policy gaps by increasing reliance on special interests and lobbyists.” If the goal is not to legislate, but instead get elected, the power of lobbyists and special interests will only grow. Would not a legislator, in their 10th Term and running for reelection, be highly concerned about where the National Education Association stood on education, given they can turn out the vote and raise the money. It is also difficult to believe that a person who has been in office for 24 years will be more immune to policy proposals from the likes of the NRA or the ABA than would a congressperson in their 3rd year.
Then there is the final argument of the incestuous nature that exists between public servants and lobbying organizations. First, this goes on today without term limits. When the federal government controls $3 trillion of GDP directly and influences many times that amount, there will be lobbyists attempting to influence governmental decisions. Having more term-limited congresspeople available to staff lobbying jobs will neither increase nor decrease that fact. A
Burgat begins his dissertation with a quote from Founder Roger Sherman, “Nothing renders government more unstable than a frequent change of the persons that administer it.” An Open Letter, 1788. After he wrote these words, the American people elected four out of five presidents who elected to Term limit themselves.
In a 2019 work issued by the Federation of American Scientists entitled, “Congressional Careers: Service Tenure and Patterns of Member Service, 1789-2019,” the report states, “During much of the 19th century, the average tenure of Representatives and Senators remained relatively steady, with incoming Representatives generally averaging between two and three years of prior service in most Congresses, and Senators averaging between three and five years. In the late 19th and through much of the 20th century, the average tenure for Members in both chambers steadily increased. The average years of service for Members of the 116th Congress, as of January 3, 2019, when the Congress convened, was 8.6 years for the House and 10.1 years for the Senate.”
It is noteworthy that during the 19th century when Congress was still a co-equal branch of government and passed landmark legislation including the 13th, 14th, and 15th amendments, its terms were far shorter than in the 20th and 21st centuries. Sherman was wrong on term limits, and they need to be implemented today.
The Myth of Publicly Funded Infrastructure
June 2020
There was a time in Congress when Presidents would send over their state of the unions in writing. This is far more preferable to the bizarre circus of a President addressing Congress from the podium with members of their party jumping up and down like hundreds of jack in the boxes. In one of these written accounts, which President John Quincy Adams delivered in December of 1825, the topic of internal improvements was raised. “Upon this first occasion of addressing the legislature of the Union, with which I have been honored, in presenting to their view the execution so far as it has been effected of the measures sanctioned by them for promoting the internal improvement of our country.” Adams later described his vision of these improvements. “Roads and canals, by multiplying and facilitating the communications and intercourse between distant regions and multitudes of men, are among the most important means of improvement. But moral, political, intellectual improvement are duties assigned by the Author of Our Existence to social no less than to individual man.” Even before this statement, the territory of Missouri, in the process of applying for statehood in 1820, declared, “Internal Improvements shall forever be encouraged by the government of this State.” This support of infrastructure was one of many provisions of those territories wishing to become states.
American leaders of this time, trained in their classics in a way modern students are not, using history as a link. The intellectual Adams was no exception, “The magnificence and splendor of their public works are among the imperishable glories of the ancient republics. The roads and aqueducts of Rome have been the admiration of all after ages, and have survived thousands of years after all her conquests have been swallowed up in despotism or become the spoil of barbarians.”
During this period, the executive was not the transcending branch that it has become in 2020. Instead, individual leaders in Congress, including Henry Clay, held as much if not more influence and authority as presidents. It was Clay who defined internal improvements as part of the “American System,” developed during the early 1820s. His American System was a group of policies that included higher tariffs, the chartering of a bank of the United States, and internal improvements. In a speech delivered in 1832, Clay defended the system, “And I now say, preserve the protective system in full vigor, give us the proceeds of the public domain for internal improvements.” Jackson killed the national bank. Tariffs are still the subject of debate, but with the income and corporate tax, not the primary revenue source of revenue as they were in Clay’s time. But the concept of internal improvements is still very much a source of mischief for politicians throughout the Republic.
Internal Improvements or infrastructure usually spend a political winner. These types of spending projects, or boondoggles, depending on one’s point of view, create jobs, foster commerce, and provide a platform for future growth opportunities. Of course, capitalism could do all of that as well, and more efficiently. But politicians, such as Adams, do not get as much credit, or as many votes, for creating the circumstances of prosperity as an indirect provision of funds. Instead, they get the goods by providing direct largesse to a targeted constituency.
No less than Republican Donald Trump, in a comment on April 7, 2020, stated, “We’re going to do — perhaps — infrastructure, which you wouldn’t have gotten approved before. And now people are looking to do it. And the beauty is we’re paying zero interest or very close to zero interest. In some cases, we’re actually paying zero. And the dollar is very strong, and people are investing in the dollar.”
Arguably the most touted of all American infrastructure projects was the Erie Canal. In a 2017 segment from CBS news humbly titled “All Hail the Erie Canal,” CBS states, “The canal also accelerated the western expansion of the nation. People and commerce were able to reach and develop what would become the American Midwest. A region that was isolated and landlocked was now connected, via the canal and the Hudson River, down to New York City and the world.” John Steele Gordon, in his Empire of Wealth, states, “The Erie Canal would prove the most consequential public works project in American History and make New York, both state and city, the linchpins of the American economy for more than a century.” Maybe. Before the construction of the canal, New York City was already the largest in the young Republic due to its excellent port, the Hudson River, and the central location.
There is no doubt the Erie Canal helped with this growth. But the larger question was whether it had to be built by public funds and whether it was an overall success. As Carter Goodrich noted in his essay, “The Revulsion Against Internal Improvements,” the author states, “Even New York’s brilliant financial success with the original Erie Canal was followed by great losses on the lateral or feeder canals of the state system.” The American System extolled by Clay led many states to either borrow too much or to float public bonds. “The inability to meet interest rates and principal on them led to more drastic decisions, including default,” adds Goodrich.
Many Southern states desired public works project in the forms of railroad development. In an essay entitled “A New Look at Antebellum Southern Railroad Development,” author James Ward states, “A larger proportion of investment in the South was raised in the public sector than in the private one. The South relied on the public coffers to a greater extent than did the rest of the country.” Southern railroads were not just about moving cotton closer to ports of export. As John Majewski notes in his Modernizing A Slave Economy: The Economic Vision of the Confederate Nation, “Promoters sometimes focused on military advantages-the use of railroads to quell slave rebellions or move troops – to justify the public investment.”
In Empire of Wealth, Gordon goes onto to name a what’s what of American internal improvements, “The Erie Canal would prove to be the first of the long and continuing, list of the mega projects-the Atlantic cable, the Transcontinental Railroad, the Brooklyn Bridge, the Panama Canal, the Hoover Dam, the Interstate Highway System, the Apollo Project-that would become so much a part of the American experience.” Essentially Gordon’s argument is that these projects are as much about national unity, certainly the case with a moon landing, as much as about economic stimulus.
As a believer in internal improvements, one of Eisenhower’s legacies was the creation of the Interstate Highway System in 1956. One of the justifications was the Cold War. His use of national security is a common tactic for politicians wishing to justify domestic spending. The reasoning behind the creation of the Interstate Highway System is often touted by progressives who want to use the federal government on large infrastructure projects or as President Obama later referred to as “shovel-ready” projects.
What is notable, and something President Obama learned the hard way, is that the Interstate Highway System could not be built in the early 21st century. Between the needs of local entities, environmental concerns, and general bureaucracy, making similar internal improvements is problematic. Also, the claim of the success of the Interstate Highway System belies the fact that the U.S. economy was far and away from the largest in the world in 1956 and highly robust without this federal largesse. As for military value, America managed to win two World Wars without the Interstate Highway System.
The challenge to this chicken and egg kind of thinking is backward. Nations do not build highway systems without some form of cash. Even when they borrow, they are taking from potential earnings from future generations. Nothing in government happens without the prospect of future governmental taxation. In the case of the Interstate Highway System, the U.S. was already the largest economy on the planet and had been since the late 1800s. This infrastructure project was only possible due to those entities in a nation capable of producing revenue. The government does not create income. It distributes and redistributes revenues from its citizens and their enterprises.
It was then noteworthy that the promulgator of the Interstate Highway system, built under the false patina of military exigency, to turn around, at the end of his presidency, and sound a warning of this military/political alliance. “In the councils of government, we must guard against the acquisition of unwarranted influence, whether sought or unsought, by the military-industrial complex. The potential for the disastrous rise of misplaced power exists and will persist. We must never let the weight of this combination endanger our liberties or democratic processes. We should take nothing for granted. Only an alert and knowledgeable.” Eisenhower was right in this concept, but the glue linking the complex together is government. It is one thing for the government to oversee the military. But by using military exigencies as justification for large public works projects, the government becomes part of the complex and faces the questionable prospect of trying to regulate itself.
During the 2008 financial crisis, President Barack Obama proposed a near trillion stimulus plan, of which $150 billion was to go to infrastructure. As Obama found out, shovel ready is not a thing anymore. In a 2017 article on the Bezinga Blog, author Wayne Duggan notes, “When it comes to economic stimulus, local governments may take years to begin actual construction even once they receive funding. The reason why such a small portion of the American Recovery and Reinvestment Act of 2009 ended up spent on infrastructure is that the projects are simply too slow to get off the ground to provide meaningful near-term stimulus.” Another proponent of infrastructure for stimulus, former Pennsylvania governor Ed Rendell said, “If you are talking about stimulating the economy, ‘shovel ready’ doesn’t mean get the money on Monday and start on Tuesday,”
Even Obama himself admitted as much. In a 2010 interview with the New York Times Magazine, conducted by Peter Baker, “Obama told me he had no regrets about the broad direction of his presidency. But he did identify what he called “tactical lessons.” He let himself look too much like “the same old tax-and-spend liberal Democrat.” He realized too late that “there’s no such thing as shovel-ready projects” when it comes to public works. Perhaps he should not have proposed tax breaks as part of his stimulus and instead “let the Republicans insist on the tax cuts” so it could be seen as a bipartisan compromise.”
American politician’s desire for publicly funded works is not exactly unique to this nation. As Adams noted, it goes back for centuries. Many of these same projects are often lost, especially when completed more for public perception than for economic growth and can be the ruin of nations. The Fourth Dynasty of Egypt quickly fell after the construction of the Great Pyramids.
Alhambra’s cost meant less revenue for the Nasrid Dynasty to fight the burgeoning Christian powers in Spain. The Taj Mahal bankrupted the Mogul Empire and opened the way for British Colonialism.
Bourbon France never economically recovered from the construction of Versailles. For a historian, these sights are once in a lifetime experiences and now generate significant tourism dollars. But for the ordinary people of the time, who endured the taxes and the subsequent economic dislocation, they were a disaster.
A ruler with authoritarian powers undertook all of these works, but that does not preclude an ambitious governor or American president with a supine congress conducting such expensive acts. The “Big Dig” of Boston, begun in 1991, was supposed to cost $2.8 billion and take seven years to build. It cost $8 billion and took sixteen years to build.
It will not be fully paid off (if ever) until 2038. The California High-Speed rail, a new bullet train between San Francisco and Los Angeles, was purported to cost $33 billion when initially proposed in 2008. As of May 2020, the latest projections put it at nearly $90 billion. Only a small amount of track has been laid in a rural area of California at the cost of $12 billion. The far more expensive route in the urban areas lies ahead. Given these projections, current California governor Gavin Newsom has been tepid in his support.
This is not an argument for pure libertarianism, as if that could ever exist. The building of roads, along with police and military, should be in the purview of government, albeit, the more local, the better. Instead, it is a caution that when politicians, of any stripe, begin using “infrastructure” as a panacea for economic ailments, do not believe them. Instead, one would be better to ask for a tax cut, deregulation, and rollback of some of the more onerous and silly environmental edicts. Then one can build something cool all their own.
Joe Biden’s Choice and Henry Wallace in 1944
May 2020
At the Democratic Party Convention in 1944, Vice President Henry Wallace, former Agricultural Secretary and key supporter of Franklin Roosevelt’s New Deal, was dumped from the ticket. This says as much about the Roosevelt’s health than it did about Wallace’s strong progressivism and perceptions by the conservatives within the Democratic Party. Why would a successful ticket be dismantled? And why the choice of Democratic stalwart Harry Truman in place of Wallace? In a 2013 article written for the National Review, author Conrad Black, writing of Wallace’s selection as running mate in 1940, states, “FDR wanted the vice president to be someone who believed more emphatically in what he had been doing for eight years than did the incumbent vice president, former House speaker John Nance Garner of Texas.”
Yet by 1944, the selection of Wallace was seen to be a mistake. According to historian Roy Jenkins in his biography entitled Franklin Delano Roosevelt, “There was general agreement that Wallace would not do; FDR thought he would cut a million off the Democratic Vote.” Historian Arthur M. Schlesinger Jr. pronounced Wallace to be “an incorrigibly naive politician.” Journalist Peter Beinart wrote that Wallace had a “naive faith in U.S.-Soviet cooperation.” Yet Wallace had served as Agricultural Secretary for seven years prior to his election as Vice President in 1940. At that time, his views, that included his support for farmers, and a certain admiration for the Soviet Union, were known to Roosevelt.
So the question is why was Wallace, a palatable running mate in 1940, was not the right Vice President candidate in 1944? Despite the concerns over Wallace’s ideology, it is difficult to imagine a hale and hearty Roosevelt losing an election in the middle of the greatest war ever fought in history, regardless of who was running with him. Yet this was not the 1932 FDR, “The beginning of Roosevelt’s decline came in March 1944. With a bronchial infection and a temperature of 104 degrees, he had little choice but to undergo a major checkup,” notes Jenkins. Did Roosevelt know what was coming? Almost certainly but for a man who loved being president, his declining health could never be publicly, and barely privately, acknowledged. FDR’s condition was not going to be mentioned by his supporters with World War II still in doubt and an election coming up. In a 2015 book review in The New York Times, Lynn Olson states, “If the American public had known how gravely ill he was, his chances of re-election would have been greatly jeopardized, if not destroyed.” And did they know? Historian David Jordan in his work FDR, Dewey, and the Election of 1944, states, “Franklin Roosevelt’s health had long been a matter of public knowledge and speculation.” And this speculation led to, “another factor that weighed upon the Democratic leaders as they thought about a running mate for Franklin D. Roosevelt in 1944. That factor was Roosevelt’s health and the possibility that the convention might actually be choosing the next president when it named someone to the second spot on the ticket,” adds Jordan.
Harry S Truman was in many regards, the anti-Wallace. For the party bosses of the Democratic Party, Wallace was never going to be the choice. “He had a peculiar personality and his political views were anathema to large parts of the Democratic Party,” states Jordan. As for Truman, he was a great consensus pick because in his time in Washington, much of it as a back-bench Senator, he had not made many enemies. He also had a reputation as a man of integrity through his work monitoring war expenditures. Because the weight of their choice was so much more impactful than in 1940, the Democratic leadership, and Roosevelt himself, needed someone who could bring agreement and not split the party.
This opens up all kinds of insights into Vice Presidential choosing. Would H.W. Bush have chosen Dan Quayle if he had an expectation of not completing his term? Would John McCain have chosen Sarah Palin or Al Gore chose Joe Lieberman? There are recent examples of top of the ticket candidates making choices on the rhetorical basis of a having a ready successor. In 1992, Bill Clinton choose another southern moderate, to round out the ticket though Al Gore lacked some of Clinton’s personal baggage. And in the cases of Michael Dukakis choosing Lloyd Bentsen, and George W. Bush choosing William Cheney, these are examples of selecting experience greater than the person actually running for president. But none of these candidates, certainly not the hearty 43 year old Clinton, or long distance runner W. Bush imagined an inability to finish their terms.
In 2020, Joe Biden, if elected President, will not finish his first term. To have to write this is a distressing. It is dejecting in that a once robust figure is not the same man he once was. Disheartening in that it is a reminder that for all, there is a weakening and an end. And also bleak that after a process of 12 months, numerous debates, hundreds of speeches, and one candidate burning $500 million dollars, a major party ends up with a man who is not capable of handling the full weight of presidential duties. Recent presidential candidates of BOTH parties make one yearn for the days of party bosses and smoky back rooms.
Biden’s history of gaffes and plagiarism are well known. But he was committing his malapropisms and stealing Neil Kinnock speeches before presiding over the Senate Judiciary Committee and running as Barack Obama’s Vice President. Here are a few of the gems. In 1988, then aged 46, “New York Times reporter Maureen Dowd reported that in September 1987. During an event at the Iowa State Fair, Biden mimicked entire portions of Kinnock’s speech from earlier in the year. At one moment, Biden repeated the line that he was the first “in a thousand generations” to graduate from college, gesturing to his wife in the exact same way Kinnock did, while also saying the same line about her education and lineage.” Also in 1987, Biden claimed that he graduated in the top half of his law school when he actually was 76th of 85. In 2006 Biden stated, “In Delaware, the largest growth of population is Indian Americans, moving from India. You cannot go to a 7-11 or a Dunkin’ Donuts unless you have a slight Indian accent. I’m not joking.” Biden also described Obama, “I mean, you got the first mainstream African-American who is articulate and bright and clean and a nice-looking guy,” Try to imagine a Republican describing a minority candidate as “clean.” And yet these were example of Biden’s 40 year career ranging from his initial political campaigns in the 1970s up to the Obama 2012 reelection campaign.
In 2012, when running for a second term, Biden debated the Republican vice Presidential candidate, Representative Paul Ryan. Here is how the Washington Post reporter Chris Cillizza described the debate, “The debate was SO dominated by Biden — for good and for bad — that Ryan was largely a bystander. If you liked aggressive Biden, that makes Ryan a loser. If you don’t like aggressive Biden, which makes Ryan a winner.” CNN described the debate as this, “We expected Ryan, not Biden, to bring a three-ring binder full of facts and figures to the debate. It’s not that the data-driven Ryan didn’t show up with an arm full of his statistics; it is just that Biden did so as well.” And a Republican strategist, quoted in the Wall Street Journal stated, “If you like Joe Biden already, then you absolutely loved him tonight. If you are undecided or indifferent, you’re probably left wondering who this man is who is smirking, sighing, pointing, interrupting and badgering.”
Does anyone with sentient thought or even a smidge of objectivity think that 2020 Joe Biden is the same as this one described just eight years ago. Biden may have cribbed from British Prime Ministers or used cringe worthy language but there was a fighting spirit about all of it. This is different from an inability to string together two sentences, stumbling over easy words, or not know which state he happens to be giving a speech.
Peggy Noonan caught the gist of the issue in saying of Biden, “not everyone ages the same,” to explain how Bernie Sanders seems to be on top of his game though a year older than Biden. Or how to explain Nancy Pelosi’s command of the House at an age even older than Sanders. What is unmistakable is not just that Biden is older, but that he is aged. One of the few to come out and call the obvious was Fox commentator Britt Hume who stated, “I don’t think there’s any doubt about this. I have traces of this myself. I know what it feels like. Sometimes you’re confused, sometimes you can’t remember, ‘What are you supposed to do the next morning?’ — and I’m not running for president and it’s probably a good thing I’m not.”
All of this puts the choice of Vice President in the same context as 1944. And it makes Biden’s choice of Vice President that much more critical. One has to go back to Richard Nixon in 1973 to find the last president, who made the choice of who was going to be the next president.
Domination of the Political Discussion
April 2020
One of the arguments of right-wing pundits is that the left controls the media, or as it is referred to on conservative platforms, the mainstream media. A contrary view to this supposition is provided by Liberal Vox blogger Matthew Yglesias, who in a 2018 article, calls this the “The Hack Gap.” Yglesias states that “The hack gap has two core pillars. One is the constellation of conservative media outlets — led by Fox News and other Rupert Murdoch properties like the Wall Street Journal editorial page, but also including Sinclair Broadcasting in local television, much of AM talk radio, and new media offerings such as Breitbart and the Daily Caller — that simply abjure anything resembling journalism in favor of propaganda.”
The problem with Yglesias reasoning here is relatively simple: CBS, NBC, ABC, MSNBC, The New York Times, Washington Post, USA Today, and a host of media outlets, including Vox itself, counter with their brand of journalism, much of which is also propaganda. Understood that personalities such as Sean Hannity and Rush Limbaugh gin up their listeners with the types of stories that will be best received. But it is the height of disingenuous to believe the same thing does not happen on the left. Hannity is not a true journalist, but then neither is Don Lemon or Rachel Maddow.
In a 2020 study released by the American Association for the Advancement of Science, three scientists, Hans Hassell, John Holbein, and Matthew Miles titled their paper, “There is no liberal bias in which news stories political journalists choose to cover.” Yet later in the article, the writers noted that liberal journalists outnumbered conservative ones by a 4:1 ratio. Subsequently, they said, “Although we do not find evidence of broad, systematic ideological bias or ideological bias depending on the ideology of the potential readership, one might expect that individual biases would shape response patterns. Put differently, while we do not find conservative or liberal candidates to be systematically disadvantaged overall, there are strong theoretical reasons to expect that political reporters will be more responsive to candidates with whom they share their political ideology.”
Another contention by Yglesias is the presence of “the self-consciousness journalists at legacy outlets have about accusations of liberal bias leads them to bend over backward to allow the leading conservative gripes of the day to dominate the news agenda. Television producers who would never dream of assigning segments where talking heads debate whether it’s bad that the richest country on earth also has millions of children growing up in dire poverty think nothing of chasing random conservative shiny objects, from “Fast & Furious” (remember that one?) to Benghazi to the migrant caravan.”
Another Vox writer, Carlos Maza, noted in May of 2019 that “partisan mentality has an important secondary effect: influencing the coverage of mainstream news networks. One of the ways mainstream journalists try to avoid accusations of “liberal bias” is by paying attention to what happens in conservative media. Which means that pseudo-scandals that get a lot of focus on Fox — Benghazi, Clinton’s email server, Rep. Ilhan Omar’s mention of 9/11 — end up getting taken seriously by mainstream news outlets.”
The charge of self-consciousness also cuts both ways. It is even harder to avoid self-consciousness with the type of accusations that emanate from the left. This website itself, the Conservative Historian, has been accused of being the height of “white privilege.” The charge also implies that the purpose of this site is to return to the preferred time of the 1950s. Both accusations combined are to provide the implication that the use of this site is not to extol values of small government, greater personal liberty, and capitalism, but rather to reverse gains made in the arena of civil rights. In other words, conservative principles are not wrong, nor are conservatives only presenting one side; they are simply immoral, or worse. The use of the 1950s is also an implication. According to liberals, conservatives secretly pine for life before Brown V. Board of Education, the Civil Rights movement, Martin Luther King, Jr., and the removal of Jim Crowe, which is patently not correct.
There are so many significant issues that go to the core of the nation and its future. What is the best course to elevate those who are in the lower rungs of the economy? What are the limits of regulation? What is the sustainability of our entitlement state? Where does the government have regulatory oversight? What are the roles of the states in a federalist system? For the left, the debate is moot because to take a conservative position is not to debate the merits of small vs. large government, but rather to exhibit discrimination and abrogate any conversation. The goal is to make a conservative so self-conscious that they cease to be a conservative.
Vox is not the only liberal media outfit seemingly ruing supposed right-wing dominance. In a March 18, 2017 post by Tom Whyman, Vice.com notes, “Twitter isn’t wholly dominated by the left, but – Donald Trump and anyone with an egg avatar aside – left-wing views are certainly better-represented there than on any other major social network. However, left-wing Twitter has failed to translate into real-world influence. Twitter conversations aren’t very accessible to outsiders…On YouTube, by contrast, left-wing voices are seemingly non-existent – apart from that communist child – while right-wing voices dominate.”
For Yglesias, Maza, Whyman, and others on the left, the domination of political discourse by certain media outlets such as Fox, YouTube, or radio station owner Sinclair translates into a popularity boost that influences elections. Without the media, contends Yglesias, the Republicans would move to a more centrist position. “Specifically, by exploiting semi-random variation in Fox viewership-driven by changes in the assignment of channel numbers, they find that if Fox News hadn’t existed, the Republican presidential candidate’s share of the two-party vote would have been 3.59 points lower in 2004 and 6.34 points lower in 2008. Without Fox, in other words, the GOP’s only popular vote win since the 1980s would have been reversed, and the 2008 election would have been an extinction-level landslide.” Yglesias’ math is a little suspect, but his point is understood. But would it not be the mirror image on the right. If ABC, CBS, NBC, the Times, Washington Post, MSNBC, George Soros and his millions, and Michael Bloomberg, and his billions, did not exist, then Mitt Romney, who lost by 4%, would be President.
As for the Democratic primary contest, Vermont Senator Bernie Sanders proposes what is essentially a socialist program of governmental control over healthcare, energy, education, finance, and manufacturing. Without Jim Clyburn and some serious work on the part of the DNC, Sanders would have won the 2020 nomination and, arguably, should have won in 2016. The likely candidate, Joe Biden, has abandoned many of his centrist platforms held in the past. On issues ranging from healthcare to crime to the Hyde Amendment, Biden has adopted more liberal views. Trump, by contrast, has governed as a surprisingly center-right candidate. Sanders would transform the U.S. economy. Trump will not do a single thing about entitlement spending. Biden wants to expand healthcare to those aged 60. Trump renegotiates a trade agreement with Mexico and Canada more favorable to unions. So, who is pulling which side to the extreme?
There is also the argument of validity, or as Yglesias puts it, “Silly stuff can be a powerful tool.” Focusing on whether Obama took too many vacations or whether he was respectful to the troops are silly. It would have been better for the country if he had taken more vacations. Too bad Obama did not skip 2010. Conservative media does often focus on tropes, and so does the liberal press. It was the Washington Post who breathlessly broke the story that Dan Quayle misspelled the word potato. It was the New York Times that reported President Trump had an interest in a company whose drug he was promoting only to learn the stake was tiny. And it was every major news outlet that ran with the Julie Swetnick story against Brett Kavanaugh that was proven to be utterly false. This last article, though, was not silly at all.
There is plenty of debate and a gray area about which media ideology has dominance. But what happens when the scope is broadened to all sources where Americans obtain their political and historical information. The consumption of every outlet, every story, every op-ed that Yglesias and his supporters cite is a voluntary activity. One can choose to watch Fox or not. One can opt to listen to Rush Limbaugh. Though Limbaugh has a big audience in radio terms, 95% of the country chooses not to listen. There are so many choices that an American can now live in a political bubble, only receiving the echo chamber of agreement. But there is one notable exception. In the field of Education, there is no divide but rather a monolith. For a CNN there is a Fox News. For the New York Times, there is the Wall Street Journal, but for Big Education, there are public schools and a handful of charter schools, voucher systems, and homeschooling.
According to the Education Week website, there are 132,853 schools in the United States. 98,158 or nearly 70% of the actual entities are public schools. The remaining 34,000 are private schools. The discrepancy in student counts, because public school sizes are larger, is much higher. Per the Department of Education, about 56.6 million students will attend elementary, middle, and high schools across the United States in 2020. Of these, nearly 90% will be in Public Schools.
According to the Bureau of Labor Statistics, there are also 3.2 million full-time teachers staffing these schools. Of these teachers, 3 million or 93% of American teachers, are members of the National Education Association. In a Washington Post article dated June 3, 2015, the newspaper cited a Verdantlabs study that showed for every 87 Democratic high school teachers, there were just 13 Republicans. At the elementary level, it was 85 to 15. This same study showed that in social sciences, history teachers are 88 democrats for every 12 Republicans. The ratios only get more pronounced towards the left when sociology and anthropology are added.
So, where does the most prominent supporter of teachers, the National Education Association, stand in politics? Here is a blurb about Bernie Sanders issued on April 8, 2020, “The passion and energy Senator Sanders brought to the campaign encouraged so many people to get involved in this process.” And how does the NEA feel about the Trump Administration’s Education Secretary? “Betsy DeVos is the least qualified Secretary of Education in history, and her agenda consistently harms the students she’s charged with protecting. She needs to go.”
The least qualified in history? Harms students? In an appearance at a Washington D.C. school, DeVos made a statement that was severely condemned by the NEA, “I visited a school on Friday and met with some wonderful, genuine, sincere teachers who pour their heart and soul into their classrooms and their students, and our conversation was not long enough to draw out of them what is limiting them from being even more successful from what they are currently. But I can tell the attitude is more of a ‘receive mode.’ They’re waiting to be told what they have to do, and that’s not going to bring success to an individual child. You have to have teachers who are empowered to facilitate great teaching.” Calling teachers “wonderful” and saying they pour their hearts into the jobs. This is the bogey in the NEA closet? The person who wants greater choice in school selection, and more empowered teachers, is bringing irreparable “harm” to students?
There is also an entire NEA unit, called Education Votes, that serves as primarily a fundraising organization for leftist candidates. As of this writing, they have a front-page endorsement of Joe Biden for President. The NEA is currently and publicly on the left of every significant issue in the United States today from health care to gun control to opposing Republican judicial nominees, “Politicized courts threaten students, schools, and communities. Republicans are reshaping the federal judiciary with conservative employees.” The assumption is that the 3 million teachers who support the NEA will simply place these beliefs alongside conservative ones in the classroom?
Does the NEA or those who wish to provide greater school choice have the best interests of students at heart? If the school choice argument is correct, more private schools might lead to fewer NEA members. What is not debatable is that fewer NEA members mean less revenue and political influence for the NEA. In 1970, nearly 1 in 3 American workers belonged to a union. In 2020 that number is 1 in 10. This potential membership loss is the great fear of the NEA, and DeVos, with her visions of school choice, voucher programs, and funding of private schools, represents a threat to their education dominance.
Then there is college. n a 2005 Study conducted by Stanley Rothman of Smith College and Neil Nevitte of the University of Toronto, “72 percent of those teaching at American universities and colleges are liberal, and 15 percent are conservative.” The rate goes even higher when the history departments are broken out from the total.
The website Econ Journal Watch, a blog run by three economics professors, in 2016, issued an article entitled Faculty Voter Registration in Economics, History, Journalism, Law, and Psychology.” The premise for the material is quite simple. Look up the public voter registrations that are available to the public of professors in the academy, “The 40 universities we investigated were determined, in early 2016, by starting at the top of the U.S. News and World Report list “National Universities Rankings.”
They then broke up their ratios by fields of Economics, History, Journalism/Communications, Law, and Psychology and looked up 7,243 professors and found 3,623 to be registered Democratic and 314 Republican, for an overall D:R ratio of 11.5:1.
- History – 33.5:1,
- Economics – 4.5:1,
- Journalism/Communications – 20.0:1,
- Law – 8.6:1,
- Psychology – 17.4:1
And the best part of this is the youth of the audience. Reaganesque conversions, such as the 40th president turning from New Deal Democrat to a conservative stalwart, are rare. Paul of Tarsus is not just notable because he supported Christianity, but because he was once a persecutor. Most political minds are cemented by the time they are 30. The fact that Fox keeps advertising Gold and has a heavy dose of pharma commercials tells you something about their demographic.
The real story is about domination. According to a December 2019 article in Tech Crunch by Sara Perez, 6% of twitter accounted for an astounding 73% of political commentary. All of the Fox viewership, combined with CNN and MSNBC, and throw in the readership of all the national players, and they account for less than 50 million or 15% of the population. But Big Education accounts for, at minimum, 90% of all students. And they have the student’s attention for sixteen years. For sixteen years students learn about the challenges of climate change, systematic racism, and that healthcare is a natural right. Ok, but will they learn about individual agency, the value of smaller government, Regan’s success as President, and some of the failure of the Obama administration?
In one high school in an affluent neighborhood, the economics textbook was edited by liberal New York Times blogger and economist Paul Krugman. In another affluent high school in the northern suburbs of Il, the school conducted a civil rights seminar with a schedule that included the teachings of Howard Zinn and a discourse on the value of a higher minimum wage.
What Yglesias is doing in his piece is the very thing he is accusing his conservative counterparts. A Vox reader, more than likely a liberal, will be maddened and concerned and agitated by this discourse on the dominance of opinions antithetical to their own – and thus buy more Vox!
Whether a person chooses to listen to Rush Limbaugh or to read Charles Blow in the New York Times op-ed section is just that – a choice – and vive capitalism and choice! Education is different. Aside from many retirement communities, Americans pay property taxes, of which a vast majority goes to fund schools. Paying taxes is not a choice. The last group who defied the U.S. government in such a fashion had General Sherman march through their farms.
And given the predominance of the left in Public Education, a student will get leftist views. And in college, with a 33-1 ratio of left to right history professors, a college student has few choices; go to a school without a humanities determination, go to Hillsdale or Liberty, two of the 4,000 colleges in the U.S., or drop out. This latter will have consequences in subsequent job hunts. Until greater ideological equality is brought education, is not even a contest.
Top Ten Bad Historical Movies:
April 2020
A few days ago, a list of the Top 20 best historical movies was posted to this site. In Zoroastrianism, there is Ahura Mazda, the wise lord. But there is also Angra Mainyu, the destroyer. This list is the Angra Mainyu of historical movie choices.
The core criteria for this list is the differences between education, entertainment, and indoctrination. This list is to depict those movies, however well-made or written, that was built expressly to extoll liberal ideology and in doing so, are so riven with historical inaccuracies because of that bias, they should only be taken as fiction and relation to any person, living or dead, is a coincidence. One of the best historical movies ever made, Spielberg’s “Lincoln,” both educates and entertains. Warren Beatty’s “Reds” however, wishes to glamorize communist sympathizer John Reed. Most movies out of Hollywood have a progressive bent. How many times are the villains portrayed as greedy corporate plutocrats screwing the little guy? The movies on this list intentionally play with history to make ideological points, or, as in the case of “The Iron Lady,” to diminish conservatism and its adherents.
Documentaries present a challenge for list similar to this because they are supposed to be historical by nature. Most documentaries are to history what the op-ed page of a newspaper is to news. There is fact but those facts are cherry picked. There is a documentary on this list however because of its notoriety, its filmmaker, and a Best Documentary Oscar.
- Southside with You
We have not seen this movie. Maybe it is one of the best historical biopics and rom coms ever produced. But it is just the fact of this movie that is so hard to stomach. So much of Obama himself is smoke and mirrors. He is supposed to be a brilliant strategist but could not think of a way to pass his legislation in a consensus manner. He was supposed to be someone to bring the country together, yet he was one of the most divisive presidents of our time. His supporters claim his intelligence is unrivaled, but without a teleprompter, he struggles to put two sentences together.
Obama is a man who wrote not one, but two autobiographies before he was president. This movie is a part of that. It is about one small aspect of his life in which he courted his future wife, Michelle Robinson. Perhaps the real point of this movie was a prelude to a run by the now Michelle Obama. From a purely commercial exercise, it is not incorrect to see an interest in this prominent and by all perceptions, successful couple. Yet whenever a movie is made about a Conservative couple, such as Ronald and Nancy Reagan, the film inevitably portrays the subjects negatively. More a horror movie than a traditional romantic comedy. Given Hollywood’s liberalism and love of the Obamas, our only surprise is that we have not received the sequels: “Washington With You,” “The White House With You,” and “Martha Vineyard’s With You.” The last part of the Obama franchise might be problematic, given that, due to Climate Change, their new $15 million mansion will be underwater by the time the cameras begin to roll.
- The Big Short
Were “The Big Short” fictional, it would be great fun. The viewer gets Margot Robbie in a bubble bath explaining complex financial transactions. She could be reading from the Congressional Register, and it would be entertaining. “The Big Short” uses all manner of celebrities to break down complicated jargon. Steve Carell as “Mark Baum” is tremendous. But that was not the point of the movie. The goal was to depict a greedy, capitalist society that plays fast and loose and always leaves “the little guy” holding the bag.
What Adam McKay, the creator of this and many other anti-right, anti-capitalist romps, seems to forget is that before capitalism, things pretty much sucked for the average Joe. Simple historical game: would a Russian serf from the 1800s, an English peasant from the 1600s, or a Chinese peasant under Mao wish to live in their time on in 2019 America? And though some of the names have changed – Mark Baum is real-life Steve Eisman, yes we read the book! – “The Big Short” still depicts a real-time and place. Yet how can any movie that talks about the housing crisis fail to mention that leftist politicians screamed racism whenever a loan was denied to a majority?
One of the ironies of our time is that the same man who drove the housing crisis, Barney Frank, lent his name to the bill, Dodd-Frank, that solved a dilemma of his own creation. Here is Barney Frank on his mistake. In an August 2010, he said “I hope by next year we’ll have abolished Fannie and Freddie … it was a great mistake to push lower-income people into housing they couldn’t afford and couldn’t really handle once they had it.” Of Barney Frank and the governmental push for loans to the people who could not afford them, nothing is said in the “Big Short.” One other note. Adam McKay’s partner in crime on these anti-capitalist products is none other than funny-man Will Farrell. In a perfect World, these pro-left actors and actresses would have their salaries and fame redistributed for just one year. After that time, all of Hollywood would become a Conservative enclave.
- All the Presidents Men
This selection may come as a surprise as this is a well-crafted, well-acted movie that depicted a genuine low point in the American presidency. This is a not a denial of historical accuracy. Instead, this movie created a media scourge. There were famous reporters before Watergate, but Woodward and Bernstein received the star treatment in near real-time and were portrayed by the likes of Robert Redford and Dustin Hoffman. Redford was not just any actor. He was Sundance, Jeremiah Johnson, and Gatsby. The two reporters had received accolades for their work. But from the moment they became household names, every reporter wanted to be depicted by a movie star. The best path to Woodward type fame and fortune is to find their own Watergate – but only from a Republican administration. They all need to be the crusading journalist. One can trace a clear line from this movie to the antics of Dan Rather, the bombastic nature of Sam Donaldson, the manipulations of Brian Williams and the inanity of Jim Acosta.
Rather than report the news, the media now want to be the news. How is that possible unless there is some dragon to slay or dirty laundry to unearth? A free press is a bulwark of the American Republic and a necessity for the protection of citizens. But the media of today are advocates, not reporters. Instead of being in the news section, they want to own the op-ed, and maybe get an invite to the Oscars.
- An Inconvenient Truth
Are documentaries technically historical movies? These works are often done directly for the audience of the present, but they can be used for history, and we include these in with the concern that some lazy social study teacher will show these works in their classes to abrogate the need for a comprehensive, well thought out, and impactful lesson plan. In 2004’s and “An Inconvenient Truth,” for which Al Gore won an academy award, the first prediction was that the Polar Ice Caps would melt in ten years without drastic action. It is 2019. Glacier ice caps are still there. So is Al Gore, who took the proceeds from this movie, built a 10,000 square foot mansion, and did business with Al Jazeera, which is funded by oil money. The only truth in this work is that hypocrisy continues to be a human attribute.
- Reds
In Warren Beatty’s Oscar-bait biopic about communist sympathizer John Reed, the hero is depicted as an idealist who finds himself at the mercy of ruthless men who warp the utopian view of communism. Beatty’s, and much of the left, continue to believe that socialism and even communism would be the answer to Eden on Earth if only that evil men ruin the dream.
They never seem to consider that communism creates ruthlessness or that it singularly attracts men such as Stalin or Castro because, unlike democracy and capitalism, government control means control. Even the trailer portrays noble men brought low. The reality was that Reed, and by extension Betty, become just another of Lenin’s useful idiots.
- Dances with Wolves
Ok – not precisely historical but meant to be so. Every European descended individual, except for Kevin Costner’s hero, Lt. Dunbar, is a stark raving, vile lunatic. The Native Americans, even the enemies of the Sioux, are depicted as loving family people living in an idyllic setting ruined by the evil white man. Reality is a little different. According to one account, the Anasazi, a Southwest Indian tribe, had nearly 50% of their population wiped out by another tribe in Pre Columbian South America. Which is closest to Eden? A nation of 330 million where the most impoverished possess iPhones and Air Conditioning or waking in the middle of the night by a Pawnee attack.
- The Iron Lady
Most biopics either span the life of the historical figure or focus on a singular moment that encapsulates their lives. But liberal filmmakers need to tarnish one of the greatest Prime Ministers of British History, so they focused on her dotage. Understanding the value of history, liberals demean Thatcher in this way because to diminish her life is to demean her legacy of small government and personal liberty. “The Iron Lady” is sort of the King Lear approach to life. Show them as a senile fool and forget what made them great. This type of depiction may have a point. They need to make a movie of Wilson after his stroke when his wife ran the country. What about a movie showing FDR ordering the internment of Japanese-American citizens? There could be an exciting thriller about how the Obama administration covered up what happened at Benghazi. Wonder if Meryl Streep is up for playing Hillary Clinton?
- Che
What is the liberal fascination with Che Guevara? As crazy as the left can get, this was a thoroughly evil person. Guevara, the supposed champion of the downtrodden, was a supporter of the Castro regime, which has overseen thousands of political executions. He fomented rebellion and chaos wherever he went whether South America or Africa. The best news about Guevara’s life is that he, unlike Peron, Chaves or Castro, tasted ultimate power. It would have been horrific bloodbath.
- Any movie made by Michael Moore.
One of the recurring themes of the Conservative Historian is that ideal history is similar to a detective story where an intrepid reporter gathers as many facts as possible and makes the best, most reasonable determination about what happened. Another approach is to come up with a pre-set expectation and opinion. Then search high and low for facts that fit this narrative. If those facts do not exist, make them up! Many documentaries, including the aforementioned “An Inconvenient Truth,” take this approach.
There is a third approach, and that is to call a work a documentary, fabricate facts to fit a pre-ordained narrative, and then inject dollops of sarcasm to mask the falsities. This approach is Moore’s guide to filmmaking. In a very clever move, Moore’s breakthrough work, “Roger and Me,” was not straightforward about the economic disruption caused in Michigan by General Motor’s reallocation of resources. Was GM losing market share? Were foreign companies dumping cheap automobiles on the US market? Were CAFÉ restrictions causing GM to make money-losing cars? Was GM’s labor force responsive to all of these changes? If one were looking for this in “Roger and Me,” the viewer would see none of it. Instead, we have Moore trying to get a meeting with Roger Smith as the foreground to the economic disruption in Flint, MI. At least when Al Gore discuses polar ice caps, he shows a polar bear. That is more fact than one is libel to find in a Moore doc.
With every subsequent “documentary,” Moore moved further away from any semblance of the truth. Were people in Flint adversely affected by GM’s decision? That would be correct. Is Cuba’s healthcare system superior to that of the US as Moore contended in “Sicko,” his 2007 fantasy about healthcare? Seriously? Any possible connection between a Moore film and historical reality is purely by accident.
- Birth of a Nation
Is this work a liberal piece? I would think that the left would shriek at the thought, and unlike the other works on this list, this film is not an overt attempt at spreading progressive values. This film was made by D.W. Griffiths as a piece of propaganda about what a group of white Southerners believed was a concern in the Jim Crow South, the possible power of the African American population.
Yet the origins of this era are clouded by liberal historians who wish to change the narrative. It was the Republican Party that freed the slaves, passed the 13th, 14th, and 15th Amendments opposed Jim Crow laws, and worked with Johnson to pass the Civil Rights Acts in the mid-1960s. It was one of the darlings of the left, Woodrow Wilson, president at the time of the release of this movie, who believed in eugenics and re-segregated the federal civil service. It was under the Conservative president Calvin Coolidge when the power of the Klan broke, and lynching, once in the 50s per annum, was down to 7 in Coolidge’s last year in office.
At the core of conservativism is a rejection of the identity politics that are part of this movie. Whether portrayed as victims or criminals, Conservatives would prefer to judge people on their individualism and the choices they make rather than on gender, ethnicity, age, or class.
Coronavirus is not World War II
March 2020
On the morning of December 7, 1941, a Sunday, Japanese airplanes attacked the American naval and air base at Pearl Harbor. Most American students know this or used to before “Intersectional Studies of Gay Women in Rwanda overtook the curriculum.” Two thousand four hundred were killed, and another 1,000 were wounded. OF the dead, most were young sailors serving about battleships such as the U.S.S. Arizona and the U.S.S. Utah. This surprise attack kicked off our entry in World War II.
With its unprecedented isolation of Americans, the involuntary shut down of the business, and the use of private enterprise to make up for shortfalls of equipment, the Coronavirus response is being compared to World War II. This quote comes from The Hill website, “MSNBC’s Joe Scarborough said Tuesday that the coronavirus pandemic is more like World War II than the 9/11 terror attacks, with the “Morning Joe” co-host citing “high-ranking officials” who say the crisis “could be very bad.”
Angela Merkel weighed in, “This is serious. And we must take it seriously. There has been no such challenge to our country since German reunification — no, not since the Second World War II — that relies so heavily on us all working together in solidarity.”
Politico added, “As the globe confronts the Coronavirus pandemic, one urgent problem is the shortage of critical pieces of equipment, including high-quality masks, test kits and—perhaps most important of all—ventilators. It seems hundreds of thousands of lives might be saved, if only manufacturers could quickly ramp up the production of such equipment, perhaps by a factor of 100 or 1,000, within a few weeks. The United States has done something similar, on a nationwide scale, once before—eight decades ago during the emergency of World War II.” Additionally, Axios, Fortune Magazine, and Britain’s The Telegraph all weigh in on the analogy. As much as writers need a hook for their stories and politicians to try to grasp for a suitable analogy, World War II is not the same.
418,000 Americans died in World War II, but a disproportionate of them were young. As Herodotus noted, “During times of peace, the sons bury their fathers, but in war, it is the fathers who send their sons to the grave.” Of the roughly 110 million Americans living in the U.S. in 1945, 8 million were in the services. Of these, the vast majority were young men. In contrast, the Coronavirus disproportionately affects older patients, according to the Centers for Disease Control and Prevention, “This first preliminary description of outcomes among patients with COVID-19 in the United States indicates that fatality was highest in persons aged ≥85, ranging from 10% to 27%, followed by 3% to 11% among persons aged 65–84 years, 1% to 3% among persons aged 55-64 years, <1% among persons aged 20–54 years, and no fatalities among persons aged ≤19 years.”
The actual timing upon when the United States began offensive operations against the Axis was months in the planning. In the European Theater of operations, the U.S. did not conduct Operation Torch, the invasion of North Africa, until nearly a year had passed after Pearl Harbor. The Battle of Midway was fought seven months after Pearl Harbor and Guadalcanal was two months after the great aircraft carrier battle. In the case of the Coronavirus, this was first noted in the Hubei Province in China of January 2020. As of March 4, three weeks before this writing, there were 158 cases in a nation of 330 million. By March 25, 2020, that figure had shot up to 50,000. The response is expected to be in near real-time.
Then there are the economic differentiations between World War II and the Coronavirus. The sheer volume of the U.S. effort to conduct the war, in two different theaters, was astronomical. By 1945, the United States had produced 2 million vehicles, of which 108,000 were tanks. Two hundred thousand combat airplanes were constructed and another 24,000 planes for transport. The U.S. built 1,200 large ships of all kinds and 35,000 landing craft. This mass ramp of production was the real reason the U.S. emerged from the Great Depression, not the New Deal program, as is erroneously believed. With the response to the Coronavirus of shutting down businesses, cancelling events, quarantining, and social distancing all will have a deleterious effect on employment. The U.S. response to World War II created work. The response to the Coronavirus prevents work.
The Trump administration has been castigated for lack of preparation for the Coronavirus. A few weeks ago Peggy Noonan, in a harrowing fit of Monday morning quarterbacking, wrote, “After we spoke, something came to mind I’d written months after 9/11. Everyone was asking how we could have missed the signs. I remembered the words of astronaut Frank Borman when he was asked in 1967 why NASA had not been prepared for the catastrophic Launchpad fire that killed three astronauts. It was “a failure of imagination,” he said. No one imagined such a thing could happen on the ground.” That’s where I think we have been the past few weeks in this epidemic, a failure of imagination.”
Here implication indicts the administration for lack of planning or as she put it, lack of imagination. Not certain what this means. Writers and movie makers have been imagining alien invasions for over 100 years. Does that mean we prepare for that? And was there not a greater lack of imagination, or foresight, or preparedness than Pearl Harbor. The Japanese were perceived as a threat going back to World War I. The U.S. possession of the Philippines put the nation astride Japan’s access to precious resources needed to fuel their ongoing conquest of China. An 8th grader with a map could have predicted the attack but Roosevelt did not.
Contrast that with Coronavirus. The U.S. did not need mass quarantine or country wide closure with the Bird Flu, H1N1, or the Ebola scare of 2014. As of March 5, there were 158 cases in the U.S. and it was beginning to abate in China. As of this writing there are over 50,000 cases in the U.S. in just three weeks. It would have been wonderful if the nation was better prepared, but not inevitable. The rush of time came on in a few weeks period. In World War II, after Pearl Harbor, the U.S. had several months.
After the initial attack on Pearl Harbor, Japan focused its efforts in the Southeast. They were conquering the Philippines, New Guinea, the Dutch East Indies, Indochina, Malaysia, and Singapore. There was no subsequent invasion of the United States homeland nor further attacks on Hawaii. It was not until June of 1942 that the Japanese resumed offenses directly against the U.S. Germany and Italy, our two other opponents in World War II, only attacked through a submarine program.
The population counts are very different as well. The total population of the world at that time was 2.3 billion. In 2020, the total number of people in over 7 billion. As of this writing, total deaths due to Coronavirus over the past three months is 16,000. American deaths alone over the roughly 30 days of one World War II battle, was 19,000, with another 23,000 missing or imprisoned, and 63,000 wounded. Let’s be clear. We all sincerely hope that Coronavirus does not accelerate and become something more akin to the Spanish Flu epidemic of 1918 or worse, the Black Death of 1347-1349. Instead, this is to put the number into some context. None of the people making these pronouncements were alive during World War II. The challenge is that many, including Merkel, do not seem to know anything about the history of that terrible conflict.