The News And Times Review - NewsAndTimes.org | Links | Blog | Tweets  | Selected Articles 

Categories
Audio Posts: Selected Articles

Effective Governance and the Executive

Spread the news

Political theory that becomes unmoored from political reality creates a dangerous disconnect—one that undermines the fundamental purpose of governance itself: human flourishing. This core truth has repeatedly emerged throughout American constitutional history. The Articles of Confederation, while theoretically sound in their protection of state sovereignty and checks against centralized power, ultimately failed in practice, proving inadequate for addressing the commercial, security, and unity challenges of the early republic. Similarly, the original constitutional provision for state legislatures to elect senators—a system designed to protect federalism and ensure deliberative selection—eventually collapsed under the weight of practical concerns about corruption, deadlock, and democratic representation.

Rogers’s defense of the unitary executive theory would benefit from greater consideration of its practical fit with the underlying purpose of our constitutional order. While intellectually rigorous and grounded in constitutional text and history, it risks becoming another case of constitutional theory that floats above the turbulent waters of actual governance. The daily evidence of administrative turbulence, limited accountability, and expansive uses of executive power contradicts the neat theoretical constructs that unitary executive proponents advance. Rogers himself seems to acknowledge this disconnect by explicitly limiting his analysis to descriptive rather than normative claims.

However, this separation between constitutional description and normative evaluation creates a fundamental problem. Any enduring analysis of the unitary executive must engage with how this structure functions amid contemporary political dynamics—the intense polarization, the weakening of institutional norms, the collapse of congressional oversight capacity, and the increasing concentration of power in the presidency. A constitutional theory that cannot account for these realities may confine its utility to academic discourse.

The fundamental test of any governance structure must be its capacity to advance human flourishing—to create conditions where citizens can lead secure, meaningful, and self-directed lives. By avoiding engagement with how the unitary executive theory operates within our current political landscape, Rogers sidesteps the most crucial question: Rogers sidesteps the most crucial question: Does this constitutional understanding actually serve the interests of the people? Without addressing this question, even the most elegant constitutional theory remains incomplete at best and potentially harmful at worst, perpetuating structures that may undermine the very liberties the Constitution was designed to protect.

The Unitary Executive in Practice

Two key aspects of Rogers’s defense of the unitary executive lack sufficient grounding in the political reality of today. First, he contends that a unitary executive fosters accountability. Second, he contends that this conception of executive power aligns with a robust system of checks and balances. When considered in light of the current political moment, neither rests on sturdy ground. 

On accountability, the sort of accountability hoped for by Rogers under a unitary executive does not materialize when subjected to empirical scrutiny of electoral outcomes and political discourse. Ironically, the very concern that Hamilton expressed about a plural executive, which Rogers paraphrases as allowing “blame shifting” and confounding “the ability of voters to hold executives responsible for administration,”—has manifested in a different form under the unitary model.

The current political reality reveals a Congress increasingly controlled by members more loyal to party than to institutional prerogatives.

This manifestation takes the form of what might be called a “unitary dividend”—a phenomenon that parallels the concept of a “liar’s dividend” in public discourse. Just as the liar’s dividend allows those with a tendency toward dishonesty to strategically leverage their mendacity by selectively claiming what they meant what they said, the “unitary dividend” enables administrations to expand executive authority to its furthest boundaries when politically advantageous, while strategically disclaiming responsibility when those same powers produce unfavorable outcomes. Recent administrations—both Biden’s and Trump’s first and second terms—have repeatedly leveraged this dividend, exercising extraordinary executive authority on certain issues while simultaneously arguing that other governmental actors bear the constitutional responsibility when their initiatives falter. The result is a fog of accountability that leaves the public uncertain where to direct their democratic judgment. Though the Biden administration never explicitly embraced Rogers’s conception of the unitary executive—defined as the “president [having] authority to direct all parts of the executive branch”—its operational approach often reflected precisely this understanding.

The COVID-19 pandemic provides a compelling case study of this unitary dividend in action under the Biden administration. President Biden exercised sweeping executive authority by instructing Anthony Fauci and other officials to implement what many considered an expansive and intrusive federal pandemic response, including the suspension and termination of federal employees who failed to comply with Fauci’s directives. Yet when politically expedient, the administration redirected blame toward various external actors—social media platforms, vaccine-resistant citizens, and perhaps most significantly, Congress. This strategic oscillation between asserting maximal executive authority and deflecting responsibility created precisely the accountability confusion that proponents of the unitary executive theory, including Hamilton, had hoped to prevent.

Similarly, despite asserting significant authority in directing the nation’s foreign policy apparatus, President Trump frequently redirected responsibility toward Congress when confronting unfavorable geopolitical developments. In 2017, congressional actions became the administration’s preferred explanation for heightened tensions with Russia, despite the executive branch’s constitutional primacy in foreign affairs. More recently, Trump has partially attributed ongoing trade volatility with China to Congress’s passage of the 2022 CHIPS and Science Act, creating a narrative of divided responsibility that obscures the administration’s own policy choices. This pattern of selective accountability coexists with the administration’s willingness to exercise far-reaching foreign policy authority that sometimes operates at the boundaries of established legal and political norms.

This historical pattern reveals a misalignment between the theoretical accountability promised by unitary executive and the lived reality of our democratic system. Rogers, like many constitutional scholars, works from an idealized conception of voter capacity that assumes citizens possess sufficient information, political knowledge, and civic motivation to first identify the responsible governmental actors and then translate that assessment into electoral consequences. The empirical record tells a different story—one where voters consistently support parties that have implemented unpopular or even demonstrably harmful policies, where complex lines of authority remain obscured, and where the theoretical clarity of the unitary executive dissolves into the murky waters of modern media strategy and political messaging.

In short, the promise of democratic accountability through a unitary executive faces significant challenges in today’s political environment. A strong executive with centralized authority may paradoxically dilute accountability by creating a system where responsibility is more difficult to pin down. When executive power is concentrated in a single figure with broad authority and minimal means of accountability outside of the extreme of impeachment, the public may lack an effective means to evaluate the complex chain of decisions and delegations that occur within the executive branch. The tendency of recent administrations to develop and execute sophisticated media strategies may also shift narratives of responsibility away from the administration.

Checks and Balances

This lack of clear accountability invites a broader constitutional concern: How do our traditional checks and balances respond to emerging power vacuums and institutional ambiguity? The systems of checks and balances envisioned by Rogers assumes that Congress and the courts act in defense of their respective institutional powers, thereby counteracting the fervent expression of executive power involved with the unitary executive. Political reality suggests otherwise. A unitary executive is not necessarily in conflict with a system of checks and balances. A Youngstown Sheet & Tube Co.-type analysis makes this clear. The first bucket involves each of the three branches fully and vigorously using their respective powers. In this case, checks and balances are firmly in place. The second bucket occurs when the executive operates under the unitary executive theory, yet only one other branch takes the equivalent approach. Here, the system of checks and balances may not immediately or adequately prevent an executive from exceeding their authority. But, on the whole, the system works as intended. Finally, on the third bucket, an executive operates to the full bounds of the unitary executive theory while the other two branches evidence deferential tendencies to the executive. This scenario raises significant concerns as to the viability and sufficiency of the system of checks and balances. We may find ourselves here now, facing a governance landscape in which the legislature hesitates, the courts retreat, and the executive assumes expansive authority over technologies shaping the public sphere.

The current political reality reveals a Congress increasingly controlled by members more loyal to party than to institutional prerogatives. Congressional representatives often demonstrate greater allegiance to a president of their own party than to Congress as an institution, rendering legislative checks ineffective. When public confidence in political institutions erodes, as multiple polling sources indicate is occurring, traditional checks and balances may no longer function as intended.

Effective governance requires systems that can respond to complex challenges with appropriate speed, expertise, and democratic input.

Similarly, the judicial branch, particularly the Supreme Court, has demonstrated a complex relationship with executive power. While sometimes providing checks on executive overreach, the Court has also shown deference to presidential authority in key areas like national security and immigration. The current 6-3 conservative majority on the Court may further complicate this dynamic, potentially showing greater deference to certain uses of executive power while restricting others based on ideological rather than constitutional considerations.

Though the occurrence of this third bucket may seem like an anomaly, it warrants further and more frank analysis by Rogers.

Effective Governance and the Unitary Executive

The primary blind spot in Rogers’s piece rests with his selection of constitutional principles to defend the unitary executive; he omits effective governance. In my own work on the right to effective governance, I argue that “early Americans shared a belief that the underlying purpose of the government was to advance the well-being of the governed.” That belief informed the decision to abandon the flawed Articles of Confederation and adopt a structure more aligned with the social, economic, and political realities of the day. Rogers failed to explore whether a unitary executive, in practice, has advanced the general welfare. As an aside, effective governance need not and should not have a partisan valence—it merely refers to the capacity of the government to address the essential purposes for having a central authority, such as coordinated international and economic policy, provision of national defense, protection against invasions of fundamental rights, and the like.

Effective governance requires systems that can respond to complex challenges with appropriate speed, expertise, and democratic input. The unitary executive theory, while appealing in its theoretical clarity, may actually impede these goals in practice. As evidenced by recurring administrative failures across administrations of both parties, concentrating decision-making power in a single executive has not demonstrably improved government performance on some of the aforementioned basic aspects of effective governance.

The structural flaws of a unitary executive reveal themselves in systematic governance breakdowns that go beyond individual leadership qualities. When presidential authority dominates the executive branch, we often witness the deterioration of institutional expertise as career officials with specialized knowledge depart amid shifting political winds. The centralization of decision-making authority creates bottlenecks that prevent nimble responses to emerging challenges, as evidenced by the sluggish adaptation to numerous crises over the past two decades. Perhaps most concerning is the policy whiplash that occurs when each new administration reorients entire agencies around presidential priorities rather than enduring public needs—a phenomenon we’ve seen play out dramatically across trade policy, healthcare implementation, and immigration enforcement.

Consider the cascading failures we’ve witnessed in critical infrastructure oversight. The Federal Aviation Administration’s delayed response to alarming air traffic controller shortages exemplifies how centralized control can impair timely action. Despite years of internal warnings from technical experts, political appointees delayed implementing recommended staffing reforms. This pattern repeated itself with the East Palestine train derailment, where regulatory capture and centralized decision-making diluted safety standards that independent regulators had long advocated. These weren’t failures of individual leadership but predictable outcomes of a governance model that subordinates expertise to hierarchical control. More generally, a massive reduction in executive branch staff does not bode well for the ability of the federal government to efficiently and sufficiently respond to threats to public well-being. As Andrew Rudalevige, a political scientist at Bowdoin, relayed to The New York Times

The damage caused to governmental expertise and simple competence could be long lasting. Firing probationary workers en masse may reduce the government employment headcount, slightly, but it also purged those most likely to bring the freshest view and most up-to-date skills to government service, while souring them on that service.

Similarly, troubling patterns emerge in how executive agencies handle fundamental rights. The due process failures in immigration proceedings stem directly from the executive branch’s ability to reshape entire enforcement apparatuses without meaningful checks. When presidents can unilaterally reconstruct agency priorities—as we’ve seen with abrupt shifts in enforcement targeting, detention policies, and asylum procedures—the consistent application of law gives way to political expediency. These compromises of fundamental rights aren’t bugs but features of a system that concentrates too much authority in a single elected official with incentives that often diverge from safeguarding individual liberties.

What these cases highlight is that the unitary executive theory fails to address a fundamental requirement of democratic governance: institutional capacity to deliver results that serve the public good while protecting core rights. A governance model focused solely on hierarchical control without equal attention to expertise, deliberation, and institutional knowledge may consolidate authority without improving outcomes. Historical experience suggests that effective governance emerges not from concentration of power but from thoughtfully designed systems that balance democratic accountability with professional expertise, rapid response capabilities with careful deliberation, and centralized coordination with distributed implementation.

Conclusion

Rogers’s exploration of the unitary executive makes valuable contributions to our understanding of constitutional design. However, its disconnect from contemporary political reality undermines its practical relevance. A more balanced approach would recognize that effective democratic governance requires not just clear lines of authority, but also robust institutional capacity, appropriate checks on power, and mechanisms to ensure that democratic accountability genuinely functions.


Spread the news
Categories
Audio Posts: Selected Articles

Reimagining College

Spread the news

These are trying times for higher education, and for more than temporary partisan reasons. Future demographic trends will exacerbate declining enrollment numbers. Facing budgetary shortfalls, colleges must cancel burdensome academic and athletic programs. Growing numbers of institutions will fail altogether, inflicting economic tragedy on the local communities that rely on them. There are no painless choices here; there will be losers in this readjustment process. But two new books present impending crises, new technology, and shifting consumer demand as opportunities for innovative reform.

Their provocative titles notwithstanding, Richard K. Vedder’s Let Colleges Fail: The Power of Creative Destruction in Higher Education and Kathleen deLaski’s Who Needs College Anymore? Imagining a Future Where Degrees Won’t Matter both articulate largely optimistic visions for higher education’s long-overdue course correction. Neither work will appeal deeply to readers who revere America’s traditional liberal arts undergraduate curriculum. Yet thoughtful and generous reading of both suggests creative and perhaps necessary means to harmonize new realities with an ancient heritage.

Russell Kirk once professed certainty that “if all schools, colleges, and universities were abolished tomorrow,” the young would nevertheless “find lucrative employment, and means would exist, or be developed, of training them for … work.” According to Kirk, the college exists not for vocational training per se, but for “liberal education,” which “defends order against disorder” by “cultivation of the person’s own intellect and imagination.” If pursued in this spirit, such education conduces to “order in the republic.” But much as lab-produced substitutes are an ineffectual mockery of real food, liberal education’s ancillary blessings cannot be reductively pursued as ends in themselves.

Even two generations ago, few Americans concurred with Kirk’s noble ideal. Today, six decades after Clark Kerr coined the term “modern multiversity” for their often conflicting array of interests, research universities are less coherent than ever. Kerr famously quipped at a meeting of Cal Berkeley’s faculty in 1957 that the major administrative challenges to the university were to “provide parking for the faculty, sex for the students, and athletics for the alumni.” For the typical large “R1,” this may be as complete a mission statement as possible. It is certainly the most honest. Even at many putative liberal arts colleges, the increasingly vocational focus of undergraduate education contributes to this confusion as both cause and effect. The three most common reasons given for college attendance in a recent New America survey—all with response rates above ninety percent—were “improve employment opportunities,” “make more money,” and “get a good job.” This is the consensus understanding of higher education’s purpose. But it may have been so for longer than we imagine.

Clayton Sedwick Cooper wrote Why Go To College? in 1912, spawning a subgenre now large enough to fill a library by itself. Cooper recounts the scene at an Ivy League graduation of a couple whose “homely” clothes, “deeply lined” faces, and “hard, calloused hands” identified them as farmers, watching with pride as their son led in the senior class. He imagined them “dedicating their lives to the task of giving [this] boy the advantages … they must have felt would separate him forever from their humble life.” Such scenes are a commonplace of American life. Attend any college’s commencement exercises; the families cheering loudest when their student’s name is read will be the spiritual descendants of those Yankee agrarians. Kirk’s noble admonition notwithstanding, America’s colleges have always served partly as entryways to its professional class. That is an essential function in a socially diverse, egalitarian republic. At any rate, people understand it as such—and in a democratic society, the people will have their way. Liberal education and vocational training must coexist somehow.

Consumer-driven schemes for vocational training may not be incompatible with a liberal education intended to cultivate moral imaginations.

Richard Vedder, emeritus professor of economic history at Ohio University, has long been among higher education’s foremost conservative critics. He is not, though, a wanton agent of chaos. Let Colleges Fail is less a celebratory paean to higher education’s imminent disintegration than a call for its renewal. Vedder does consider the possible value of “creative destruction” in higher education, noting that publicly subsidized colleges and universities “lack strong incentives to improve outcomes,” reduce overheads and prices. “Though we may mourn the loss of [individual] schools,” he writes, “we should accept and even rejoice in more closing in the years ahead as resources shift away from” failing institutions toward “educationally stronger ones.” In this vein, the book retreads some familiar ground on higher education’s excesses, abuses, and inefficiencies, though often with characteristically insightful data analysis. 

Vedder suggests many reforms, ranging in scale and consequence from minor and benign to the most sweepingly ambitious. “Reform efforts must … reduce market ignorance in higher education,” he writes. Other merchants, such as “big-box stores,” do not advertise one price, then charge each customer unique and undisclosed discounted prices—why should colleges be permitted to do so? Such commonsense proposals would meet little popular resistance, at least in principle. More controversial, perhaps, would be his scheme for voucher-style tuition aid, “converting subsidies given to schools to payments made to students’ directly.” Then there are Vedder’s most original suggestions, such as halving tuition costs by moving the academic year to three fifteen-week semesters, eliminating summer vacation, and condensing the bachelor’s degree into three years. In this plan, faculty base pay is increased, but large lecture sections are tripled in size, the number of tenured instructors is reduced, and faculty pay is moved to a sliding scale pegged to student enrollment in their courses. Pray for the poor dean who is tasked with presenting this plan at the next faculty meeting!

Vedder’s timeliest proposals are for the restructuring of research universities. Private companies that grow unwieldy, he writes, “are constantly spinning off operations that do not fit well with their core activities—shouldn’t universities do the same?” Do teaching hospitals, vocational schools, advanced research labs, and professional football teams still belong under the same institutional umbrella? Could inefficient, high-cost dormitories and dining halls be replaced with private boarding houses or similar free-market arrangements? Can independent laboratories not turn research grants into knowledge as well or better without campus bureaucracy? Wherever possible, Vedder urges institutions to shed distracting encumbrances to their core purpose of educating students.

Kathleen deLaski’s book, though, suggests that even this foundational mission may undergo revolutionary transformation in the near future. Who Needs College Anymore? explores the possibilities for education at the dawn of what she terms the “skills-first age,” in which the bachelor’s degree will no longer serve as the primary signifier of employability. Her book is a surprisingly engaging tour of the present state and likely direction of “the alternative credentials market.” Central to deLaski’s narrative is the “micro-credential,” a trendy catch-all term for industry-certified short-term training programs. Many, such as intensive “bootcamps” to learn software coding languages, offer direct, non-degree paths into remunerative careers. But such credentials and the traditional campus are not exclusive models. Some institutions—the 250,000-student University of Texas system, for example—subcontract third-party providers to give students access to thousands of skills-based credential courses alongside their degree curriculum.

Two institutions that embedded “alternative credential” programs within their curriculum at their inception offer particularly illustrative models. Chartered in 1997, Western Governors University pioneered “competency-based” curricula, in which students “move through [self-guided] online course material” without any real-time instruction, then take assessments “to demonstrate mastery.” Chartered in 1912, Northeastern University in Boston was an even earlier pioneer. From the beginning, its undergraduate curriculum has required completion of a months-long off-campus work experience placement. Students are prepared for professional work environments with a mandatory general education course covering resume curation, interview etiquette, and the like. In both examples, once-uniquely innovative ideas are now commonplace in higher education. All major accreditors permit credit-bearing off-campus apprenticeships and competency-based curricula. In the same way, deLaski believes, “as less expensive alternate pathways become clearer and surer,” traditional bachelor’s degrees “will seem impractical for a new majority of learners.” But she asks, “Why does the degree have to be the only product colleges sell?” Campuses offer many advantages that could help savvy institutions adjust to a changing education landscape. deLaski suggests various means for higher-education institutions to offer non-degree credentials within, alongside, or as alternatives to their existing programs. In short, “the degree may be in trouble, but colleges can survive.”

In a democratic society, the people will have their way. Liberal education and vocational training must coexist somehow.

This would be cold comfort to Russell Kirk. But consumer-driven schemes for vocational training may not be incompatible with a liberal education intended, as Kirk wrote, to cultivate “the person’s intellect and [moral] imagination, for the person’s own sake.” A few institutions already offer suggestive examples for combining the two. Affectionately known as “Hard Work U,” Missouri’s College of the Ozarks’ work-study model enables every student to gain in-house job skills and graduate debt-free. Its robust general education curriculum includes required two-part course sequences in Christian Worldview, American history and civics, and Western Civilization. Another intriguing example is LeTourneau University in Longview, Texas—a private, religious, four-year vocational college. Typical major programs include various branches of engineering, computer science, nursing, and business. But since 2015, LeTourneau’s Honors College has offered an excellent slate of liberal arts courses. Roughly five percent of students complete the full nineteen-credit concentration, but a greater number take a few honors courses as electives.

These are rare and modest examples. But if technological change and consumer demand augur seismic change for higher education, they may hold out hope to those who cherish the old liberal education. Market forces, legislatures, or both may, as Vedder recommends, require large public universities to reorganize, shedding non-core functions and renewing their focus on undergraduate education. We should hope so. Selective liberal arts colleges may continue more or less unchanged. But what of non-elite smaller institutions lacking the mysterious appeal of “prestige” or the security of large endowments? Imagine a struggling four-year, private institution with low admissions standards, reliant on athletics and vocational majors to drive recruitment. In a world of readily accessible, rapidly adaptive short-term credential programs, why enroll in a four-year vocational degree whose curriculum is updated rarely and belatedly? Such programs may appear increasingly cumbersome and costly, chiefly benefiting their tenured faculty. Suppose this college abandoned the bachelor’s degree and replaced its numerous putatively pre-professional and vocational major programs with a single liberal arts associates degree. Imagine a three-year program, the first two years devoted primarily to a “great books” curriculum alongside some foundational vocational training and summer internship options. In year three, the focus shifts primarily to job-specific training gained through the latest micro-credential courses, perhaps taken online or through intensive “bootcamps” off-campus. Students would receive guidance from a corps of counselors with up-to-date training in the “alternate credentials market”—a much expanded role for the “drop-in” career centers presently an afterthought on many campuses. In less time and with lower cost than the current bachelor’s degree, graduates of such a college might attain a “skills-based, job-ready” resume while also forming their minds and imaginations in a college-level liberal arts core curriculum.

This may be a fanciful hope. But if Kathleen deLaski is correct, new technology and probable consumer demands will permit such ambitious reimagination of college education very soon. How many administrators have the vision and courage to try such things? Richard Vedder suggests they may have no choice.


Spread the news
Categories
Audio Posts: Selected Articles

What Economists Understand About Tariffs

Spread the news

I have tremendous respect for Gene Callahan. His writing, especially on economics, has received well-deserved praise, and rightly so. In a recent essay at Modern Age, however, he joins the rancor of people accusing economists of just not understanding something when it comes to tariffs and the discussion surrounding them. He begins with an analogy of someone who, after being offended, is considering giving the offender “a good punch in the nose.” The man asks his friend, who happens to be a physicist, for moral advice. The friend instead describes the kinetics involved in a fist colliding with a face based on “the masses and velocities involved.” The physicist has obviously answered a question, but not the question that was asked. 

The implication here is that economists speaking about tariffs are much like the physicist: we give precise, rationalistic answers to the outcomes of tariffs and then move from those answers to proffering advice to the practical question of “what should we do?” Callahan doesn’t advocate specific policies or endorse President Trump’s trade actions, but instead he urges economists to adopt a more open, civil approach to discussing tariffs. Fair enough—civil discourse is valuable and some economists, myself included, have occasionally been less than diplomatic on the topic. 

By chiding economists for being too rationalistic in our approach, however, he overlooks the simple fact that the same can be said about tariff proponents: they believe that they can 1) identify substantive social outcomes that “we” want, such as certain types of jobs in certain locations (e.g. manufacturing jobs in the Rust Belt), national economic “independence,” or increased tax revenues, 2) that they alone possess the knowledge of which policy buttons to push to bring about these assertedly desired outcomes, and 3) that the sequence of policy changes they advocate will bring about those outcomes. In doing so, they present their own rationalistic answers and similarly move from those to answering “what should we do?”

The unfortunate reality is that tariffs, which are intended to help the poor, often have the exact opposite effect and simply enrich the already-rich.

So in truth, we do not have the hyper-rationalist economists telling people what they should do versus cosmopolitan thinkers pondering bigger, moral questions. Instead, what we see are economists questioning each of the three points above, with an admittedly particular emphasis on the question of whether the policies will bring about the results their advocates claim.

Still, I will cede Callahan’s broad point that economists qua economists should cease proffering answers to normative questions, just as Nobel Laureate James Buchanan argued. We should absolutely not allow, however, for the folk economics to grow and fester that Paul Rubin warned us about and that Frederic Bastiat and Henry Hazlitt provided panaceas against. In that respect, economists have a unique and necessary role to provide a “prophylactic to popular fallacies.”

Tariffs: Means, Not Ends

Economists excel at analyzing means to achieve given ends, though the economist qua economist is incapable of judging the ends themselves. With respect to tariffs, the Trump Administration has put forth five ends they contend tariffs will achieve: bolstering national security, raising revenue, repatriating jobs, reshaping supply chains, and negotiating better trade deals. With these ends in mind, economists can analyze their efficacy.

First, tariffs are often justified as protecting national security. This is especially true when they are unilaterally enacted by a president, even though recent court cases have rendered this less certain than they were a few weeks ago. No serious economist denies the possibility that tariffs and other forms of trade restrictions can enhance national security. Even staunch free-trade advocates such as Don Boudreaux acknowledge this possibility. Economists do raise two concerns that are often overlooked by non-economists, however: first, reducing trade barriers may actually strengthen national security more than raising them. Second, the national security argument may be abused. Are foreign films or Apple’s overseas investments really threats warranting tariffs? 

The right question isn’t whether tariffs in general can promote security but whether they do in particular circumstances—a question economists are well-equipped to analyze. For example, decades of protectionist policies—special tax abatementsprotective tariffs, and mandatory purchasing agreements—have done little to save the domestic steel industry, with US Steel and Cleveland-Cliffs, among other steel companies, reporting loss after loss each year. At some point, the problem with the domestic steel market is not the supposed scourge of foreign competition. 

Second, tariffs are proffered as a means of generating revenue. Recently, Trump and advisors such as Pete Navarro have claimed that tariffs could replace income taxes or generate $600-700 billion annually. Revenue collected through tariffs falls squarely within the realm of economists, namely, that of public finance economists. While calculating tax revenue is a seemingly straightforward task (simply multiply the effective tax per unit by the number of units sold), the reality is more nuanced. Because tariffs are taxes, both the buyers and the sellers will bear some of the burden of the tax. This means that the price that buyers pay for this good will increase by at least some amount because of the tariff. How much it increases and the subsequent decrease in the amount purchased will depend on decidedly economic factors.

Additionally, there is the further complication of determining how much of the economic incidence will be paid by the sellers and how much will be paid by the buyers. President Trump has famously (and repeatedly) claimed that it is the foreign country that will pay all the tariffs. He said this with respect to building the wall on America’s southern border during his first term and again with the tariffs he’s imposed during his second term. But more recently, Trump admonished Walmart on social media and told them to “eat the tariff.” And yet if tariffs were entirely paid by foreign entities, then there should be nothing for Walmart to “eat” here.

Third, the promise that tariffs can “bring jobs back” ignores key facts. Specifically, that US manufacturing output is near record highs, that most of the jobs that were “lost” in manufacturing were lost due to productivity gains not offshoring, that few people want to do the manufacturing jobs themselves, or that there are at present plenty of manufacturing jobs open does nothing to sway those who fetishize the idea that we need to “bring back” manufacturing jobs. The “New Right” champions tariffs to revive these jobs, but their arguments recycle failed protectionist ideas. Economists focus on trade-offs: tariffs may protect some jobs but raise costs for other industries, costing them jobs. The net effect on jobs of this is, for reasons economists particularly understand, negative.

Fourth, tariffs are also pitched as a means of diversifying supply chains and reducing our reliance on countries like China, a concern that was starkly brought to light during the late pandemic’s supply shortages. But to assert that tariffs simply will work, with no relevant unintended consequences, is to deny history. From the tire tariffs the US placed on China in 2009, for example, we can plainly see that the tire industries in Taiwan and Mexico were able to grow and mature into the sectors of their economies they occupy today. This sounds like a win. However, because of the higher prices for tires, trucking companies in particular shifted to using retreaded tires. Just how many additional accidents this will cause on the highway, as trucks experience more complete tire blowouts with the retreaded tires than with brand-new tires, is an empirical question. Its link to the tire tariffs is not.

Finally, we come to the consideration of tariffs as a negotiating tactic. Here, we can turn to Adam Smith, who provided perhaps the most cogent defense of this strategy in his Wealth of Nations. To summarize, Smith points out that tariffs and other trade restrictions (or the threat thereof) can be used to convince other nations to lower their barriers against us. However, they should not be a permanent state of affairs, and they should only be used in situations where they are likely to work. Trump goes far beyond this and, in doing so, is actively pushing other nations away, which will lower their overall effectiveness as a negotiating tool.

Economists and the Good Life

In his Modern Age piece, Callahan argues economists miss the value of contentment over endless consumption. Yet, economists understand trade-offs in personal choices, too. Many, including myself, choose lower-paying academic jobs over lucrative private-sector roles because we value the sense of meaning we achieve through our work. Furthermore, many of us will retire at some point; surely, the quest for “more and more” is not served by ceasing to earn income. Though some, like the excellent Walter Williams, joke that “if I should ever die, I want to have taught that day.” But in doing so, people like Williams demonstrate not their consumerist desires, but their great love for their craft of teaching.

Likewise, when Callahan criticizes Mike Munger for describing wealth as “the ability to obtain high quality, low cost products,” he again misses the mark. The implication is not that all people should always and everywhere zealously pursue maximizing their ability to buy more stuff. Munger is making the simple claim that, all else being equal, a person is wealthier when they can purchase more things. If we want people to have more access to the things that allow them to live healthier and wealthier (however they choose to define those terms), then we should eschew policies that make that more difficult. This is especially true of tariffs, which are widely recognized as being regressive in their application, even by members of the New Right such as Michael Lind.

Callahan’s call for economists to engage other disciplines is certainly reasonable, but even fostering an interdisciplinary spirit can never justify bad trade policy.

The unfortunate reality is that tariffs, which are intended to help the poor by bringing back low-skill jobs and boosting the wages of low-skill workers, often have the exact opposite effect and enrich the already-rich at the direct expense of the poor. Tariffs open the door for cronyism, one of the most pernicious forms of the transference of wealth from the poor to the rich. Autarkist utopianism may be well-intentioned, but it is just as misguided as socialist utopianism. The facts do not support the dreams of tariff proponents, and economists have both the ability and duty to speak up.

Callahan’s call for economists to engage other disciplines is certainly reasonable, but even fostering an interdisciplinary spirit can never justify bad trade policy. Economists understand tariffs’ mechanics and historical failures better than most. Trump’s “Tariff Man” rhetoric paints them as a cure-all, but economic analysis reveals their true limits. Great economists such as Mises and Hayek warned our field against narrowness, and we should heed their advice. But on tariffs, our skepticism is grounded in evidence and expertise. Far from misunderstanding tariffs, economists are uniquely positioned to clarify their costs and benefits for a public often swayed by political promises.

Trade policy does need to be guided by more than knowledge of the past or theoretical understandings. It requires prudence, good judgment, and discernment. Economists can (and should) contribute to these discussions owing to our specialized knowledge of how markets work and how they respond to policy changes. With the specific ends in mind that the Trump administration has espoused, the reality is that protectionism failed in the past, is failing us now, and, because of the very forces that economists readily understand, will fail us in the future.


Spread the news
Categories
Audio Posts: Selected Articles

Future of FBI CI

Spread the news


Michael_Novakhov
shared this story
from [Untitled].


Spread the news
Categories
Audio Posts: Selected Articles

Can We Restore American Print Journalism?

Spread the news

Can print newspapers be saved? The total estimated print circulation of US newspapers peaked in the early 1990s, and has been declining ever since. Circulation is now below what it was before World War II.

Many celebrate the demise of the newspaper industry, especially given that digitally-based news (including via social media) provides such a low-cost, diverse alternative to an industry whose reputation has been shattered by the almost comical narrowness of its longstanding liberal bias. Yet as much as I share that frustration with so-obviously partisan reporters claiming to be the disinterested guardians of American democracy, something integral to the health of our republic has been lost as print newspapers arriving daily on the doorstep become an endangered species. As Alexis de Tocqueville declared in Democracy in America about print media and the American people: “The newspaper brought them together, and the newspaper is still necessary to keep them united.” Without newspapers, will, as we’re warned, “democracy die in darkness”?

As a native Northern Virginian, I grew up reading the Washington Post in print—as its old (less savior-complex-sounding) slogan went: “If you don’t get it, you don’t get it.” My grade-school interest in the comics and sports sections—especially learning to decipher baseball box scores—in time matured into reading most of the newspaper every morning, a tradition I have maintained even while most other members of my millennial generation have dispensed with print media (and their expensive subscriptions). Though I am frustrated daily by the editorial decision-making of even the Post’s news desk, in fifteen minutes or less, I can gain a remarkably broad knowledge of the day’s international, national, and local news.

Much of the benefit is easier to perceive when compared to the digital media that has largely replaced print newspapers. In comparison to many of their digital brethren (though thankfully not this one), print stories and op-eds aren’t broken up every couple of paragraphs with distracting advertisements or eye candy. Moreover, whereas the newspaper is a single thing unto itself, one’s smartphone or laptop presents a constant, difficult-to-resist temptation towards the infinite scroll of social media or following countless Wikipedia threads or curious Google searches, which is the case even when reading excellent online long-form journalism.

Though it’s true that podcasts, social media, and Substack blogs offer a more intellectually diverse information ecosphere than the staid predictability of the Washington Post or The New York Times, research indicates that consuming information via print results in far better comprehension. And to get one’s news primarily from social media (as many Americans now do) risks becoming overly-reliant on an often simplistic, emotive, and stove-piped form of discourse that is inimical to the kind of extended, thoughtful debate required for republican government. In our increasingly anonymous and atomized world, digital partisan echo chambers reduce our exposure to people (and ideas) different from our own, even though such persons may literally live next door to us.

As Tocqueville observed, the influence of the press in early American life was immense. “It is the power which impels the circulation of political life through all the districts of that vast territory. Its eye is constantly open to detect the secret springs of political designs, and to summon the leaders of all parties to the bar of public opinion,” wrote the French aristocrat. Even when newspapers were demonstrably partisan, as they typically were until journalistic objectivity became part of the professional brand beginning in the 1890s, their role was to both inform readers and bring them into conversation with their political representatives.

Despite his disdain for American journalists who could be “crude” and “artless,” Tocqueville recognized that a vibrant, independent press served as an important “ingredient of liberty” and even sustainer of civilization, because it was not beholden to elite bases of power. “In America there is scarcely a hamlet which has not its own newspaper.” That, of course, is no longer true—the Washington Post and Los Angeles Times are owned by billionaires, while the largest print dailies represent little more than the opinions of the “American aristocracy.” Over the last generation, many modestly profitable family newspapers have sold out to larger, publicly-traded corporations in the search for higher quarterly revenues, destroying many multi-generational local and municipal newspapers.

If America is to uphold a free press tradition that predates our very founding—and avoid a future world of news that is entirely screen-dependent—we have our work cut out for us.

Robert D. Putnam famously catalogued the effects of this crisis in his now classic Bowling Alone. His research found that Americans who regularly read newspapers were more knowledgeable about current events, had higher membership and participation rates in local civic associations, volunteered more frequently, had higher voter turnout rates, and trusted their neighbors more than Americans whose sole source of news was television. Since then, further research has linked the closure of newspapers to declines in civic engagement, increases in government waste, and more intense political polarization. As local news dies off, Americans pay more attention to national politics, reducing competition in local campaigns.

American journalists are well aware of (and bemoan) this frightening trend and its effect on the health of our polity. Yet attempts to restore record-low trust in the fourth estate have been risible. After the November election, Washington Post owner Jeff Bezos announced that the paper’s opinion section would be “writing every day in support and defense of two pillars: personal liberties and free markets,” as if valorizing these principles would somehow persuade millions of disenchanted Americans of the Post’s editorial integrity. Since then, though, there has been little appreciable change in how the newspaper covers the issues of the day.

That every section of the prominent American dailies features content within a scarcely-hidden liberal bias is suggestive that the ideological rot of our nation’s media industry is deep indeed, beginning with the training they receive in journalism schools. Journalists working even the domestic news or sports desks have been trained to look for stories that will further particular narratives about race, sex, gender, social class, and religion. One looks in vain, for example, for straight news stories from the Post or the Times sympathetic to religious conservatives.

Nevertheless, I know I am not the only one who yearns for a newspaper delivered daily to my driveway, one I may not always agree with, but which provides a significant chunk of information required to be an informed citizen in a free republic. The popularity of Walter Kirn’s County Highway, a 20-page broadsheet published six times per year, born during covid lockdowns and serving as a “hand-made alternative to the undifferentiated blob of electronic ‘content’ that you scroll through every morning,” suggests I am not alone. Yet as impressive as that broadsheet may be, its infrequency can only make it a competitor with magazines such as The Atlantic or Harper’s rather than a replacement for outlets that inform us daily.

Admittedly, it would require an incredible amount of creativity (and financing) to replace America’s dying print media industry, including both local media to report on the news citizens require to make informed decisions on local politics, but also ones that can appeal to a broad cross-section of our diverse nation. The Wall Street Journal is too urbane and elite to fill that role; the New York Post, despite some periodic strong reporting and opining, remains too much of a tabloid (and, like the Journal, is too focused on one part of the country).

It’s possible that those colleges and universities with a strong sense of the founding principles of our nation, such as Hillsdale College, which currently offers a minor in journalism, could offer journalism majors with the hope of restoring a profession that, since our nation’s founding, has been integral to republican self-government. It’s difficult, however, to imagine this being done at a sufficient scale to combat or replace legacy print media. Nor does it seem likely that investors could be persuaded to corral enough conservative journalistic talent to form competitors and replacements to dying papers.

Perhaps, then, what is most realistic for now is for Americans to fashion personal, bespoke means of consuming information that, as much as possible, incorporates the traditional benefits of paper-reading while resisting the worst tendencies of the digital age. One such way to do this is to subscribe to veritable print publications which, though they are not dailies (or even weeklies), still provide a means of reading long-form journalism and essays without the distractions of the Internet. Another is to be intentional about reading local online sources, whether it be the “local” section of larger papers, or the websites of those smaller local papers that remain. Setting careful guardrails around our consumption of social media, as Clement Harrold recently urged at First Things, would also be beneficial to the health of our republic.

There are even bipartisan legislative proposals to provide tax credits for those who subscribe to local news sources as a means of supporting outlets not as beholden to the partisanship of corporate legacy media. None of these, I acknowledge, is a particularly satisfactory replacement for what has been lost, given the decline and increasing irrelevance of print journalism.

For now, I remain one of an increasingly small percentage of Americans who still shell out hundreds of dollars a year for a print subscription to my hometown paper, willing to put up with its pervasive bias, in part, out of a (perhaps naive) desire of teaching my children how to read the news in print and a nostalgic pleasure in reading the comics and sports with them. The framers of our Constitution so valued the press that they included its protection in the First Amendment. If America is to uphold a tradition that predates our very founding—and avoid a future world of news that is entirely screen-dependent—we have our work cut out for us.


Spread the news
Categories
Audio Posts: Selected Articles

The Myth of Victimization

Spread the news

History will harshly judge the United States’ prolonged vacillation over whether to honor the Fourteenth Amendment’s command of color-blindness by government actors in the wake of Brown v. Board of Education, with the Supreme Court earning much of the blame. Brown vindicated Justice Harlan’s lonely dissent in Plessy v. Ferguson (1896), which proclaimed that “our Constitution is color-blind.” Yet, it took the Justices 45 years, from Bakke in 1978 to SFFA v. Harvard in 2023, to reject the erroneous notion that racial discrimination in pursuit of “diversity” is acceptable under the equal protection clause of the Fourteenth Amendment and federal civil rights laws. Granting “preferences” to favored racial groups is invidious discrimination—and therefore unconstitutional.

The hand-wringing and indecision reflected in Bakke, Grutter, Fisher I, and Fisher II are an embarrassment to the High Court, which finally reached the right result in the Harvard case. The Supreme Court’s earlier endorsement of the dubious “disparate impact” theory (concocted out of whole cloth by the EEOC in a clear misreading of Title VII of the Civil Rights Act of 1964) in Griggs v. Duke Power Co. (1971) remains uncorrected. Statistical imbalances are not the same as intentional discrimination, and it is ludicrous to suggest otherwise. Title VII explicitly declined to impose racial quotas. Section 703(j) of Title VII specifically states that employers are not required to grant preferential treatment to any individual to correct statistical imbalances in the workforce, and Democrat Hubert Humphrey, the Senate floor leader for Title VII, famously promised to eat the pages from the statute if Title VII were shown to authorize preferential treatment for any group. Like many promises made in Washington, DC, this one was never carried out.

Wall Street Journal columnist Jason L. Riley’s excellent new book, The Affirmative Action Myth, is a thorough and balanced post-mortem of the Court’s bungled jurisprudence. The Affirmative Action Myth is really two books in one, as evidenced by the subtitle Why Blacks Don’t Need Racial Preferences to Succeed, because Riley argues that “affirmative action” (which he correctly points out is “synonymous with racial favoritism”) and other progressive innovations created in the name of “civil rights” have actually harmed blacks more than helping them. Decades of black upward mobility were upended by quota-driven affirmative action and welfare programs since the 1970s.

Thus, racial preferences are both improper and unnecessary. “The main purpose of this book,” Riley states, “is to explain how affirmative action has failed.” Riley deftly weaves together an accessible account of the Court’s tortuous decision-making and the failure of racial preferences to improve the status of affirmative action’s intended beneficiaries. He even digresses briefly into the Court’s dreadful busing decisions, using Lino Graglia’s aptly-titled book Disaster by Decree as a guide. As Riley proves, liberal largesse has made things worse for the black community. The Court’s affirmative action jurisprudence was infected by the same ideological flaw that has hijacked the civil rights movement in America. He cogently explains that the continued advancement of the black community in America will be hampered until that error is recognized and rejected.

Like many aspects of the Great Society’s social engineering programs, and their successors, affirmative action was well-intended. Beneficent motives, however, do not assure good results; as the saying goes, the road to hell is paved with good intentions. Affirmative action and other civil rights policies have been disastrous for blacks. The welfare state undermined the black middle class by encouraging fathers to abandon their families, incentivizing black women to have children out of wedlock, and resulting in the proliferation of single-parent households led by females. Family instability and, in particular, fatherless homes are strongly correlated with violent crime rates and other social pathologies.

As Riley demonstrates, all of this was a reversal of positive economic and educational trends experienced by black people between 1940 and 1960. Black Americans were making remarkable gains even under “peak Jim Crow.” Riley doesn’t claim that racism didn’t (or doesn’t) exist, only that the legacy of LBJ’s welfare programs and the advent of affirmative action has had detrimental effects on blacks that are often overlooked in lieu of continued reliance on tired canards such as “systemic racism” and the presumed debility among blacks caused by the institution of slavery (which ended 160 years ago).

Riley does not deny the existence of racism in America, but he insists that it doesn’t explain the disparities within the black community.

As it is practiced today, “civil rights” is an industry in which many activists, scholars, bureaucrats, journalists, and organizations have a vested interest in perpetuating the myth of black victimization and helplessness. Riley argues (with extensive supporting footnotes) that “blacks have made faster progress when color blindness has been the policy objective.” Allowing equal treatment to be replaced by a regime of “oppression pedagogy” and identity politics, Riley suggests, is “one of our greatest tragedies.” Racial preferences “have been a hindrance rather than a boon for blacks,” he contends.

Riley makes a persuasive case. He reprises the work done by scholars such as Thomas Sowell, Walter Williams, Robert Woodson, Shelby Steele, John McWhorter, and Wilfred Reilly; as he notes, much of the research on this topic by center-right figures tends to be done by black academics, possibly due to white scholars’ well-founded fear of repercussions. (If you doubt this, recall the pariah treatment accorded Charles Murray, Amy Wax, Ilya Shapiro, and others who refused to genuflect to the prevailing orthodoxy.) Riley also draws upon the work of Stephan and Abigal Thernstrom, Richard Sander and Stuart Taylor Jr., and many others. Readers may be familiar with some of this work, but Riley usefully summarizes it and supplements it with census data, lesser-known academic studies, and historical and biographical profiles such as Hidden Figures, the book and movie about pioneering black mathematicians who helped NASA’s space program in the 1960s. 

Riley devotes a compelling chapter to debunking the efficacy of affirmative action, but his critique is not limited to the harmful consequences of racial preferences in higher education—in the form of the “mismatch” phenomenon and otherwise. Riley contends, “One tragic legacy of the affirmative action era is that the number of black college graduates is almost certainly lower today than it would have been without racial preferences that mismatch students with schools for diversity purposes.” Riley nimbly tackles the whole array of liberal shibboleths on race: critical race theory (and its leading proponents), reparations, the false narrative of the 1619 Project, “mass incarceration,” DEI, redlining, and more.

What these topics have in common is that they share the premise—one that Riley debunks as a myth—that blacks are helpless victims of an oppressively racist system and cannot improve their status without special preferences and favored treatment. A more descriptive (but less catchy) title for the book would be “The Racial Victimization Myth,” because Riley explores the many facets of the false race narrative peddled by the Left.

The patronizing paternalism of the prevailing narrative harms blacks, Riley argues, because it instills a mentality of victimhood that fuels grievance and undermines effort and personal responsibility on the part of blacks. If the game is rigged due to “white supremacy,” and if “systemic racism” determines one’s fate, why bother to work hard, exercise self-restraint, adopt good habits (in the form of so-called “bourgeois values”), and so forth? This theme resonates powerfully throughout the book. History teaches that the path to upward mobility for minorities is assimilation into the mainstream culture, and the rigors and discipline of competition—the bedrock of meritocracy—foster a culture of striving instead of excuses, resentment, and despair. Treating black people as helpless victims discourages them from devoting themselves to achieving success through self-improvement.

One of Riley’s most effective rhetorical devices is the juxtaposition of attitudes about black self-reliance and personal responsibility from earlier eras and the contrived ideology of “anti-racism,” as exemplified by the writings of Ibram X. Kendi, Ta-Nehisi Coates, and Robin DiAngelo. Proponents of critical race theory (which Riley says “amounts to little more than a fancy justification for racial favoritism”) embrace what Riley calls “racial essentialism”: the notion that “anti-black bias in America is systemic … and must be eliminated root and branch before any significant narrowing of racial disparities can take place.” Not only is this premise contradicted by the well-documented progress blacks made in the first two-thirds of the twentieth century, even under Jim Crow, it is also contrary to the sentiments expressed by early civil rights leaders such as Booker T. Washington, W. E. B. Du Bois, Martin Luther King Jr., and even Malcolm X, all of whom emphasized the importance of self-reliance, hard work, and individual responsibility as indispensable to upward mobility. Riley notes that “Coates is waiting on white people to rescue black people. [Frederick] Douglass understood that black people must save themselves.”

Advocates of this concept, once dubbed the “politics of respectability,” called upon black Americans to adopt constructive manners, morals, and attitudes to achieve social and economic advancement, even in the face of discrimination. Other minority ethnic and racial groups, including Irish, Chinese, Japanese, and Jewish immigrants, overcame prejudice in America by adopting productive cultural habits—assimilation, in other words. Riley points out that this approach is anathema to modern-day civil rights activists, who scorn respectability politics as “ineffective and a waste of time. Studious black youngsters and other black people who adopt middle-class speech, dress, and behavior are accused of racial betrayal, or ‘acting white.’”

Critical race theory holds that “racism is mainly if not entirely to blame for black-white gaps in everything from income to incarceration to standardized test scores.” Riley strongly disagrees. Absolving blacks of any responsibility for improving their status in American society conveniently blames “white supremacy” for every aspect of dysfunctional black culture and encourages blacks to be dependent on favors bestowed by the welfare state. Riley states that affirmative action creates the “impression that black people are charity cases dependent on government programs.” Opponents of affirmative action and statist policies sometimes compare the retrogression of black advancement since the Great Society to a “return to the plantation.”

Like many aspects of the Great Society’s social engineering programs, and their successors, affirmative action was well-intended. Beneficent motives, however, do not assure good results.

Even mild suggestions for black self-improvement by sympathetic figures such as Barack Obama are rebuked by ideologues as “blaming the victim” and “talking down to black people.” Riley laments that “discouraging acculturation and assimilation in the name of racial solidarity is self-defeating.” The rejection of individual responsibility and the tendency to blame “whiteness” for all racial disparities have sabotaged the upward mobility and progress that blacks enjoyed before the civil rights era. Liberals understandably don’t want to acknowledge the body of data that Riley convincingly marshals. Hard work, thrift, sobriety, respect for authority, the nuclear family, recognizing the importance of education, and deferring gratification are not manifestations of white supremacy, but essential ingredients for success.

It is truly astounding to see how condescending leftist intellectuals deny black people any moral agency, and encourage them to wallow in grievance and victimhood. Riley observes that “black politicians and activists have a vested interest in a narrative that accentuates black suffering.” Victims require saviors, and those promising to deliver salvation are often rewarded with status, money, and influence.

The Supreme Court, left-wing scholars, and self-interested activists are not the only villains in The Affirmative Action Myth. Riley exposes the activist role of the Equal Employment Opportunity Commission in “turn[ing] Title VII on its head,” and points out that presidents from both political parties have muddied the waters by issuing executive orders mandating quotas (as Lyndon Johnson did with federal contractors in Executive Order 11246) or supporting the expansion of the Great Society welfare programs (as Richard Nixon did). There are few “heroes” in Riley’s account, although Justice Clarence Thomas—a longtime critic of affirmative action—comes close.

Riley does not deny the existence of racism in America, but he insists that it doesn’t explain the disparities within the black community (such as West Indian blacks versus African-American blacks) or the retrogression since the 1960s: “The elimination of white racism, however desirable then and now, is not a prerequisite for black socio-economic advancement.”

Riley’s sobering concluding chapter is chock-full of hard truths—and not for the faint of heart:

One reason antisocial behaviors [in black populations] became more common in the post-1960s era is because they became more tolerated and more lavishly subsidized by the government. … Low-income blacks began to adopt counterproductive attitudes and habits that previous generations had rejected and strived to eradicate. Even more tragically, academics began to intellectualize this degeneracy instead of calling it out for what it is.

Providing grisly details, Riley condemns the hip-hop culture and gangsta rap: “Too many young people have come to equate self-destructive behavior with black authenticity.” Put in economic terms, “government programs are no substitute for the development of human capital.”

Sadly, despite the proven failure of the victimization narrative, and its baleful consequences so readily apparent in our inner cities, the proponents of this model “have perhaps never been more celebrated in the academy and the media than they are today.” Affirmative action has never enjoyed popular support, and has now been declared unconstitutional and illegal. That it still carries sway in the influential spheres of academia and the media, Riley laments, “ought to be of deep concern to anyone who cares about the future of the black underclass.” Indeed.

The Affirmative Action Myth is a timely and well-written book that contains an abundance of common sense, solid arguments, and carefully researched historical data. One can only hope that it is widely read and provokes a long-overdue change in direction in the area of civil rights and race relations. Sixty years of failed policies are enough.


Spread the news
Categories
Audio Posts: Selected Articles

US intelligence worker arrested for ‘trying to leak secrets to Germany’

Spread the news


Michael_Novakhov
shared this story
.

US intelligence worker arrested for ‘trying to leak secrets to Germany


Spread the news
Categories
Audio Posts: Selected Articles

The Bard, the Truckers, and Prince Trudeau

Spread the news

In early 2022, from the safety of my office in academia’s ivory tower, I periodically tuned in to the scenes of Canadian truckers gathering outside Canada’s Parliament. I also saw the disgust with which these protestors were greeted by elites in government and the media across North America. Canadian Prime Minister Justin Trudeau, so this narrative went, was being harried by a band of white nationalist truckers. But in the end, so their narrative went, Trudeau deftly used emergency powers and the police to suppress the conspiracy.

The political furor over the trucker protests has died down, Trudeau has left the political scene in Canada, and the passage of time has left room for reflection. As a member of the elite myself, a tenured professor, I, of course, turned to the works of William Shakespeare.

The clash between the elite and the common people that played out in Canada reminded me again and again of Shakespeare’s play Pericles, Prince of Tyre. The play not only provides political insights into this episode in Canadian politics, but also exposes some of the sources of the uniquely broken relationship between elites and the common people in today’s technologically advanced democracies. The divide between the elites and the common people has been present since the first chieftain led a raid on his stone-age neighbors. Political communities need leadership. For almost as long, this divide has been exploited by political movements from populists in the Roman Republic to international communists in the twentieth century. In the information age, however, this divide has become starker than ever. Elites can wrap themselves in social media echo chambers that flatter their pride and cut them off from almost any contact with those they lead, upon whom the elite depend for their privileged lifestyles.

Unsurprisingly, William Shakespeare was no stranger to this class divide, and it is a constant theme in his plays. Perhaps ironically, through a play set in the pagan Greek world, Shakespeare reminds us of the Christian perspective on political authority, one based in servant leadership that today’s elites need to recapture. Of course, that task is made more difficult by ubiquitous social media and its distortion of reality. But perhaps the Bard’s enduring wisdom can show us how to achieve it in our own time.

Pericles, Prince of Tyre is one of Shakespeare’s least known plays. Rarely performed or discussed today, it was, however, his most popular play in his lifetime. Something in this play resonated with the people of England. It was an exciting adventure story, of course, but audiences also saw in it some reflections of their common experience. And even now, centuries after it was first staged, the kind of humble virtues Shakespeare extols in the play are necessary for good government in a free society.

The Canadian truckers dared to remind Parliament that their positions were granted by the will of the people, and the people set limits on their powers.

In the play, Shakespeare’s protagonist, the good Prince Pericles, is chased into exile by a powerful, evil king who wants the prince’s head. During his escape, Pericles is shipwrecked and washes ashore in a strange land. Though a prince, he is now without possessions, bereft even of the shirt on his back. As he wanders the desolate shore in search of aid, he comes upon a group of fishermen mending their nets and asks for their assistance. They offer to share their fire and food. Many of those who visited with the protesters outside of the Canadian Parliament reported a similar hospitality on display, as they welcomed those who joined them to ask questions and shared warm refreshments in the midst of a frigid Ottawa winter. This, of course, belied the picture painted by the media of the truckers and their supporters as a band of violent extremists.

The fishmen provide this same simple hospitality to the incognito Prince Pericles. As the small group converses by the fire, a nearby fisherman exclaims that he’s hauling in a huge catch that threatens to burst his nets, and the interlocutors rush to his aid. Instead of a great catch of fish, however, the net has plucked Pericles’s armor from the depths. As this unlikely story proceeds, Pericles dons the armor and uses it in a jousting tournament to earn the hand in marriage of a noble maiden. This is followed by the birth of their royal daughter, her near-death at sea, her kidnapping by pirates, her life in captivity, and her final tearful reunion with her mother and father. All’s well that ends well.

From my first reading of the play, I was captivated by the image of the fishmen catching Pericles’s armor. In many ways, this scene illustrates some permanent truths about politics, truths with which our political elites must become reacquainted. By virtue of their authority, leaders stand over ordinary people within the political community. This hierarchy, however, obscures the reality that those leaders cannot stand without the support of ordinary people. It is only through the labor of the common people that rulers and the political community are sustained with the necessities of life.

This is one of the truths that communism exploited so successfully in the twentieth century, when its ideologues promised a world in which the worker would no longer be exploited by the elites. Instead, communist revolutionaries used the popular power of this truth and the propaganda of equality to establish regimes dominated by an even narrower and brutal, unaccountable elite. The horrors of the resulting political systems, which fell disproportionately on ordinary people, are unmatched in modern history. Despite the nefarious purposes to which the communists have exploited this truth, it remains true. As the naked Pericles discovers on the seashore, the few elites cannot survive without the labor of the many.

On a spiritual level, Shakespeare’s audience watching this scene—a collection of both the elite and the common people—would have caught the reference to the first disciples, mending their nets by the Sea of Galilee when Jesus calls them to follow Him. And as the excited fisherman hauled up his weighty catch, the audience would have caught the reference to Peter incredulously “putt[ing] out into the deep” at Christ’s behest and the miraculous catch, by which Peter knew he was in the presence of the Son of God. That presence called him to humility and contrition: “Depart from me, for I am a sinful man, Oh Lord.” Jesus replies with that most common of all divine preambles: “Do not be afraid.” In a play on these biblical images, Shakespeare’s fishermen catch Pericles’s armor and set him on the path to the redemption of his honor and elite station. The prince’s return to power would have been impossible without the hospitality and honest labor of these simple fishermen; here, Shakespeare brings to our minds the model of servant leadership based in humility and a recognition of our common dependence on one another.

In many ways, the truckers who protested against Covid-era restrictions are much like Shakespeare’s humble fishermen—through their labors, these workers make all of our endeavors possible. Since the trucker protests, when I travel the interstates of the US, commuting to campus to teach and research political science, I’ve started to notice trucks and truckers more. Is there a class of laborers that is more overlooked and more essential than truckers? As we, the laptop class, the elite of our communities, rush to work, we’re annoyed by the delays they cause us. Though how often have we stopped to think that every piece of food we pluck from the grocery store shelf or piece of clothing we pluck from the rack was delivered by those same people? Without these bleary-eyed blue-collar workers, the shelves at the grocery store, and our stomachs, would be empty. Yet we overlook them so easily and see them as an inconvenience. Covid made this divide even starker, as the elites isolated in the comfort of their homes and worked remotely, while those who live from paycheck to paycheck continued to work to provide the goods that made our splendid isolation possible.

All of Canada, and the world, stood up and took notice, however, when the convoy of truckers began its ponderous journey to Ottawa, to the seat of Canadian government, where Justin Trudeau, author of the vaccine mandate, sat secure in his political echo chamber. As the convoy crossed through the country in the dead of winter, Canadians came out to the highway by the thousands to stand in the bitter cold and cheer them on. The fact that so many came out in weather that would freeze exposed skin in a few minutes was a testament to their enthusiasm for the truckers’ cause. No doubt it galled Trudeau to see the opponents of his mandate mobilizing Canadians to an extent he could only dream.

I was shocked and pleasantly surprised by the spirit that the truckers inspired in ordinary Canadians. In fact, as I watched the Canadian lockdowns and vaccine mandates broaden and deepen from my home in Texas, I began to suspect that, against all odds, Canadians still had the instincts of a free people. Historically, Canadians are a proud people. They are a people who sent thousands of their young men across the Atlantic to fight in both world wars, years before American boys would make the same journey. Their fighting prowess in both those wars made them a boon to their friends and a scourge to their enemies. They were, as one author expressed it, the “shock army of the British Empire.” In fact, may consider World War I as the beginning of the Canadian nation, when a Canadian army that fought as one showed the world that it was one people. It was this fighting spirit that I thought had been blunted by decades of peace and ease. Then Canada’s truckers showed me that even the quiet, polite Canadians could only be pushed so far before fighting back.

Isolated elites are much less likely to see errors, learn from them, and grow in humility if we don’t seek out the voices of those with whom we disagree.

When the truckers surrounded the Parliament buildings and began their protest, the Prime Minister refused to meet with them. The truckers had dared to break into the bubble that the elites had made for themselves. They dared to remind them that their positions were granted by the will of the people, and the people set limits on their powers. The lockdowns had thickened the walls of the echo chambers that elites inhabit on social media in the information age, as algorithms flatter them into self-satisfaction and isolate them from criticism.

In a scene between Pericles and the lords of his court, Shakespeare provides an image of the kind of councel that elites should seek. All Pericles’s court wish him peace and comfort, save Lord Helicanus, who is determined to speak his mind to the Prince:

They do abuse the king that flatter him,
For flattery is the bellows blows up sin;
The thing the which is flatter’d but a spark,
To which that blast gives heat and stronger glowing;
Whereas reproof, obedient and in order,
Fits kings, as they are men, for they may err.

Helicanus concludes that the one who flatters “makes war upon your life.”

After the counselor finishes his invective against the flattery of the other lords, Pericles sends away most of the lordly companions, but bids Helicanus stay behind; the other lords no doubt presuming the prince will dress down the insubordinate Helicanus in private. Helicanus thinks as much too when Pericles opens their private conversation by echoing the infamous words of Pontius Pilate: “Thou know’st I have power to take thy life from thee.” The brave Helicanus replies: “I have ground the axe myself; do but you strike the blow.” Seeing that the noble’s loyalty is as firm as his honesty, Pericles bids him:

Rise, prithee, rise;
Sit down, thou art no flatterer;
I thank thee for’t; and heaven forbid
That kings should let their ears hear their faults hid!
Fit counsellor and servant for a prince,
Who by thy wisdom makes a prince thy servant,
What would’st thou have me do?

The algorithms that curate our social media feeds are bottomless fonts of flattery. The elite class can draw all their information about the outside world and their standing in it from these whitewashed platforms. The continuous stream of affirmation can make them incredulous of those who disagree with their decisions, labeling them radicals on the fringe of society, as Trudeau and his inner circle did with the truckers. We isolated elites are much less likely to see our errors, learn from them, and grow in humility if we don’t seek out the voices of those with whom we disagree.

Unfortunately, to borrow an analogy from the pandemic, today, with the help of social media, our political elites and their supporters have immunized themselves against those lessons. One way to break this cycle of flattery is a return to the wisdom of writers like Shakespeare, who provide timeless truths that can humble our egos and provide a north star to help us navigate even the treacherous currents of the information age.


Spread the news
Categories
Audio Posts: Selected Articles

Atonement in Hollywood and the Old West

Spread the news

After almost five years of on-and-off production and an extensive trial by tabloid, Alec Baldwin’s infamous Western epic Rust finally premiered this month. It’s the story of a tragic death, where a faulty gun kills an innocent bystander, setting off a saga of sin, grief, and atonement as the culprit—the murderer?—plays out his shoddy hand in a high-profile clash with the law. This isn’t just the plot of the film, however, but the real-world tragedy that happened behind the scenes. 

If you know one thing about this film, it’s that cinematographer Halyna Hutchins—a wife, a mother, and by all accounts a dedicated and earnest artist—was shot and killed when Baldwin’s prop revolver misfired on October 21, 2021. That’s about the most definitive thing one can say about the events of that day: Baldwin’s involuntary manslaughter charges were quickly dropped after a late filing in 2023; a subsequent grand jury indictment was dismissed with prejudice in 2024; and while Hutchins’ family settled an (undisclosed) wrongful death claim in 2022, civil cases between the family, Baldwin, and other members of the crew who interacted with the gun that day remain ongoing. In a case where intent is obviously absent and where all parties are reluctant to either cast or accept blame, the full truth of legal culpability, if it exists, will likely remain a mystery. 

But that doesn’t mean we can’t speak confidently on morality. 

Rust stands firmly in the tradition of the great American Western, a genre in which the actions of morally grey antiheroes come together in a firm moral code. From The Searchers to Unforgiven, every iconic hero, vigilante, or outlaw teaches us that no matter your intent, killing leaves an indelible mark on a man. Yet it’s often a brutal world or an unjust law that forces him to reconcile his dark nature with his good conscience in the first place. As Baldwin’s own tale plays out in an undignified opera in the press, it’s poetically tragic that this film should offer a path of conscience, so that all parties involved may have a chance at peace. 

Rust takes place in the waning days of the Wild West. It’s 1882, the law has finally come to the burgeoning frontier town of Hayesville, Wyoming, where thirteen-year-old Lucas (Patrick McDermott) is raising his younger brother Jacob alone after their mother succumbed to illness and their father killed himself from grief. Still grieving himself, Lucas is nevertheless saddled with adult responsibilities—tending the farm, serving as both caregiver and provider—but when another boy bullies his brother, his wilful youth wins out. He beats the boy to a pulp and makes an enemy of his victim’s surly father. 

Since the father is now out a farmhand, he plans to take Lucas back to his ranch to work off the debt, indifferent to what that means for Jacob. “I’ll come collect you in the morning,” he says, though Lucas has no intention of going. So when Lucas takes his faulty gun and crests a steep hill the next morning to fire a shot at a wolf seen stalking their livestock, we’re left wondering whether he missed or hit his target. The beaten boy’s father drops from his horse. 

The law quickly and callously finds Lucas guilty of premeditated murder, sentencing him to hang despite his “age and circumstances.” He believes it’s morally abhorrent to flee and leave his brother behind, but he reluctantly runs when his estranged grandfather, legendary outlaw Harland Rust (Baldwin), comes to spring him from the slammer. The story unfolds as the reluctant duo rebuild a relationship from the ashes of their shattered family, outrunning a villainous bounty hunter, and a brooding US Marshal on their way to freedom in Mexico. 

Hutchins’ cinematography carries much of the film, capturing the desaturated landscape of a dying Western frontier and the shadowy silhouettes of the even shadier characters within it. Even so, Rust is not a stand-out Western, mostly due to lazy writing, as character dynamics go largely unexplored. 

From this point forward, there will be spoilers. 

Why is Rust so feared and revered in the West? We learn that he burnt down the banks that took his farm and that he abandoned his daughter (Lucas’ mother) after the rest of his family died. But beyond a vague sense of regret, we’re told very little about the moral transformation that led him back to Lucas. Why frame the story around a suspensefully open-ended murder/accident only to abandon that storyline, never folding it back into the plot? We’re told to presume it was an accident, but we never really find out whether Lucas is a murderer or a victim of a cold, utilitarian system. With a schmaltzy happy ending that sees Lucas and Jacob reunited, we’re not even expected to thoughtfully speculate. At the end, Rust, who is an admitted murderer, gives himself up to the marshal on condition that Lucas be permitted to cross the border. But why is a supposed man of the law so easily convinced to let the boy walk free? He’s gruff and talks in morally ambivalent platitudes—so we’re just expected to believe he’s painfully disillusioned with his role as a faceless enforcer of heartless laws. 

The world today is just as cruel as it was in the West. The brutality of the frontier is but a metaphorical complement to human nature, a timeless fixture of the world anywhere humans inhabit it.

Rust is filled with so many nods to cowboy classics, however, it’s clear what filmmakers were trying to do, even if the execution was less than stellar. Baldwin, who co-wrote the script, said he found inspiration in Clint Eastwood’s Oscar-winning Unforgiven, the story of a retired outlaw trying to live a peaceful life before his conscience, violent proclivities, and a merciless world pull him back to his old ways. We see implicit nods to The Searchers, not just in the vengeance-turned-redemption arc of an ex-Confederate trying to save his niece from her Comanche kidnappers, but in the iconic cinematography where darkened interiors open to vibrant outdoor backdrops, drawing a stark contrast between wilderness and civilization, and our own dark natures lurking subtly within the frame. Naturally, we also see echoes of Shane, in which the most classic line of the genre captures an ethos that has defined nearly every Western in the 75 years since: “There’s no living with a killing. There’s no going back from one. Right or wrong, it’s a brand. A brand sticks.” In the vein of these films, it’s possible to fill in the blanks. 

At the end of the day, Rust and Lucas are both killers—directly or indirectly, “right or wrong”—grappling with the pain they’ve both received and inflicted in an unforgiving world. Rust knows it, he accepts it, and to his shame, knows nothing can change it. He craves forgiveness, but doesn’t believe he deserves it; he doesn’t even know of whom he might ask it. The only absolution a man like Rust can have is to sacrifice himself to the hangman so that Lucas may have a chance to lead a good life. None of this depth is clearly articulated, but two solid performances between Baldwin and McDermott in Hutchins’ moodily lit fireside chat scenes are enough to bring the sentiment through. 

The other great theme of the Western genre—the tension between unjust laws and true justice in a brutal world—comes through in Lucas. The boy can’t run from whatever feelings of shame and culpability may materialize as they eventually did for his grandfather. He may have escaped the noose, but he is still a killer; he will have to live with it, alienated from his home as an outlaw in Mexico. Though Lucas can’t run from his conscience forever, he can and must run from the law. And he simply doesn’t have the luxury to process his shame in real time. 

In an earlier time in the West, his case would have likely been handled differently. We hear fond talk of these times throughout the film: how earlier generations lived more rugged, how they necessarily had to impose a stricter code on themselves, and thus personally uphold it against others—a rudimentary, but in many ways more noble, form of civilization. “The only order that exists in this world is the order we impose,” the marshal says early on about his role as a lawman, yet to realize his eventual compassion towards Lucas is the true hallmark of man-made order in a system of far-removed laws.

In Rust’s generation, perhaps this feud would have been carried on by Lucas and his victim’s son, fading into frontier lore. Perhaps a local elder would have seen the folly of an eye for an eye and told them to squash it. Or perhaps, with no authority to appeal to on the unsettled plains, it would simply have been forgotten as the way of the world. But for better or worse, with the railroad comes the cold, exacting, and impersonal force of the law. Is it right, given the circumstances, that the boy should hang? Or does the law, failing to grasp the ways of the world and people it governs, only perpetuate injustice? 

The film seems to suggest the latter, a nostalgia for a moral code superior to pure law. It’s this which allows man to fully unchain his conscience from his desperate circumstances, as Rust apparently discovered as an outlaw. But it’s this theme that goes sadly unexplored for Lucas. Free in Mexico, he ought to reflect on his sins, the boy he himself left without a father; no one will ever really have a happy ending. 

It’s the hallmark of any great Western to show that while the law may enact justice by its own standard, it cannot induce feelings of shame, it cannot offer absolution, and it most certainly cannot offer the most deeply personal interaction of all: forgiveness. These are what truly define a man’s life, and a good law conforms to a man’s own impulses here. But when the standards of law are off from the nature they should conform to, then he has no choice but to flee, even when his conscience bristles. 

You can say the charges against Baldwin were dropped because he’s Hollywood royalty, and a liberal elite in good standing with America’s cultural arbiters. Or, you can say that’s the only reason the charges were brought in the first place. It doesn’t matter—Baldwin is a killer, and nothing can change that. But has this saga of never-ending court cases, where Baldwin has degraded himself on a press tour of deflection, frenzied self-defense, and shameless victim posturing, helped anyone find justice, let alone peace? 

The normal human reaction to killing, even among the outlaws of the West, is to feel a deep sense of shame, to grieve those whom you’ve harmed along with the piece of the soul you’ve lost, and to seek atonement and forgiveness. One ought to give Baldwin the grace, as a human being, to assume that he does feel this way. But in his own way, he was forced to flee the law in just as undignified a manner as Lucas. 

In our litigious world, where the law is often far from impartial, and the media hunts for every misstep publicly and in real time, Baldwin, too, lacked the luxury of speaking with the dignified shame of a human being who took the life of another. Earnest grief is an implication, shame a confession—anything not carefully filtered through lawyers and PR consultants is a potentially life-ending liability. After all, Baldwin, too, has a family to care for. 

That’s not to defend what he did. But despite legal technicalities, there is no punishment the law can provide that would remedy this tragic accident. Can we accept and atone for our gravest sins? Can we recognize the atonement of others and forgive them for the sins inflicted upon us? These are the only questions that matter, not a hair-width’s difference of finger placement on the trigger of a gun that should have never been loaded in the first place. 

In many ways, the world today is just as cruel as it was in the West. After all, the brutality of the frontier is but a metaphorical complement to human nature, a timeless fixture of the world anywhere humans inhabit it. There can be no happy ending for anyone involved: Baldwin will never get a fair shake in the public eye, and the Hutchins family certainly can never be made whole, no matter the legal or civil outcome. The only path forward is to accept that sin is sin, no matter the intent, and that it is a perpetually sinful world we live in. It’s a deeply personal journey of reflection that should not and cannot play out in public. But it’s the only way anyone has a chance at ever finding peace.


Spread the news
Categories
Audio Posts: Selected Articles

What’s Behind the “Woke Right”?

Spread the news

Is there a right-wing version of “woke”?

Earlier this week, Rod Dreher used the term “woke right” in a worthy essay for the Free Press warning against the emergence of a dangerously radical trend that inclines right-leaning young people toward overt racism and antisemitism.

Almost immediately, however, Dreher repented of his use of the term. He had not realized that it was also used by people like James Lindsay, Konstantin Kisin, and others to criticize his own political friends. He quickly clarified that all he meant to include in the category were “the new racialists and anti-Semites of the right, especially the very online right, who I regard as malicious would-be totalitarians.” Not National Conservatives, postliberals, or Matt Walsh. And definitely not J.D. Vance!

In Search of Woke

It’s certainly fine for Dreher to clarify his meaning, but it’s hard not to chuckle at the fact that he redefined the term simply to match his own prior assessment of who does and doesn’t seem to be a racist and a “malicious would-be totalitarian.”

The affair reveals a problem with the “woke right” discourse generally. Plenty of people seem to think there’s something there worth examining (even if they use different terminology), but most are just looking to find a new name to call a group of people they already dislike without having to undertake any broader assessment. While some, like Christian apologist Neil Shenvi, do seem genuinely interested in understanding a phenomenon, far more are merely looking to command an epithet. Instead of asking what is going on, they just ask who deserves my indignation.

If we focus on the phenomenon itself, we might find that the lines are less clear than Dreher suggests. “Woke” is a term that captures vague sentiments better than concrete meanings. It is also a phenomenon with several elements, which can then be organized in a number of different ways. And people may share certain qualities and tendencies, but not others. Dreher wants to boil it down to an essential characteristic that makes it easier to point a finger at some people without having to examine others that he’s already committed to. Others, like Lindsay perhaps, define it more broadly because they do want to point their finger at Dreher’s friends. For Dreher, the essence of woke is “racism” (possibly combined with a subjective assessment of malice). Others, though, could say it’s the use of critical methods. Or an oppressor/oppressed framework. Or its shutdown, bully tactics like cancel culture. Which is the defining woke value?

All of that is to say that there are significant problems with pretending “woke right” is an unambiguous category. There is, however, an interesting angle to the discourse that makes it worth pursuing. When applied to the left, “woke” has always evoked something beyond a set of political preferences, aims, or tactics. At its peak, at least, wokeness seemed to cut to the very being of its adherents. It seemed to define entirely their sense of self and their orientation to the social, ethical, and even spiritual order of the world. Thus, it has often appropriately been seen as having a quasi-religious character. The year 2020 saw public ceremonies with liturgies, confessions, creedal statements, etc. There was ecstatic emotionalism; a rigid and sometimes violent demand of conformity; a puritanical moralism that often accompanied a lack of basic personal morality and decency. Wokeness was an all-consuming Cause that gave people a sense of meaning that they had not found elsewhere in the world around them, which they believed to be evil beyond redemption.

Is there a similar tendency in pockets of the right? To treat their Cause not just as a worthy endeavor, but as existential? To believe that only in their Cause can they find personal meaning and, indeed, salvation? That was, according to the great political theorist Eric Voegelin, a key quality of modern ideology. And Voegelin may help enlighten us on the qualities of left-wing wokeness and the right-wing counterpart that so many have been trying to define.

Escape from the World

Somewhat controversially, Voegelin framed his critique of ideology in terms of its conceptual and historical linkage with the ancient heresy of Gnosticism. Regardless of how helpful the historical “gnostic” connection itself is, Voegelin’s use of it undoubtedly provides a powerful framework for understanding radical ideology—and it’s one that seems to have the “woke” phenomenon pegged.

Voegelin’s concept of gnostic political ideology is mostly associated with the utopian or semi-utopian end at which it aims—the attempt to “immanentize the eschaton,” in his memorable phrase.

But equally important—and particularly intriguing with wokeness in mind—is the way these ideologues confront the world in which they currently live and operate. All gnostic movements, he argued, start with a radical alienation from the actual order of the world as experienced, generally resulting from a sudden collapse of cultural and institutional authority. For the gnostic man, “the world has become a prison from which he wants to escape.”

This means the ideologue’s dissatisfaction is rooted in a belief in the complete “wickedness of the world” as it is experienced, rather than any flaw in himself or human nature. In contemporary terminology, he understands the world in terms of a “hegemonic” power that entirely imprisons the thoughts and actions of all except the few enlightened. It is thus something that must be entirely overcome and replaced, rather than something to be mitigated, sidelined, or worked through. In the modern context of political gnosticism, the structure of social order is so overpowering, that only a revolutionary (or counterrevolutionary?) movement can break through its spell.

This (dis)orientation of the human being to his moral context seems to capture the woke political phenomenon, which takes extreme forms of critical theory as its conceptual starting point. Racism, for instance, is redefined from being a personal fault to a “systemic” one that necessarily infuses all aspects of common life, even if no one is actually doing or saying directly racist things. In that sense, it is commanding—it infuses our reality so fully that even those who do not want to be racist and do not think themselves to be racist actually are. No one is capable of resisting its evil influence except those who have the “secret knowledge” (gnosis) and work to destroy it in favor of a different, equally closed system of values.

Many “alt-,” “dissident,” or “new” right thinkers also tend to speak in such terms. They often present the various maladies of the modern world not as discrete pathologies flowing from the flawed nature of human beings, but as closed, commanding systems—structures that trap modern men and warp their minds in unperceived ways.

The source of cultural renewal is precisely in this freedom of the human being to encounter, reflect on, and respond to his circumstances.

Consider the concepts and metaphors that are often deployed in online discourse. The “red” and “blue” pills of The Matrix offer the choice to continue to live in the false, constructed reality or to gain secret knowledge that reveals true reality; references to “the regime” (relying on a tortured reading of Aristotle) suggest that all societies are governed by a coherent and closed system of values backed by power, and that political conflict consists in titanic clashes over which closed system we shall adopt. Alt-right celebrity Curtis Yarvin speaks of “the cathedral,” and masculinity advocates use the “longhouse”—both architectural metaphors that suggest the horizon is utterly closed around us, except for those few who have escaped; everyone is unknowingly dominated by the system.

Most of these are revolving around the idea, also prevalent in more mainstream discourse, that “liberalism” (broadly construed to include classical and progressive forms) is a kind of essence that infuses our reality rather than a pattern of behavior or a set of concepts that—for both good and ill—have emerged in response to the Western experience. Thus, in the balance of partisan conflict hangs not only offices and policies, but our very state of being. If the hegemonic forces are not defeated and replaced, we are cut off from living the good life.

It seems to me that this belief—that the world as we experience it is utterly alien, all-powerful, and destructive of our ability to live a good life—does represent a kind of perverse common ground between some pockets of today’s right and the woke. Precisely how that vision manifests when it comes to race, “oppressor” categories, or precise political tactics are less essential questions. Both translate politics into a religious endeavor, with salvation depending on collective human action. It is not hard to see how such beliefs lead to rampant conspiracy-mongering, the impulse to rigorously police words and deeds, and the belief that political action ought to aim at a comprehensive control of private life and civil society, either by expanding government or by collective pressure campaigns. It also combines a rigid moralism when it comes to demands on others with a complete lack of self-reflection or self-restraint. Everything is permitted for me, since my Cause is just.

Whether or not such tendencies on the right amount to a “woke” right seems to me a trivial question of labels. But the mirror ideological approach is important and troubling.

The Freedom to Respond

This way of understanding the world has at its core a denial of human freedom—not freedom in a moralistic sense that demands everyone must be free to think or do this, that, or the other. Rather, the freedom being denied is the capacity human beings always have within them to respond to the world in which they operate in light of truths and traditions that may be widely denied. The imminent order in which we live—including social and political life—certainly has a powerful influence on the way we understand the world. We are historical beings who understand ourselves by engaging and reacting to the symbols of our time and place. But does the social and political framework around us entirely bind our moral and intellectual horizon?

Voegelin did not think so:

The spiritual disorder of our time, the civilizational crisis of which everyone so readily speaks, does not by any means have to be borne as an inevitable fate; [ ] on the contrary, everyone possesses the means of overcoming it in his own life. … No one is obliged to take part in the spiritual crisis of a society; on the contrary, everyone is obliged to avoid this folly and live his life in order. (emphasis added)

The source of cultural renewal is in this freedom of the human being to encounter, reflect on, and respond to his circumstances drawing on alternative cultural resources that can never be fully abolished. It is precisely such a response which indirectly produces new cultural symbols.

Nothing can ultimately separate us from our ability to live in truth, and to critique and unmask the errors of the present age.

That passage quoted above also speaks to a potential counterargument. Since the right-wing figures under consideration are precisely pointing to some version of modernity and modern ideologies as the hegemonic power to be unmasked, it might be argued, they must be exempted from falling into Voegelin’s category. Left-wing wokeness, progressivism, and critical theory are indeed dangerous and powerful forces in the world. So if these are seen as the cause of the disorder, then the right is only “woke” to a very real and present problem—the same one Voegelin himself diagnosed.

There may be something to this: many of the right-wing figures under consideration are at least somewhat aware of the destruction that left-wing ideologies have unleashed on modern society. But that recognition does not prevent them from falling into the same patterns of thought. By treating these ideological forces as hegemonic and commanding, the right-wing ideologue signals his agreement on the basic, servile condition of the human person and his agreement on the power that the revolutionary thinks he possesses. Cultures are built not by tradition or reflection, but by the will to power.

Voegelin was very clear that the ideologue, no matter how successful his Cause may be, does not actually have the ability to accomplish the full transformation of human life he seeks.

The gnostic revolution has for its purpose a change in the nature of man and the establishment of a transfigured society. Since this program cannot be carried out in historical reality, gnostic revolutionaries must inevitably institutionalize their partial or total success in the existential struggle by a compromise with reality; and whatever emerges from this compromise—it will not be the transfigured world envisaged by gnostic symbolism.

Those who are attuned to the errors and danger of left-wing ideology must contend with an essential question: what exactly is the consequence of such ideologies? Did they truly succeed? Are we now morally and spiritually trapped in a world created by them? Are we “blue-pilled” or living in a “longhouse”? If so, then we must acknowledge that their advocates were correct in their assessment of the human condition—that the will to power is indeed the guiding principle of our existence. A reasonable response would be for the enlightened few to learn from and mimic the revolutionary’s course of action.

Alternatively, are we living in an order that has been disturbed, confused, and partially transformed thanks to the destructive but ultimately futile efforts of the ideologues? In this case, we should be reassured that—whatever the revolutionary may think—nothing can ultimately separate us from our ability to live in truth, and to critique and unmask the errors of the present age. The most dangerous roadblock to cultural renewal, then, would be to accept the false premises about the human condition that ideologues like Marx, Gramsci, or Evola preached. Ultimately, their way of thinking about human beings is far more destructive of a healthy culture than their political advocacy.

Certainly, not everyone on the new right falls into such an extreme “gnostic” attitude, but some of its more “dissident” figures do. And certain terms, assumptions, and rhetorical devices that reflect this basic orientation often make their way into mainstream right-of-center discourse. Coming up with a list of who qualifies as “woke right” seems less important than recognizing the ideological tendencies. In interesting and disturbed times like these, labels and partisan coalitions aren’t as important as clear thinking and self-reflection.


Spread the news