Religious Freedom Under the First Amendment: Three Supreme Court Cases and the Ambiguous Term “Religion”

Throughout the years, and in various Supreme Court cases, the distinction between “religious/sectarian” and “nonreligious/secular” has been rather ambiguous. In this essay, I will examine three separate Court cases in which the Court had to defend its verdict by employing what I deem “ambiguous” uses of the term “religion.” Moreover, I will argue that “religion” as a phenomenon is virtually impossible to define in any concrete, rigid manner. Given this reality, the Court’s decisions, when attempting to demarcate the line between that which is religious and nonreligious, will always remain blurry. Hence, it is my position that ambiguity will remain ever present in the their decisions so long as the Court continues to deal with an ambiguous phenomenon[1] known as “religion.”

Before examining the three cases, I will first begin by looking at the First Amendment and the surrounding historical context in which it was shaped, a context, as we shall later see, that set the trend for the Court’s various positions on “religion.”

The First Amendment was shaped in the 18th century during a time when several principles were deemed essentially conducive to a peaceful, well-governed society. The principles were: (1) liberty of conscience; (2) free exercise of religion; (3) religious pluralism; (4) religious equality; (5) separation of church and state; and (6) disestablishment. “While many of these terms carried multiple meanings in the later eighteenth century and several other terms were under discussion, these six principles were foundational for the American founders.”[2] The First Amendment—an amendment originally governing only Congress—was first applied to states and local governments via the Fourteenth Amendment’s due process clause in the pioneering case of Cantwell v. Connecticut (1940).[3] “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof.”[4] The Founding Fathers initially feared a particular religious institution so close to the State that it would use the State to persecute any dissenting voices.[5] However, in their attempts to articulate a form of government that allowed the flourishing of religion, the Fathers left one fatal void: they failed to define “religion.” What constitutes a religion? Witte writes, “Nowhere is the word ‘religion’ defined in the Constitution or Bill of Rights…”[6] In fact, if “original intent” is observed, it becomes relatively clear that by employing the term “religion” the Fathers meant “a plurality of Protestant Christian faiths.”[7] That is, they probably did not mean to defend the religious freedom rights of Muslims, Buddhists, Hindus, or even Catholics. Nonetheless, a few scattered remarks from this time period do exist which help us understand what “religion” was thought to be. In 1802, Thomas Jefferson wrote, “[R]eligion is a matter which lies solely between a man and his God.”[8] Here “religion” was thought to be (a) a private affair and (b) involving a person and some deity. On June 26, 1788, during the Virginia convention on the Constitution, the authors wrote revealingly: “Religion, or the duty which we owe our creator, and the manner of discharging it…”[9] Here it can be seen that “religion” was thought to be something between a person and his or her deity/Creator. What is alarming in these two remarks is the lack of precise terminology. As we shall see, the modern day Court—from the 1940s onwards—has continued to wrestle with its definition of “religion,” having inherited this ambiguous legacy. I now turn my attention to three modern-day cases in which the demarcation between religious and nonreligious has continued, as in the past, along ambiguous lines.

In Frazee v Illinois (1989), the Court decided a case that involved a certain William Frazee who refused to accept a retail position that involved work on Sundays. He claimed that, as a Christian, it was unlawful for him to work on the Lord’s Day. Frazee later applied for unemployment benefits and was denied. Consequentially, the Department of Employment Security’s Board of Review justified its refusal to grant unemployment benefits to Frazee by stating: “When a refusal of work is based on religious convictions, the refusal must be based upon some tenets or dogma accepted by the individual of some church, sect, or denomination, and such a refusal based solely on an individual’s personal belief is personal and noncompelling and does not render the work unsuitable” [489 U.S. 829, 831] App. 18-19.[10] In a rather fortunate series of events, the Supreme Court picked up this case and overturned the earlier decisions made by the lower courts. Justice White, arguing for the majority opinion, wrote:

“While membership in a sect would simplify the problem of identifying sincerely held beliefs, the notion that one must be responding to the commands of a particular religious organization to claim the protection of the Free Exercise Clause is rejected. The sincerity or religious nature of appellant’s belief was not questioned by the courts below and was conceded by the State, which offered no justification for the burden that the denial of benefits placed on appellant’s right to exercise his religion.”[11]

Essentially, the Court said that while it may be true that Frazee was not a part of any church or sect—for all they knew, he might have stayed home on Sundays only to watch Oprah and eat Bon-Bons—nonetheless, it was not the State’s job to verify the sincerity of religious beliefs, or, for that matter, to act as an arbiter in religious affairs. Put simply: if a person stated they were Christian, it was beyond the State’s power to attempt to prove or disprove the sincerity of those beliefs. The State was not a religious organization, and so could not pass judgment on the sincerity of any deeply held—or, for that matter, deeply faked—religious beliefs.

In the above case we see, once again, a continuation of ambiguity when it comes to the subject of religion. Mr. Frazee was not a part of any church or religious organization. And yet the Court overturned an earlier denial of unemployment benefits on the basis that work on Sundays, for Frazee, was an unnecessary burden on his allegedly religious conscience. Justice White wrote, regarding the difficult process of demarcation between religious and secular, “Nor do we underestimate the difficulty of distinguishing between religious and secular convictions and in determining whether a professed belief is sincerely held.” Could one create a religion out of thin air, claim a free exercise violation, and win? In the post-Frazee v Illinois world, it seems so. For here—as much as ever—the term “religion” is not clearly demarcated from the secular/nonreligious. If staying home and watching football on Sundays is, at some future point, considered to be a “religious act,” who would blame the Court for not knowing what to do? Nobody seems to know what religion/religious is to begin with. Next, I will look at yet another pesky issue: just how cozy could the secular State get with religious holiday displays?

Lynch v Donnelly (1984) was a case settled after the groundbreaking Lemon v Kurtzman (1971). In Lemon v Kurtzman the three-pronged “Lemon test,” a test used to determine whether a law had the effect of establishing a religion, was first formulated.[12] In the case we are now considering—namely, Lynch v Donnelly—the city of Pawtucket, R. I. came under fire for erecting a Christmas display on private property owned by a nonprofit organization, property located directly in the center of the city’s shopping district. Amongst the Santa Claus house, Christmas tree and other such holiday objects, there was also placed a crèche, or nativity scene. This crèche was challenged for being an “establishment clause” violation: the State, funded by diverse taxpayers, was using its funds to “promote” a single religion, Christianity. The case ended up going to the Supreme Court, where the Court concluded, “Pawtucket has not violated the Establishment Clause.”[13] What were the Court’s reasons for reaching this verdict?

The Court argued that the now-famous concept of a wall of separation between church and state was a “useful metaphor” but “not an accurate description of the practical aspects of the relationship that in fact exists.” In addition to this, the Court argued that the Constitution did not, in fact, “require complete separation of church and state”; rather, “it affirmatively mandates accommodation…”[14] The Court also recognized how ubiquitous religion was. Religion was a part of the “American life.” Because it was the Christmas season, and because the crèche alone was not the singular focus of the Christmas display, the Court—echoing the “Lemon test”—ruled that “the city has a secular purpose for including the crèche in its Christmas display and has not impermissibly advanced religion or created an excessive entanglement between religion and government.”

As can be seen, the Court argued that religion was already mixed into the American way of life, thus admitting that the line between religious and secular was to be found “in the mix” somewhere. In other words, there wasn’t much of a line to begin with. Nonetheless, the Court still attempted to make that line materialize. Somehow, because of the “national tradition” and our desire to “depict the origins of that Holiday,” Christmas had become a rather secular holiday, with displays serving “legitimate secular purposes.” While the Court was busy employing the terms “secular” and “religious” without defining them, they had also snuck in some Orwellian double-think by referring to Christmas as both a “religious” and, finally, a “secular” holiday. And so the ambiguity continues.

I now want to turn my eyes to my final case. In Employment Division v Smith (1990) the Court back peddled on the “accommodationist logic” it used in Lynch v Donnelly. In this case, the defendants were two members of the Native American Church fired from their place of employment for using peyote on religious grounds. Once fired, they applied for unemployment benefits and were denied. The Oregon Supreme Court initially ruled that denying them unemployment benefits for using peyote on religious grounds violated their right to exercise religion; however, the state refused to pay out the benefits because possession of peyote was deemed a crime—so the case went to the Supreme Court. The Court focused, citing Sherbert v Verner, on whether the employees had a “constitutional right to unemployment benefits on the part of all persons whose religious convictions are the cause of their unemployment.”[15] Smith, one of the members who appealed to the Supreme Court, argued that he was doing nothing different than what we saw done in Frazee. That is, “[i]f Frazee could get unemployment compensation for refusing to work on Sunday, his day of rest but not worship, Smith argued, surely he could get compensation for being fired for engaging in the arduous and ancient religious ritual of peyote ingestion.”[16] The Court, however, was not in agreement with Smith. On the contrary, they argued that this case should be treated not as a case dealing with unemployment per se but rather as a case dealing with “free exercise” and compliance with “criminal laws.” In fact, the Court argued that Oregon State’s law regarding the illegal use of drugs (or which peyote was one) was “neutral” and “generally applicable”; hence, differing from the prior cases such as Frazee, the Court now argued that it was possible for the State to cast a burden upon a religious person so long as it was doing so by means of a generally applicable law that did not single out any particular person or religion.[17] Using the Court’s logic in Smith and applying it to Frazee one could argue that Frazee did not deserve unemployment compensation since he refused to work on Sundays—and “mandatory Sunday-work is required of everyone, being generally applicable to all, religious or irreligious.”[18] Such a statement, however, was not made in Frazee. Why?

Returning to the second case I looked at—that is, Lynch v Donnelly—allow me to remind you that in that case that which was secular and that which was religious was comingled. In Lynch the religious became the secular by means of “tradition.” Since what was initially religious had been around so long, it was no longer really religious; it was, in fact, perfectly secular. “Christmas is not really a religious holiday; it is mostly a secular holiday with ancient, religious roots. But most of us don’t focus on the religious element, so it’s basically secular,” went the argument.

But not so in Smith. Here a couple of men, who were unquestionably religious, were not allowed to exercise their religious beliefs. Like Christians partaking of the Lord’s Supper—sipping on a toxin known as alcohol[19]—the men involved in the Smith case could not exercise their beliefs. Why? Because the state thought their use of peyote, even in what was deemed a purely religious ritual, to be illegal. The line between religious and secular was assumed throughout the Smith case; there was no question that the two men were participating in a religious act. However, the relationship the State had with their so-called “religious activities” was vastly different than its cozy relationship with the mostly Christian activities we saw in Frazee and Lynch. In these cases, whatever was found to be religious was either explained away as the mostly secular (Lynch) or deemed impossible to verify (Frazee)—in both cases the Court allowed the religious to exercise their religious beliefs, no matter how fake (Frazee) or how assimilated into the secular culture (Lynch). What we saw in Smith, however, was what appeared to be a rather concrete, underlying assumption that the Court understood what it meant for something to be “religious.” But even here the “religious” was never defined. And so, despite the dogmatic rhetoric, the Court has yet to define what it means for something to be a religious act or a religion.

In 1912, James H. Leuba published a seminal paper that included an oft-cited appendix listing more than fifty definitions of religion.[20] Today, more than ever, the religious is ubiquitous—we see it in law, in politics, in science classrooms, in our libraries, in our churches, etc. As then, so now, we don’t really know what religious really means—if anything at all. There are a multitude of definitions available to us. Some, like the Founding Fathers, may see religion as that which involves some deity/Creator. Others, like Buddhists, may argue that no such deity is required by religion. Still others may argue that no such thing as God exists. Some may think a church or synagogue plays an essential part in what it means for something to be deemed religious; others, like Frazee, argue that religious acts do not have to involve such structures. Some may argue that religion has so infiltrated our society, it is no longer possible to clearly separate the two (e.g., Lynch). Some may argue that religion is relatively straightforward, involving the use of chemical substances; practices that the State could, in theory, forbid (e.g., Smith). In all of these various cases, involving a plurality of definitions, the distinction between religious and nonreligious, sectarian and secular, remain forever indistinct to our eyes as we gaze into that abysmal sea of religious discourse “through a glass, darkly” (1 Cor. 13:12). So long as the Supreme Court continues to deal with this most notoriously difficult of issues—that is, the ambiguous phenomenon we call religion—so long will we be haunted by paradoxical court cases and unclear decision-making processes.


Written by Moses Y. Mikheyev

Dedicated to John Witte, Jr.




[1] It is entirely possible to argue that my use of the term “religion” in itself is already misleading; instead, it may be argued, that what I should have written should have been the plural “religions.” However, I use the term colloquially: it encompasses all and every “religion,” whether the various religions have anything in common or not. (Even here one detects a thorough-going ambiguity: what, in fact, do all religions have in common? Or do we just group various phenomena that appear to be ceremonial as being “religious”? What, then, is “religion”?)

[2] John Witte, Jr. and Joel A. Nichols, Religion and the American Constitutional Experiment, 4th ed. (New York: Oxford University Press, 2016), 62-3.

[3] Ibid., 98-9.

[4] Ibid., 1.

[5] Ibid., 30-1.

[6] Ibid., 95.

[7] Ibid.

[8] Ibid., 56.

[9] Ibid., 74.

[10] Frazee v. Illinois Dept. of Employment Security, 489 U.S. 829 (1989), URL=

[11] Ibid.

[12] The three-pronged approach is as follows: “a challenged law must (1) have a secular purpose, (2) have a primary effect that neither advances nor inhibits religion, and (3) foster no excessive entanglement between church and state” (Witte and Nichols, Religion, 163).

[13] Lynch v. Donnelly, 465 U.S. 668 (1984), URL=

[14] Italics mine.

[15] Employment Division, Department of Human Resources of Oregon v. Smith, 494 U.S. 872 (1990), URL=

[16] Witte and Nichols, Religion, 146.

[17] Ibid., 146-7.

[18] The words in quotation marks are theoretical, in case that was not made clear.

[19] “Respondents contend that the sacramental use of small quantities of peyote in the Native American Church is comparable to the sacramental use of small quantities of alcohol in Christian religious ceremonies” (Employment Division, Department of Human Resources of Oregon v. Smith, 494 U.S. 872 [1990], URL=

[20] Jonathan Z. Smith, “Religion, Religions, Religious,” in Critical Terms for Religious Studies, ed. Mark C. Taylor (Chicago: University of Chicago Press, 1998), 281.


Why I am a Democratic Socialist: Capitalism and its Exchange of Ethics for Economics

I was born in Krasnodar, Russia just before the collapse of the Soviet Union. My parents brought me to the United States a year before the collapse actually took place. I guess you could say I escaped communism by the skin of my teeth. Growing up, I heard all the ridiculously hilarious tales my dad told me about how he ran several businesses at once, all in secret from the government, just to make enough money to live relatively comfortably. To be free from suspicion as to how he made his money, my father worked as a city bus driver. The pay was almost as ridiculous as the things he did to survive. By day, he would drive the bus; by night, he would jack it up, attach a machine to the odometer, and make sure it spun all night! He would get paid for the amount of miles he drove. He didn’t drive many in reality, but his “nightly business” gave the appearance that he drove people day and night. And so his pay was decent. He could put food on the table, as long as his odometer kept increasing its digits in leaps and bounds.

My dad, the entrepreneur, also grew flowers. He grew lots of them. Every March 8th—International Women’s Day, for those who don’t know—was my dad’s version of Black Friday. In those days, even during the relatively oppressive Russian communist regime, people practiced romance. They’d buy flowers for their lovers with money secretly stolen from the government. In those days, virtually the entire expanse of Russia could be considered “government property.” And so anything and everything belonged to “the government.” Nobody really knew who “the government” was but they all knew it was surely not them. And in such a way all theft and every theft became theft from the government.

It would come as a surprise for me to think that “the government” did not know that people were stealing from “it.” Think about it. You give people a salary of sixty rubles a month. But rent is seventy rubles. Food is another twenty. It doesn’t take a mathematician a long time to figure out that people were surviving in Russia against all odds. In other words, you’d be an idiot to think that people who were alive during a famine were not hiding food. The fact that you could survive in Russia should be seen as an impossibility. You had to be stealing. In fact, you, and you, and you over there—all of you—had to be stealing. The communist regime produced a lot of competent thieves—for they all en masse became thieves. This sort of regime could not possibly last long. And—thank God!—it didn’t. On December 26th 1991 the Soviet Union became no more. It vanished—and not a tear was shed…

In the year of 1987, Gorbachev, then president of the U.S.S.R., tried to implement policies that would make the Soviet Union more democratic—it was known in the Russian as the process of “demokratizatsiya.” Implementing a democratic government proved harder than he thought and so nothing ever really came of it.

Anyone living in those days would not have ever called Russia a “democratic” country by any means. The people had no say in government. They could not democratically choose to be a capitalist society, for example. They could not democratically choose to be a socialist society either. The rules that governed their world, their economy, came from a group of elites. And the rules and regulations favored the elites. The common people were left to fend for themselves. This was no America: there was no working democracy in Russia.

To be honest, I’ve always been fascinated with communism, socialism, and capitalism. There was something demonically sacred about communism. It was the thing everybody here feared. And I was born in a country full of communists. I was, to be blunt, “one of them.”

I never actually cared much for politics and economics growing up. Sure, I thought communism was a spooky word, but I never really studied any of it. It wasn’t until I graduated nursing school, and had begun working as a nurse at a hospital, that I started thinking about the way our country did healthcare. And believe me, I was scared shitless. I didn’t know much. And I still don’t know much. (How many people have read the Affordable Care Act? I mean, please, the senators didn’t give a rat’s ass about it—and didn’t bother to read it—so why would a commoner bother to read several thousand pages?) Truth be told, I still don’t know much about healthcare. Moreover, none of the doctors I’ve met, nor the nurses I’ve spoken with, knew much about healthcare policy. No one I’ve asked seemed to know what Obamacare was all about. Nobody seemed to know what it all meant. Not the health professionals, not the therapists, not any of the hospital staff, not the senators. It made me wonder: who the hell wrote this? Well, I never did find out…

The more I worked in the hospital setting, the more I became convinced that we had become a business that was trying to make money. We were all being forced to cut corners, to practice shitty nursing care, just so that somebody on top could make money, get their year end bonus. Most people outside of nursing probably don’t have a clue as to what I’m talking about. Most people outside of doctors probably don’t know what I mean when I say: we have to do a million things at once. And that makes for unsafe nursing practice. In fact, in Washington State we have something called “ADO Forms” we can fill out if we feel the hospital is not staffing us safely. These are “assignment despite objection” forms. Basically this means that I accept this patient load under the condition that my professional judgment is that I cannot safely provide the quality nursing care I believe I would be able to provide had I had less acute patients. In other words, the patients are too acute and I’m going to be swamped. And if I’m swamped, expect errors to be made. The hospital, then, would, allegedly, take liability for any errors committed on that nurse’s shift. Such is a day in the life of a nurse.

I would like to make it clear that none of what I am saying here is controversial. There are no hospitals that I am aware of that staff so safely that their nurses are very satisfied. If you don’t believe that this is an ongoing issue, start reading a little bit about California State law regarding their nurse-to-patient ratios. For different units, depending on acuity, nurses are assigned different patient loads. If you are a nurse working ICU, the law states that there shall be one nurse for every two patients. If you work on the telemetry unit, as of 2008, the nurse to patient ratio is 1:4. Those working on the medical surgical floor are staffed 1:5. This is what the nurses, the healthcare professionals, and, eventually, state law decided is best when it came to staffing at hospitals. A law had to be made because staffing had become an issue. For the majority of states in the United States, there are no state-enforced ratios. The hospitals can literally do whatever the hell they want. If the CEO and the board of directors decide to cut staffing, well, there’s really not much the nurses and doctors could do about that. When was the last time you heard anyone making a fuss about safe staffing at hospitals? You’ve probably heard more about that new stadium the college kids want. Too bad our hospitals are falling apart. So long as we have that stadium built, we’ll all be merry. And so something as critical as healthcare—good, quality healthcare—is left on the sidelines, waiting for some kind of Jesus or Good Samaritan to come around and resurrect it back to life.

I apologize for boring my readers with nursing gibberish and something as essential to life as healthcare: I assure you, I mean well. It wasn’t until I became an angry nurse, one who wanted to do something about the healthcare we provided, that prompted my immersion in the Washington State Nursing Association, my state’s formal nursing union. You see, all of my “socialist” activities since have had their initial birth right there in a hospital setting. The rest of my story is history.

I was angry with hospitals. I was angry with staffing. I was angry that a young, athletic and fast guy like me could not keep up with the system. I could not keep up with charting. I could not keep up with dressing changes. I could not keep up with providing assistance with my patients’ activities of daily living. And certainly I could not provide a shoulder to cry on or a second opinion. I was way too busy to do any of the normal “human” stuff. This was a business, and we had to make money. Money, money, money. Everybody wanted some money.

I understood the need for money. But I never understood something as simple as providing one additional staff member—let’s say a nursing assistant “valued at,” roughly, $40,000 per year—just so that our unit could function well. Because, as all of the nurses know, one fall in the bathroom per year can cost your facility a million dollars in a lawsuit. And we heard about those lawsuits, believe me. And they could have been prevented. All you needed was an extra set of hands. We weren’t asking for much. But what do nurses and doctors know about healthcare, right? I mean, doesn’t the CEO know that a patient admitted to the hospital with a right hemisphere stroke tends to be impulsive, and is at a high risk for falls? And, when left alone in the bathroom, is almost guaranteed to attempt to get back to bed—unsafely—on his or her own. That’s what happens when you have a stroke that affects that part of your brain. But I’m just a nurse.

I joined the union and I ended up being one of the five nurses from the hospital that renegotiated our contract with the hospital. We went through the entire thing, line by line. Unlike the senators, we knew the thing inside out. It was highlighted to the point of becoming so saturated in color that the paper ignited our room in flashes of neon yellow. We underlined words we wanted changed; we looked up Washington State “codes”; we included clauses that we thought would serve the interests of our hospital’s nurses and patients. We did our best, no doubt about that. It was a long seven-month process. Unlike the members that worked “defending” the interests of the hospital, we were not getting paid a penny to be there. In fact, lunch, parking, and all other associated costs, fell solely upon us. If we wanted to make this hospital a good place to be a patient at, and a good place to work, we had to want that. Really bad. And want that we did.

I remember a conversation I had with our labor attorney. She was a middle-aged woman with dirty blonde hair and a gentle smile. She would sit there listening to all the nurses point out the strengths and weaknesses of the hospital. She’d let us rant for minutes and then interject with a brief, “I like that!” She would, then, proceed to write whatever you said down. During one of our lunch breaks, I asked her why she did what she did.

“Why defend nurses?” I asked her.

“Why not work for the hospitals and make the big bucks?” she asked me. “I could never do that,” she continued. “I could not do that ethically. Never.”

For Laura, making a little over forty-thousand a year was more satisfying than making six figures and helping destroy this country by allowing hospitals to essentially become businesses more interested in money than in providing quality healthcare. The attorneys fighting unions in defense of the hospitals were essentially fighting for corporate interests. These guys do not give a damn about you or your health. What matters to them is how much they can get away with while making a fat profit. That’s why hospitals hire the nation’s best attorneys.

Laura did not think that ethics should ever be compromised by economics. In fact, economics should always be subservient to ethics. If you were to be called a good person, you needed to act in the most ethical manner possible. And sometimes, especially in healthcare, this called for acting in a very un-economical manner. “People over profit,” as them dirty socialists say.

I didn’t use the word “socialist” above unconsciously. In fact, I used the term precisely because democratic socialists surrounded me the entire time I was working with the union. Over coffee, during and after meetings, so many of our conversations turned to politics, women’s rights, human rights, and economic equality. The people that surrounded me were some of the kindest individuals I had ever met in my life. These ladies were the very epitome of moral leadership. My sense of morality was being nourished and sustained by these conversations and our work for the nursing union. Every time we met, I thought more and more about politics and economics…

Take Fran, for example. She was a jolly woman, somewhere in her sixties, who participated in women’s rights demonstrations, strikes outside of hospitals, and was a proud participant in the hippie movement during the ‘60s and ‘70s. She would let homeless people into her home, feed them, and rant non-stop about social justice. She never stopped talking. I began calling her “Frantic Fran.” And, as she knew by then, I refused to participate in her “Fran-tasies!” She read books by the socialist writer Chris Hedges and Cornel West. One night I invited her to come with me and see Hedges at an event at the Bing Crosby Theater. She gratuitously accepted and spent the night refusing to eat my popcorn, listening voraciously to Hedges critique the corruption in our government. Outside of her denying me the pleasure of sharing popcorn with one of my professional colleagues, I think Fran qualifies as a moral leader. She is compassionate, she is involved, she knows what she is talking about, and she cares deeply about the things she engages in. Her actions are the direct result of her thoughts and words.

Then there was Cheryl. She was the sweetest and gentlest of the bunch. She had a gorgeous smile and bright eyes. She radiated a certain grace. During one of the times in which we asked nurses to fill out their concerns regarding hospital staffing, one or two nurses—who were Republicans and, naturally, could not stand unions—met us. Prior to us meeting them in person, Cheryl took me aside and said, “Look, Moses, some of these nurses here don’t like the union. They think it’s bad. They will try to hinder our progress. They will monitor our activities and may report us if we do anything that does not comply with hospital policy [such as talking to nurses about the union while they are actively working and involved in patient care]. In such cases, we’ll just smile, offer coffee and cookies, and move on. Moses, don’t be angry at them. You are doing them good—they simply don’t know what you are doing for them.” Cheryl here was echoing that prophet’s words: “Father, forgive them, for they know not what they do.”

Somewhere during my life as a nurse, a college student studying theology, and working with the union, I began reading Robert Reich’s works. The things Reich wrote about struck a chord with me. He was dealing with the same issues I was dealing with. This was real. This wasn’t economic theory. This wasn’t some bullshit Hollywood one-night flick. This was my life. These were my patients’ lives. I watched his documentary Inequality for All and found myself dumb-founded. (The documentary was recommended to me by our hospital’s own medical director. These damn socialists are everywhere!) The economy was rigged and nobody was doing anything about it. CEOs were barely squeezing their fat assess through the bank doors to cash their insane checks—checks they wrote to themselves. A select few were reaping the majority of the country’s money. Big Pharma was having a cakewalk buying out lobbyists, senators, and scientific studies left and right. In fact, the pharmaceutical industry became so successful at purchasing studies that in 2005 John Ioannidis, a Stanford epidemiologist, titled his research study Why Most Published Research Findings Are False. Ever since it has become the single most downloaded technical paper in the history of PLOS Medicine’s existence. This shit was pervasive. Our scientific community was being handed over to corporations who didn’t have any sense of ethics. They didn’t give a damn about right or wrong. They didn’t give a rat’s ass about false research, so long as they made a dollar or two.

They placed profit over people. They made economics subservient to ethics. They did what the democratic socialists feared all along: neglected their sacred duty to be good, ethical people. They had welcomed in the greedy, all-consuming hands of unfettered capitalism. A virus so sickening even an ethical human being—when infected with it—fails to abide by simple, universal principles of right and wrong; simple things like “Don’t lie” fall on deaf ears. But that’s what happens when a society, a democratic community of people, allows economics to be the end-all, be-all of human flourishing. When ethics are thrown out the window, all shit is permissible. If ethics do not exist, all things are permissible. And when unfettered capitalism pits economics against ethics, it doesn’t take an MLK to figure out which one goes flying out the window first.

At our meetings with the hospital we asked for safe staffing. We had all of us deliver “speeches” to the hospital’s attorney. Some of us spoke like a Demosthenes. But that was, mostly, to no avail. The attorney, after one such speech, told the nurses to quit offering her “sound bites.” She didn’t give a shit about patient safety or the concern of the nurses. We spent one such meeting discussing safety concerns for something like ten hours. The following day, the hospital’s human resources administrator sent out an email “summarizing” the efforts of the union (in my own paraphrase):

“The Washington State Association of Nurses is requesting that all employees pay union dues. We believe that employees should be allowed to exercise their right to choose whether they would like to be a member of the union and pay union dues. Therefore, we are not in agreement with the union. Negotiations are expected to resume on…”

Reading the email, I realized exactly what it felt like when biased journalism was being passed as dogmatic truth. Here was a summary of our activities, and, while it stated something that was true, failed to convey the atmosphere of those meetings. We were not emphasizing mandatory union membership for all new employees. We were not asking for wage increases all day. We were asking for safe staffing. And we spent the majority of our time giving reasons why. That, in short, was the real concern of the union and the nurses. (Excepting one nurse who was more concerned with money [she was the only bad apple on our team].)

The email hit me like a ton of rocks. I now was able to subjectively relate to those people who read magazine articles about themselves. And the article had (almost) nothing to do with me.

The emails the hospital sent out had a clear agenda: convince the nursing staff that the union was a thorn in their side. Despite this false propaganda—and it wasn’t explicitly evil, it was mostly subtle—we continued having staff meetings, served coffee, and discussed the need for a strong union at the hospital. Twice a month or so we’d meet at the collective bargaining table with the hospital. In between those times we’d meet separately with our attorney, union representatives, and nurses to discuss the issues we’d discuss at upcoming meetings. We took notes, wrote “speeches,” and essentially came up with every argument and counterargument to safe staffing ratios at our hospital. We had scientific research papers showing how state legislated nurse-to-patient ratios, such as those found in California, in comparison to similar states, actually saved hospitals money due to a decrease in infections post surgery secondary to better staffing ratios.[1] A recent article published July 14th 2015 in Scientific American was titled “Widespread Understaffing of Nurses Increases Risk to Patients.”[2] The blurb below the online version read: “Emerging data support minimum nurse-to-patient ratios, but hospital administrations are reluctant to adopt them.” Such was the state of staffing nationwide. And as of the date of this writing (December 22nd 2015) California remains the only state in the entire country that mandates nurse-to-patient ratios by unit. No other state does this. Why? Aren’t there laws in this country mandating how hospitals should run? Well, sort of. The law you are talking about is probably the shitty 42 Code of Federal Regulations (42CFR 482.23[b]). The section you are thinking of states, and I am not joking, “The nursing service must have adequate numbers of licensed registered nurses, licensed practical (vocational) nurses, and other personnel to provide nursing care to all patients as needed.”[3] That’s it. The nebulous and vague language is as weak as a Tweety Bird facing a Marshawn Lynch in beast mode. What in the hell does “adequate numbers” mean? We all know—and I am using the categorical “all” here—that hospitals nationwide are not being staffed adequately. And who is determining what is adequate? The nurses. And I know that they know that they aren’t being staffed adequately. But the hospital administrators—about as detached from healthcare as a bed bug is from beauty care products—have no idea what the hell adequate staffing is. They sure as hell know how to make a buck or two, but don’t give me the nonsense that they understand nurses, doctors, or the needs of patients. They don’t.

It’s no surprise, then, that we never got safe staffing done at our hospital. We never got ratios put into our contract. Of course we knew it was next to impossible, but a couple of us decided we’d let the administration know just how the nurses felt. Out of seventy-five nurses at this hospital, something like forty-five wrote small cards stating their support for our proposals regarding safe staffing and ratios. This wasn’t, in other words, something controversial at our hospital or something we, like a despotic regime, were trying to force upon a non-compliant majority. In fact, truth be told, we were the majority. But not all stories, as I’ve grown to learn, have happy endings. Ours certainly didn’t. However, there were a few things that we did change. We included a code from Washington State that mandated safe staffing committees at hospitals. We copied and pasted it right into the contract. It would, theoretically speaking, give nurses some negotiating power when staffing went downhill. We could, in theory, at least point to the wording and say, “Look, the safe staffing committee doesn’t think these ratios are safe.” We did do that. We also, somewhat reluctantly, spent time negotiating our wages. In comparison to nearby hospitals, we were behind by something like twenty percent. They ended up giving the entire nursing staff, across the board, a two-dollar and fifty-cent raise. They figured, for unknown reasons, that this was a good idea but increasing staff members was not. I have no idea why they did what they did, but they did. So we took our money—and our real fight (i.e., safe staffing)—packed our bags, and headed home.

In all honesty, I left a better—more informed—person. I may have lost the battle, but I have not lost the war. I took the war with me, brought it home, brooded over it for weeks. The weeks turned into months. And here I am, months later, still contemplating all of these real issues. How is it that unions are so weak? How is it that, in America, we no longer care about unions? How is it that at my own hospital nurses were fighting against us—against their own?

Rewind my life a couple of years and you’d find me standing behind a cash register at a T.J. Maxx store in a white dress shirt, a tie, and khaki pants. I was on my way to nursing school, finished my prerequisites, and was killing time during the long months I spent on the waiting list. After I signed on with T.J. Maxx, the managers gathered us into a room during orientation and played us a video. At that time—and this was a long time ago—they still used videocassettes. So here was this chubby, happy-go-lucky manager with whiskers and a thick Italian accent trying to teach us youngsters why unions were scary. He probably was merely a talking head for the corporation doing his job. I now doubt he had any idea as to what he was talking about. He probably got a memo with a basic script, read it to us, and went home to a nice wife, two kids, a dog, all in a wealthy suburb. Even then I never understood why corporations like T. J. Maxx feared unions. What was so bad about people uniting? What was so dangerous about people having power? What was so bad about a democratic process.

And—there—I said it: this was about democracy. The corporations hated democracy. They hated the fact that regular people like you and I could gather together and tell them—nay, demand—certain rights. We, as a collective bargaining unit, could voice our concerns; we, as a community, could have a say in the way we are treated, the wages we are paid, and our working conditions. And two or three big wigs at the top did not give a rat’s ass about your rights. So long as you left them alone, gave them 300x the amount of money their average employees made, they were happy. But they would not be happy for long. Why should the CEO only make 300x more than his or her average employee? Why not 400x more? Why not 500x more? Eventually the CEO asks himself: Why the hell should we even pay these fuckers at all?

            Robert Reich, in his latest book Saving Capitalism: For the Many, Not the Few, writes about what he calls “the meritocratic myth.” This a capitalist myth invented by the rich and wealthy to keep little people sucking on their thumbs for life. Essentially it goes like this: in a capitalist society people are paid what they are worth. So, if you have a CEO making a billion dollars a year and a worker at her corporation making six dollars an hour, well then, so be it: the worker must only be “worth” that much. But there is a historical problem with this. (History tends to reveal all kinds of problems, in my experience.) As Reich writes,

“Anyone who still believes people are paid what they’re worth is obliged to explain the soaring compensation of CEOs in America’s corporations over the last three decades, relative to the pay of average workers—from a ratio of 2 to 1 in 1965, to 30 to 1 in 1978, 123 to 1 in 1995, 296 to 1 in 2013, and over 300 to 1 today. Overall, CEO pay climbed 937 percent between 1978 and 2013, while the pay of the typical worker rose just 10.2 percent.”[4]

Clearly, as history shows, CEOs are increasingly making more as we are, when our wages are adjusted for inflation, increasingly making less. How is it that in a capitalist society, which calls itself a democracy, we have a large portion of people making these corporations what they are but not being compensated for their work? Why is it becoming increasingly common to think that a CEO—somehow in isolation from all the employees working with him—is the only one worthy of his wages? If you think this is the case, congratulate yourself: you’ve bought into a myth they want you to believe. It’s like a child’s belief in Santa Claus. There’s no empirical proof for Santa’s existence, but it keeps the naughty kids in check. And all you naughty workers need to suck your thumbs and suck it up: life ain’t fair. The CEOs make a lot of money. Deal with it. But how do they continue to make so much money? One reason is that they have money, and money gives one access to power. Access to power gives one access to lobbyists and senators. You get those guys to write a bill for you, favoring you, and you’re good to go. This is why even when CEOs screw up, they still get paid—for the majority of people in America believe all kinds of myths, and the myth of meritocracy is one of them. Take Martin Sullivan, for example. He made $47 million when he left his company AIG. The company’s share dropped almost a hundred percent under his leadership. But CEOs are paid what they’re worth, you say? Thomas E. Freston, the CEO of Viacom, ended up getting a severance package of $101 million after being fired.[5] The list goes on and on ad infinitum.

You know you’ve complained about the lazy McDonald’s worker—and rejoiced when he was fired—but when was the last time you complained about the CEOs and their pay? The rich and powerful have always prided themselves on being able to make the little people wage war on littler people. Seldom do the poor gather together and wage war on the elite who enslave them, the ones who are responsible for the majority of their problems. Welfare is an issue? Are you kidding me? CEO severance packages are the issue. Stop comparing fleas with elephants in the room, pal.

And guess who’s paying for the CEO pay? You and I. You heard that right: you and I. Yes, we’re paying. “[C]orporations deduct CEO pay from their income taxes, requiring the rest of us to pay more proportionately in taxes to make up the difference. To take but one example, Howard Schultz, CEO of Starbucks, received $1.5 million in salary for 2013, along with a whopping $150 million in stock options and awards. That saved Starbucks $82 million in taxes.”[6] And you, my friend, the one residing in Washington State, subsidized his pay. We are responsible for the $82 million dollar loss in tax revenue. Congratulate yourself. And next time you pay taxes, remember, some of that money is going to Howard Schultz. Literally.

And while the (mostly) Republican fan base protests an increase in minimum wage, try to swallow the $26.7 billion paid out to the already rich Wall Street bankers in bonuses alone. This “would have been enough to more than double the pay of every one of America’s 1,007,000 full-time minimum wage workers that year.”[7] But enough about the majority of people residing in America. Who cares about those guys, right? All you have to do is work hard. Get a degree. You’ll be fine, they said. Well, that’s no longer true, either. “[B]etween 2000 and 2013, the real average wages of young college graduates declined.”[8] In the past, even a factory worker could provide for his family, stay-at-home wife, and three kids. He could buy a small home in a good neighborhood, and own two new cars. Today, that’s not the case. The worker is making shit, his wife is making shit working full-time, and they can’t afford children or good healthcare. The cars they drive are owned by some big bank. The house they live in is owned by the same bank. The degrees they both hold are in debt—to the same bank. Nothing is theirs. They are, no doubt, slaves to Wall Street. They work, they breathe, they live to pay some dude at the top. That’s the reality of modern America. But what happened? Did our GDP decrease? Did something happen that could explain this profound change in the economic reality of many Americans?

“Since 1979, the nation’s productivity has risen 65 percent, but workers’ median compensation has increased by just 8 percent. Almost all the gains from growth have gone to the top.”[9]

That, my friend, is what happened. No, it wasn’t the Mexicans; it wasn’t the Muslims; it wasn’t the immigrants; it wasn’t the bum that caused your problems. It was the rich and wealthy people mostly populating a small section in New York called Wall Street, and the rest of the Wall-Street-inspired, greedy CEOs.

Let’s play a little game of comparisons. Let’s have Reich take us back a couple of decades.

“Fifty years ago, when General Motors was the largest employer in America, the typical GM worker earned $35.00 an hour in today’s dollars. By 2014, America’s largest employer was Walmart, and the average hourly wage of Walmart workers was $11.22…The GM worker was not better educated or more motivated than the Walmart worker. The real difference was that GM workers a half century ago had a strong union behind them that summoned the collective bargaining power of all autoworkers to get a substantial share of company revenues for its members.”[10]

And there you have it: the solution to our current crisis. It’s been in American history—our history—for a very long time: we need strong unions.

Do we have any real evidence that unionization will save our nation? I think we do. As Reich points out,

“Some argue that the decline of American unions is simply the result of ‘market forces.’ But other nations, such as Germany, have been subject to many of the same ‘market forces’ and yet continue to have strong unions…In contrast to decades of nearly stagnant growth for most Americans, real average hourly pay in Germany has risen by almost 30 percent since 1985.”[11]

Here we see two countries going through the same technological revolution that has engulfed America with vastly different results, on an economic scale, for the middle- and working-class. One key difference is that unions are still alive and well in Germany.

Returning for a brief second to my claim that we need to increase the minimum wage, critics—those people who live in castles made of cloud-stuff and write with immense knowledge from their Ivory Towers—claim that increasing the minimum wage would result in massive unemployment. But has this actually been the case? Not at all.

“Research by Arindrajit Dube, T. William Lester, and Michael Reich confirms this. They examined unemployment in several hundred pairs of adjacent counties lying on opposite sides of state borders, each with different minimum wages (one at the federal minimum, the other at a higher minimum enacted by a state) and found no statistically significant increase in unemployment in the higher-minimum-wage counties…”[12]

There are other reasons why we need to increase our minimum wage. And this one involves the rich and wealthy yet again. (I hope you’re starting to see who’s really causing a lot of our problems.) This may not be news for you but: corporations want to pay as little as they can to their workers. But why? Well, because they can make more money that way. But in paying their workers a minimum wage that is no longer the “living wage” is was meant to be, the rest of us tax payers subsidize companies like Walmart and McDonald’s. Here’s how.

People who get paid a minimum wage are usually on government-subsidized programs like food stamps, Medicaid, and other such programs. Where is that money coming from to pay for these programs? From you and I, yet again. Since McDonald’s doesn’t want to pay their workers living wages—wages they can survive on—they make the rest of us pay for their employees. Next time you walk into a McDonald’s remember that you are helping these workers get their paycheck. Walk in like you own the place (because you do).

If you think I’m kidding, here’s another statistic for you:

“[I]n 2012, 52 percent of fast-food workers were dependent on some form of public assistance, and they received almost $7 billion in support from federal and state governments. That sum is in effect a subsidy the rest of American taxpayers pay the fast-food industry for the industry’s failure to pay its workers enough to live on.”[13]

We the people have come to a point in the crossroads that is crucial. If we were Jesus, this would be our “cleansing of the temple” moment. Somebody has got to stop this. Somebody has got to take a stand and defend unions, fair wages, and stop corporate greed.

In the 1950s, something like thirty percent of the private sector was unionized. Today, that number is somewhere around seven percent. And, not coincidentally, as the union rates decreased, wages decreased along with them.[14] This issue is very personal for me too. I moved from a unionized state (Washington) to a right-to-work state (Georgia). I knew I was in for a pay cut. What followed, however, could have come straight out of communist Russia. I went from making $30.32 per hour (along with evening shift differential of $2.50) to making $27.66 (with no shift differential). That was the best pay I could get—with a letter written on my behalf to the CEO. They said cost-of-living was cheaper in the south. I have found that to be mostly untrue. Starbucks is the same price everywhere you go. And that’s the reality I live everyday. You make less, you spend more. During the time I spent looking for a job, I was offered wages as low as $24 per hour (at my previous hospital, new graduates were started at over $27). The point being that I mostly live paycheck-to-paycheck now. I know exactly they mean when they say, with a meh look on their face, “I live in a right-to-work state.” Wisconsin has this year become a right-to-work state under their governor Scott Walker, a whore in bed with the Koch brothers. Slowly but surely, many Republicans, corporations, Wall Street, and the Koch brothers would like to make a minimum wage obsolete. It would make their lives so much easier. Then they could pay you anything they liked.

In the latest research comparing right-to-work state wages versus non-right-to-work states, those living in right-to-work states (states located mostly in the south) made 3.2% less.[15] The next time you hear a (usually) Republican presidential candidate or senator talk about the beauty of the “free market” and the wonder of “right-to-work” laws, try to do what scientists do: follow the evidence. Don’t let anyone tell you something that is not grounded in reality. The people living in the American South are poorer than those living in the North. That’s the simple truth. If you wish to join their ranks, vote for any Republican of your choosing. If you wish to improve your working conditions and wages, vote for whoever it is that is a big advocate for unions (“collective bargaining”)—and these guys are seldom Republican.

My essay must now come to screeching end. It’s become longer than I initially intended it to be. I thought it’d be short and sweet. Lo and behold, it is long and bitter. I hope my reader is more informed since reading this writing. I believe in democracy not because it is perfect, but because it is the best thing we have. I identify with many democratic-socialist ideals because I think they are the best we have. I do believe in unions and a democratic community of people that isn’t afraid of saying, “Yes, we believe in a government that is not afraid of regulating Wall Street and economics.” Call this a very, very weak form of socialism. Oh well. I’m not here to defend terms or to hide behind words. I identify with many of the concerns Bernie Sanders has. Having said that, I think it’s safe to call me a democratic socialist. I don’t think I, or people like me, have all the answers. We don’t think we do. I’m not saying my ideas will not create problems and unintended repercussions. All ideas do that. I am saying that this—all the words written above—is a good place to start. I am saying that we must not allow ethics to be replaced by economics. I am saying that we need a strong democracy. I am saying that we need to be able to have large—in Trump’s words, “y-uuuge”—unions. Finally, I hope that after you’ve finished reading this—long after—the stench of unfettered capitalism fills your nostrils causing you to increasingly spend a disproportionate amount of time sniffling and wondering what the hell went wrong…


Written by: Moses Y. Mikheyev



[1] P. G. Shekelle, R. M. Wachter, and P. J. Pronovost (eds.), “Making Health Care Safer II: An Updated Critical Analysis of the Evidence for Patient Safety Practices,” AHRQ Publication No. 13-E001-EF (Rockville, MD: Agency for Healthcare Research and Quality [US], 2013). Available at: See also “Safe-Staffing Ratios: Benefiting Nurses and Patients,” Department For Professional Employees, AFLCIO, accessed December 22, 2015,

[2] Roni Jacobson, “Widespread Understaffing of Nurses Increases Risk to Patients,” Scientific American, July 14, 2015, accessed December 22, 2015,

[3] 42 CFR 482.2—Condition of Participation: Nursing Services, accessed December 22, 2015,

[4] Robert B. Reich, Saving Capitalism: For the Many, Not the Few (New York: Alfred A. Knopf, 2015), 97.

[5] Ibid., 104-5.

[6] Ibid., 105.

[7] Ibid., 111.

[8] Ibid., 117.

[9] Ibid., 123.

[10] Ibid., 126-7. Italics mine.

[11] Ibid., 127.

[12] Ibid., 136.

[13] Ibid., 137.

[14] Ibid., 89, 131.

[15] Elise Gould and Will Kimball, “‘Right-to-Work’ States Still Have Lower Wages,” Economic Policy Institute, Briefing Paper No. 395 (2015).