Calm Before the Storm

I was standing on the back deck of my uncle’s house, chatting with him while he grilled burgers and hot dogs. It was a blazing summer afternoon, with blue skies overhead.

My uncle scanned the sky. Then he turned to me and calmly stated, Once this food is done cooking, we’ll want to bring it inside. It’s going to rain soon.

I was incredulous. Sure, there were some clouds off near the horizon, but they weren’t the ominous variety that screamed Rain. There were no rumbles of thunder in the distance or flashes of lightning.

Nevertheless, I heeded his warning. And 20 minutes later, we were in the kitchen, watching the rain come down in sheets where we had previously been standing.

I was in awe of my uncle. How could he so easily tell that it was going to storm when I saw so few signs of it?

My uncle is not a meteorologist. A renowned surgeon and cancer researcher, his professional endeavors take place far from a weather center. Those skills require precision, ingenuity, and many long hours in operating rooms and labs.

And yet, in his limited spare time, my uncle seemed to have developed an uncanny ability to sense the impending danger in the skies ahead.

I was only a teenager at the time of this story, and I had no true vision for my future. Yet, this revelation hit me light a lightning bolt. If my uncle could make time to understand the weather, perhaps this was a skill I could pick up too.

So, I started studying radar maps and watching The Weather Channel. I took an introductory college meteorology course for fun, and I ended up with the top grade in the class. And when I worked as a news producer as a young adult, I would constantly pick the brains of the staff meteorologists to fill the gaps in my knowledge.

I was captivated by the idea of knowing what comes next. I was relieved to know I wouldn’t get caught off-guard by shifting weather patterns. I was confident in dressing properly for the elements.

But most of all, I was entranced by the details — particularly, the moments of change. I was mesmerized by the rush of fresh air from a cold front. I was ensconced by the smell of dew at dawn. And, of course, I was awestruck by the calm before a storm.

It became an obsession. And that obsession has persisted.


Lately, I’ve been thinking a lot about the calm before a particular storm.

This storm didn’t bring thunder, lightning, rain, or snow. In fact, it wasn’t a weather event at all. But it wreaked plenty of havoc, nonetheless.

This storm was a global pandemic.

We should have seen it coming. News of a mysterious virus plaguing China had made it around the world long before the virus itself did. But the vast distance gave many of us — particularly here in America — a false sense of security. It led us to believe that It won’t happen here.

It did, of course. And now, even with the worst of the pandemic behind us in this nation, our lives have been inexorably changed.

I am moving forward, as so many of us are. Rather than dwell on what happened, I’m picking up the pieces from a lost year.

But despite all this progress, I find myself going back to a specific time. I keep circling the weeks and months right before the pandemic brought life to an abrupt halt.

Some may think that such a focus is foolish. They might exclaim that the moment is gone now and is not worth fixating on any longer.

And yet, I see things differently.

It helps me to ask what our world looked like while we were standing on our back deck, unaware that a storm was about to blow in. It helps me to think of what we might be able to recapture from those moments.

In some ways, we were at our most idealistic then. I know I was.

In the months before the pandemic, I was battling several cross currents. I was at a career crossroads. I was ramping up programming for the local alumni chapter I headed. And I was laser-focused on getting into better shape, physically and financially.

I was living life week-to-week, but with a distant goal in mind. I’d assumed that the world would stay roughly the same over time and that I’d gradually get to where I needed to be.

All this idealism sounds ridiculous in hindsight. Catastrophes have a knack for distorting our vision in this way.

And yet, those shattered illusions might be our best guide for the road ahead.


For all its benefits in a state of emergency, living from moment to moment is not a sustainable activity. If the trauma of a pandemic — or some other crisis — causes us to give up on long-term planning, our future will be as turbulent as our present.

And yet, reverting to our old ways is no simple task. It’s a challenge to head back into the fire after we’ve been burned.

This is the crossroads we find ourselves at now, as the worst of the storm has passed. Do we take our cues from the ravaged landscape around us, or do we harness the spirit that resonated in the air before the skies turned dark?

I have chosen my path.

I’m harkening back to that moment before the chaos and reclaiming the life I’d built in those days. Some of my priorities were out of scope, for sure. That much is clear now. But even with that disclaimer, I was coming into my own back then.

I want that feeling back. I want to believe that the trauma of a pandemic year hasn’t wiped it away for good. And I will do everything in my power to make it so.

I’m sure others feel this way as well. But that feeling might be blown away by the winds of opinion. It might be crushed by the prevalent demands to build something better out of the wreckage.

I’d encourage anyone in this predicament to be still for a moment. To picture the moments before the world turned sideways. And to consider whether that setting — that life — is something worth pursuing once again.

The calm before the storm is a snapshot of doom. But it can also be a moment of opportunity.

Let’s not let it slip by.

On Communication

At first glance, the situation seemed normal.

I was on the floor of an apartment bedroom, with another kid on the other side of the room. Between us lay some toys — miniature dinosaurs, trains, and cars.

It was the kind of scene that was commonplace when children spent time together. But this was no normal encounter.

For one thing, this apartment was in China. The place was comfortable enough, but still rather rudimentary.

And that kid I was hanging out with? He was the nephew of a family friend. Just like me, he was 10 years old. But he spoke no English. And I spoke no Chinese.

We stared at each other in silence for what seemed like an eternity. Both of us were perplexed by the situation we were in.

Then, the boy took one of the dinosaurs from the floor and guided it across the bed. As he did, I made dinosaur noises.

Of course, I didn’t really know what a dinosaur sounded like. The real ones predated me by millions of years.

But it didn’t matter. My sound effects made the boy smile, and then chuckle. Soon enough, we were having a blast, without sharing a single word.


Much has been made about the keys to success.

Some have pointed to talent and opportunity. Others champion focus and grit.

These are important attributes. But I think they all play second fiddle.

Communication is the most skill there is. And yet, it seems to be the most overlooked one.

We have all kinds of acronyms to describe our performance — IQ (Intelligence Quotient) for smarts, EQ (Emotional Quotient) for social acuity, AQ (Adversity Quotient) for resilience. But all too often, we fail to assess our CQ — or communication quotient.

Perhaps we struggle to quantify the benefits. After all, the smartest people can solve the most pressing problems. The most socially affable people can draw a crowd. And the most resilient people turn setbacks into triumphs.

What awaits the best communicators? It’s hard to come up with concrete examples.

And yet, we know the devastating impacts of poor communication all too well. Failed communication can torpedo even the most promising venture. It can damage relationships, corrode trust and vaporize goodwill.

These are major issues. But we like to pretend we’re immune to them.

We’re not.


Not long ago, I went back to school to earn an MBA (Master’s degree in Business Administration).

My business school coursework gave me several new talents — the ability to read financial statements, to understand economic theory, and to enact pricing strategy, for instance. But the most impactful course I took was on business presentations.

At first glance, this seems strange. I already had a college degree in communications, and I’d spent three years working in the news media. Plus, I’d put together several PowerPoints in my marketing career and I’d written plenty of articles here on Words of the West.

But even with all this communication experience, I knew I had room for improvement. There were plenty of times where I had been called to the carpet for a work email that didn’t land the way I intended. And I often struggled to make the desired impact when speaking up at staff meetings.

This course wasn’t designed to address any of that. It was simply a primer on how to present to business executives. And yet, I found it transformative.

For the course didn’t just address the all-too-common fear of public speaking. It delved into the intricacies of eye contact. It established guidelines for speaking cadence. And it provided instructions on how to create a slide deck that tells a story.

These tips were more focused on the audience than the presenter. They were meant to ensure that the message landed properly.

That, of course, is the most important aspect of communication. Messages are only effective if others can decode them in the way we intended. The audience reaction is everything.

I had learned that skill, by necessity, on that day in China. Even with a language barrier in my midst, I’d managed to forge a friendship with a boy my age.

And yet, I had seemed to forget what I’d learned over all the intervening years. But that changed once I took the business presentations class.

I left the course on a mission to be a more effective communicator. I wanted to ensure that my messages landed with precision moving forward, no matter the medium.

This mission is still ongoing. But I’m encouraged by the progress I’ve made thus far.

Still, I wish this wasn’t a solitary quest.


For millennia, humans have evolved their communication techniques.

We started by making standardized noises, which evolved into language. Oral storytelling, cave paintings, and hieroglyphics came next, followed by the written word.

The advent of the printing press and — much later — the microphone spawned mass communication. Radio and television spread these messages ever wider. And ultimately, the Internet made communication both global and simultaneous.

Communication has never been more convenient. Today, we literally have the tools for it at our fingertips. And yet, we fail to use those tools properly.

This is particularly noticeable at our current moment of strife.

A health crisis has cost the world millions of lives. And an economic crisis has cost America millions of jobs. But it’s an ongoing communication crisis that is perhaps most profound.

A world connected as never before has, paradoxically, never seemed further apart. And as the dialogue breaks down, polarization only deepens.

I understand the temptation to eschew open lines of communication. Engaging with others can be tough work, particularly when we have little in common. And the risk of a blunder seems to outweigh the rewards of avoiding communication altogether.

But this laissez-faire attitude has its costs. We’re seeing these broadly now, through the radicalization of society. But we’re seeing them individually as well.

No, not all of us will end up in a room with someone who doesn’t speak our language. But many of us will find ourselves outside our depth at some point. Perhaps we already have.

Basic communication skills can aid us in these unsettling circumstances. But if we’ve let those skills atrophy, there’s no guarantee they’ll come back to us in time. We could quickly find ourselves up a creek without a paddle.

Fortunately, the power still lies in our hands. But it’s our obligation to do something about it.

So, let’s give communication the priority it deserves. Let’s make a choice to engage, even when it seems inconvenient. And let’s ensure our messages stick the landing.

The challenges we face are substantial. But if we communicate with precision, we stand a better chance of rising to the occasion.

Let’s get to it.

Of Words and Weapons

Sticks and stones can break my bones, but words will never harm me.

So goes one of the quintessential schoolyard retorts.

Kids can be brats at times, calling other kids names in order to get under their skin. The sticks and stones phrase has long given the aggrieved an opportunity to blunt these attacks.

Sure, it’s a mouthful. But that’s precisely the point. Its complexity gives the tormentor pause. And this lowers the temperature.

This pattern has repeated itself for years. But things are different now.


Not all words are created equal.

Some bring joy. Some bring sadness. And some are so inflammatory that they’re considered taboo.

Growing up, I knew what these off-limits words were. They were so scandalous that people referred to them by their first letter. The F-word. The S-word. The N-word.

I was not born with this knowledge, but I picked it up quickly.

For instance, when I was 7 years old, I asked my father about a word I’d read in Mark Twain’s Huckleberry Finn. My father implored me not to use that word — which was negro. In the same breath, he warned me to never use its uglier, more inflammatory derivative.

Looking back now, it strikes me just how strange this all was. In order to teach me which word not to use, my father needed to use it.

But I learned my lessons well. I steered clear of bad words with a precision that would have made Mormons proud. By the time I got to high school, my best friend — who dropped the F-words and S-words into most sentences — even ribbed me for being so square.

Truth be told, it was easy to avoid these terms. There was a rich ecosystem of synonyms I could draw from to avoid swearing. And that’s precisely what I did.

But these days, it’s trickier to steer clear of the landmines.


Trigger warning.

It’s one of the terms that’s emerged in this newfangled era.

Trigger warnings guard against information that might upset us. They prepare us for the shock, horror or emotional distress ahead.

The premise of this phenomenon is sensible. We shouldn’t be blindsided when facing disturbing topics, particularly since many of us have experienced trauma in our lives already.

Words can in fact harm us, particularly if they reopen wounds that haven’t fully healed. Trigger warnings are our last line of defense against such catastrophe.

Yet, as our society gets more polarized, the number of terms deemed worthy of a trigger warning only seems to grow. Racial slurs and descriptors of physical assaults aren’t the only sources of consternation anymore. Now, phrases that upset our worldviews make the list as well.

Some of these terms do have ties to partisan politics. Global warming became climate change thanks to a focus group put together by conservatives, for instance.

Still, many phrases with a trigger warning label lack obvious political ties. It’s the associations we draw from these terms that so deeply aggrieve us.

This leaves us with a bevy of words that have turned radioactive. And this time, there are no simple substitutes for them.

We can take the long way and describe the words without using them — a real-life version of the game Taboo. But in an era of dwindling attention spans, these efforts are likely to fall short.

And so, with no clear path forward, we avoid these terms — and their associated topics — altogether. And by doing this, we invoke a sense of shadow censorship.

That should trigger its own warning.


Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.

So reads the first amendment of the United States Constitution.

Scholars, justices, and activists have broken down those 45 words countless times. They’ve attempted to determine what rights people have to express themselves.

But these dry interpretations miss a key angle. Namely, the intent of those who put those 45 words on paper.

The First Amendment was part of the Bill of Rights — a set of personal freedoms afforded to all Americans. These rights were foundational, rebutting the censorship that was commonplace in the colonial era.

The founding fathers wanted us to use our voices without fear of silencing. In their view, words were not weapons. And opening one’s mouth shouldn’t be treated as an act of war.

While the courts have imposed limitations in a few situations, freedom of expression largely remains intact today. Yet, we now find ourselves restricting our own speech.

By making more and more terms taboo, we are limiting discourse. We are narrowing our perspective. And we are failing to address crucial societal concerns.

Sure, shadow censorship might make us feel more secure and less aggravated. But ignoring the uncomfortable topics around us won’t make them go away. The elephant in the room remains.


It’s time to end the shadow censorship. It’s time to stop treating words as weapons.

Yes, some select words are truly vulgar. And we absolutely should avoid those words whenever possible.

But, by and large, words are not the concern. It’s the actions associated with those words that pose the gravest danger.

This is a point that we seem to miss.

Let’s consider what is really spurring the trigger warnings. Do these difficult phrases trigger emotional distress? Or do they trigger us to acts of aggression?

Both effects are troubling. But words shouldn’t shoulder all the blame for these adverse outcomes. We need to take some responsibility as well.

We have the agency to face our trauma head-on and to help the scars heal. We have the ability to keep dialogue from erupting into violence.

Taking phrases out of circulation doesn’t absolve us of these duties. It only deludes us further.

So, let’s stop with the smoke and mirrors. Let’s rid ourselves of the shadow censorship. And let’s commit ourselves to have important discussions, even if they might be a bit uncomfortable.

This is our best path forward. Let’s not squander it.

The Heart of Morality

I just want to do the right thing.

Many of us have uttered these words after doing something unorthodox.

Staying on the straight and narrow sometimes involves deviating from routine procedures or making personal sacrifices. And this can envelop us with a sense of cognitive dissonance.

Whenever we veer off-script, a conflict emerges between the norm we’re breaking with and the result we’re seeking. Reminding ourselves that we’re doing the right thing helps reconcile that conflict.

The right thing can seem like a nebulous term. But the code it’s based upon is not.

We call that code morality.

Morality means everything to us. It’s the standard we judge others on. And it’s what we critique ourselves on as well.

But how do we derive morality? How do we distinguish between what’s appropriate and what’s unjust?

Many of us believe the answer is basic logic. We claim that tapping into widely accepted values helps us determine what to do next. And we argue that morality is simply the process of following those markers.

It’s a tidy argument. But the truth of the matter is far more complicated.


The final months of 2001 were nothing short of surreal.

America had endured the September 11th attacks. Our military had sent troops to Afghanistan to root out those responsible for the violence. Our economy was in a recession and a sense of tension was in the air.

I was in my early teens at the time, which made these events particularly jarring. In an instant, my youthful innocence was ripped away. A sobering reality took its place.

I went through all the emotions that come with trauma in those months. I oscillated between anger, fear, and sadness. But mostly, I was filled with confusion.

The terrorists who organized the September 11th attacks had committed unspeakable acts — killing 3,000 innocent Americans, toppling skyscrapers, and blasting a hole in the Pentagon. And yet, they claimed they were in the right. They blamed America for a culture of sin. And they touted the morality of their actions.

All of this made no sense to me.

How would sending operatives 5,000 miles to indiscriminately kill civilians be viewed as moral? It violated one of the Ten Commandments from the Bible. (Thou shalt not kill.) And it ran afoul of the guidance of the Quran. (You shall not take life, except by way of justice and law.)

To me, it was as if these terrorists had stacked a crime on a crime. They had done more than just violate the code of morality. They had ripped the code to shreds. This made them evil, in my mind, and thereby worthy of purging.

So, as I slogged through adolescence and early adulthood, I was filled with thoughts of vengeance. I openly cheered the killing of Osama Bin Laden. And I turned a blind eye to the torture of detainees accused of terrorism.

It all seemed so clear to me. Anyone who so blatantly disavowed the code of morality had to be eliminated. I stuck by this logic, even as it took me to darker and darker places.

But then, some new examples of misaligned morals enveloped our society. And this time, the situation was far murkier.

The killing of unarmed Black teens by law enforcement — a longstanding problem — gained widespread attention following the death of Michael Brown in 2014. Protestors took to the streets in Ferguson, Missouri in pursuit of racial justice.

Those protests grew violent, with looting and mayhem. This led to a militarized law enforcement response. Police sprayed tear gas, threw smoke bombs, and fired rubber bullets at the protesters.

In the wake of this confrontation, both sides claimed they were in the right. Supporters of law enforcement said it was their moral duty to prevent looting and assault. The protestors believed considered racial justice to be their moral quest. A calling that superseded the code of laws they might break along the way.

Neither claim to morality was fully upheld. But neither was refuted either. And in the years since then, the debate over morality has only grown fiercer. It’s become a defining marker of our societal divisions.

It’s uncomfortable living in conflict like this. So, we keep seeking to close the gap.

We search for that one bit of logic that will neutralize the other side, settling this debate once and for all. And, in the process, we keep finding nothing but futility.

Perhaps it’s time we try a new approach.


On October 6, 1965, the Los Angeles Dodgers dropped the first game of the World Series to the Minnesota Twins. Many players had a hand in the result. But one man who never saw the field seemed to grab the most attention.

Sandy Koufax — the Dodgers’ best pitcher — was supposed to take the mound in Minnesota that day. But October 6th also happened to be the date of Yom Kippur — the holiest day of the Jewish calendar — that year. Koufax, who is Jewish, refused to pitch on that day.

Many criticized Koufax for abandoning his job at such an important juncture. It seemed immoral to some.

But Koufax’s choice might actually have been the purest example of morality at work.

Baseball was Koufax’s profession. He was a steady, dominant force in a sport that meant a great deal to him. But his faith also mattered. It was as much a part of his values as baseball was.

So, when Koufax found the two halves of his identity in conflict, he listened to his heart and made his decision.

Yes, Koufax let emotion — not logic — define his morality. That gave him the clarity and conviction he needed to see his decision through.


The example Sandy Koufax set might seem extreme. But it’s far from extraordinary.

When we drop everything to be there for family or friends in need, we’re following our moral compass. And we’re often doing this at the expense of our logical one.

In a vacuum, such choices make little sense. They’re inconvenient and they pull us away from proven patterns of success.

Still, we can’t imagine not making these decisions. They clearly seem like the right thing to do.

It’s our emotions that are guiding us to go the extra mile. It’s our feelings that are helping us be there in the moments that matter. It’s our hearts that are defining our sense of morality.

Our emotions help us distinguish right from wrong. And through this process, we realize what it truly means to be human.

As such, our mandate is clear.

We must stop relying on logic alone to delineate right and wrong. We must listen to our hearts as well.

It’s our obligation to look beyond our self-interest. It’s our duty to care about each other, be good to each other and be there for each other.

So, the next time we’re faced with a tough choice, let’s resist the temptation to break out the spreadsheets. Let’s give our hearts the chance to guide the way.

Eraser Marks

Have you heard of Julius Caesar? What about Alexander Hamilton?

There’s a good chance you have. And not because you had a salad for lunch or watched a Broadway musical at some point.

We know these names because we are students of history.

In America, we learn about the history of our own nation in school. We also learn of those societies that came before — such as the Roman Empire.

Reminders exist far beyond the classroom walls as well. Idioms, memes, and other colloquial wisdom weave the markers of history into the fabric of our culture.

These lessons allow us to capitalize on what those before us did well. They also allow us to avoid repeating what our predecessors did poorly.

It’s been this way for generations. But now, this arrangement is endangered.


The sea change effectively started in 2017.

America was emerging from the shadow of some contentious events. A brash outsider had won the United States presidency months earlier. And there was a growing clamor that foreign nations might have interfered in the presidential election.

Tensions were high. Then, two events sent the kindling ablaze.

In August, white supremacists marched on a Virginia college town. Then, in October, the New York Times published a sexual harassment investigation of Hollywood producer Harvey Weinstein.

At first, these events don’t seem directly correlated. The white supremacists were spewing racist hate on one side of the country. On the other, an entertainment mogul was coming undone after years of mistreating women.

But if you look at the response to each of these events, the connection is clear. In both cases, the repudiation of these actions went to a new level. Symbols tied to racism started disappearing from the south, while Weinstein-produced movies vanished from entertainment services.

This was a turning point in what came to be known as Cancel Culture.

The message was clear. No longer would those on the wrong side of history simply face scorn. They might find be erased from the record altogether.

In these initial cases, the cancellations turned out to be prudent.

After all, the Confederacy lost the Civil War, and racial discrimination is against the law. So, maintaining symbols of a vanquished cause did little good.

And as for Harvey Weinstein, he was ultimately convicted of rape and sentenced to prison.

But Cancel Culture would grow in the ensuing years. And as the revisionist history exploded, we started to lose our way.


I am a proud alumna of the University of Miami.

Like any institution, the university is not perfect. But it’s had a profound impact on my life. And it’s proven to be a valuable member of the surrounding community.

The university has made several transformational decisions in recent decades, including upgrading facilities and expanding its healthcare network across South Florida.

But a recent decision caused me to furrow my brow.

The university removed the names of several prominent figures from campus buildings, including that of founder George Merrick. The university claimed that an anti-racism stance fueled their decision.

On the surface, this decision seemed prudent. While Merrick donated 600 acres of land to build the university in 1925, he also spoke of keeping Black neighborhoods outside of greater Miami.

Viewed from a modern lens — or indeed, a humane lens — such ideals are repugnant. But in the 1920s, they were par for the course.

It was the heart of the Jim Crow era back then. And Miami was the newest outpost of the South — a coastal town built along a rail line extension.

Fidel Castro’s ascension in Cuba was still more than 30 years away. And it would be a decade before Dominican dictator Rafael Trujillo ordered the infamous El Corte massacre against Haitians.

Such events helped spur a wave of migration to Miami, turning it into the multicultural mecca we know it as today. But back in 1925, Miami was a mostly white city in a segregationist state.

Merrick’s views on city planning are not to be celebrated, for sure. But canceling him from the university is not necessarily the answer either.

Such actions are effectively castigating one man for the sins of his time. It’s a move that even civic leaders think is unfair.

This is not the case of Alabama governor George Wallace openly defying the Civil Rights Act and bellowing Segregation forever. If George Merrick had lived in a more equitable era, there’s a chance he might have had a more progressive stance on racial relations.

But he didn’t. He lived in the South in the 1920s. And now, he’s being punished for that fate of circumstance.


There are few names more infamous than that of Adolf Hitler.

The Nazi leader led the genocide of 6 million people, spurred the rise of fascism in Europe and sparked the Second World War. In most circles, he’s considered the embodiment of evil incarnate.

More than 75 years have passed since the fall of the Nazis. Most Germans these days have no firsthand knowledge of that despicable era. But they do know who Hitler was.

This is intentional. In the shadow of World War II, Allied powers removed Nazi symbols from German buildings. But they didn’t scrub their atrocities from the history books.

The more German schoolchildren learned about the sins of prior generations, the less they’d be inclined to repeat them. At least that was the prevailing idea.

For the most part, this strategy has worked. Some pockets of right-wing extremism have bubbled up in Germany recently. But such scourges took many decades to re-emerge.

As I look at our society, I wonder why we are so set on deviating from this path. The actions of Confederate leaders — let alone the Nazi regime — are far worse than the thoughts of a George Merrick.

Cancelling Merrick for racist views — or any number of figures for the warts of their era — is a flawed approach.

Taking an eraser to the history books doesn’t wipe the slate clean. It simply leaves us with eraser marks.

Such actions deprive us of the database of missteps. They rob us of tangible signs of society’s progression. And they leave us every opportunity to make mistakes that could otherwise have been avoided.

History is made of people. And people are flawed.

Julius Caesar got power-hungry and ended up assassinated. Alexander Hamilton’s hotheaded style led him to a fatal duel with Aaron Burr.

Those flaws ended their lives, but not their relevance. In fact, those flaws have become a crucial portion of their relevance.

This is the power of history when it’s left annotated but unvarnished. It offers us the chance to make tomorrow better than yesterday was.

So, let’s not give Cancel Culture a free pass. Let’s stop pretending that eraser marks can rectify the sins of the past. Let’s investigate those sins at face value. And then let’s resolve to do better

The Advantage

The house always wins.

If you’ve ever been to Las Vegas – or anywhere else with a casino — you’ve probably heard this mantra.

It’s a word of warning. A shot across the bow to anyone who thinks they can tip the scales in their favor when they gamble.

Sure, some lucky people here or there might hit the jackpot on the slot machines or win at the table games. And those big winners might score loads of money.

But ultimately, whoever owns the place comes out on top.

For the laws of probability prove that for every jackpot winner, there are plenty of others who put their money down and get nothing in return. And anyone who tries to take a shortcut to success — by counting cards in Blackjack, for example — is booted from the venue.

Yes, the moment we walk into a casino, the house has the drop on us. It has to so that the owners can cover those giant payouts — and make profits. If we fail to recognize this, we’re the suckers.

This is why I don’t gamble. It’s why you won’t see me feasting on $40 steaks, ambling up to the card tables, or hitting the slots.

I know which way the deck is tilted. And I ain’t playing.


There are plenty of jargony phrases in the business world.

Terms like EBITDA and revenue can delineate failure and success. Acronyms like IRR and EPS demonstrate how much money can be made on investments.

These words mean everything inside corporate boardrooms. But outside of the office, they’re little more than gibberish.

This is the way it should be. There are far more important things in life than discounted cash flows. Things like our health, our families, and our sense of belonging.

Even so, there is another concept that has use far beyond the business world. That concept is Arbitrage.

Arbitrage represents the advantage businesses try to seek. It’s the difference between what they pay for an item and the value they get out of it.

In order to make money and sustain success, businesses need to seize arbitrage wherever they can.

This is why casinos stack the deck against us. This is why companies devote entire departments to innovation. And this is why the titans of industry keep acquiring fledgling companies.

For years, I couldn’t stand this concept. I thought that working in the business world meant screwing over someone else. And I didn’t want to live that life.

I never saw the movie Wall Street as a kid. But I definitely would have taken issue with the character Gordon Gekko, who infamously proclaimed that Greed Is Good.

In fact, when I was asked to draw my fears at age 11, I didn’t sketch a Great White Shark or the Loch Ness Monster. I drew a man in a suit on a train platform, holding a briefcase and looking forlorn.

I had no idea how so many could willingly sacrifice their conscience for a taste of success. How could they sleep at night knowing that their gains were built on the misfortunes of others?

To me, arbitrage was nothing more than a mark of shame. But gradually, my thoughts on the matter have shifted.


Western Europe has many things going for it. Picturesque cities, renowned cuisine and timeless works of art — just to name a few.

But freedom of vocation is not on that list.

In many European countries, students must decide on their preferred profession as teenagers. Then they must pass entrance exams, which will let them pursue secondary education in that field. Beyond that, their professional future awaits.

There is little room for second-guessing or changing one’s mind. One adolescent decision carries the burden of destiny.

I’m thankful I wasn’t raised in such a system. Because truth be told, I had no idea what profession I would want to work in when I was 18 years old.

I entered college with aspirations of becoming a Hollywood screenwriter, only to find that I lacked passion for it. I shifted my focus to broadcast journalism, and I managed to work in the news media for three years. But then I pivoted again, ending up in the business world I’d once abhorred.

Such a winding path is quintessentially American. This is a nation where college dropouts can create trillion-dollar tech companies. It’s a culture where the words serial entrepreneur are celebrated, not reviled.

There is always an opportunity to try something new. To pick ourselves up by our bootstraps and try a new path.

But lost in such possibilities is any semblance of meritocracy. We can’t rely on the system to buoy us. We must seize whatever advantage we can.

We must embrace arbitrage.

Leveraging our advantage might mean doing something small, like focusing on our uniqueness during a job interview. Or it might mean something big, like dropping everything to seize an unprecedented opportunity.

These actions can help us. But by proxy, they can deny others.

And yet, we must accept this subtle cruelty. Because our selfishness ensures our survival.

There has to be another way.


Elon Musk is a polarizing figure.

The multibillionaire is brash, bold, and highly controversial. Plenty of people are repelled by his ego, his antics, and his wild commentary.

Yet, Musk has his admirers as well. For the companies he’s created — Tesla and SpaceX, among others — seek to solve problems that could benefit all of us. Efficient vehicles and ubiquitous space travel can broaden our horizons and redefine our future.

Musk is a torchbearer for a new type of arbitrage. One where an entire society benefits from the advantage.

And while few of us could be Elon Musk — and many wouldn’t even want to — we can follow his lead.

We can use our talents to improve more than our own situation. We can seek out the advantages that benefit our community. And we can leverage arbitrage to bring a positive change to the world.

The pursuit of this type of advantage can steady us. It can provide us the sustenance we need to thrive, without compromising our conscience.

It’s still a zero-sum game. But it’s got far more room in the winner’s circle.

So, let’s be smart about the advantages we seek. And let’s do our best to spread those benefits far and wide.

Going Dry

It was a work of art.

A perfect glass of whiskey on the rocks.

The distiller’s name has evaded my memory. But the smooth taste of the libation has not.

I finished one glass, and then another. Then, I paid my bar tab and went back to my hotel room.

I haven’t touched alcohol since.


As I write this, it’s been more than three years since I tasted that whiskey. Technically, I could say I’ve been three years sober. But I struggle to use that word — sober — to describe myself.

For the way I parted with drinking doesn’t match the sobriety stigma. There was no killer hangover, no devastating hospital diagnosis, no trail of collateral damage to force my hand. I was able to coordinate my own exit.

In this case, it meant saying farewell to alcohol at The Happiest Place On Earth — Disney World. I’d traveled to Orlando for professional training right after New Year’s Day. And with lodging and transport taken care of, I decided to make Disney World my last drinking hurrah.

So, I spent an evening sampling a drink from each of the country pavilions at Epcot — beer in Germany, baijiu in China, a margarita in Mexico. A couple of nights later, I had those two glasses of whiskey at the hotel bar. Then, that was it.

One month without alcohol became two, then three. While I had said my break from alcohol would be temporary, I began to reconsider that stance.

I was having nightmares about returning to drinking. And the anxiety about falling off the wagon overshadowed any lingering desire for whiskey or beer. So, I made my split with alcohol official.

I wasn’t going back. But moving forward would prove tricky.


America and alcohol go hand in hand.

Our obsession with drinking dates to our nation’s origins. Many colonial settlers came from England and Scotland — two regions with a legacy of brewing and distilling. And while these settlers dumped tea into the Boston Harbor in protest of a tax, we’ve long paid surcharges for booze without much complaint.

Our relationship with alcohol has not always been healthy. There are tales of liquored-up outlaws going on rampages in the Old West. And the rise of the automobile has led to an epidemic of drunk-driving deaths.

But our only national temperance effort backfired spectacularly. While Prohibition was the law of the land in the early 1900s, bootlegged liquor operations and speakeasy bars flourished. Organized crime outfits benefitted from this boom, and the collective love of libations only deepened.

Humiliated, the government repealed Prohibition in 1933. It had become clear that alcohol, for all its problems, would remain entrenched in our society. Indeed, many of our cultural norms — from dating to celebrating the new year — continue to involve sharing a drink.

When I decided to abandon this legacy, I found myself on treacherous footing.

Social life became surprisingly complex. I would often end up in alcohol-laden settings, turning down drinks left and right. And as I did, I faced incredulous questions from those around me.

How could I just swear off drinking? And why was I doing this if there I was not facing a crisis?

I knew why these inquiries were headed my way. My actions were unconventional.

Family, friends, and acquaintances were all trying to be respectful of my decisions — all while saving face.

Even so, the questions upset me.

I was feeling better than I ever had. And yet, time and again, I found myself on the defensive for the choices I had made.

I started withdrawing from social life to give myself a break. And when I did find myself in mixed company, I started announcing my aversion to drinking upfront.

It was draining. Demoralizing even. Then, a global pandemic hit.

Suddenly, social gatherings weren’t happening. And neither were the uncomfortable questions.

This was a relief at first. Even as my anxiety was soaring, this was one area where I could find a bit of solace.

Yet, as the months dragged on, I started to yearn for social life again. And now, as we emerge from the pandemic tunnel, I’m ready to reengage.

I just wish I could do so without being put on trial for going dry.


Behind every lifestyle choice we make is a mission.

My mission for going dry was to be mentally present for each moment of my life.

I didn’t get obliterated all that often in my younger days. But those times that I did still gnaw at me.

Losing control of my thoughts and actions was distressing. And the potential implications were terrifying.

By purging alcohol from my life, I wouldn’t have to worry about ever driving drunk. I wouldn’t need to concern myself with the harmful words I’d later forget ever having said. I wouldn’t be filled with humiliation after making a fool of myself.

These are all positive outcomes — both for myself and those around me. And yet, all too often, I feel like a pariah for choosing this path.

It shouldn’t be this way.

After all, plenty of people don’t drink alcohol. Some avoid imbibing because of their faith or their demons. Others make an active choice to abstain.

No matter the cause of our decision, we deserve better than to be cast into the shadows. We desire a kinder fate than the stain of scorn. We demand the benefit of the doubt in its place.

Social acceptance need not hinge on filling our bodies with poison. Irresponsible behavior need not be boundlessly lionized. And the implications of inebriation need not be ignored.

Yes, drinking will continue to be an important part of our society, our economy, and our culture for generations to come. But there can — and should — be room at the table for temperance too.

I yearn for that possibility.

I long for the day when sobriety is not a loaded term. I pine for the moment when the intricacies of social life are no longer dominated by what’s in our glass.

We are not there — not yet. But with a little more empathy and open-mindedness, we can be someday.

So, the next time you hear someone calling themself sober, don’t assume they have problems. It could just be that they have solutions.

Running On Empty

I’m just not feeling it today.

How many times have you said something like this? Plenty, I’m sure.

We’re not on our A-game all the time. There are instances where we’re out of sync. There are moments where we don’t feel up to the task.

This has been true since the dawn of humanity. And it will continue to be true for generations to come.

And yet, the ways in which we handle such instances have changed in recent years.

I’m just not feeling it has morphed into a code word. It’s become an invitation to abandon the task if we’re not at our peak.

Such a strategy has become widely accepted. It’s even celebrated.

But should it be?


The greatest ability is availability.

Football coaches live by this quote. But applies far beyond the gridiron.

Just as the most legendary athletes have a penchant for staying in the game, the most accomplished among us tend to remain in the action.

That means showing up, even when we’re not at our best. It means giving our all, even when we know we don’t have much left to give.

It means running on empty.

Such a concept often gets a bad rap. It conjures images of bluffing our way through a task. It amplifies the concerns of burnout.

These unsavory outcomes can occur when we run on empty. But they’re only one part of the tapestry.

Many people can run effectively on empty, without the side effects. A mix of preparation and passion can help them sail through, even when they’re not at 100%.

A famous example of this comes from Michael Jordan. The legendary basketball player was already a four-time world champion in June 1997, when his Chicago Bulls battled the Utah Jazz in the National Basketball Association Finals.

The teams had split the first four games of the series, setting up a pivotal Game 5. But on that morning in Utah, Jordan woke up severely ill. Instead of joining the team for the morning practice, Jordan stayed in his hotel room for much of the day. He only arrived at the arena an hour before the game. And he looked incredibly frail.

No one would have faulted Jordan for sitting out the game. But he suited up anyway — and he ended up putting on a performance for the ages. Jordan poured in 38 points, including the game-clinching basket. The Bulls went on to win another championship two nights later.

The “Flu Game” has become an indelible part of Jordan’s legacy. It proved that even when Jordan’s speed, strength, and stamina were stripped away, he could still get the job done. This was a testament to his athletic fundamentals, his competitive spirit, and his love of the game of basketball.

While we might not be Michael Jordan, we also have the ability to make an impact when the odds are stacked against us.

Not long ago, business people routinely battled jet lag to give important presentations halfway around the world. For generations, blue-collar workers have been able to put in long hours, even as their bodies ached. And for millennia, parents facing the roughest of days have managed to remain superheroes for their children.

Of course, these people would much rather be at the top of their game. But when they’ve found themselves far below that level, they’ve adjusted. They’ve been able to run on empty.


A few months before Michael Jordan’s “Flu Game,” I woke up with the stomach flu. After I made a mess in the bathroom, my mother held me out of school.

It took me a couple of days to recover, and I was miserable the whole time. I loathed the fatigue and nausea, of course. But I despised the feeling of helplessness as the world droned on without me.

When I made it back to school, I set a new goal for myself. Perfect attendance moving forward.

And by and large, I managed to achieve that. Over the next decade or so, I only missed a handful of school days. And hardly any were due to illness.

I wasn’t always at my best. But I showed up anyway. And I feel I was better for it.

These days, such a sentiment rings hollow.

Wellness has become a buzzword. And technology has allowed us to filter our persona to our heart’s desire.

Showing up on both the good and bad days no longer has cachet. If anything, it’s viewed as a waste of effort.

Now, not everyone is on board with this airbrushed reality. Some have rebelled against it, rallying behind the phrase If you don’t love me at my worst, you don’t deserve me at my best.

But even this saying is off-kilter. It implies that we should treat mediocrity as an ideal. And that just isn’t true.

Michael Jordan persevered in that “Flu Game” in Utah. But I’m sure he would have much preferred to be at full strength.

The same goes for any of us when we run on empty. We’d prefer a full tank, but we make do with what we’ve got.

It doesn’t take special talent to pull this off. All it takes is a bit of pride in our craft. And a commitment to stick with it through thick and thin.


Our tanks are all empty now.

After a year of illness, job loss, and isolation, we are a shell of what we once were.

It can be tempting to wave the white flag at a time like this. To hibernate until a brighter day emerges.

But such desires are foolish.

There is no escape from what we’ve experienced. The trauma is shared, and it permeates all corners of our existence.

We will only find the light if we do it collectively.

We must stop clinging to the ideal. And we must engage with what’s real instead.

We must run on empty.

Sure, this might feel awkward. But that discomfort is a hurdle we must clear to reach our destination. There is no other way.

So let’s stop bowing out when we’re not our best. Let’s stop looking for the emergency exit at every opportunity.

Running on empty is a feature, not a bug. It’s time we use it to its potential.

The Opinion Trap

Who cares what others think?

How often have we heard someone ask a question like this? Plenty of times, probably.

This question is rhetorical. The implied answer is that we shouldn’t take too much stock in what others have to say.

At first glance, this seems like well-intentioned advice.

After all, there are plenty of people out there, each with their own opinions. If we pander to the crowd, we lose a sense of ourselves. Or worse, we become co-opted by the views of others.

Better for us to promote our individuality. Better for us to wave off the background noise. Better for us to have faith in our own abilities.

And indeed, in a vacuum, such single-minded confidence might work.

But we don’t live in a vacuum. We live in the real world.


High school is an uncomfortable time. And yet, it can be an illuminating one.

Our bodies are transforming. Our minds are going through turbulence. And our social status is still being sorted out.

High school is the first time we’re faced with a real decision. Do we roll with the cool kids or linger among the outcasts?

It’s a cruel dilemma to be thrust upon an adolescent mind. For each decision has steep costs.

If we strive to be cool, we abandon our sense of individuality. We become an embodiment of the views and values of others.

But if we embrace our individuality, we find ourselves banished to the shadows. We miss out on many interactions with our peers. We risk the sting of loneliness at a time when we are ill-equipped to weather it.

My own high school days were marked by the tension between these fates.

I had already switched schools three times by the time I was 14, and I was aware of how difficult it could be to make new friends. Becoming a cool kid would appear to be my best path forward.

But many of my classmates were from a different background than I was. Plus they were much more outgoing than I was able to be.

So, I tried to split the difference. I joined the baseball team, and I sat near the popular kids as they held court at lunch. But otherwise, I retreated to my own world.

This approach did little to ease my angst. And although I met one my closest friends during high school, I don’t tend to look fondly on those days.

But perhaps I shouldn’t be so harsh. Maybe I shouldn’t consider the adolescent social status gauntlet as crude.

As it turns out, it’s a great primer for what comes next.


There are many definitions of adulthood. But the one I find most telling is The point at which one is self-sufficient, independent of their parents or guardians.

Yes, adulthood depends on self-sufficiency. And in a capitalist society, that means taking advantage of opportunities to financially sustain ourselves. Landing a steady job, selling enough of a product, or raising sufficient capital are three common ways to get there.

But where do those opportunities come from? They come from other humans.

Whether they’re representing a company or they’re simply consumers, other people are the linchpin to our success. Self-sufficiency is nothing more than a misnomer.

The fate of our future relies on the opinions of others. On their willingness to give us a chance, to provide us financing, to stick with us through thick and thin. This much is unavoidable.

But what of our credo of self-belief? What of our pledge to tune out what others think? How do we reconcile this contradiction?

I call this dilemma The Opinion Trap.

There are two main ways to confront The Opinion Trap. We can lean into it or we can attempt to escape it.

Those who lean in tend to follow the well-worn path. They actively seek the favorable opinions of others — particularly those who will provide them the opportunities they require. This might mean attaining certain educational milestones to stand out to hiring managers or working extra hours to impress their supervisors.

Such work can pay dividends. But it also diminishes the value of these individuals’ beliefs and opinions.

By contrast, some people have sought to escape The Opinion Trap. They’ve broken out from the corporate cycle and set off as entrepreneurs. These nonconformists are steeled by an intense belief in themselves. They’re determined not to let the views of others impact their fate.

And yet, on their way out of Dodge, many budding entrepreneurs are horrified to find The Opinion Trap lurking in their luggage.

Indeed, for their venture to take root, they need funding and a consumer base. And attaining both of those depends on the favorable opinions of others.

The Opinion Trap is insidious. And it is unavoidable.


If I were to pinpoint the moment I became an adult, I’d say it was the day I left my childhood home at age 18. But it wasn’t until I was 25 years old that I was financially self-sufficient.

My parents helped support me in college. And when my first job didn’t pay enough to cover my rent, my parents generously helped with the difference.

I was grateful to my parents for supporting me as I sought my footing in the world. But I also had aspirations of being self-sufficient.

So, when I exited the news media and moved across Texas, I was elated. Finally, I’d be able to sustain myself.

Then, I ran into the rough side of The Opinion Trap.

What I thought would be a two-week job search ended up lasting three months. With media experience all over my resume, I applied for a bevy of media relations and corporate communications positions. I figured this would be the most logical step forward.

But the opinions of the hiring managers filling those roles were unanimous. I was not qualified.

I will admit that these rebukes took me to a dark place. I had always believed in myself, but now I was questioning that faith. Was I really worthless all along, and was this just now coming to light?

Fortunately, I was able to get off this escalator before it hit rock bottom. Someone took a chance and offered me a digital marketing position. I didn’t know a thing about marketing at the time, but I got myself up to speed quickly. I’ve been in that industry ever since.

But even with the success I’ve seen, I’ve never fully recovered from that job search. My bouts with Imposter Syndrome — already prominent during my news media days — have only intensified. And I am continually worried that I will fall out of favor with the key decision-makers in my career.

With all this in mind, I’ve leaned hard into The Opinion Trap. I’ve taken on new responsibilities to stay in the good graces of my superiors. I’ve improved my customer service techniques to earn the trust of my clients. And I’ve gone back to business school to fill any perceived gaps in my marketing training.

These choices have paid dividends. But they leave the fate of my career — and my livelihood — squarely in the hands of others. If I run afoul of them in any way, I could end up out in the cold.

I have to live with that.

One way I do this is by escaping The Opinion Trap in all other aspects of my life. When it comes to my hairstyle, my exercise regimen, or the way I spend my free time, I rely solely on my own opinions. Even Words of the West is a venture where I follow my own nose. (Although the trust you put in me, dear reader, does loom large.)

For me, such a divide is necessary. It allows me to control the way I live my life, even if the way I sustain such a lifestyle relies on the good graces of others. That’s a compromise I can live with.

It’s on all of us to find a version of the middle ground that best suits us. To reconcile the importance of both outside perceptions and our own style. And to reconcile both in a healthy manner.

There is no clear roadmap for this objective. We’ll each need to find our own way forward through trial and error.

It’s daunting. But it’s the only way to keep The Opinion Trap from eating us alive.

So, let’s step to it.

The Failure of the Fourth Estate

I entered the newsroom on a mission.

It was my first job after graduating from college. My first time interacting with the big, bad world of adulthood. And I was as idealistic as I was young.

I viewed my new role as an evening TV news producer in West Texas with purpose and responsibility.

I would be providing information to improve the lives of my station’s viewers. What could be more important than that?

Sure, I had heard the doubters and the naysayers. The ones who stated that news was nothing but garbage. I was determined to prove them all wrong.

The path to this objective turned out to be a jagged one. I had my fair share of bumps in the road.

There was the time I bungled some breaking news. There was the election night coverage felled by a graphics mishap. And there was the time my boss chided me for featuring too many crime stories.

But I learned from my mistakes. I iterated. I improved.

By the time I left that job, I’d figured out how to handle breaking news. I’d successfully produced an election night newscast — during a presidential election year, no less. And I’d diversified my news coverage beyond a parade of mugshots.

Ultimately, my desire to stay in the media dwindled, and I left the industry behind.

Yet, I never blamed the media for my decision to leave it.

I never questioned the devotion of the reporters, anchors, and producers who poured their hearts into their work. I never questioned the integrity of journalists who often brought home smaller paychecks than Walmart associates. I never believed the claims of bias and corruption from the naysayers.

For years, I would continue to defend the media against all comers. But those days have come to an end.


The Fourth Estate.

It’s an old term for the role of the media. So old, in fact, that many have not heard of it.

The term comes from eighteenth-century England. In those years, there were three estates of British society: the clergy, the nobility, and the commoners. The press — the Fourth Estate — disseminated information between all three.

Of course, the colonists in North America didn’t think much of this system. They broke away from England, forming a nation that separated church and state. They also removed the formal distinction between nobility and commoners in favor of representative democracy.

And yet, the fledgling nation left the Fourth Estate intact.

The role of the media has been cherished ever since America’s earliest days. Journalists have been given the liberty to disseminate information and hold power to account. And they’ve been largely protected from censorship.

Journalism has chronicled the growth of this nation. It has helped expose corruption. And it has even restored our dignity at times.

But it also has an insidious side. And that element has never been more apparent.


March 2020 was a surreal month.

A deadly virus spawned a global pandemic. And in America, life as we knew it abruptly stopped.

As Americans sequestered themselves, many turned to the news for assistance. With so much fear and uncertainty percolating, the Fourth Estate would be our truth-teller.

But the truth we were provided came with an angle. A dark, insidious angle.

As the lockdowns set in, there were endless reports of overrun hospitals. There were harrowing tales of medical professionals reusing contaminated protective gear. And there were the chilling images of refrigerated trucks acting as makeshift morgues.

The sights and sounds of the first wave were jarring enough. But as we sought further guidance, the media provided us with little reassurance.

The point of the lockdowns had been to limit interpersonal contact. Public health officials believed this would keep the virus from spreading and hospitals from getting further overrun.

Journalists seemed to latch onto this message. And, as we sought guidance for everything from getting exercise to grabbing groceries, the media pounced.

There was the example of the young woman who defeated the virus, only to drop dead after a run. There were all the tutorials about the safest way to scrub down groceries. There were all the other anecdotes of someone doing something menial and ending up on a ventilator — or worse.

The underlying message was supposed to be clear. Stop trying to play the angles. Follow the public health guidance. Stay home. Stay safe.

But the grizzly examples used to drive this point home were outliers. And they painted an alarmist picture, causing undue dread. Even I, the media veteran, had a panic attack after scrubbing down groceries.

There was no denying it. The Fourth Estate had failed us.


Back in 1906, the media changed forever.

That was the year muckraking journalist Upton Sinclair published The Jungle — an insider account of conditions inside meatpacking plants.

The revelations in that book were horrifying. So horrifying, in fact, that they led to a spate of new regulations on both factory labor and food processing.

By showing how the sausage gets made, Sinclair had reformed major swaths of society. He had proved that the media could do more than bear witness. It could affect positive change.

That revelation proved true, time and again. It was the media that exposed the Watergate Break-In. It was the media who showed police brutally beating civil rights demonstrators in Alabama. It was the media who held the government accountable for its bungled response to Hurricane Katrina.

Each of those bombshells had us looking on in horror. But the collective outrage forced our country to move forward.

And yet, I don’t view Sinclair’s work as a net positive. At best, it was a mixed success.

For while The Jungle might have ushered in a new age of investigative journalism, it introduced a new element to the mix.

Sensationalism.

No longer was reporting the facts satisfactory. To be sensationalist, the story had to spark emotion.

After all, that’s what the reader — or listener or viewer — wanted. That’s what would grab their attention and keep them coming back for more.

There is no doubt that the media became more sensationalist in the 20th and 21st centuries. If it bleeds, it leads has been a well-known adage for years. And when I was cutting my teeth in the news industry, I was constantly told to find stories with a good hook.

But now, in the wake of a global pandemic, it feels like media sensationalism has hit reached a tipping point. The overpromotion of cautionary tales and the incessant parade of gloomy headlines has crushed the psyche of millions. It hass heightened anxiety, sowed distrust, and even led to despair.

In fact, I believe our society will emerge from this pandemic worse off than we could have been, thanks to the work of the media.

I’m not the only one with these views. A scholarly article from the National Bureau of Economic Research identified a negativity bias amongst journalists. And even The New York Times took note of its findings.

I found some of the explanations for this phenomenon to be lacking. No, people don’t want incessant negativity in the stories they encounter. If they did, Disney films would never have become a commercial success.

But the main point of the research still rings true. The media has failed us with a barrage of sensationalism. They’ve exploited our emotions too many times. And they’ve left a trail of psychological concerns in their wake.

The Fourth Estate has failed us.


It’s time for the media to change its tune.

It’s time for journalists to treat readers, listeners, and viewers with the dignity they deserve. It’s time for the industry to recognize the damage caused by playing to emotions. And it’s time for the media to handle that power responsibly.

The Fourth Estate can be great again. For our sake, it must.