On Toughness

I dug into the batter’s box and stared out toward the infield.

Each of the three bases had a teammate of mine standing on it. With one mighty swing, I could bring them all home.

It was the scenario every baseball player dreams about. But it was up to me to write that storybook ending.

So, I set my stance. I readied my bat. I stared intently at the pitcher as he wound up and released the ball.

The pitch veered my way. It wasn’t going to be hittable, so I tried to let it pass.

But the baseball kept riding closer and closer to my hands — until it clanged of the base of my right thumb.

The home plate umpire immediately shouted Hit Batter and pointed toward first base. I jogged in that direction, as my teammate on third base jogged toward home plate.

By the time I got to first base, my hand was beginning to throb. I looked over at my coach — who was standing nearby — and grimaced.

Hurts so good, don’t it? he asked. Shake it off. You drove in a run.

I took those words as gospel. And I paid the pain in my right thumb no further mind.


That pitch clanged off my thumb more than half my life ago.

And yet, I still remember the incident like it was yesterday.

For it was the first foray into toughness. The first time where my taking it on the chin — or the hand, as it were — brought anything other than unbridled agony.

This time, such an act brought applause and praise. And I was enthralled by the adulation.

So, I made toughness part of my persona. I stopped bemoaining my aches and pains. And I started treating them as badges of honor instead.

My rationale was straightforward. If John Wayne didn’t complain about bumps and bruises, neither should I. If Chuck Norris could dust himself off after taking a blow, so could I.

That meant bouncing back to my feet each time I fell. It meant postponing a trip to the doctor or urgent care if something was nagging me. It meant keeping that upper lip stiff and complaints to a minimum.

I thought that my grit and resilience proofed my tough I was. But it turns out I knew far less about toughness than I thought.


I sat on the floor and carefully unstrapped my protective walking boot. As I stared out at my right leg, I flexed my foot upwards and downwards.

With each movement, I felt the tendons around my ankle tighten in resistance. The pain made me grimace.

It had been like this for days, ever since my surgery.

My refurbished ankle was wrapped in bandages like a burrito. And most of the day, those bandages were shielded by my bulky walking boot. My entire lower leg had become an enigma to me.

Those few moments where I shed the boot to change clothes were precious. The flexing exercises were my only opportunities to get a sense of my recovery.

And I didn’t like what I felt.

The blunt ankle discomfort I’d experienced before the operation was gone. But now this intense tendon tightness had taken its place.

My range of motion was in shambles. And so was my confidence.

There would be no quick return to form. I would need weeks of Physical Therapy and plenty of patience to get my ankle functional again.

And even with all this work ahead of me, there was a chance that the tightness and pain would linger. There was a chance I’d never be as I once was.

I had brought all this on myself. For I had elected for this surgery, without a hint of hesitation.

The choice seemed as natural as could be. I had been hobbled by a couple of ankle injuries and viewed the process of going under the knife as a Second Level Risk. I yearned for improved mobility and accepted the potential downsides of my decision.

But I hadn’t understood the depth of those consequences until this moment. It was only when that tendon tightness started to take hold that I truly felt the full gravity of what I’d done.

As I stared into the abyss of uncertainty, I realized I had two options. I could throw in the towel and accept my compromised state. Or I could devote myself to a lengthy rehabilitation without any guaranteed returns.

I chose the latter.

It’s been quite some time since I made that choice to face the darkness. That decision hasn’t affected my physical recovery all that much.

Even so, this experience has changed the way I see the world. And it’s shifted the way I see myself.


Several years ago, here on Words of the West, I shared the saga of Jim Stockdale.

Stockdale, a U.S. Naval pilot, spent seven years as a prisoner of war in North Vietnam. He emerged from the ordeal with a Medal of Honor. And he was later elevated to the rank of Vice Admiral.

Surviving seven years of wartime captivity required plenty of physical resilience. Stockdale absorbed the blinding pain of torture, encountering starvation and sleep deprivation along the way.

But it was Stockdale’s mental fortitude that proved most critical to his survival. Other prisoners gave into despair or fell prey to delusions of an imminent rescue. But not Stockdale.

Stockdale stared right into the abyss, determined yet realistic. He would later define his mental model with clarity and eloquence.

You must never confuse faith that you will prevail in the end — which you can never afford to lose — with the discipline to confront the most brutal facts of your current reality, whatever they might be.

These words have come to be known as The Stockdale Paradox. And they’ve become an ethos for everything from psychological resistance training to business strategy.

Yet, they can also serve as the definition of toughness. They can, and perhaps they should.

You see, toughness is not about ignoring the alarm bells of your central nervous system. It’s not about popping back up off the mat when you’ve been knocked down.

No, toughness is about assessing our impairments and vulnerabilities, accepting their continued presence, and finding the courage to carry on.

Toughness requires us to rewire our brains. It demands that we take a sledgehammer to the concept of psychological safety. It forces us to lean into uncertainty at a seemingly unbearable level.

These are not small asks. But they are attainable.

My recovery from ankle surgery serves as a small example of this. My tribulations appear as a drop of water next to Stockdale’s ocean. But the experience has proven my mettle in a way that no baseball to the thumb ever could.

I now know what true toughness is. And that knowledge will serve me well for the rest of my life.


Many of us will never experience true toughness.

We will never come face to face with our own mortality in a faraway Prisoner of War camp. We will never need to ask ourselves if we’ll be able to walk normally again.

Our lives will remain unencumbered. And for that, we don’t owe anyone any apologies.

But there is one thing we can still do. One change that we simply should make.

We can stop conflating grit and resilience with toughness.

We can. We should. We must.

Shaking off bruises is commendable. Getting back on our feet is notable. But it doesn’t make us tough.

No, dear reader, that moniker demands a higher pedestal. So, let’s take it off the ground and lift it back up to where it belongs.

Second Level Risk

Are you sure you want to do this?

The words filled me with dread. But before I could reply, the technician continued.

Because if this repair doesn’t take, we’ll be out of options. Your device is considered vintage.

I took a moment to try and unpack these words.

I struggled to comprehend how my laptop computer could be a relic. This wasn’t a dusty Remington typewriter from the 1970s. I’d gotten it — new — less than 10 years earlier.

The transaction had cost a small fortune. And I had a hard time believing the computer was now vintage.

But technology moves fast. New editions of the laptop had hit the market since I got mine. Versions with new processors, updated displays, and a completely redesigned keyboard.

This would prove to be a problem, as I desperately needed to fix some busted keys.

I could either take a leap of faith with the technician, hoping he could get the misaligned keycaps back in place. Or I could decline the repair and make do with a compromised keyboard.

It wasn’t much of a decision.

I’d like you to try to fix It, I replied. It’s not working well for me right now, so leaving it as is doesn’t seem like an option.

The technician nodded and took the laptop to a back room. After a few minutes, he returned triumphant. The keyboard was fully intact once again.


When I entrusted the technician with my computer, I was taking a risk.

This was an opportunity to make something broken whole again. But it was also a final roll of the dice.

There were no guarantees that the repair would work. And there was the possibility of inflicting further, irreversible harm to my keyboard.

Such an outcome wouldn’t be beneficial to anyone.

I would be left with a mangled computer. The technician’s reputation would be tarnished. And the manufacturer would face the potential of legal action — if I were so inclined to pursue it.

And so, the technician seemed hesitant — unwilling, even — to proceed. The risk seemed too big to ignore. And the status quo seemed more enticing.

I was decidedly not on board with this thinking.

You see, the computer technician put all risks in the same grouping. But I don’t.

Indeed, I consider the history behind the status quo when making these calls.

If everything is going well, a repair would indeed appear risky. Sure, tinkering might provide new capabilities or unlock new features. But it could also screw up something that was working just fine.

I call this type of scenario a First Level Risk. And I rarely consider it worthwhile.

But if something is already damaged or off-kilter, the risks of a repair seem less stark. Sure, another layer of damage would cause further headaches. But living with a compromised status quo is hardly palatable.

I call this scenario a Second Level Risk. And I’m more willing to take it on.

So yes, I commanded the technician to repair my computer with little hesitation. I made a similar choice regarding surgery for an injured ankle. And yet another to get some rodent-damaged wiring replaced in my vehicle.

I couldn’t imagine making do with what I had. I couldn’t imagine jumping through hoops to maneuver around the damage. (Or not jumping at all, when it came to my ankle.)

Fixing the damage seemed like the only salve. Even if that fix was far from a sure thing.

Second Level Risks were worthwhile.


When I was growing up, I would often go shopping for furniture with my parents.

The store had an As Is showroom. And we would always scour it for discounted furniture.

The As Is items changed out frequently. But they tended to have one thing in common — defects.

Many found these defects acceptable — or at least acceptable in exchange for a lower purchase price.

But to the best of my recollection, my family did not.

I was too young to have an informed opinion back then. But now, decades later, I find myself continuing my family’s legacy.

I don’t want anything of mine to be As Is. I don’t want to be hindered or compromised.

And so, I do what I can to avoid that fate. I entrust others with the task of making me whole.

Until recently, it hadn’t occurred to me how unusual such a decision is.

Indeed, many in our society will gladly take a First Level Risk. But they’ll avoid a Second Level one.

Take my late grandfather as an example.

This was a man who enlisted in the United States Navy at age 17, during the waning months of World War II. He could have stayed in high school until the summer of 1945, likely avoiding the risk of ever being drafted into the conflict. But instead, he decided to put his life on the line for his country.

Shipping off to the Navy during a global war was perhaps the most commendable of First Level Risks. But it was a substantial risk, nonetheless.

My grandfather was placing all kinds of trust in his commanding officers to make it through the ordeal. And that faith ultimately paid off.

You would think such unwavering trust would flow into other risky decisions my grandfather faced. But it didn’t.

All too often, my grandfather would try to fix household appliances himself, or leave them in a compromised state. Good enough was sufficient for him— even if neglected or MacGuyvered repairs put parts of his house in structural danger. Entrusting trained professionals with a solution was just too risky.

In hindsight, my grandfather’s allergy to Second Level Risks seems comical. But in practice, it’s all too understandable.

For America is built upon the pattern my grandfather espoused. We’re implored to take big risks to seize bigger opportunities. But we’re also indoctrinated on the value of self-sufficiency.

Embracing only Second Level risks is an affront to all of this. If we play it safe when things are going well, we’ll leave countless opportunities on the table. And if we turn to others when things are broken, we lose autonomy.

As such, many have followed my grandfather’s pattern. They’ve taken chances when it wasn’t strictly necessary. And they’ve avoided taking chances when the situation could have called for it.

While I understand the sentiment, I also find it a bit baffling.

Are we really that comfortable with spinning the wheel on those First Level Risks, with their massive opportunity costs? And if we are, shouldn’t the Second Level Risks seem doubly enticing?

The answers tend to be Yes and No, respectively. But it’s time we flip them around.

It’s time to listen to reason. It’s time to follow common sense. It’s time to manage our risk tolerance.

We have less to lose with Second Level Risks than we do with First Level ones.

So, let’s stop throwing away a good thing in pursuit of more. And let’s take the calculated risks we need to fix something that’s gone rotten.

This is the sensible way to make decisions. It’s about time we adhered to it.

Into The Fire

On the evening of April 23, 2005, a young man in a suit and tie strode across the stage at a convention hall in New York City.

The man stood next to the commissioner of the National Football League and posed for the cameras. His dream of becoming a pro football player had just become reality.

For many, this might seem like a triumphant moment. But throughout the experience, the man in the suit did not smile.

He had an axe to grind.

The man on the stage that night was named Aaron Rodgers. A standout college quarterback for the California Golden Bears, he had gone into the NFL Draft with high hopes.

Rodgers expected the San Francisco 49ers to call his name with the draft’s first overall pick. He would then move across the San Francisco Bay from his college campus, sign a lucrative contract, and take the reins as the storied franchise’s next quarterback.

But the 49ers chose another quarterback instead. And the teams that followed San Francisco selected players who starred at different positions than quarterback. As the hours passed, Rodgers appeared visibly despondent.

Finally, a team called Rodgers’ name, with the draft’s 24th pick. But it was probably the last one he wanted to hear from.

The Green Bay Packers were everything the San Francisco 49ers weren’t. Based in the NFL’s smallest host city, they played outdoors in the frigid Wisconsin winters. They had won only one championship in the past 35 seasons. And they had a future Hall of Famer — Brett Favre — as their quarterback.

Rodgers would need to bide his time to get his opportunity. And so, he did.

Rodgers played sparingly in 2005, 2006, and 2007. But then, the Packers and Favre parted ways. And suddenly Rodgers was at the helm of Green Bay’s offense.

The Packers had a lackluster season in 2008. But Rodgers showed poise, preparedness, and promise.

He built on that foundation in 2009, leading Green Bay back to the playoffs. Then, in 2010, Rodgers led the Packers to a Super Bowl championship.

Over the subsequent 12 seasons, Aaron Rodgers won four league Most Valuable Player awards. And he led the Packers to the playoffs nine times.

Rodgers might not have had the evening he wanted at the 2005 NFL Draft. But things have turned out well anyway.


Aaron Rodgers’ story is well known, in part because it’s so uncommon.

Franchise quarterbacks just don’t tend to have the journey that Rodgers did. They don’t fall to the 24th pick. They don’t wait as the heir apparent for three full seasons.

Instead, they follow the path of Peyton Manning.

Manning, a college standout for the Tennessee Volunteers, was the first overall pick in the 1998 NFL Draft. Named the starter from Day One, Manning struggled through his debut season with the Indianapolis Colts. But he was downright dominant thereafter.

Manning led the Colts to the playoffs in his second season. The team then returned to the postseason in 10 of the 11 seasons that followed, winning one Super Bowl championship, and losing in another Super Bowl. Along the way, Manning won 5 MVP awards and established himself as one of football’s premier quarterbacks.

NFL teams have tried to follow the Manning blueprint for years. They’ve chosen talented college quarterbacks at the top of the draft and thrown them into the fire. If these young signal callers don’t make it through the inferno with aplomb, team executives will cut their losses and move on.

This whole process is counterintuitive.

You see, the National Football League is perhaps the least appropriate place for snap evaluations. For any new entrant to its ranks faces a steep learning curve.

The dimensions of NFL fields might be no different than those found at the amateur levels. But the players are faster. The play diagrams are more complex. And the competition for each roster spot is fierce.

A player with top-notch skills and a championship pedigree at the amateur levels can still find himself humbled in the pros. It’s that tough to level up.

The burden is that much tougher for rookie quarterbacks. They must orchestrate an entire offensive attack against the best defenses they’ve ever faced. And if these quarterbacks were high draft picks, they likely took over a struggling team — one without a culture of making key plays. (The teams who lost the most games in the prior season pick first in the draft.)

Add it all up, and it’s ridiculous to expect mastery from the start. Yet increasingly, that’s what teams demand.

Consider the case of Tua Tagovailoa.

The quarterback entered the pros with a sterling resume. He came off the bench to lead the Alabama Crimson Tide to a championship in his first collegiate season, then dominated college football over his next two. Considered a sure thing, Tagovailoa was selected by the Miami Dolphins with the 5th pick of the 2020 NFL Draft.

Tagovailoa started his rookie year on the sidelines, but he quickly found his way into the starting lineup. He proceeded to win 6 of his 9 starts and lead the moribund Dolphins to the brink of the playoffs. He followed that up with another solid campaign — and winning record — in his second year.

Tagovailoa played about as well as could be expected. He mastered the NFL learning curve, winning games consistently. He got a previously putrid Miami offense across the goal line frequently. He didn’t turn the ball over often.

And yet, many pundits have called Tagovailoa a bust. Even with all his accomplishments, Tagovailoa hadn’t proved his worth as an NFL franchise quarterback.

This is the nonsense that Aaron Rodgers avoided when he slid to the 24th pick in the draft. He wasn’t saddled with an underperforming team and asked to work instant magic.

Rodgers got to learn the ropes out of the spotlight. And once he finally got his shot, it was with a team poised to succeed.

The fire still burned hot. But Rodgers was iron clad.


I’ve never played a down of professional football.

And yet, I’ve been both Aaron Rodgers and Tua Tagovailoa.

My Tua Tagovailoa turn came first. Two months and a day after my college graduation, I took the helm of an evening newscast in Midland, Texas.

I’d never produced a newscast on a local TV station before. But my resume looked good enough — dotted with some solid internships and time volunteering for my university’s TV channel.

So, I was offered a producer job. And once I accepted, I was thrown into the fire.

The results were solid, but not spectacular. I made a few early mistakes and was generally slow in reacting to breaking news. Even after fixing those early hiccups, I was never able to get my newscasts above third place in the local rankings.

I ultimately left the news business long before it would have left me. But, in hindsight, I was never Peyton Manning material in that industry.

My second career has ultimately proven more successful. But its arc has been Aaron Rodgers-esque.

You see, when I left the news media, I figured I’d land a role in corporate communications. My skills, pedigree, and track record seemingly lined up well for those positions.

But hiring managers didn’t see it that way. And so, I spent three months unemployed – growing more despondent by the day.

Ultimately, I did land a marketing role. But I knew next to nothing about the discipline.

So, I spent several years learning the ropes. I leaned on supervisors and tenured colleagues to check my work and highlight my blind spots.

This process started with that first marketing job. But it continued as I moved to a new role with a different company. It even carried through when I enrolled in business school.

Eventually, I felt confident enough to take command. I became more strategic and innovative. I took on initiatives I once considered too risky. And I racked up a raft of career accomplishments.

That voice of doubt still lives rent-free in my head. But my track record tells a far different story.

I am an accomplished marketer. But I don’t think I’d have become one if I were thrown into the fire and left to burn.


The journey I’ve taken is mine alone. But my story is hardly unique.

Most of us will find the Aaron Rodgers path more fruitful than the Tua Tagovailoa one.

This shouldn’t come as a surprise.

For we rarely enter a new venture as a finished product. There remains much for us to learn. There are still many ways in which we can grow.

Our participation can be viewed as a long-term investment — for employers and for ourselves. It’s something that will inevitably start slow and uncertain. But it’s also something that provides a valuable return over time.

Many professional roles are set up in this way. But many others are not.

So, whether we’re an NFL quarterback or a TV news producer, we find ourselves up against it. We’re expected to show our full value from the moment we walk in the door. And all too often, we disappoint.

It doesn’t have to be this way. Indeed, it shouldn’t be.

It’s abundantly clear that the into the fire method does more harm than good. It inhibits growth. It makes late bloomers irrelevant. And it causes employers to short-circuit non-immediate returns by pulling the plug too early.

No one wins. So, let’s abandon this losing game.

Let’s do away with the snap judgments. Let’s give each other some grace. And let’s see what good a little more runway gives us.

Life’s as much about opportunities as it is about moments. Let’s not set them ablaze.

On Transportation

On a chilly, muggy morning, I stood on the edge of a street in Downtown Dallas.

In my outstretched hand was a paper cup filled with water. To my left were dozens of runners, making their way down Main Street. Above me was a noisy highway viaduct.

I was grateful for the viaduct on this morning. For there was a chance of rain, and its cover would keep me dry.

The runners would also likely be grateful for a brief respite from the elements during their race.

But on most other days, what lay above us was a hot-button topic.

The viaduct, you see, connects two highways. One of them meanders through Dallas’ vast northern suburbs and continues for about 80 miles until it crosses into Oklahoma. The other connects Dallas to Houston, roughly 250 miles to the southeast.

When the structure went up in 1973, it was likely met with little more than a shrug. Development hadn’t reached this part of downtown, and the neighborhood that abutted it — Deep Ellum — was a slum. Stitching the highways together made perfect sense.

But now, plenty of activists want it demolished.

They see the viaduct as a divider, separating a reborn Deep Ellum from Dallas’ Downtown. And they think removing the highway will solve the problem.

Spoiler alert: It won’t.


The discussion over removing an elevated highway from Dallas is a local issue. It could impact city neighborhoods, as well as drivers traversing through town.

The story should begin and end there. But it doesn’t.

You see, this topic has gotten the ear of an activist posse based miles and miles from Dallas, Texas. A posse that seeks to replace urban interstates with parks, boulevards with bikeways, and side streets with pedestrian promenades.

This posse has zeroed in on several American cities as targets.

St. Paul, Minnesota. Kansas City, Missouri. New Orleans, Louisiana. Atlanta, Georgia. And yes, Dallas, Texas.

All these cities are far from this posse’s base. And yet, the posse sees itself as a savior meant to right the wrongs these municipalities endured.

The leaders of this activist posse point to an acknowledged fact. Highways have, in fact, torn apart city neighborhoods. But the proposed “cure” of effectively banishing all motorized transportation in cities is several bridges too far.

Hashing a universal urban future in the image of a Brooklyn hipster enclave is not righteous. It’s not idyllic.

If anything, it’s shortsighted and delusional. It’s opening Pandora’s Box to a parade of unsavory side effects.

Let’s look at why that is.


If you were pressed to choose one word that defines America, what would it be?

Freedom? Democracy? Fireworks?

All are good choices. Yet, I wouldn’t pick any of them.

My one-word definition of America is Movement.

It’s been at our core from the start.

Movement was behind Daniel Boone’s Wilderness Road. Movement was behind Manifest Destiny and the Oregon Trail. Movement was behind the Transcontinental Railway, the jumbo jet, and — yes — the Interstate Highway network.

Our willingness to uproot ourselves in search of better opportunities, better resources, and a better life is well-known. And the innovations spawned by this commitment transformed America from a fledgling nation into a superpower.

Transportation was part and parcel with this narrative. Indeed, many cities an America’s interior grew and blossomed with the advent of steamships and train tracks.

Cities like St. Paul, Minnesota. Cities like Kansas City, Missouri. Cities like New Orleans, Louisiana. Cities like Atlanta, Georgia. Cities like Dallas, Texas.

The advent of the automobile helped these cities grow ever further. No longer did homes and businesses need to be within a stone’s throw of the port or depot. The footprint could expand exponentially.

The incursion of high-speed highways eventually cut into this growth, of course. It divided some neighborhoods and left visible scars on the city grids.

But I would argue such disruption amounted to a setback, rather than a crisis, in these cities.

After all, these metropolises were forged by transportation. And now, the encroaching ribbons of blacktop provided its residents new opportunity.

Opportunity to get fresh goods from other corners of the country, quickly and efficiently. Opportunity to build a new house on a generous plot of land without sacrificing that steady job downtown. Opportunity to get away to that city, mountain village, or beach town without spending half the day on a crowded, slow-moving train.

You see, transportation is part of the culture in broad swaths of America. But it runs so much deeper than that.

Indeed, so many aspects of cities that the activist posse members loathe turn out to be more feature than bug in the wild. Urban sprawl, supermarkets, parking lots outside malls and sports arenas — these have value for the people using them.

Sure, such constructs create massive hurdles for those without sufficient transportation access in these regions. But those hurdles were, sadly, not caused by the advent of transportation. And as such, its removal will do little to level the playing field.

Why does all this matter? Well, let’s consider what happens when we remove modes of transportation from cities built upon them.

Let’s say we tore out a highway — such as that one in Dallas — and replaced it with nothing. Some of those scars on the cityscape might heal. But they’d be replaced by a fresh nuisance — gridlock traffic.

People are not going to suddenly uproot their lifestyle just because a highway is gone. If they’re used to traveling to — or through — the city center, they’ll keep doing it.

But with less room for all those vehicles, remaining roadways would get clogged up quickly as a result. And this would be a nightmare for everyone.

Travel times would increase. Emergency services would have trouble getting through. Trucks would face delays ferrying goods to stores.

It would look a lot like that view across the river from the Brooklyn hipster’s neighborhood. An endless parade of headlights and taillights. A cacophony of car horns.

Perhaps this is why some in the activist posse want motorized transportation banned. Shifting cities back to the good old days would seemingly make neighborhoods vibrant, while exiling the ills of transportation culture.

But there were no good old days for cities built on transportation. So, rewriting history will only serve to punish countless residents. It will force substantial sacrifices with only fleeting rewards in return.

It will backfire. Badly.


There’s a 5-mile path in Dallas’ Uptown neighborhood that I’ve moseyed down from time to time.

It’s called the Katy Trail, and it was built on an old rail line. It’s elevated over street level, providing a nice respite from the hustle and bustle of the city below.

The Katy Trail is just one example of an urban trail oasis. The BeltLine in Atlanta, Georgia is another. So is the River Line in Milwaukee, Wisconsin.

I am thankful these trails exist. But I’m also glad the rest of the space in these cities doesn’t look like them.

There is a need for recreational activities in cities. And there is a need for vibrant neighborhoods.

But there is also a need for transportation. A need to get around town, and out of it. A need for people to get essential goods and services in a timely fashion.

Is it worth giving all that up so that some faraway hipster activist can live out their own idyllic urban fantasy? I don’t think so.

So, yes. I was grateful for that highway viaduct in Dallas once. I still am.

But more than that, I’m fearful of what might happen if it were gone.

Inputs and Outputs

I worked two jobs in college.

Chances are, we’ve heard this phrase before.

We might have even lived it.

I can claim that as true. Sort of.

You see, I did work two jobs to help me with such month-to-month expenses as food and gasoline. But not at the same time.

The first job was with my university’s admissions department. But it was from a heady position.

My role was to digitize prospective students’ college application documents. That meant splaying the packets of materials out on my desk, removing the staples, running each page through a scanner, and then stapling the packets back together.

It was boring work, yet somehow still tedious.

I was terrified of getting a paper cut, stapling my fingers, or accidentally mixing up documents from the applicants. And so, I came back to the dorms mentally exhausted each evening — just in time to start on my homework.

I can’t remember if I lasted a few days or a few weeks in that job. But at some point, I quit.

By the time the next school year came around, I had a new job. This time, I was an administrative assistant for a tutoring program for underprivileged youth.

The program took place at the university, so its offices were on-campus. My job was to check program attendance, file papers, gather the mail, and do a host of other menial tasks.

My tenure there lasted three years, severed only by my graduation from the university.

So yes, I worked two jobs in college. But the mileage varied.


What was behind the differing outcomes in my collegiate job history?

After all, both jobs were of similar administrative ilk. They both paid about the same and required the same hours.

Yet, I ran for the hills from one and stuck around for another. Why was that?

I believe the answer comes from three words: Inputs and Outputs.

You see, most jobs involve these. But some apply them more dynamically than others.

In the college admissions support role, the inputs were a set of paper documents. The outputs were the digitized files, plus the paper backups.

My job was to transform those inputs into outputs. But it relied wholeheartedly on both aspects.

If the inputs weren’t there, I had nothing to work on. That would leave me without any outputs — and without pay.

And so, I yearned for that stack of unprocessed papers on my desk to be as tall as possible. All while dreading the repetitive task of going through it.

With the admin assistant job, the inputs varied. There was always something to help with, but it wasn’t always the same thing.

I was able to practice creativity, to a degree. Efficiency wasn’t just about doing one task faster and more accurately. It was about providing as many outputs to my employer as possible.

And even for a fresh-faced college student like me, that was enlightening.


Over the past two centuries, there have been two dominant paradigms for work in the western world.

One is the Assembly Line Model. The other is the Innovator Model.

The Assembly Line Model was made famous by Henry Ford. His factory workers would each focus on one specialized task, repeating it as quickly and accurately as possible. When these tasks were performed in parallel, they’d yield a finished product in record time.

The Innovator Model is almost entirely the opposite. Tasks would vary widely, all in the context of a challenging end goal.

It’s easy to put each role into buckets. To relegate the Assembly Line Model to manufacturing and the Innovator Model to high-tech software.

But that would be a grave mistake.

Industries and salaries don’t determine which bucket each of our job functions falls into. Only one question does.

Is there a predetermined input?

In the case of my administrative assistant roles in college, the answer to that question was clear. Only the admissions job had such an input. The other role was far more varied.

But oftentimes, the situation is much murkier. We might have some base inputs. But we’re not solely wedded to them.

In these scenarios, our choices tend to diverge along three paths.

Some of us will stick to the inputs we’re given, sacrificing opportunity for reliability.

Others will shun the inputs, going rouge to make their own way to success.

And still others will split the difference, iterating off inputs in hopes of maximizing outputs.

I have taken this third path in my professional life after graduation.

As a TV news producer, I relied on the stories my assignment editors and reporters uncovered. But I also scrounged for material to round out the newscasts. Material that helped balance the needs to inform, inspire, and entertain my station’s viewer base.

As a marketer, I’ve relied on several things — technology, revenue targets, and product development, to name a few. But I’ve proactively viewed my work from a consumer perspective, identifying and filling the gaps I identified.

Through it all, I’ve strived to be transparent, compassionate, and collaborative. I’ve sought to provide unique value to my employers, but in a manner where my contributions could be replicated by others. I tried to be invaluable, yet not entirely irreplaceable.

It’s a blueprint that’s worked wonders for me. But I needn’t be the alone in reaping the benefits.


Business news these days is bleak.

Week after week, tales of stock market downturns, interest rate increases, and stubbornly high costs seem to take center stage. And this has led to a spate of layoffs.

Tech companies are reducing staff at levels not seen in two decades. Other employers are cutting their workforces at rates not seen since the Great Recession.

This has all led to a lot of heartache. Tens of thousands of workers have suddenly found themselves without a livelihood, searching for new roles in an unsteady economy.

It’s a sobering moment, to be sure. But this inflection point also provides a unique opportunity.

We now have the chance to reinvent the way we approach work. To be more than a connector between inputs and outputs. To be scrappy and fill the gaps that existing systems and processes yield. To propel our role, our employer, our industry forward.

Such attributes will not guarantee security or success. But they’ll put us in a far better position to get where we want to be.

Yet, even in this moment, many of us are still yearning for reliable inputs. Whether we’re hanging onto our roles or looking to land a new one, we have little appetite for being transformational. It just seems too risky.

I understand the sentiment. But it’s sorely misplaced.

The more we settle for turning the same tired inputs into outputs, the more we make ourselves forgettable. The more we depend on others, without providing unique value in return. The more we put ourselves in jeopardy of becoming redundant.

Hiding in plain sight isn’t the safe play. Not in a game that awards extra points to the bold and the determined.

So, let’s switch tactics. Let’s put our stamp on the work we do.

Let’s take agency. Let’s be transformational. Let’s dare to make vision reality.

Inputs needn’t define our destiny. That responsibility can, should, must fall on us.

It’s time to grab the reins.

Survive and Advance

They were a juggernaut.

The 2014-2015 Kentucky Wildcats men’s basketball team had top-end talent up and down the roster. Led by a legendary coach, the team had elite-level prowess, talent, and competitive drive. And this made them a nightmare to compete against.

The Wildcats could beat you with offensive skill. They could smother you defensively. And they could outlast you with superior depth.

The college basketball season is a grind, and even the best teams end up with a few blemishes along the way. But not Kentucky.

The Wildcats finished off the regular slate with a 31-0 record. Only 7 of those games were decided by less than 10 points.

As they entered postseason play, a sense of inevitability reigned.

All Kentucky had to do was win 9 more games. That would make them the first men’s team to go 40-0 in a season.

The Wildcats rolled through their conference tournament and the early rounds of the national tournament. But once they reached the Final Four (the national semifinals), something strange happened.

Kentucky’s opponent — the Wisconsin Badgers — matched the Wildcats blow for blow, before pulling away in the final minute.

The Badgers won by 7 points. And just like that, Kentucky’s season was over.

There would be no national championship. No coronation as the best team ever. Kentucky’s ballyhooed players would watch the title game along with the rest of us.

The Wildcats had played 1,574 minutes of masterful basketball that season. But the 1,575th minute cost them everything.


College basketball is full of peculiarities.

Pro basketball has evolved into a spectacle, with elite players competing in modern arenas blaring hip-hop beats.

But college ball remains rugged and antiquated. Games take place in old-school fieldhouses, with cheerleaders and pep bands providing the soundtrack. Jump ball confrontations are replaced by an alternating possession arrow. And, in certain circumstances, players must make one free throw to get a chance at a second. (The dreaded 1 and 1.)

These oddities are widely forgiven, though. For the college basketball season ends with perhaps the most iconic tournament in sports.

The NCAA Tournament — widely known as March Madness — pits the top 68 teams in the country against each other. Teams face off against each other, with the winners moving on and the losers going home. This continues until there is one team left standing.

In theory, March Madness is not all that different than other postseason tournaments. Both the college and professional versions of American football have a single-elimination tournament at the end of their seasons. Part of the World Cup in soccer uses the same format.

But none of these tournaments have the size or scope of the NCAA Tournament. And none are as inherently cruel to elite teams as March Madness.

You see, to win it all, college basketball teams must win 6 games in a row. Those 6 wins must come against other great teams, under the brightest of lights.

This requires a mindset shift. It requires teams to embrace three simple words.

Survive and advance.

Indeed, it’s the most scrappy and desperate teams that have the edge in March. This has led to all manner of surprises over the years — with “Cinderella” teams knocking out more highly-regarded opponents.

Kentucky was able to avoid such an upset in the early rounds of the 2015 tournament. But the sand ran out in the Final Four.

Wisconsin proved to be scrappier than the Wildcats with the game on the line.

The Badgers survived. They advanced.


I often think about the 2014-2015 Kentucky Wildcats. The team that had it all yet walked away with nothing.

It’s tough to know what to make of them.

Generations of evidence show that The Two T’s — talent and teamwork — provide a winning combination. Darwin’s theory of evolution states that the stronger species survives, adapting to adversity more deftly than its foes.

Yet, the loss to Wisconsin defies both trends. The Badgers were no slouch that season, but they weren’t at Kentucky’s level. If both teams were firing on all cylinders, Wisconsin would seemingly be toast.

But they weren’t. The Badgers took the Wildcats’ best shot and prevailed.

In the wake of this outcome, what should we do?

Should we cast off Darwin and The Two T’s, declaring them false prophets? Absolutely not. That would be as foolish as denying the existence of gravity because a party balloon floated toward the ceiling.

Should we shrug our shoulders and chalk this all up to an anomaly? Perhaps. But it doesn’t help us make heads or tails of what happened.

No, the best course of action is to consider what the Kentucky Wildcats could have done better. And then to avoid those same pitfalls in our own life.

The answer to that is clear.

For whatever reason, the Kentucky Wildcats failed to take stock. They failed to consider what they had, and what would be needed to protect it.

This led them to get outscrapped at the worst possible time.

We must not follow suit.


As I write this, another college basketball season is in full swing.

Some teams have risen to the top. Others have stumbled but have some time to right themselves.

Indeed, March Madness is months away for college basketball. But for the rest of us, Selection Sunday is upon us.

We’re heading into a new year rife with uncertainty. Persistent inflation and accelerating layoffs are all over the headlines. The long tail of a pandemic and societal divisiveness each linger beneath the surface.

For quite a while now, we’ve relied on our attributes to thrive. The parallel rise of the tech and venture funding industries has provided ample growth opportunities. When it came to our lives, our careers, and our financial futures, we had leverage.

But now, the tables are turning.

Those around us are battening down the hatches. Growth is turning to maintenance. Excess opportunities are drying up.

In the wake of all this, we need to do what the Kentucky Wildcats didn’t. We need to adapt.

Instead of deciding which options best maximize our talents, we should consider how we can hang on to what we have.

We must be scrappy. We must be gritty.

We must survive and advance.

I’m ready to rise to the moment. Are you?

The Time Shift Fallacy

As I entered the arena, I was in for a surprise.

I knew that I was there for a pro hockey game. And I knew that my favorite team would be wearing modified throwback jerseys.

But what I didn’t know was that nearly the entire game experience would be retrofitted.

The sound system blared 1990s music. The scoreboard showed TV commercials for such bygone brands as Kay Bee Toys and Circuit City. The Zamboni drivers wore Zumba pants.

For a moment, I was transfixed. My mind had traveled back to the days when Wayne Gretzky and Mario Lemieux were on the ice. My body seemed to follow suit.

But then, reality snapped me back.

That star player who scored a hat trick (three goals) that night, leading to a cascade of hats from the stands? He was a baby in the late 1990s.

Those high-powered smartphones we were using to check the game stats? They were years from being invented back in that decade.

And the arena I was sitting in? Well, the team didn’t even start playing there until the early 2000s.

Yes, I was in an alternate reality. One that capitalized on nostalgia without sacrificing the comforts of modernity.

For a night, it worked. But when the clock struck 12, the experience turned into a pumpkin.

And an uncomfortable reality lingered.


Retro night at the hockey game isn’t the only time we’ve thrown it back.

Indeed, remnants of the past are all over our present.

Fashion from the 1990s has been back in style recently. And several cultural figures from that era have had a renaissance.

This should come as no surprise. Generational revitalizations are like clockwork in our society.

Styles from the 1980s re-emerged in the 2010s. And figures from the 1970s found new life in the 2000s.

Still, this is the first time I’ve experienced both the original and the remix. And the nostalgia has brought both glee and alarm.

At first glance, there’s not much to airbrush from the 1990s. The Cold War had ended. The American economy was humming. Aside from the O.J. Simpson trial and the Monica Lewinsky affair, there was not much to wring our hands about.

But dig a bit deeper, and the story is less tidy.

You see, the 1990s introduced the world to a film called Forrest Gump. The movie follows the title character on an accidental journey through many key moments in 20th century America.

In one such scene, Gump is trying to go to class at the University of Alabama when he finds a crowd gathered outside a building on campus. It turns out the commotion is over the racial integration of the university. Several Black students are heading to class, protected by the National Guard. And the crowd, while calm, is hostile to their cause.

During the commotion — including grandstanding by the segregationist governor George Wallace — Gump can be seen on his tiptoes, staring in on what’s going on. He later picks up a book that one of the students inadvertently dropped and hands it back to her.

In the moment, the scene seemed quaint. A relic from a moment in American history.

But recently, real-life imagery of another pivotal moment has seen some new light. The moment was the integration of North Little Rock High School in Arkansas. The era was the 1950s. And the peering onlooker was Jerry Jones.

Jones was an awkward teenager back then. But today, he’s the billionaire owner of the Dallas Cowboys — one of the world’s most famous sports teams. That makes him plenty visible.

As such, the response has not been kind. Instead of viewing the image as quaint, many have directed ire at Jones. Why was he there? And why didn’t he do more to help the bullied Black students?

The answers matter. But the questions are even more significant.


History is written by the victors.

So goes an adage that’s attributed – often controversially – to Winston Churchill.

For decades, we took such commentary at face value. But these days, we’re adding a new twist.

You see, there are now two dominant positions when it comes to historical artifacts. There are those who seek to amplify the flaws of those who came before us. And there are those who seek to wipe those blemishes away.

Thanks to this, turning points in our history — such as desegregation — are no longer taken at face value. They’ve become flashpoints.

Never mind the foolishness of viewing 20th century actions with a 21st century lens. The outcome is set in stone.

Those in the photos, recordings, and writings of yesteryear are sure to be canceled one way or another. They are certain to be construed as villains or heroes, even if they went through those eras as bystanders.

This principle is evident when it comes to Jerry Jones and that photo from Little Rock. But what about that scene from Forrest Gump?

If the movie was being made today, would that plot point have been altered? Might it have been cut?

The answer would most likely Yes.

Indeed, plenty of comedy routines from the 1990s are now considered “over the line.” A prominent 1980s song spoke of asking a doctor for a woman’s gynecological photos. A classic 1970s movie featured an Italian American saying the N-word.

None of that would fly today.

This is the reason the cultural staples of the present are so carefully varnished. And it’s the reason why we curate our trips down memory lane, through such experiences as retro night at a hockey game.

It seems sensible. It seems safe.

But it’s not working.


Back at the arena, I took in the sights and sounds of retro night with wonder.

But down the row from me, a young girl was perplexed.

The girl didn’t understand all the 1990s references. And her mother was struggling to describe them to her.

I couldn’t blame either of them.

The girl was born years after 90s mania had subsided. Like a Soviet defector encountering McDonalds for the first time, she had no ability to generate the warm fuzzies others did.

And her mother experienced that mania in real time. She was processing the Disney World version of the 1990s at the same time she was trying to explain it. That proved too tall a task to master.

This one example explains the time shift fallacy.

All our varnishing, cleansing, and massaging of the past can’t substitute for the real thing. Those of us who lived through it know better than to be bamboozled. And those who didn’t are in no position to understand, appreciate, or judge.

It’s fair to question the faults of the past using the glare of a modern lens. Such enlightenment is necessary. And efforts to avoid such inquiries are corrosively reckless.

Yet, it’s not fair to categorically dismiss all those who committed such faults. Dictators and madmen deserve our scorn for their atrocities, to be sure. But teenage onlookers captured in photos from yesteryear might not.

We might find movies reprehensible for racist dialogue. We might find songs offensive for sexist content. And indeed, we might think twice before sharing these bygone staples in contemporary settings.

But it must end there.

We mustn’t have the gall to think we can time shift, even for a moment. We mustn’t have the hubris to think we can sanitize the past. And we mustn’t categorically mistake the sins of ignorance for malice.

Yesterday is gone. The window for changing it has closed.

Let’s make today great instead.

The Extension Trap

The images were horrifying.

In the heart of Chicago, railroad tracks were on fire.

This seemed to be disastrous for America’s third-largest city. Track fires would jam up rail traffic, disrupting commuters and putting a halt to freight deliveries. And the flames could easily threaten nearby structures — a possibility that had literally burned Chicago before.

But appearances can be deceiving.

Indeed, the flames were no accident. Maintenance crews had intentionally set the tracks ablaze to preserve them.

An arctic blast had hit Illinois, sending temperatures well below 0. And in those conditions, exposed metal can shrink.

Narrower tracks cannot properly hold train wheels. They make derailments likely.

Setting the tracks on fire caused the metal to expand, canceling out the damage from the biting cold. The trains kept running, and life kept churning.

Those blazing railroad tracks kept everything in equilibrium.


Several years later, another picture of fiery metal made the rounds.

This time, a metal dumpster was on fire. And the image of it was all over the Internet.

Now, an inferno of a trash receptacle doesn’t mean much on its own. Burning trash is still trash.

But what those bins represented? That certainly struck a chord.

The dumpster fire images were referencing WeWork, a once ballyhooed company that had hit a rough patch.

WeWork had started as an office co-working company — one of the first of its kind. It was a darling of the start-up world and a tempting target for venture funding.

The ingredients for success were there. And the company began to scale.

But once WeWork announced plans to incorporate as a publicly traded company, the wheels fell off.

Investors started diffing into WeWork’s finances, and they didn’t like what they saw.

The company appeared to be spending far more money than it brought in, and there seemed to be no end in sight for this pattern.

WeWork’s CEO and co-founder dismissed these concerns, stating that the company was doing far more than running a business. It was sparking a movement — a physical social network that replaced Me with We.

To this end, WeWork had already created a co-living brand called WeLive and an education concept called WeGrow. There were plans for banks, shipping, and airlines as well.

Venture investors had long looked beyond these red flags of excess. But public investors were less easily mesmerized. They wanted a return on their investment, and they saw right through the house of cards.

The fallout was brutal. WeWork saw its valuation plummet, canceled its Initial Public Offering, and laid off thousands of its workers. WeLive and WeGrow were put on ice. And the CEO was forced to resign.

There are plenty of reasons for WeWork’s collapse. Case studies and TV dramas will likely cover them for years to come. But I’d like to focus on just one.

WeWork’s failed, in part, because the burgeoning company fell into The Extension Trap.

WeWork expanded too fast, without a plan for sustaining such growth. Worse still, it pitched itself as a lifestyle movement before ensuring its core business was viable.

There was only one way out of this trap. WeWork was forced to shrink like those Chicago rail tracks, simply to get to where it should have been at all along.

The company does still exists today, and it’s now publicly traded. But that damage from its foray into The Extension Trap? It’s likely to linger for years.


The WeWork dumpster fire and the Chicago track fire have each been on my mind recently.

For as I write this, winter is setting in. And as the temperatures plummet, the world around us gets visibly smaller.

Indeed, signs of withering are everywhere. The economy is teetering, with high interest rates and higher inflation spooking off investors. And several companies have started to lay off many of their workers.

As the cold, hard reality of these cuts sinks in, the rationale remains consistent. We expanded too fast, and now the winds have changed.

On its face, such an explanation makes sense. This is the way modern markets work; investors and businesses are simply operating within those parameters.

But, come on.

Is this really the way we want to live? Are these really the values we want to espouse?

I would say not.

When it comes to eating, a cycle of binging and purging is labeled a disorder. It’s a problem — one not to be practiced or written off as trivial.

So why do we give a free pass for this behavior more broadly? Why do we keep taking the bait when we clearly know better?

Its maddening. But it doesn’t have to be inevitable.


The start of winter, with its shorter days and location at the tail end of the calendar, can seem like the lean times.

Paradoxically, it’s also the season of excess.

This is the time of the year where we overextend ourselves. Where we fill our calendars with gatherings. Where we indulge ourselves with sweets. Where we empty the coffers while shopping for gifts.

For several weeks, we lure ourselves into The Extension Trap, in the name of holiday spirit.

Of course, we can’t sustain this behavior. So once the holiday lights dim and the ornaments go back into storage, we adjust back to our regular patterns. And we do our best to ignore the pain this readjustment causes us.

It doesn’t have to be this way.

We can resolve to stop this madness. To say No more often. To choose not to overextend ourselves.

It’s a singular action, a drop in the bucket in the grand scheme of things. But as more and more of us head that direction, that ripple can become a wave. And perhaps, these expectations of overextension will go away.

And it doesn’t have to stop there.

Investors are people. So are members of the C-Suite. They too have lives outside of the office. They too have families and social circles.

If our movement crosses the tipping point, it can influence their decisions. And it can shift the contours in which we operate.

That would truly be a paradigm shift. But it can’t happen unless we make the first move.

So, let’s be bold. Let’s be brave. Let’s be smart.

Let’s practice moderation and steer clear of The Extension Trap.

It’s our best path forward.

Playing it Back

As I picked up the cup, I felt it slide.

My grip seemed strong, and my focus was top-notch. Yet, gravity was inclined to foil my efforts.

My reflexes took over, clutching the cup tighter. My hands trembled momentarily, but I was able to steady myself.

Crisis averted, I thought. Or maybe not.

I looked down at my custom football jersey, now splashed with beer. When my hands had trembled, some of the liquid had left the cup — and ended up on one of my most expensive pieces of clothing.

It was the cruelest of ironies. I don’t drink; I was bringing the beer to my mother, sitting at a table nearby. And yet, I’d paid the price for chivalry.

Back at the table, with the beer now handed off, my mind began racing. I was counting the seconds until I could get home and carefully place the jersey in the wash. And I was reliving my quasi-disaster, playing it back over and over to see where things went wrong.

I was stuck on a road to nowhere.


If I could turn back time.

This is more than a famous Cher song. It’s a common lament. A wish with no chance of being granted.

For time moves in but one direction — forward. Attempting to re-litigate the past is foolhardy.

And yet, we continue to try.

There’s a reason why time travel movies are so popular. There’s a reason fashion trends cycle every few decades. There’s a reason why songs about regret — including that Cher tune — persist.

We are obsessed with playing it back. We are consumed by the thought of one tweak yielding a different outcome.

We’d rather not look at the spilled beer on our cherished jersey. We’d rather not sweep up the shattered glass from the kitchen floor. We’d rather not face the conundrum we find ourselves in.

Far better to picture an entirely different reality.

Even if conjuring such illusions amounts to little more than wasted energy.


I sat in the classroom, staring at the whiteboard.

My business school professor was introducing the concept of decision trees, and I was mesmerized.

Not by the myriad probabilities and the complicated math. All of that was over my head.

No, the concept itself had me enthralled.

You see, I had long dreamed of seeing all the possibilities in front of me and choosing the optimal one. For I had obsessed over the moments that caused bad outcomes, imagining how they could have gone better.

I tended to do this more with the little things in my life than the big ones. I rarely played back my decision to move to a new state or to jump to a new vocation.

But that trek down a muddy path that got my shoes dirty? That money I wasted because I forgot to use a discount code? I’d chew on those missteps for months.

Now, I had a visual aid for this fixation. I could draw the branches and vividly explore the alternatives.

I could make the imperfect art of playing it back a bit smoother.

And so, my games of what if intensified. What was once an arcane exercise turned into a data driven endeavor. One whose futility was masked by ferocity.

Nothing could deter me from this sorry crusade. At least not until the day I spilled some beer on my cherished football jersey.

For my mother caught me in this sad spiral. And she would have none of it.

Stop reliving it, she scolded me. We’ll get the jersey clean and move on.

It wasn’t exactly earth-shattering advice. But it changed my approach entirely.

For my mother’s words exposed an underlying truth. This obsession with playing it back, with decision trees, with alternatives — it wasn’t about hiding in the past for me. No, I kept going to the tape as a means of control.

If I could find the root cause of bad outcomes, I could avoid them in the future. At least that was the thought.

But things happen, regardless of my attempts to avoid them. It would be far better for me to focus on my response than to keep digging for the root cause.

With that ethos in tow, I find myself playing it less often.


In September 2008, the Miami Dolphins and the New England Patriots met for a football game in Massachusetts.

The game was billed as a massive mismatch. New England had won 21 straight games in the regular season, had dominated the division both teams played in, and had played in the most recent Super Bowl. While the Patriots were missing their injured star quarterback, they still had Bill Belichick — the best head coach in the National Football League.

In the days leading up to the game, Belichick prepared meticulously. He watched hours of game film, noting the Dolphins’ patterns and tendencies. And he formed a game plan to exploit those tendencies.

But once the game started, it was Belichick who was exploited.

The Dolphins rolled out a new offensive formation. The running back would line up where the quarterback normally did, taking the snap directly. He would then rush to the outside behind a convoy of blockers. Or he might zip it to a nearby wide receiver if the defense left that receiver open.

Miami hadn’t used this formation — the Wildcat — in any of its prior games. Belichick hadn’t prepared for it, and neither had the New England defense.

The Dolphins ran roughshod over the Patriots, earning the victory on the way to a division title. New England ended up missing the playoffs.

This game showed how playing it back has its limits.

Video footage has revolutionized football, taking coaching, scouting, and player safety to the next level. But it can’t tell all.

There’s always a surprise looming that the tape can’t find. A Wildcat formation, if you will.

How teams react to that sudden adversity makes all the difference. The players, coaches and staff who can steady themselves through the fog tend to be the ones who claim victory. Those attached to the past find themselves weighed down by it.

The same dichotomy awaits us. Memory is a potent tool. But it’s not all-powerful.

Past doesn’t always make prologue. And dwelling on what’s written can lower the horizons of what we’ve yet to write.

So, let’s move away from playing it back. Let’s get off the what if carousel. Let’s swap out the rehash for the response.

We’ll be better for it.

Against The Grain

Just say no.

If you turned on your television back in the 1980s, you likely heard those three words.

They came from First Lady Nancy Reagan. And they were part of the War on Drugs campaign.

The United States was in plenty of shadow conflicts at the time. The Cold War was ever present. The War on Poverty appeared to be a lost cause. The War on Inflation had yielded a brutal recession.

But the War on Drugs was getting plenty of outsize attention. Because the future of our kids was at stake.

Now, the future of our kids was at stake plenty of times before. Teenagers tend to be rebellious, after all. And those signs of rebellion – rock and roll music, dancing, roller blading — those have traditionally come under fire by buttoned-up older generations.

But this was different. This time, the offender was a public health hazard. One that we’d turned a blind eye to for far too long.

So, our nation took dead aim. Arrests for possession accelerated. Sentence lengths for dealing skyrocketed. And the crisis abated.

Or at least that was what we told ourselves.

For we were already onto the next frontier — big tobacco. Over the course of the 1990s, the sight of teenagers smoking went from normal to noteworthy.

Advertising for cigarettes declined — per government degree — and buying a pack became much more tedious. As a result, fewer young people gave it a try.

This seemed like a massive success. But there was no time to celebrate. For once again, it was on to the next challenge.

The new enemy arose around the time I reached my teenage years. This one wasn’t a pill, a powder, or a cigarette. It was online poker — a game my peers were flocking to, despite not having the money to back their bids.

Legislators had long dealt with this problem by restricting access to gambling venues, through licensing and age minimums. But the Internet opened a gateway for teens to walk through. And walk through, they did.

So, the authorities cracked down. They started going after the owners of poker websites, while putting out Public Service Announcements about the dangers of gambling.

It didn’t work out as intended.

For it turned out that the online poker fiasco was just the tip of the iceberg. Technology was opening a Pandora’s Box of issues for adolescents — including new ways to access drugs and inhale nicotine.

Fending off those myriad issues turned into a giant game of whack-a-mole. Those leading the charges were a step behind.

Just say no wasn’t quote as straightforward as it seemed.


Why did Nancy Reagan’s initiative go so awry?

Was it the messaging? The tactics? The inability to anticipate the whims of youth?

All these issues likely played a role. But I believe the biggest fault lies at the root.

Just say no trivialized the concept of abstinence. It made quitting seem as trivial as flipping a light switch — a simple task with instant results.

But it’s never quite that simple.

It turns out that abstinence campaigns are asking a lot of us. They’re demanding that we break with habit and go against the grain. All while ignoring the related challenges that are sure to arise along the way.

And those challenges are doubly prominent with adolescents. After all, teenagers are naturally primed to go against the grain. That’s the impetus behind the rule bending and troublemaking that gives older generations such distress.

Asking teenagers to rebel against their rebelliousness on a dime can be straight up delusional. Yet, this is precisely what we tried with Just say no.

No wonder it flopped.


How can I help?

These four words were meant to be my compass.

So said the internship coordinator at CBS News on my first day there.

I was meant to be continually useful, searching for projects to assist with whenever I had a free moment. Saying no was not an option.

I was barely beyond my own adolescence at this point. Fresh off rebellious years that proved to be anything but, I was keen to answer the call.

So, I set up green screen backdrops. I reordered archive tapes. I watched arcane news clips until I knew them by memory.

It wasn’t a glamorous role, but it fulfilled the mission. It proved I was helpful, useful, and perhaps worthy of a future job opportunity.

Still, I finished those eight weeks unsettled. For it seemed to me that finding a footing in TV news — or any other industry — meant never saying no to anything.

It didn’t matter if the pay was too low, the risk was too great, or life was getting in the way. Declining an opportunity might slam the door on your career before it could even get established.

This mentality is now pervasive in our society. Openness and flexibility are cornerstones of our culture.

That’s often a good thing. But not always.

You see, agreeableness requires sacrifice. We put aside our own needs to cater to the demands of others.

The benefits of this trade — acceptance, opportunity, prosperity — make it palatable. But we can only truly flourish if we look out for ourselves as voraciously as we do for others. And sometimes that means going against the grain.

It means just saying no.


Several years back, I got an invite to a fancy gala.

It had all the fixings. Black tie. Hors d’oeuvres. And a guest list that featured several friends.

I had every reason to go. I would get to dress up and live it up with people I cared about.

There was only one problem: I didn’t want to go. At all.

So, I went against the grain. I declined the invite, without providing an alibi. And I didn’t regret it.

That gala was the first time in a while that I remember actively saying no to something. But it wouldn’t be the last.

Indeed, I’ve declined all manner of invites and requests in subsequent years. I’m selective when I do this — I don’t want to jeopardize my career or my friendships. But the days of me being an automatic Yes have long passed.

And I have flourished as a result.

Perhaps this is the Just say no that we can get behind. One where our own compass guides the way, rather than one foisted upon us from others.

This method won’t be perfect. But it holds the promise of being better than the status quo.

Going against the grain is never easy. But sometimes it’s needed.

When it is, let’s do it right.