The Engineer’s Paradox

In late October of 2012, something strange happened.

A hurricane that was churning in open waters took a hard left and barreled toward the Northeast coast of America.

By the time the storm made landfall on the Jersey Shore, it was considered post-tropical. But that moniker did nothing to dull Superstorm Sandy’s wrath.

Trees toppled in the wind. Low lying areas flooded. And millions of people lost power.

The devastation was especially pronounced in the New York City area. When I visited months later, there were still debris piles in several places around the city. And if you looked at walls and embankments, you could see where the water had risen to during the storm.

Still, the recovery efforts were mostly complete. With one exception.

New York City’s subway system was snarled by the rushing water. And even after the tunnels had dried out, problems persisted.

Signaling systems faltered, tunnel walls were left unstable, and delays piled up.

The Metropolitan Transportation Authority would ultimately spend the better part of a decade refurbishing the worst-hit areas of the system. This process led to more delays, more grumbling, and potshots from the peanut gallery.

Pundits would point to the age of the tunnels, budget cuts from years gone by, and inadequate maintenance when they lampooned the New York City Subway in those days. They likened it to a dinosaur on life support – a relic from the past that wasn’t worthy of such a costly resuscitation.

It was a convenient argument. But a shortsighted one.


When I was young, I was a hardcore transit nerd.

And I was particularly obsessed with the New York City Subway.

Growing up in the area gave me ample opportunity to ride the city’s 250 miles of subway lines. And I was amazed by the entire operation.

Tunnels wound through the city in a complex maze, with a variety of station designs. Several long, elaborate transfer passages connected the platforms of different subway lines as well.

The New York City Subway seemed like its own subterranean ecosystem. Riders were protected from the rain and the wind, from the summer heat and the winter chill.

Other cities had their mass transit systems too. And each of them had their own unique flair.

Boston boasted underground trolley cars on one line. Washington had long escalators that descended from the street into bunker-like stations. Chicago had miles of elevated train lines punctuated by quaint wooden platforms.

But none of these features were as elaborate or imaginative as the New York City Subway system.

Much of this was by design. Three separate companies laid most of the Big Apple’s tracks back in the early 1900s. The companies had to navigate densely populated areas of town while building around each other’s tunnels and stations. It was quite the operation.

I didn’t think much about these details as a kid. I was just happy to ride the trains and stare out the windows.

Still, as an adult, it’s hard not to marvel at what New York created – and when.

You see, these subway tunnels were built in an era before smart machinery, computers, or Artificial Intelligence. Renderings and surveys were done by hand. Most construction required a human touch.

And yet, despite all those restrictions, the system remained structurally sound for decades. It took a black swan weather event for any cracks to show.

That is a testament to the power of engineering. To following the exacting principles of measurement, mathematics, and physics to a T. To committing to the creation of something that lasts.

That rigidity offers us a sense of security – even beyond the subway turnstiles. Adherence to those principles provides the peace of mind that the structures around us won’t fall apart and put our lives at risk.

It’s a powerful benefit that we all enjoy. But at what cost?


Many of those childhood subway rides led me to the Museum of Modern Art in the heart of New York City.

My grandparents served as volunteers in the museum back then. They answered visitor questions on the weekends. And during breaks in their shifts, they show me around the galleries.

I was often perplexed by how different the artwork looked. Piet Mondrian’s work was full of rigid blocks and lines. Andy Worhol’s artistry tended to convey of Campbell’s soup cans. And Jackson Pollock’s creations were mostly paint splatter.

My grandparents explained that each artist saw the world differently, and they put those perspectives on canvas. Art was a form of expression for them, free of rules and inhibitions.

This concept terrified me. Surely, there had to be a right answer, or some guidelines for the artists to follow. How else would they know if their work was viable, let alone successful?

I was too young to recognize it, but I was thinking like an engineer.

The irony of all this isn’t lost on me. For not only was I a poor prospect for engineering back then – as proved by my horrendous math and science grades – but I was also sharing this opinion half a block from the 5th Avenue/53rd Street station.

This was a split-level station located deep under the street. Once you descended the long escalator, you’d find westbound trains on the upper-level platform and eastbound trains on the platform beneath them.

At first glance, this design made sense. 53rd Street was rather narrow, and it was flanked by tall buildings. Stacking the tunnels deep underground seemed like a necessary engineering decision.

Yet, at the next station to the east – at 53rd Street and Lexington Avenue – both tracks were on the same level, with a central platform between them. The street was just as narrow, and the buildings flanking it were just as tall. But the engineers had gone with a different design.

This all makes me wonder if I’d given the engineers who designed the New York City Subway enough credit. Sure, they’d passed the most critical test – creating something precise enough to withstand the test of time. But they’d also mixed in just enough reasoned creativity to make the system interesting – especially to young transit nerds like me.

Perhaps the choice wasn’t between regulations and rebellion. Perhaps there was room for a shade of gray.

And perhaps that silver lining was a necessity.


I am the son of teachers.

My parents are now retired. But they spent the bulk of their working lives in the classroom, teaching an entire generation of children.

I was in grade school myself for the first half of this endeavor. And I was planting a flag for my own career in the second half. So, I rarely talked shop with my parents.

But little morsels of information still made their way to my ears. And one of those morsels – from my father – has stuck with me.

No two students learn the same way, he recounted. It’s best to be adaptable, and cater your lesson to each student, so that the entire class can take the information in.

I think about this often, particularly in the case of engineering. After all, that discipline seems to fly in the face of adaptability.

How have so many generations of engineers earned their stripes without buckling under the weight of rigidity? It’s a question that seems to defy response.

Or does it?

Yes, the clues to unlock the Engineer’s Paradox lie deep beneath the streets of New York City, where only Superstorm Sandy’s floodwaters can reach.

Down there, it’s possible to have built a mass transit marvel – a century-old masterpiece that stood the test of time – while still maintaining just enough creativity to keep things interesting. Down there, the surety of tradition could mesh with the promise of advancement in the best possible way.

Engineering is precise. Humanity is erratic. But against all odds, that pairing manages to sustain itself year after year. And we all get to reap the benefits.

I’m grateful for that.

The Functional Innovator

On a hot day in St. Louis, Missouri, a concessionaire had a problem.

The vendor was selling ice cream at the World’s Fair of 1904. But with the temperature rising and customers lining up in abundance, he was running out of cups to hold the frozen treat.

Enter Ernest Hamwi.

Hamwi was serving up crisp, waffle-like pastries the next stand over from the ice cream vendor. And he sprang into action to help his neighbor.

Hamwi rolled some of his pastries into cones and delivered them off to the ice cream stand. The vendor then filled those cones with scoops of ice cream, giving patrons an edible cup for their delicacies.

The ice cream cone had an audience.


Few people know the name Ernest Hamwi. Or the name of the pastries he turned into ice cream cones. (They were called Zalabis, for anyone curious.)

But plenty of people are familiar with the broad strokes of Hamwi’s creation. They’ve heard the story of the hot day at the World’s Fair of 1904, of the cup shortage, and of the waffle cone substitute.

What they likely don’t know was that ice cream cones actually existed before Hamwi repurposed his Zalabis. Indeed, an Italian immigrant in New York City came up with a design for them in the 1890s, and he patented that design before the World’s Fair even came to Missouri.

Hamwi was the second mover in the ice cream cone space. And yet, his story stuck.

Why is that? Well, it comes down to two factors.

First, Hamwi’s quick thinking took place within a hub of innovation — the World’s Fair. Both wide-eyed patrons and inquisitive journalists were all around him.

Second, Hamwi deftly created a company to capitalize on his innovation. And other enterprising businessmen across St. Louis followed suit.

Ice cream cones became as synonymous with the city as toasted ravioli. Travelers marveled at the creation. And it started to spread through the Midwest, and then the rest of the nation.

This mixture of widespread attention and a corresponding business plan sparked the ice cream cone’s meteoric rise.

Ernest Hamwi might not have held the patent, but he had the golden ticket.


The origin story of the ice cream cone is more than a quaint tale.

It’s a powerful reminder of our errors when considering innovation.

You see, we tend to become enamored with the OG – with the original inventor of an item or system.

We consider that figure to be part visionary, part Santa Claus. We see them as the bestower of a great gift we can partake of.

But oftentimes, it’s not the original inventor who effectively transforms our lives. That honor belongs to the functional innovator.

The functional innovator is the one who brings the invention to the wider market. The one who incorporates the new contraption into existing systems, who expertly drums up demand, and who sets a tolerable price point.

Ernest Hamwi was the functional innovator of the ice cream cone. He might not have been the first to create the delicacy, but he was the first to make it broadly viable.

Similarly, Henry Ford was not the one to invent the automobile. But he was the one who turned it from a plaything for the rich to something available to all.

And Steve Jobs did not invent the personal computer or the mobile phone. But he ultimately made both ubiquitous in American society.

The functional innovator is part visionary, part connector. They can see where an emergent innovation can fit in society, and they can deftly execute a plan to make it so.

Yet, if they don’t have the force of personality of a Steve Jobs, or the business buzz of a Henry Ford, they can wind up all but forgotten.

And that omission has consequences.


As I write this, society is in the early stages of a tectonic shift.

In a few years’ time, Artificial Intelligence – or AI – has gone from the unknown to the ubiquitous. It’s in our jobs, in our schools, and in our lives.

Signs of this transformation are all around.

An electronic chipmaker – Nvidia – has become the world’s most valuable company, thanks to its prowess at powering AI. Corporate giants are racing to partner with AI specific companies – such as OpenAI – or to create their own models. And seasoned professionals are striving to get up to speed with the new technology, before they’re made redundant by it.

But for all the excitement about AI, something is missing.

A functional plan.

You see, despite all the talk of what AI could possibly do to enrich our lives, there’s little attention focused on what it currently does.

That’s left for us to determine. And both companies and individuals have been stumbling in the dark to try and find the answer.

This is a strange place to be in. It’s as if our garages are full of Model T’s, but the roads are filled with horses and buggies. And it’s been this way for a few years.

What this movement needs is a functional innovator. Someone who not only understands the nuances of AI but also has a concrete plan for its tidy placement within our culture.

The pioneers of the technology can’t fill this gap. They have the genius of technological creation, but not the business skills to compellingly bring their creation to the masses.

The AI companies can’t fill this gap either. They’re – understandably – too versed in the Silicon Valley valuation game to take on this critical task.

And the businesses and individuals experimenting with AI likely can’t fill this gap either. If they could have, we’d already be beyond this morass.

No, the functional innovator for AI powered technology is still out there somewhere. Their moment has yet to arrive, and the world has yet to change because of it.

That’s why the sizzle is still greater than the steak when it comes to AI. That’s why the potential remains so great, yet the risk remains so profound to all of us.

I hope this changes someday. Hopefully someday soon.

But when it does, I hope this pioneer is not forgotten.

Functional innovators matter. Let’s not pass up another opportunity to recognize them.

When Words Kill

In August of 1940, Leon Trotsky was sitting in his study in Mexico City.

It was a peaceful moment. A quiet moment. One that would soon be brutally disrupted.

For there was another man in the study with Trotsky – Ramon Mercader. And as Trosky started reading an article, Mercader hit him in the head with a mountaineering axe.

The blow proved fatal to Trotsky. And it caused outrage far beyond the Mexican capital.

For Trosky was no average citizen. He was a prominent writer and thinker, who also happened to be living in exile.

Yes, Trotsky – who helped form the U.S.S.R – had fled the bloc when he got on the wrong side of Joseph Stalin. And that separation seemingly negated the threat Trotsky posed to the Stalin regime.

Indeed, all Trotsky had left were his words. But those words still got him brutally murdered half a world away — at the behest of the regime.

Words, it seemed, could kill.


Congress shall make no law respecting the establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or of the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.

So reads the First Amendment of the United States Constitution.

There are only six words in that clause dedicated to the right to speak in America. But one of those six words is freedom.

For years, we’ve pointed to that right. We’ve considered it to be tougher than Teflon. Something that differentiates the United States from other nations.

The power of this right was made evident by looking across our southern border. Leon Trotsky – an exile from a nation 7,000 miles away from Mexico – still found himself in mortal danger there for something he’d said.

Such an outcome would be considered unconstitutional in America. It simply wouldn’t be allowed.

Or would it?


Twenty-three years after Trotsky’s demise, United States President John F. Kennedy was riding in a motorcade in Dallas, Texas, when a bullet ripped through his skull. He was rushed to Parkland Hospital and quickly pronounced dead.

Kennedy’s accused assassin – Lee Harvey Oswald – was himself murdered during a prisoner transport days later. And conspiracy theories continue as to whether someone else pulled the trigger on shot that felled the president. So, we don’t know the motive behind the murder of the leader of the free world.

But what is clear is that Kennedy’s words, as much as anything else, led to his demise.

You see, John F. Kennedy was barely halfway through his term as president when he was killed. Most of the signature actions we associate with him — such as space exploration and Civil Rights legislation — hadn’t occurred yet. Lyndon Johnson would ultimately take those across the finish line.

We associate those actions with Kennedy because of his words. Because of his speeches and addresses.

It was Kennedy who declared We choose to go to the moon in this decade. It was Kennedy who spoke pointedly against the advance of the U.S.S.R. on multiple occasions. It was Kennedy who spoke out against segregation following the attacks on protesters in Birmingham, Alabama.

Any of those words could have driven an aggrieved opponent to violence. And they ultimately did.

Words could kill.


About four and a half years after President John F. Kennedy was assassinated, Dr. Martin Luther King, Jr. was gunned down in Memphis.

Much like Kennedy, King had towered over the 1960s. His activism in the service of the Black community had been extensive, and it led to extensive Civil Rights legislation.

But if we take a closer look at King’s contributions to that movement, it’s his words that come to the fore.

It was King’s Letter from Birmingham Jail that set a roadmap for his activism. It was King’s I have a dream speech that captivated so many — making his cause their cause.

Still, that cause – equal rights – was considered controversial in parts of our society. Indeed, an entire swath of the country was built on a platform that directly conflicted with that ideal.

So, an assassin decided to put an end to King’s letters and speeches. He spotted King on the balcony of a Memphis motel. And the assassin shot him.

Two months later, it was John F. Kennedy’s brother Robert who would meet an untimely demise. In the ensuing decades, President Gerald Ford and Ronald Reagan would survive assassination attempts.

A stark reality was coming into view.

The carnage in Dallas wasn’t a one-off. It was the start of a trend.

Sure, people could say what they wanted in America. But if they were prominent enough, those words might keep them from seeing tomorrow.

This concept has come back into focus in recent years. President Donald Trump survived two assassination attempts while campaigning for his second term. And recently, conservative pundit Charlie Kirk was gunned down during an event at a Utah college.

Many have labeled these incidents as political violence, but that belies the point. These individuals were all targeted for exercising their First Amendment rights. And many of them paid the ultimate price.

Freedom of speech, it seems, only practically extends so far.


Back in 2008, I studied abroad in Chile.

Amid the peaks of the Andes and the serene beauty of the Pacific Ocean, I noticed something else – hordes of students engaging in protest.

As a college student who was months away from voting in his first U.S. Presidential election, I was intrigued by this development. But several people told me to stay away.

The police don’t play around here, they told me in Spanish. They show up in riot gear and use tear gas and water cannons. It will burn your eyes and ruin your clothes.

I was horrified by these descriptions. But the locals told me it had once been much worse.

During the reign of Augusto Pinochet, protestors weren’t merely sprayed with tear gas. They were whisked off to secret detention sites, never to be seen again.

It had been 18 years since that dictatorship ended and the disappearances ceased. But I could still see the wariness in the eyes of so many of the locals.

They were exceedingly kind. But they were also reserved. Even after four Pisco Sours, they were unlikely to speak their minds.

It was only those students – too young to remember the Pinochet era – who dared to speak up and face the tear gas.

This is the long shadow that censorship carries.

Rules and regulations, rights and freedoms – they supposedly set the groundwork for discourse. But once we see the blood splatter, or find our acquaintances whisked away, those guidelines go out the window.

The walls close in, and we clam up. A single bullet effectively silences multiple voices.

I worry about this fate befalling America.

Sure, the outliers and the extremists might continue to yammer on, even in the wake of violence. That is their prerogative, and the carnage will not deter them.

But what about the rest of us? Will we feel the same liberty to speak our minds, as we see the bloody corpses of orators on our screens?

I doubt it.

I’m not sure if there’s tidy way out of this conundrum. It’s hard to feel secure when violence is an omnipresent threat.

My only hope is for more of us to face our fears head-on. To leave the cocoon of self-censorship, and to share our thoughts with the world — as I’ve done here at Ember Trace for nearly a decade.

It’s a risk, yes. But it’s one worth taking.

Words can kill. But they can also change the world.

It’s high time we let them.

Lest We Forget

On December 7, 1941, America changed forever.

Shortly before 8 AM local time, Japanese forces launched a surprise attack on the military base of Pearl Harbor in Hawai’i. More than 2,400 people were killed as the base was largely destroyed.

The United States had stayed out of the early stages of World War II. But in the wake of this attack, it was clear that such avoidance could not continue.

In a speech the next day, President Franklin Delano Roosevelt referred to December 7th as A date which will live in infamy. He signed a declaration of war shortly thereafter.

American troops would soon find themselves fighting on the shores of Europe and the islands of the Pacific. Their efforts would prove fruitful, as the Allied forces defeated the Axis powers less than four years later. And that occasion was met with wild celebrations in the streets across our nation.

But even though America prevailed in World War II, it didn’t forget what was lost. For years, December 7th was a solemn day. A moment to reflect and to memorialize those taken from us.

Or so I’ve heard.


The Macy’s Thanksgiving Day parade.

It was a staple in our household during my childhood.

On Thanksgiving morning, my sister, my parents, and I would watch on TV as giant balloons were paraded through the streets of New York City. Marching bands and a host of other performers joined in the festivities.

And then, at the end, Santa Claus made an appearance.

I found this all a bit confusing. We hadn’t even had our first bites of turkey and pumpkin pie, but Santa was already drawing attention to the next holiday. It made no sense.

But my parents assured me that this was all by design. Thanksgiving was the start of the holiday season, they stated. Everything from here on out would be focused on Christmas.

And that included December 7th.

I recall precisely zero mentions of the Pearl Harbor attack as that date passed each year. No memorials. No moments of silence. No recognition whatsoever.

I suppose this seemed logical to many at the time. It had been more than 50 years since the Japanese attacked, and our shores had remained secure ever since. The fall of the Berlin Wall had ushered in an era of unprecedented peace. There was little incentive to look back.

Yet, as I advanced through school — and read about the Pearl Harbor attacks in my history textbooks — I started to question this approach.

I heard my teachers stating Those who ignore the lessons of history are destined to repeat it. AndI hoped that wasn’t about to become the case.

It was.


I was a week into my eighth-grade studies when the September 11th attacks occurred.

Amid the feelings of shock and anger, I remember a sobering sentiment of regret that made the rounds.

We should have known that something this terrible could happen, as unlikely as it might have seemed. After all, it had only been a handful of years since a domestic terrorist destroyed a government building in Oklahoma.

So yes, regret was already instantaneous.

But it was a comment on the news that got to me that evening, even as I sat safe and secure at home.

This is the first time the United States has been attacked since Pearl Harbor.

Pearl Harbor. The attack so many of us had refused to memorialize, because it got in the way of Christmas planning.

Maybe if we all had paid a bit more attention to that attack, we’d have acknowledged the grim possibility of this one. We would have been more vigilant, more prepared, and better able to respond.

But no. We buried that tragedy. And now, we found ourselves paralyzed by another.

I felt shame as I considered all this. I felt regret.

And I felt a determination to never let the 11th of September become just another day on the calendar.


For many years, it was hard to forget the events of September 11th.

We saw the cavernous site of the World Trade Center in New York. We navigated through bolstered security at airports, arenas, and other public venues. We lowered our flags to half-staff each year and held solemn memorials.

Where were you on that day and how did you find out about the attacks? was a common line of inquiry when we met new people. Whether we were young or old, from the East Coast or the West Coast, that wretched day was something we all shared.

Even as our national discourse moved on to new topics — a war in Iraq, a financial crisis — we never quite lost the thread of what we’d endured on that sunny September day in 2001. We celebrated the rebuilding of Lower Manhattan and the Pentagon. We honored those lost in museums and memorials.

And then, it all faded away.

A new generation took the fore. One raised in an online world and forged in an era of pandemics and protest.

The events of September 11th were too antiquated for this group. The dangers of America lay in plain sight, not in the schemes of foreign terrorists.

And so, September 11th became just another day. The solemn pageantry became nothing more than background noise.

Our nation moved on. And my mission sputtered.

Or so it seemed.


How do we remember?

The common answers to this question are imagery and stories. A picture of Michael Jordan soaring to the hoop becomes a brand logo, and a cornerstone of our culture. A tale of Greek soldiers stowed away in a giant wooden horse at the gates of Troy becomes a timeless legend.

Yet, even with the recent advancements in technology, imagery and stories have a shelf life. They can fade or get distorted, tainting our memory.

Actions, on the other hand, never lose their luster. The precision of repetition can reinforce recollections. All while providing an example for others to follow.

I believe in a commitment to action, and in honoring my promises. And so, even as the world moves on from commemorating September 11th, I still will.

I will take a moment to honor the memory of those lost. To revisit my own difficult emotions from that day. And to reiterate my pledge to live filled with humility and grace.

It’s worth the somber silence. It’s worth the canceled social events and rescheduled professional meetings.

September 11th is not just another day to me. It cannot ever be that way.

I hope I’m not the only one.

Rebrand Risks

The logo caught me off guard.

A red, white, and blue ball appeared to be tilted upward. Beside it, the word pepsiappeared in blue, lowercase text.

Gone was the brand identity I’d grown up with. The one I’d seen on bottles in my high school cafeteria a few years earlier.

No more ball with a symmetrical white swoosh. No more PEPSI in capitalized italics.

Is this even the same soda? I mused. I endeavored to buy a bottle to find out.

Relief washed over my face when that familiar Pepsi flavor hit my tastebuds. The new logo was all I’d need to adjust to.


A few years later, I was driving home from work when I passed a Wendy’s location.

Only I didn’t quite realize it was a Wendy’s.

You see, this location didn’t have the familiar Old West font on its sign. Instead, Wendy’s was written in red script. And the illustration of a girl with pigtails above the wording had gone from cartoonish to semi-realistic.

Those same feelings of unease washed over me for a moment. But then I passed a McDonald’s.

The golden arches on the sign looked the same as they ever had. And I knew that the soda fountain inside would have the same script Coca-Cola logo I remembered from my childhood.

The calming sense of reassurance took over. All was still right with the world.


Why did these logo changes from Pepsi and Wendy’s set me off so badly?

It’s difficult to know for sure. But I believe the issues had roots in my childhood.

You see, logos were some of the first things I learned. Even before I could read or do basic math, I knew what the golden arches meant. The same with the Coca-Cola script, and a host of other brand marks.

The logos of that era were my frame of reference to the world. And I didn’t want that frame of reference to shift. Ever.

Of course, Pepsi and Wendy’s had bigger worries than my sentimentality. They were eternally in the shadow of Coca-Cola and McDonald’s — fighting for relevance and revenue.

While their rivals saw value in keeping brand marks consistent, it was actually riskier for Pepsi or Wendy’s to keep the status quo than to rebrand. So, they upset the apple cart.

In Pepsi’s case, they’d done this repeatedly. The logo I was nostalgic for was actually the fourth iteration of the company’s brand mark. Wendy’s had not rebranded before, but it was a far newer company than the soda maker.

I would eventually read business school case studies about this scenario. I would eventually become attuned to design trends through my marketing career. And I would ultimately fall in love with the art of the rebrand.

But I never completely forgot those moments when I spotted the new Pepsi and Wendy’s logos for the first time. That sense of unease lingered deep in my bones.

Good thing it did.


When I was a young adult, I was far from self-assured.

But if you asked me what my favorite restaurant was, I wouldn’t blink.

Cracker Barrel, I’d blurt out enthusiastically.

My reasoning was simple. Where else could you get a quality breakfast of chicken fried steak and eggs for a mere $12?

I was obsessed with that dish, and a host of other Southern staples on the menu. And I didn’t have the highest of salaries. So, I made my way to Cracker Barrel whenever I could.

This pattern broke a few years later. Despite how my heart felt about Cracker Barrel dishes, my stomach simply could not handle them. (Probably on account of all that butter and cream.)

I hadn’t thought much about the chain for more than a decade after that. But then, shortly before I sat down to write this article, Cracker Barrel changed their logo.

And it led to massive uproar.

At first glance, the new logo didn’t look all too controversial to me. The brown wording had been replaced by black. But the yellow background and the font that read Cracker Barrel remained intact.

What had disappeared was the caricature of a man sitting on a chair beside the wording, with his left forearm resting atop a barrel. The words Old Country Store on the bottom of the logo were also now missing.

That all was a bit jarring to me, but not outlandishly so. It simply looked like Cracker Barrel was simplifying its look.

It was only when I started reading some reports on the rebrand that I understood the revolt. The logo wasn’t all that Cracker Barrel was fixin’ on changing.

Indeed, the chain had remodeled many of its restaurants to match the streamlined logo. Walls were painted white and stripped of most accessories, such as rolling pins.

The “modern” look made Cracker Barrel look like one of those overpriced big-city brunch places. That’s the clientele the chain seemed to want to attract — for relevance and revenue.

This all reminded me of a rebrand TGI Friday’s undertook several years back. The “pieces of flair” made infamous in the movie Office Space were removed. The logo was streamlined. And the menu was revamped with more upscale dished.

That rebrand was mostly met with a shrug. And surely, management at Cracker Barrel thought their rebrand would see the same reaction, at worst.

But they were wrong.

You see, Cracker Barrel brass had conveniently forgotten where their restaurants were located — predominantly in the South and the lower Midwest. They’d conveniently failed to notice that their restaurants were more likely to sit along rural highway interchanges than in core of big cities. And they’d neglected to consider how an elaborate revamp would play in that environment.

This was more than tilting the ball on the Pepsi logo or modernizing the script on a Wendys wordmark. For Cracker Barrel’s core market, this was an outright betrayal. A signal that the chain was too good for people who wanted an affordable Southern-style meal in a homey environment.

And so, a consumer revolt broke out. And after a few weeks, Cracker Barrel was forced to retreat.

The old logo would return.


What can we learn from all this?

From the tinkerings of Pepsi. From the refreshed wordmark of Wendy’s. From the foibles of Cracker Barrel.

The biggest takeaway is that rebranding is a risk. No matter the perceived upsides, the downsides can be more severe.

It’s easy to forget this as a marketer. Like many of my contemporaries, I tend to value the opportunity of a shiny new megaphone more than the dangers of change. I tend to override the unease of seeing the familiar upended.

But maybe it’s time for me to tap back into that emotion. Perhaps it’s time for all of us to do the same.

The world is complicated, and so are emotions. Respecting that complexity — rather than blindly plowing ahead with our plans — seems prudent.

Let’s do so.

The Convenience of Privacy

As I walked into the restroom at Dallas-Fort Worth International Airport, I did a double take.

Gone were the sticky floors and uncomfortable noises. In their place was something more humane.

Each toilet sat inside its own private room, with a floor to ceiling door displaying a red or green light. Red meant occupied while green meant available.

Similarly, the urinals were arrayed in cubicles. Each one sat fully out of view from the next one.

This was all a welcome surprise to me. I went from dreading this restroom trip to relishing it.

A few hours later, I stepped off a plane and into the Chicago O’Hare Airport terminal.

Once again, the spirit moved me. And once again, I found myself in the nearest restroom.

This experience was far less pleasant.

There were no private rooms for the toilets. Just ubiquitous metal stall dividers, with latched doors that were barely hanging on. And the urinals sat in a row, without any partitions between them whatsoever.

I solemnly did my business, washed my hands, and trudged over to baggage claim. But as I waited for my luggage, my mind was racing.

How costly would it be for Chicago O’Hare Airport to upgrade its restrooms? Or at least put some partitions between the urinals? Don’t they understand the virtue of privacy?

Alas, I fear they do not.


It’s often been said that death and taxes are the only certainties in life. But there are really two more.

To survive, we must take in nourishment daily. And we must also rid ourselves of the waste from that process.

Eating and drinking do not require an audience, per se. But over time, a communal audience for those activities has become close to obligatory.

But the other activities? They’re meant to be solitary. They’re too messy and unsanitary to be considered otherwise.

Some of this solitude is self-provided. Most homes contain bathrooms, allowing us to relieve ourselves in peace.

Yet, much of our day is spent outside of our homes. Namely, in communal settings where nature’s call might still arrive. Because of that, many public spaces include restroom facilities.

This might seem obvious to the point of being an afterthought. But consider the implications.

It costs money to maintain public restrooms. Toilet paper isn’t free. Neither are janitorial salaries or maintenance bills.

But it’s also nearly impossible to charge money for restroom use. People would revolt at such a notion.

So, businesses and government entities are left to take a financial loss on restroom provisions – hoping, at best, to make up the shortfall somewhere else.

This explains the haphazard look of some facilities, such as the Chicago O’Hare Airport restrooms.

But it doesn’t explain everything.


Some time ago, I was driving down a Texas highway when I noticed a series of billboards.

They appeared every 10 miles or so, each featuring a smiling beaver with a mileage countdown. They also included clever puns about restroom usage.

The billboards were for Buc-ee’s, the now famous travel center chain. But back then, Buc-ee’s wasn’t national phenomenon. If anything, it was gaining regional notoriety for its restrooms.

You see, Cintas had given Buc-ee’s an award for maintaining America’s Best Restroom. And the company was celebrating this accolade by begging travelers to try their restrooms out.

I eventually found myself in a Buc-ee’s restroom. And it did not disappoint.

Toilets were in their own private rooms. Urinals were in secluded cubicles. There were ample supplies of hand sanitizer and soap. And janitors were steps away, ready to spring into action if needed.

It immediately dawned on me that this arrangement was not financially sustainable – especially when you add in the cost of all those billboards advertising the restrooms. But as I walked out of Buc-ee’s moments later with $50 worth of merchandise and Beaver Nuggets, I realized where the funding really came from.

Still, I had no complaints. I’d spent a lifetime relieving myself behind roadside shrubs or in grungy gas station restrooms. Buc-ee’s seemed much better.

I imagine that this was the spirit behind the restroom revamps at Dallas-Fort Worth International Airport. By ponying up for facilities improvements, airport management could make travelers more comfortable — and more apt to spend money before boarding their flights.

The leaders of Chicago O’Hare Airport clearly felt differently. Hope was not a business plan for them. And they maintained their facilities accordingly.

Privacy, it seems, has a double standard.

But should it?


How much does a urinal partition cost?

My mind was still pondering this question as my bag appeared on the luggage belt in Chicago O’Hare Airport.

A quick Internet search provided the answer. Roughly $300 per partition.

That means, in a typical restroom with 5 to 7 urinals, partitions would cost $1,200 to $1,800 to install. A decent amount, no doubt. But hardly an exorbitant one.

And yet, the amount of establishments refuse to claim that cost is staggering.

I’ve started adding these overly public restrooms to a Demerit List. A list that now includes the restrooms at Chicago O’Hare Airport.

And once a restroom makes the list, I’ve tried to avoid returning it ever again.

You see, I find the situation unconscionable. Why would entities avoid paying a grand or two for some urinal partitions, when they’re likely paying twice as much to arrange the toilets in stalls?

But more than that, I view this no-partition arrangement as a broken promise.

For if I were to relieve myself out in the open, I would be — rightfully — assessed a ticket for public lewdness. But somehow, in a communal space, I’m expected to have momentary amnesia for that warning?

No.

I’m owed convenience. I’m owed discretion.

And so is everyone else who sets foot inside a public restroom.

It’s time that we set some standards for privacy. And it’s time that we invest properly in those standards.

Put up those partitions. Install those doors. Do all we can to protect the sanctity of solitude.

This is more than an obligation. This is a right.

Let’s ensure that it’s properly honored.

Raising the Bar

Our first encounter was a strange one.

I’d just completed a Five-Mile race. And as I reached for the Gatorade and bananas past the finish line, another racer asked me how I’d done.

I flatly mentioned my accolades. Sixth overall, top in my age division. An age division it turns out we shared.

Wow, the other racer replied. I bested my time from last year, when I won the division. And yet, I wasn’t able to catch you.

This race keeps getting faster and faster.

I immediately felt a twinge of guilt.

This runner had clearly set his sights on this race. He was aiming for a first-place divisional medal.

I, on the other hand, had no idea what I was doing. I’d never raced anything longer than a 5K before. And I’d shown up at the starting line in a pair of trainers that were past their prime.

Things didn’t exactly click either once the starting horn sounded. I decided to keep pace with a blonde woman in a sports bra – who I later learned was a local elite racer from a different age division. I was unwittingly setting myself up for an epic meltdown, miles from the finish line.

But that meltdown never came. Despite huffing and puffing, I maintained my pace past mile three, and mile four. Meanwhile, the blonde elite dropped back. At the end I was running all alone.

And so, by blind luck, I’d wrestled a spot atop the medal stand from someone who had worked hard for it. I was an imposter. A thief.

Or so I thought.


A couple months after the race, I showed up at a high school track well before dawn.

I was training for a half-marathon at this point. And a friend had convinced me that joining Track Tuesday workouts with the local elites would help on race day.

The blonde woman I’d bested in the five-miler wasn’t at the track. But several other top-notch athletes were.

I’d heard their names atop race leaderboards before. And I was intimidated.

What if I wasn’t fast enough to keep up? Would I be invited back?

The anxiety was First Day of School level.

But then I saw a familiar face. The guy I’d talked to after that five-mile race.

He joined the group for the warmup laps, before peeling off to start his own workout. And my anxiety disappeared.

The next Track Tuesday, he joined us on our warmups once again. The same for the Track Tuesday after that.

I eventually approached him and explained that I was training properly now. My days of bumbling into races and stealing podium spots from him were over.

His response floored me.

Oh, don’t worry about that. That was my first race back after a hiatus. And what you did there helped set the bar for me in terms of what I need to do to get back in race shape.

I’d never considered that competitors could help each other in this way. That they could inspire each other to dig deeper and aim higher.

That together, they could raise the bar for everyone.

This revelation set the stage for my training with the local elites. It provided the impetus for me to keep showing up.

I knew I stood no chance of besting these amazing runners in a race. But they inspired me to get the most out of myself. And they inspired each other to do the same.

While my competitive running career was ultimately shortened by injuries, that group has kept on inspiring the running community. They’ve continued to put stellar race performances on the board.

They’re still raising the bar.


I am the firstborn of my generation.

My sister and my cousins are all younger than me. Many by several years.

Growing up, I had a sense that all eyes were on me. If I were to mess up, I could send an entire generation of relatives down the wrong path.

This was a heavy weight to bear. And I carried it solemnly.

That is, until my father started his family tree project.

>

These were the days before high-speed internet. And the concept of consumer-facing DNA test kits was still a ways off.

So, my father interviewed most of his living relatives to fill out the tree – either in-person or over the phone.

And the stories that came out of those discussions captivated me.

For instance, I learned that the paternal branch of my family tree — the one that has carried my last name through the generations — is essentially a rags to riches story. My great grandfather was raised by a single mother who peddled goods along the beach.

Their small family didn’t have much. And society looked harshly upon my great-great grandmother for being a single mother.

My great grandfather would overcome these modest beginnings. He would graduate from high school and go on run a grocery store. One of his sons — my grandfather — would become a doctor. And one of his sons — my father — would become a businessman and then a private school teacher.

In less than a century, my family had worked their way up through society.

After hearing all this, I started to look at my de-facto role differently.

I didn’t need to be the perfect guiding light for my younger relatives. I just needed to raise the bar a little more, so that they could have something to eclipse.

I’ve leaned into this task with gusto. To this day, my most frequent advice to my sister and cousins is:

Don’t try to be me. Be better than me.

And they have.

My sister and my cousins are all amazing people who have done incredible things. These days, they inspire me twice as much as I could ever inspire them.

This is raising the bar at its best. In the right circumstances, achievement can become a flywheel. And the rising tide can lift all boats.


There’s a lot of discourse out there these days about the dangers of striving.

While this country was built on the spirit of upward mobility, its promise has seemingly faded.

Giving more effort no longer leads to the promise of achievement. Instead, it saddles you with more expectations and higher demands. It sets you on an intractable course with your breaking point.

At least that what the critics say.

There is some merit to this argument. After all, it’s harder than ever to keep up with modern society. And burnout seems to have become a more ubiquitous affliction than the common cold.

Yet, I still reject the premise of this theory.

Sure, raising the bar is hard, demanding, and often unfulfilling. And it can dent our ego to see our hard work get erased by those who come before us.

But without that first step, there can be no leap. There’s no spark. There’s no inertia. There’s no possibility of stretching beyond the horizon.

We need that hope, that inspiration. We need the push to meet the moment.

Our families need it. Our affinity groups need it. Our society needs it.

So, let’s keep striving. Let’s keep giving our best. Let’s keep raising the bar.

We’ll all be better for it.

The Twisted View of Risk

There’s still time to get Apple Care on that new iPhone.

Those were the associate’s words, as he handed the man across the table from me his new iPhone.

But the associate wasn’t finished.

We now have monthly or yearly plans. You can add it for as long as you own the phone, unlike those old two-year Apple Care plans. And if you break your phone, you’ll be covered.

The customer was unmoved.

If my phone gets messed up, he said. I’ll just buy a new one.

The associate looked perplexed.

You’d rather pay another $800 on a new device, than $30 on a screen repair? The math doesn’t add up.

It didn’t add up to me either. After the iPhone owner left the store, I told the associate that I would have gotten the new Apple Care plan.

And yet, as I left the store, I started to doubt my certainty.

After all, I’d made some travel arrangements the night before. Airline tickets, rental cars, hotel reservations — the whole deal.

Each of those purchases came with an option to buy insurance for cancellations, delays, and other mishaps. And in each case, I’d declined that coverage.

I was no better than the guy with the brand-new iPhone. Risk did not factor in.


There’s an infamous scene in the sitcom Family Guy that sticks with me.

The dim-witted protagonist — Peter Griffin — opens his front door and encounters a salesman hawking volcano insurance. Even though he lives far from any volcanic zones, Griffin uses his wife’s rainy-day fund to pay for the coverage.

Predictably, this ends up landing the Griffin family in financial ruin.

I think many of us look at insurance this way. We see it as a scam — one that leeches our hard-earned money in pursuit of bad outcomes.

We don’t want to think that we’ll get in a car wreck, or break our smartphone, or get food poisoning and miss our flight. And we definitely don’t want to throw money at those possibilities ahead of time.

We’d rather delude ourselves into a false sense of security, floating down a river of cost efficiency and good vibes.

I, more than anyone, know how ridiculous this is.

I’ve worked in the insurance technology space for more than a decade. I’ve seen the data. I’ve learned the nuances of coverage. And I know that the overall system — while imperfect — is far from a scam.

Plus, I’ve seen the benefits of insurance in my own life. Car insurance helped make me whole after a Dodge Ram smashed into one of the doors of my SUV. And event insurance reimbursed my entry fees when I had to drop out of the New York City Marathon due to injury.

Yet, I still find myself declining event insurance these days. And I still tend to favor cost efficiency over broad coverage when it comes to my car insurance policy.

The risk is there. The logic is not.


I don’t know much about the man who walked out of that store with a new iPhone.

But I would imagine he holds a few stocks and bonds. And if he were to find himself in Las Vegas, he’d likely take a turn at the slot machines.

This is not typecasting. It’s oddsmaking.

You see, there’s been a surge in recreational investors this decade – particularly in the wake of the pandemic. And Vegas casinos have always been popular with young adults.

I’d be hard pressed to imagine Mr. I Don’t Need Apple Care bucking those trends. Despite his insurance frugality, he seems to be in the peak audience for them.

Investing and gambling share common traits. Particularly the dopamine high of making a windfall, and the delusion that such happy outcomes will befall us frequently.

That’s what draws people in. That’s what keeps people at the table. And that’s what leads people to believe that risk is negligible.

This couldn’t be further from the truth.

Risk is multiplied in the each of these forums, in great part because control is out of our hands.

We don’t get to run the companies we invest in. And gambling is nothing more than a game of chance.

It’s likely that we’ll lose at least some of what we put in. And it’s unlikely we’ll be compensated for that loss.

Insurance might not be popular. But it turns out to be much more practical than the cool activities.

If only we’d come to our senses.


A great many historical figures are known for only one thing.

Sigmund Freud is not one of them.

The father of modern psychoanalysis helped people around the world discover their unconscious, unveiled the Oedipus complex, and redefined sexuality. And along the way, he introduced us to The Pleasure Principle.

The Pleasure Principle is an instinctive drive to seek pleasure and avoid pain. It guides us as we work to satisfy our biological and psychological needs. And it might explain our twisted view of risk.

You see, The Pleasure Principle aims to insulate us fully from bad outcomes. But in a world that’s often random and cruel, that mandate is in impossibility. Risk is always lurking, like a shadow. And pain is never far behind.

Our brains can’t square with this reality. After all, we can only keep our guard up so long before we wear ourselves out.

So, we let the The Pleasure Principle cast a spell of delusion. We get addicted to the prospect of winning while turning our backs on the mere mention of losing.

This dynamic is what draws us to invest, to place bets, and to chase dreams with vigor. And it’s also what leads us to get repulsed by mentions of warranties and insurance policies.

Simply put, we’d rather wish away the bad outcomes than have a plan for mitigating them.

Such thinking is foolhardy. But it’s a Freudian principle. It’s darn near inevitable.

That is, unless we lean into the skid.

Yes, it we play Jedi Mind Tricks on ourselves, we just might cajole ourselves back to our senses.

That means treating insurance like an investment, rather than a nuisance. It means expecting the worst, rather than the best. And it means treating positive outcomes as a happy surprise.

Such a shift requires some emotional jiu jitsu, and a fair dose of pessimism. But hedging our bets in this way, can help keep us from losing it all. It can keep us from losing it all.

That seems like a winning strategy to me. Let’s make it a reality.

The Regression Fallacy

In June of 2000, the greatest golfers in the world gathered in Northern California.

One of the sport’s four major events – the United States Open – was being held at the Pebble Beach Golf Club that year. And these elite golfers were seeking to claim the $800,000 winner’s prize.

The stage was set. The players were in place.

And the golf…largely underwhelmed.

You see, Pebble Beach Golf Club is as treacherous as it is picturesque. Its location atop steep oceanside cliffs leaves it susceptible to wind, fog, and the rest of Mother Nature’s fury.

These factors can cause a flying golf ball to go awry. And they were all in full force that week in June of 2000.

For all their greatness, the world’s pro golfers simply could not contend with the elements.

Indeed, after four days trekking around the 18-hole course, all but one golfer scored worse than the club average — better known as par. The second-place golfers were three over par, the fourth-place golfer was four over par, and the fifth-place golfers were five over par.

But that first-place golfer? He didn’t just break the average. He obliterated it.

Tiger Woods finished the tournament a whopping 12 under par. He made it around the golf course in 15 fewer strokes than the second-place finishers.

The performance set a record that still stands in professional golf. A record for margin of victory in a major championship.

Tiger’s tournament seemed to break all the rules. The rules of physics. And the law of averages.

It was truly a unicorn event.

Or was it?


Regression to the mean.

If you’ve taken a statistics class in your lifetime, you’ve likely heard this phrase.

The idea is straightforward. Anyone taking on a task can perform it exceptionally well — or exceptionally poorly — one time, or even two. But given enough opportunities, their overall performance will average out about where that of others do.

Regression to the mean infers that humans are interchangeable. It states that over the long run, no one is truly that exceptional or that inept. Our bodies, our minds, and the environment we inhabit are not designed to maintain such long-term deviation.

To borrow a phrase from a different stick-and-ball sport — baseball — all it takes is more at bats for the phenomenon to play out.

I want to believe in this concept. After all, it can simultaneously provide hope for those in a rut and humility for those on a roll. What a perfect balance.

But it turns out that it’s too perfect. It’s too neat and tidy for a world that often defies explanation.

I mean, think about it. The best golfers on earth had plenty at bats at Pebble Beach. They played 72 holes of golf over four days, in a variety of challenging conditions.

Yet, despite that fact, Tiger Woods refused to regress to the mean. Instead, he calmly lapped the field.

And he wasn’t done.

A month later, he went over to Scotland and won the Open Championship by 6 strokes. A month after that, he won the PGA Championship in Kentucky. Eight months after that, he claimed the green jacket at the Masters Tournament in Georgia.

It was the famous Tiger Slam. An unprecedented sweep of golf’s signature events, all within a one-year span. He would go on to win each major event twice more over the remainder of the decade.

The at bats didn’t matter. The law of averages was not about to catch up to Tiger Woods.

Nothing was.


Around the time Tiger Woods was making waves in the golf world, a different athlete was dominating that other bat and ball sport. But in a far different way.

Randy Johnson was an intimidating force on the pitching mound. Standing at 6 foot 10 inches, the left hander came after hitters with a 100 mile per hour fastball and a wipeout slider. When batters saw Johnson scowling down at them, with his long hair flowing in the breeze, they surely felt fear and doubt.

This aura was at its apex during the Tiger Slam era.

Indeed, in 2000, Randy Johnson struck out 347 batters and won 19 games. In 2001, he struck out 372 batters, won 21 games, and led the Arizona Diamondbacks to a world championship.

Johnson won baseball’s National League Cy Young Award in each of those seasons, as well as the ones bookending them (1999 and 2002).

I saw him pitch in person during that final Cy Young season. He proceeded to give up a home run to a batter taking his first swing in the big leagues.

Everyone in the stadium was stunned that the rookie could catch up to Johnson’s fastball. After all, the pitch was that much better than the rest of the heaters in the league. It took something special to drive it 400 feet.

Randy Johnson is long retired now. But these days, there are dozens of pitchers that throw as hard as the Hall of Famer once did.

And while some hitters do knock those blazing fastballs over the fence, most of them head back to the dugout shaking their heads.

The numbers bear this out. Back in 2000, the overall batting average for major league hitters was .270. In 2024, it was .243.

Instead of pitchers regressing to the mean, the mean itself is receding.

The thing is, it’s still difficult to blow a fastball past a major league hitter repeatedly. I couldn’t do it if I tried. And most likely, neither could you.

But a growing cohort of hurlers do have that magic in their arm. And they show no sign of returning to that label we call average.

Regression to the mean? It’s a fallacy.


So why all this talk of Tiger Woods and Randy Johnson? What does it mean for everyone else?

After all, 99 percent of us don’t play professional sports. Our “at bats” come in the form of emailed assignments, sales opportunities, and so on.

The law of average surely applies to us, right?

Well, maybe not.

You see, whether we shuck corn or trade stocks, whether we construct buildings or run companies, our performance is sure to vary.

Some of us will be savants, riding the tailwinds of innate skill and good fortune to prosperity. Others of us will be forced to iterate in the wake of challenging headwinds.

But regardless which path we find ourselves on, we’re unlikely to regress to the mean.

There’s just too much of a natural differential between those who’ve got it and those who don’t. There are just too many people predisposed to defy the norms.

This is the state of play we find ourselves in. And it makes our next move critically important.

We must do better than to count on regression to save us. We must do more than pray for our rotten luck to turn around, or to expect our good fortunes to fade.

We must instead lean into our strengths, and pivot away from our weaknesses. We must channel our inner Tiger Woods. And we must avoid attempts to catch up with the Randy Johnson fastball.

Following this strategy will only widen the gap. It will add data points to the edges of the graph, while leaving the middle ever hollower.

But that’s the whole point.

There is not much to gain in being average. So, let’s head full bore toward excellence.

We’ll be better for it.

Losing the Thread

Early on the morning of June 6, 1944, allied forces converged on the beaches of Normandy.

As they emerged from the English Channel, Nazi forces opened fire from higher ground. A bruising battle ensued.

Ultimately, the Allies prevailed. That victory helped turn the tide of World War II.

The Normandy invasion is known as D-Day. And when I was young, there were plenty of ceremonies honoring the veterans who risked everything to make it happen.

These tributes were noble. They were honorable. And I wanted no part of them.

Why would I?

After all, I was a kid living in peacetime modernity. I’d never been to the shores of France, or any shores other than those of my own country.

Plus, June 6th was at the start of summer. My mind was fixated on going to the beach, building a sandcastle, and letting the ocean waves cool me down.

I had no frame of reference for why D-Day was worth pausing my summer festivities for.

I was losing the thread — and fast.


On September 21, 2001, I walked up to a police checkpoint with my father.

I showed a police officer my school ID, while my father pulled his driver’s license out of his wallet.

The officer looked each document over carefully, before ushering us past the barricade.

We proceeded to walk a mile down eerily quiet streets before reaching Ground Zero.

Eleven days earlier, Ground Zero had been known as the World Trade Center in New York City. Twin skyscrapers that dominated the skyline.

Now, in the wake of the September 11th attacks, those towers were reduced to rubble. A mess of twisted steel, ash, and debris lay in the area, measuring roughly 60 feet high.

My father and I were required to stay a block away from the rubble. Recovery efforts were still ongoing, and civilians weren’t to be part of that initiative.

Even still, the sight of that twisted steel was jarring enough to haunt me forever. As was the coating of dust on a scaffold support my father touched. It was inches thick.

That scene reinforced just how much my life had changed. I’d been adjacent to the horrors of September 11th as they occurred – close enough to sense my own demise. Yet, I was still 7 miles from the towers, relying on hearsay and TV news for information of what had transpired.

Now, it was all too real. An area that had seemed familiar weeks earlier now looked like a foreign war zone.

It didn’t matter what hoops I’d need to jump through to enter a building or board an airplane moving forward. I’d happily run through the security ringer to avoid seeing something like this ever again.

I was never losing this thread.


It’s been a generation since the September 11th attacks. And it’s been five generations since D-Day.

And with that passage of time has come a sense of detachment.

Many young adults fail to take security at buildings or transportation venues as seriously as I do. They view it all as a wasteful hassle. An unneeded expense for an imaginary threat that may never transpire.

This viewpoint alone is troublesome. But the dissonance goes deeper than that.

Some have taken it upon themselves to attempt assassinations of prominent figures, to openly support enemies of our nation, or even to espouse Nazi ideologies. Indeed, extremism has emerged from the shadows on all sides of the ideological spectrum, with devastating consequences.

Many pundits have blamed our polarized society for these developments. Others have pointed to the whiplash of a pandemic, racial reckoning, and inflationary crisis in quick succession.

Those might be accelerants, much like gasoline on a blaze. But they’re hardly the initial spark.

No.

Losing the thread is what’s to blame. Plain and simple.

When we get too far from the nightmares of yesterday, we find it all too easy ignore their lessons. We wander aimlessly past the washed-out guardrails of decency and common sense. And we revisit the darker corners of our humanity.

The world becomes a more chaotic and confrontational place. The door is open to danger. And then next nightmare is suddenly upon us.

It’s a tragic inevitability. Or is it?


There’s a picture on my wall at home. It’s in black and white, and it features 124 corpsmen of the United States Navy.

There in the front row, two to the left of the flagbearer, is my grandfather.

The photo is postmarked March 8, 1945. U.S. Naval Training Center. Great Lakes, Illinois. It was two weeks after my grandfather’s 18th birthday.

History books would later note that World War II was in its final stages at this time. But as my fresh-faced grandfather wrapped up basic training, he didn’t know that. No one did.

It would be nearly two more months until Adolf Hitler killed himself in a bunker near Berlin. Another three months would pass before Hiroshima was obliterated by an atomic bomb.

As my grandfather shipped off to a base in California, he was preparing to put his life on the line.

That never transpired.

A foot injury in the barracks kept my grandfather from seeing combat. And the subsequent surrender of the Axis powers led my grandfather to complete his military service in the earliest days of postwar peace.

And yet, years later, my grandfather was unequivocal about why he joined the Navy. And he made clear to me that he’d have made the same decision 10 times out of 10.

My grandfather pointed out that he loved America and hated Fascism. The last thing he wanted to see was a Mussolini or Hitler reigning over our shores.

The horrors of the Holocaust – made evident to much of America after Hitler’s death – only strengthened my grandfather’s resolve. As a first generation American, he found ethnic cleansing to be particularly abhorrent.

I took in this wisdom wholeheartedly. And because of it, I started to pay D-Day a bit more mind.

I read up on the invasion. I white-knuckled through the opening scene from the movie Saving Private Ryan. And I started acknowledging the tributes to the aging veterans in my midst.

The cost of forgetting had been made clear to me. And I wasn’t about to pay that toll.

Deep down, I think that’s what drives me to keep remembering the aftermath of September 11th.

Yes, the trauma of that attack still impacts me – and it probably always will. But as time goes on, and many born after that fateful day grow up, the power of Never Forget fades. And the risks of reprisal accelerate.

My mission is to help prevent that. To keep us from losing the thread even more than we already have.

But it takes more than my best effort.

It’s on all of us who have been through tribulations to share our stories. To keep those bright stars of tomorrow from backsliding into the failings of yesterday.

This process is inconvenient, even painful. But it’s necessary.

For without the thread of history, our society is destined to wind up lost.

Let’s get back on the map again.