The Boolean Trap

I got into my SUV and turned the ignition.

But before I threw it into reverse, I tapped a button on my smartphone.

The phone was sitting in the one of the cupholders beside me. But thanks to the magic of Bluetooth technology, it could stream music or podcasts straight through the car speakers.

I could be my own DJ. And I often was.

But not today.

The Bluetooth, you see, was not connecting properly. Sure, the little screen on the center console of my vehicle said it was connected, but no audio was streaming.

I set my sights on fixing the issue.

I toggled the Bluetooth switch on my phone’s settings off and on. I turned off the SUV and refired the ignition. I rebooted my phone.

By now, I’d wasted enough time troubleshooting that I was late for work. So, I put the vehicle in reverse and made the drive in silence.


That evening, I picked the thread up anew.

Sitting at my dining room table, I fired up my laptop, headed to the automaker’s support website and searched for help documentation.

It took a few minutes of dogged searching even to find my entertainment system on the site. The automaker had moved to a different system in newer vehicles, and most articles were for that system.

And the few support documents for my system were useless. They encouraged me to try what I’d already attempted. Plus, they site provided no way of reporting any issues that hadn’t been covered.

It felt as if the automaker was thumbing its nose at me. All the possible issues with this entertainment system are on this page. And if you find something else, you’re the issue.

I felt offended. I was enraged. I screamed into the void.


I had now wasted countless hours on this issue. I’d searched and toggled and stressed myself into oblivion — all to find a resolution to something that was working just a day earlier.

And yet, there was one thing I hadn’t attempted — resetting my car’s entertainment system.

It wasn’t for lack of trying. I’d gone through the settings menu on the console extensively. I’d combed those support documents until I had them memorized. No master reset option seemed to exist.

So, the next morning, I called the closest dealership and made a service appointment.

When I brought my SUV in, I explained the issue in full. The service tech listened intently. But he furrowed his brow when I mentioned the words console reset.

There’s not really a simple way to do that, he explained. I could unplug the battery for 10 to 15 seconds, and then reconnect it. That’s a hard reset. But I can’t guarantee it will fix the issue.

It was worth a shot. I gave the tech the go-ahead to try. He took my keys and drove the vehicle over to a service bay.

A short time later, I got the SUV back. Sitting in the dealership parking lot, I tried to connect my phone via Bluetooth. The connection went through.

My nightmare was over.


As children, we learn about prominent innovative thinkers. People whose innovations and discoveries have direct impacts on our lives.

Albert Einstein is synonymous with defining the mass-energy equivalence. Sir Isaac Newton is acclaimed for conveying the laws of gravity. Thomas Edison is renowned for inventing the light bulb. And Henry Ford is feted for revolutionizing the automobile.

George Boole doesn’t sit on this Mount Rushmore. But perhaps he should.

Boole was a 19th century English mathematician who didn’t even get to celebrate his 50th birthday. But in his short lifespan, he unfurled something that has come to underpin all corners of western society — Boolean logic.

Boolean logic is an algebraic system that contains two variables – true and false. It judges mathematical expressions by their attributes and classifies them accordingly.

If the expression contains a desired element, it gets coded as a 1. If it doesn’t, it gets coded as a 0.

That series of 1’s and 0’s can blaze a trail through complicated equations, getting to a final answer step-by-step.

If you think 1’s and 0’s sound like computer source code, you’re onto something. Computer systems have been built on Boolean logic since the 1930s, and the associated if-then logic is now synonymous with that technology.

Perhaps that’s why we don’t give George Boole his due. Or perhaps the century between his discovery and the computer age caused us to lose the thread.

Regardless, we are fully immersed in the Boolean world today. We’re accustomed to navigating true-false strings and if-then statements to troubleshoot just about anything, from our health to the strange noise coming from the refrigerator.

This works well. Until it doesn’t.


In the early 2000s, a technology journalist named Chris Anderson introduced a new theory to the world

Anderson saw how the computer age and the growth of the Internet had democratized the decisions consumers could make. In the Golden Era of network television, Americans had three options of what to watch on a given evening. But now, people around the globe could enter any search query they wanted into Google.

These searches tended to fall into a normal distribution, or a Bell Curve pattern. A small number of search terms got most of the volume.

But those low frequency searches at the ends of the curve, they mattered too. Search engines still returned results for them. And savvy businesses had ample opportunities to serve these audiences as well.

Anderson’s theory came to be known as The Long Tail. He wrote a WIRED article and a book about it. And many business professionals came to treat it with reverence.

Including me.

Early in my marketing career, I used long tail theories to create content for my clients’ websites. I was working at a startup agency at the time, supporting several small home remodeling firms.

A few years earlier, those businesses would have relied on the Yellow Pages and word of mouth referrals to stay viable. But thanks to The Long Tail and digital marketing, they now had a sustainable path to growth.

Long tail theory succeeded in filling the gaps of Boolean logic. It acknowledged that the world is messier than if-then statements can count for. And it resolved to clean up the mess.

But as technology has evolved and the economy has fluctuated, long tail theory has faded into the background. Innovators have favored tightening the Boolean engine over sweeping up the bits it misses.

This is what led to my odyssey to get my vehicle’s entertainment system fixed. There was no roadmap for me to follow because if-then logic didn’t account for the issue.

Out of sight, out of mind. Until it wasn’t.


You can’t fit a square peg into a round hole.

This proverbial wisdom has held for generations. And despite the attempts of innovators, streamliners, and futurists, it’s sure to endure for many more.

You see, ceding all infrastructure to Boolean theory is not a viable solution. It’s a trap.

Long tail concerns will not evaporate when swept under the rug. They will fester, agitate, and afflict. They will drive us to frustration, trust loss — or worse.

This corrosion has gone on far too long already. And it’s imperative that we keep the rot from settling in further.

It’s time that we give an audience to the edge cases once again. It’s time to inject independent judgement into the fringes of the logic machine. It’s time to account for all the outcomes we can imagine and consider solutions for the ones we can’t.

This process will be clunky and inefficient. It won’t provide the two true outcomes we’ve grown so accustomed to seeing in our systems.

But it will remove the daylight between our lived experience and the systems we rely on. It will allow us to optimize our outcomes at every turn.

And shouldn’t that be what matters?

Boolean logic is a great thing. But it needn’t be the only thing.

Let’s go for better.

Learning Experiences

It was a simple dish.

Eggs, sliced potatoes, and onions – all bonded together and cooked in a skillet. Kind of like a quiche without the cheese.

The delicacy was known as Tortilla Española. I’d sampled it at restaurants across Madrid as a teenager. Now, as an adult, I wanted to prepare it in my own kitchen.

I recalled my father making the dish from scratch a few times after my return from Spain. So, I asked him for the recipe. Then I gathered the requisite and ingredients.

I peeled the potatoes and cut them proportionally. I diced the onions. I scrambled some eggs in a bowl.

I added olive oil to a cast iron skillet and fired up the stove. I poured the ingredients into the skillet and let them settle.

I took another glance at my father’s recipe. The next task was to flip the tortilla over, so that it could cook evenly.

But how?

I had a glass lid on the skillet, but it wasn’t stable enough to stand on its own while inverted. And I didn’t have a similar-sized pan to flip the tortilla.

The sizzling sound from the skillet reminded me that there was no time to run to the store for supplies. I was going to have to do this the old-fashioned way.

I took the silicone spatula and dug into the bottom of the tortilla. I lifted it up, rotated my wrist…and caused a mess all over the stovetop.

Perhaps the tortilla wasn’t quite set enough. Perhaps my wrist flick wasn’t all that precise.

Regardless, the solid disk had disintegrated into an incongruous pile of egg and potato bits, with some onions mixed in. Most of it was still in the skillet, but some had landed around it.

My dish was ruined.

I did my best to salvage what was left – letting the eggs cook through and then consuming some of it. The rest went into Pyrex containers stashed in the refrigerator.

I’d be having my failure for dinner for nights to come.


Not long after, I told my father what happened.

Did you consider flipping the tortilla onto a plate? he asked.

I hadn’t.

I’d made a multi-meal mess and wasted hours of prep work. All because I didn’t pull a plate out from the cabinet during the moment of truth.

I was filled with regret at first. But then I remembered another of my father’s axioms.

You can make a mistake. Just don’t make the same one twice.

This was not a failure. It was a learning experience.

It was on me to grow from the experience. To do better next time around.

As it turns out, next time looked a bit different. I never did make Tortilla Española in my kitchen again. But my cooking habits for similarly complex dishes were vastly improved

No longer was I blinded by the mouth-watering outcomes of my craft. I instead devoted extra effort to preparation.

That way, I wouldn’t panic when the burners were on. And I’d be better able to adapt.

I don’t believe I would have been able to lean into that approach if everything hadn’t happened the way it did.

The botched flip. The meals upon meals of messed up results. My father’s introduction of a ready alternative. All helped me to internalize the lesson and rise from the ashes of disaster.

The story still has its scars. I cringed a bit while writing it just now.

But I have no regrets.


What is school for?

Marketing guru asked this question at the onset of a TEDx talk some years back.

Godin went on to explain how the modern iteration of American education came about.

Public school districts and standardized tests were not the natural evolutions of one-room classrooms and reclusive boarding academies. They were the vehicles of industrialist ambition, meant to confer obedience and consistency across the youth population.

The modern system of schooling seemed sensible in the early 20th century, when scores of pupils parlayed their diplomas into factory jobs. It also served its purpose in the middle of that century, when vigilance in the face of nuclear war was paramount.

But obedience and consistency seem antiquated these days, in an era where college dropouts can create trillion-dollar companies and financial strategists tend to think outside the box.

Yet, the top-down, cookie-cutter educational experience continues to proliferate. Children are expected to maintain excellence from as early as Kindergarten. There is no other option.

It’s all a bit difficult for me to comprehend.

You see, my own youth is merely decades in the rearview. But it might as well have been in the Stone Age compared to the present reality.

My teachers gave me a fair amount of free reign in the classroom and the recess yard through elementary school. I was supervised, sure – even graded on homework I turned in. But I wasn’t restrained.

The goal was to let me stumble upon knowledge organically, and therefore absorb it fully. This meant literal stumbles were accepted, not shunned.

So, I made mistakes. Lots of mistakes. Both in the classroom and out of it.

But by feeling the consequences of these missteps, I was able to move beyond them. I was able to learn, grow, and adapt. And I was able to keep the sting of regret holding me back.

It’s a throughline that carried directly to adulthood. It drove my response to the Great Tortilla Española Disaster in my kitchen, and countless other setbacks.

And it’s becoming a novelty.


What happens when the leash is too short?

We don’t need to imagine the answer. Examples are all around us.

Many of my peers now have children of their own. And in talking with them, I get a distinct sense that they’re under a microscope.

They’re expected to provide the best experience for their kids at all times – or else risk the branding of bad parent. And they’re expected to short circuit any signs of failure in their offspring.

Failure, you see, represents divergence. It puts daylight between a child and their peers. It forges a gap between expected marks and mandated ones when it comes to reading, arithmetic, and reasoning. It’s the first skid down a slippery slope.

Modern parents don’t intuitively believe this, of course. None of them hold their infants and muse They better not screw anything up in 65 months from now, or they’re toast.

No, this edict is foisted upon parents by their children’s schools, which are chock full of militant rigor and ongoing assessment.

Add in the societal pressure to bring these values home, and parents find themselves in an impossible position. It’s as if they’re meant to choreograph their children’s lives, rather than provide sturdy guardrails for growth.

This might all seem mundane. But the long-term effects could be catastrophic.

Indeed, what happens if an entire generation is shielded from the consequences of failure? How will they develop resilience?

I shudder to think about how the next generation might handle a kitchen mishap down the road – let alone anything more substantial.

Adversity is a great teacher. It’s the only real instructor for moments like these. Moments that we will inevitably encounter in our lifetimes.

And yet, adversity is being kept out of reach. Left on the top shelf of the cabinet until it’s too late for us to locate it.

Let’s change that.

Let’s stop being so allergic to failure and shackled by regret. Let’s start reframing our missteps as learning experiences instead. And let’s teach future generations to do the same.

Sometimes wrong is the first step to right. Commit to the journey.

Consolidated Options

It was darn near Pavlovian.

As the players jogged off the field and into the dugout, the fans in the stands focused their eyes on the scoreboard high above the right field wall.

It was cap shuffle time.

An image of a baseball appeared on the scoreboard. Then suddenly, a stylized baseball cap appeared, covering it up. Two identical-looking caps emerged on the big screen to flank it.

Music blared from the stadium speakers as the baseball caps shuffled around the screen. All the while, the fans tried to keep track of the cap with the ball underneath it.

Finally, the music stopped. The baseball caps froze in place across the scoreboard, the numbers 1, 2, and 3 displayed underneath them.

At the top of the screen, a question now appeared. Which cap has the ball?

There was a momentary pause. Then a murmur rose to a dim roar.

Two! Two! Two!

A few seconds later, the cap over the number 2 on the scoreboard lifted. The baseball re-appeared.

The crowd went wild.


The cap shuffle has long been a staple at ballparks.

It’s long proven to be a cost-effective way to keep fans engaged, even when the ballplayers are off the field. And it’s an easy contest to win.

Now, that’s not to say the shuffle is easy to follow. The scoreboard maneuvers can even flummox the fans with the keenest eyes and sharpest attention spans.

But those who lose the ball get a second chance. With only three options to choose from, guessing is simple. And the roar of the crowd can nudge those guesses into the educated column.

Indeed, I’ve rarely kept track of the winning cap when I’ve gone to the ballpark. I’ve guessed nearly every time. But I’ve rarely guessed wrong.

The wisdom of the crowd carried me through.


The cap shuffle is just a bit of amusement. No more. No less.

But it illustrates an entrenched element our society – The Rule of Three.

The Rule of Three is a principle that was first articulated by the Boston Consulting Group (BCG) in the 1970s. It states that most corners of commerce, there are only three significant competitors. Think Chrysler, Ford, and General Motors in the automotive space. Or Burger King, McDonald’s and Wendy’s in the fast-food sector.

The market might have started out with more competitors in these industries. But over time, those three frontrunners rose from the fray.

Such market domination has as much to do with human nature as business strategy. You see, our brains can only consider three to four options at a time. We simply cannot process a Big Six of automakers, fast-food proprietors, or nearly anything else.

But the Rule of Three only partially explains the world we live in. For while there might technically be three dominant options in just about any industry, only two of them tend to get the lion’s share of attention.

Consider soft drinks. In Texas, Dr Pepper is an immensely popular option. But once you leave the state, it’s barely relevant. Coca-Cola and Pepsi carry the mail.

The same is true in the world of computer operating systems. Linux is one of the top three options in that realm, but it doesn’t hold a candle to Apple and Microsoft.

Binary choice reigns supreme. For better or for worse.

The better refers to reliability for consumers, and a predictable revenue flow for providers. When there are only two dominant choices, each party knows what to expect.

But the worse feeds directly from those advantages. With so few dominant options, consumers must contend with the trappings of monopoly power – including higher prices and lower levels of innovation. And the main providers must contend with each other – leading to polarization and its associated ugliness.

Sound familiar?

Yes, American politics also follows the Rule of Three. Two parties have reigned supreme for generations, while a smattering of independent politicians have sat on the periphery. This dynamic has made rhetoric more extreme and consensus harder to come by with each passing year.

Representative democracy only seems to embody the most sinister corners of American existence. Elections feel like a choice for the least bad option.

And when those perceived least bad selections make it to the seat of power, precious little gets done. Accomplishments requires compromise. And compromise is a bridge too far.

This quagmire has proved demoralizing to many Americans. And the murmurs of their discontent have now risen to a dull roar.

Give us more choices, they say. Get rid of the two-party system.

It’s a seemingly sensible plea. But appearances can be deceiving.


What would a multi-party political scene look like?

We don’t have to dive into fantasyland to imagine this. Real world examples exist an ocean away.

Countries such as Germany, France, Israel, and Australia have relied on a parliamentary system for governance. That means citizens vote for parties, rather than individual politicians.

There are plenty of parties for voters to choose from, and diverse parliamentary bodies. To govern effectively in this environment, parties have traditionally formed coalitions with relatively like-minded legislators – offering a smidge of compromise in order to pool votes.

But recently, that strategy has become less of a sure thing. Voters in some of those nations have given fringe parties with extreme views a seat at the table. And traditional parties have focused on differentiating themselves in response.

Consensus has been harder to find. Coalitions have been fewer and further between. And government productivity has gone down.

The byproducts of this shift are far from pretty. Economies have stagnated. Protests have proliferated. And snap elections have become commonplace.

This is what politics would look like in America without the two-party system. But since voters select individual politicians in our nation, the dysfunction would be on another level.

Without compromise, coalitions, or consensus, bureaucracy could grind to a halt. With gridlock overwhelming funding deliberations, government shutdowns would be inevitable. Without a shared sense of accountability, dereliction of duty would weaken the nation.

Expansive choice is no panacea. Far from it.

It’s time we get used to that fact.


When I was young, my parents would ask me a question each evening.

Do you want one bedtime story, or two?

Bedtime was non-negotiable. But I still had some say over the proceedings.

I often went with the second choice. I’d listen intently to a rendition of one children’s book, then another. And by the end, I’d be down for the count.

I didn’t give this ritual much thought at the time. But I sure do now.

You see, I don’t have children of my own. But I know that kids can be a handful after the sun sets.

Crankiness, mania, hyperactivity – all are possible as youthful energy wanes. Children need their rest, but good luck getting them to acquiesce to it.

This is why my parents’ bedtime system was so brilliant. By consolidating options, they made the wind-down manageable for everyone. And they set me up for success.

I think the same is true for consolidated options in general. We might want more than Coca-Cola and Pepsi, or Republicans and Democrats. We may yearn to see 7 caps shuffling on the scoreboard.

But what we’ve got is manageable. What we’ve got is reliable. What we’ve got is familiar.

It might not work to our specifications. It might barely work at all. But it works.

And that’s no small thing.

Tragic Misconceptions

It was a jarring sight.

A Toyota sedan missing all four wheels. The disk-like rotors were fully exposed to the elements, as a small rock kept the rest of the chassis off the ground.

Some bad actors had stolen away with the tires and hubs in the dead of the night. An inner-city occurrence that was all too frequent.

Only this car wasn’t in the inner city. It was parallel parked along a tree lined street in a suburban neighborhood. My neighborhood.

Oh God, I mused as I passed the disabled vehicle. Am I safe here?

I thought back to a few nights earlier, when I’d taken an evening stroll on that same street. I don’t remember seeing the Toyota sedan parked there yet. But I don’t remember seeing much of anything at all.

You see, the streetlights were out in that area. The sidewalk was pitch black.

I wasn’t worried about criminals attacking at that moment. I was more concerned about tripping over a rogue tree branch or colliding with an aloof squirrel.

But now, I recognized the error of my ways.

I should have been more vigilant. I should have reported the extinguished streetlights – on that street and every other across the neighborhood. I should have been prepared to face down thugs on every corner.

Or maybe not.


The disabled car sat on that rock for a couple of weeks before it was towed away.

All the while, I scanned the neighborhood for other signs of mischief.

I started walking the neighborhood with a flashlight, protecting myself against a potential ambush. I perused postings on Ring and Nextdoor, looking for the patterns of local perpetrators. I pondered enrolling in a Concealed Carry course.

But trouble never came to my doorstep. Just like lightning, it only struck once.

This left me in a strange purgatory.

My neighborhood had proven to be about as safe after the wheel theft as it was before it. But that incident was too brazen to ignore. It had skewed my judgment.

No matter what the numbers stated, I could never truly feel safe there again.


Wrong place, wrong time.

It’s the predominant explanation for tragedy.

We do not tend to court misfortune. Yet, it sometimes finds us anyway — in the most random fashion possible.

There’s no way to truly rationalize these brutal occurrences. Wrong place, wrong time is all we have for an explanation.

But there’s a hidden implication in this statement. Namely, an acknowledgement that a right place and a right time exist somewhere else.

The quest for that somewhere else has served as our societal North Star for generations.

It has led us from colonial encampments to the wild frontier. It has led us back to the cities and then out to the suburbs. It has spurred innovation and infrastructure, but also White Flight and gentrification.

Yes, the legacy of the quest for somewhere else is a complicated one. For the world is not as straightforward as we’d like it to be. And the green grass on the other side of the fence is sure to turn brown once we trample all over it.

Our quest for utopia is a recipe for disaster. And yet, we commit ourselves to baking the cake.

We condemn the Southside, the South Bronx, and South Central. We exalt the fancy enclaves with the elite public schools and the well-heeled police forces.

We wrap ourselves in the illusion of safety. And when the veneer is stripped away, we feel the full weight of the betrayal. Just as I did when I saw the wheel-less Toyota sedan a mere 500 feet from where I lay my head at night.

It’s an insidious pattern. And we’re to blame for it.


Our society is obsessed with rankings.

We’re always eager to see how the football team we root for, the college we attended, or the price we paid for gasoline compares to the other options out there.

Fortunately, there are several organizations out there to satiate our list-mania. One of them is WalletHub.

The personal finance company is best known for its credit card recommendation tools. But it also publishes rankings of the safest cities in America.

WalletHub’s most recent annual edition released a few weeks before I sat down to write this article. So, naturally, I gave it a thorough read.

The first few cities didn’t lead to any raised eyebrow. They were in predominantly rural states that featured low populations.

But when I saw the city ranked #6 on the list, I gasped.

That city was Yonkers, New York.

While I’ve been a Texan for my entire adult life, I spent my childhood in Yonkers. I grew up in a decently-sized house with a front yard and a backyard — luxuries most residents of nearby New York City did not have.

The surrounding neighborhood was hilly, shaded by tall trees that dumped bushels of leaves every fall. The streets were quiet. The neighbors were too.

It had all the appearances of a nice place. But appearances can be deceiving.

When I was just 6 years old, someone stole my father’s car from right in front of our house. A few years later, a nearby home was burglarized. Shortly after that, someone drove across the front lawn of our across-the-street neighbor before plowing into a retaining wall.

It was all more than a bit unsetting.

I wanted to believe that my home was safe. That I didn’t have to worry when I closed my eyes at night.

But each time the blue police lights lit up our street, I doubted that premise. And each time my father installed an alarm system or trimmed the hedges a little lower, uncertainty proliferated.

I moved away from Yonkers many years ago. And my parents eventually sold my childhood home.

Several months after they left the city, a man in a parked car shot a Yonkers police officer approaching his vehicle. The officer’s partner returned fire, leading to an extended shootout. Terrified onlookers told news reporters that it felt like the wild west.

The whole incident took place on the same block where I grew up. If I were still there, I could have watched it unfold from my childhood bedroom.

Yet, despite that shooting and all the criminal activity I witnessed before it, Yonkers found its validation. Despite its star-crossed legacy as the site of the fire that killed Malcolm X’s widow, the arrest of the Son of Sam killer, and the early misdeeds of the rapper DMX, Yonkers was ultimately lauded as a beacon of safety.

What gives?


Signal and noise.

It’s the central paradox of statistics.

As we accumulate data, we yearn to find meaning in its patterns. But some of those associations ultimately don’t hold water. They’re the noise that the proven conclusions — the signal — must compete with.

The officer-involved shooting near my childhood home is a prime example of this. It spooked the neighborhood, no doubt. But it also was the first time in 30 years that a Yonkers Police Officer was shot in the line of duty.

In the grand scheme of things, it was not signal. It was noise.

The prior criminal incidents I witnessed on that block also fell into the noise column. While each was unnerving, they took place far too infrequently to cause real concern.

My childhood neighborhood, it seems, has long been a predominantly safe place. It just wasn’t perfectly safe.

The same can be said about my current neighborhood. And many others across our nation.

It’s that variance that gets me — that gets many of us.

Safety is such an existential need that we seize upon any sign of imperfection. One lapse is too many, and two is catastrophic.

But this trend is not feasible or productive. It leads us to overestimate bad outcomes and succumb to paranoia. It fosters tragic misconceptions of the places we frequent, and the people we share those places with.

We need to let go of those delusions, and to choose a more sustainable path instead. We need to recognize the risk of a wheel theft or a crash into a nearby retaining wall for what it is – low, not zero – and calibrate our responses accordingly. We need to stop casting out the good with the bad.

This will be an uncomfortable shift for many of us. Myself included.

But it’s a necessary one.

We will never find a true sense of security without making peace with our surroundings.

It’s starts with us. Let’s get to it.

Constants and Variables

His name was Glauber Contessoto.

Sporting wildly matted hair and a thick beard, he stood out from the crowd. Mostly because of his nickname – The Dogecoin Millionaire.

Contessoto, you see, had gone to the extreme with his investing strategy. He had stopped focusing on stocks, bonds, and savings to grow wealth. And he’d put his money into Dogecoin instead.

It was an odd strategy.

Dogecoin, you see, had started as a parody of the emerging Cryptocurrency trend. It was a tender sporting the image of a snarky Shiba Inu.

Much like hippies trading in beads, Dogecoin was not meant to be taken seriously by a wide audience. It was mostly a meme.

But Conessoto didn’t care. He was inspired by the potential of Cryptocurrency. And he went all.

His timing could not have been better. Contessoto’s $250,000 investment grew fourfold in roughly 70 days, making him an overnight millionaire.

This would have been a good time to cash out. To stash the winnings in a nest egg or reinvest them in traditional markets.

But Contessoto didn’t do that. He doubled down on his bet on Dogecoin. And he actively encouraged other investors to follow suit.

What followed next was all too predictable. Cryptocurrency markets saw a correction, and the value of Dogecoin started to plummet. The fall wasn’t quite as steep as the rise, but the tender ultimately lost 90% of its value.

It was enough to make a Dogecoin Millionaire suddenly worth only $100,000. Contessoto’s strategy had most certainly not paid off.


When I was a teenager, I’d often head to the convenience store down the street from school. I’d reach into my wallet for some allowance money, trading that cash for a newspaper and a bottle of Coca-Cola. And I’d stuff those items in my backpack.

I didn’t ride the bus in those days. So, when the last class of the day was over, I’d park myself somewhere in the lobby. I’d pull the brick-like cell phone out of my backpack, raise the antenna and dial my mother.

I’m ready for a ride home, I’d exclaim. Then, I’d put the phone back in my bag and pull out the newspaper and Coca-Cola. By the time my mother arrived, I’d read most of the articles and finished all of the soda.

These days, the waiting game is far less prevalent. I have my own vocation, my own transportation, my own living quarters.

And yet, I do occasionally find myself sitting in the lobby – waiting for a doctor’s appointment or to board a flight. Just like the old days, warding off boredom is my responsibility.

But instead of reaching into a bulky backpack for a newspaper and a bottle of soda, I now reach for my pocket. My mobile phone now fits there with ease. And it can do so much more than dial numbers.

Indeed, I can read news articles, schedule a dinner order, check the weather forecast, and even watch the ballgame – all from my phone screen. And if I need to buy something, I can do it with a tap of the device as well.

My smartphone is now one of the most essential accessories I have. Much of my daily life routes through its screen. And because of that, I always ensure it’s well protected, well maintained, and well charged.

This quantum leap in functionality hit the market in a flash. Apple released its first iPhone while I was still technically a teenager, and it contained many of the same capabilities back then as it does now.

I was only a handful of years removed from holding court in the school lobby back then. I probably could have ditched the newspaper for my phone screen.

But I didn’t.

You see, much like others, I was amazed by what Steve Jobs presented. But I was also disoriented by it.

What changes would I need to make to my daily habits with this new technology in hand? Which rituals would stay, and which would be usurped? How would I measure my own progress in the new normal?

These were tough questions without ready-made answers. So, I waited three years to get my first iPhone. And it took me three more years to cede my entertainment and commerce needs to its mighty screen.


Solve for X.

Those three words were prevalent in algebra class.

I’d long been accustomed to moving in straight lines with my studies. To memorize these facts, to read those chapters, to divide this by that.

Now, I was being asked to solve a mystery. To use the principles of arithmetic to determine what number the letter X represented.

I was annoyed at first. Why was I being asked to go through all this rigamarole? What purpose did it serve?

Perhaps sensing this frustration, my teacher gathered the class.

Algebra, the teacher stated, was not just about solving for x. It was about what X and the numbers around it stood for.

X represented a variable. Something that could be altered as circumstances shifted.

But the numbers around it? Those were constants. No matter what value X held, they would stay the same.

Deductive reasoning relied on both factors, my teacher explained. Change was an ongoing, volatile element of our world. But we could best understand its effects by holding something constant as we sought to isolate the variables.

This description continues to resonate today. In fact, it illustrates my slow adoption of the smartphone ecosystem.

You see, the iPhone might have been able to combine three pieces of technology – and one newspaper – from my arsenal instantly. But it would be a journey to get me there.

I’d need to weigh the changes against the constants to keep from getting lost. So, instead of trying everything at once, I’d adopt features one at a time.

So, my music listening habits would be the first to change, followed by my shopping habits, and my news reading ones. Such sequencing would allow me to systematically address each constant. To try each adaption on for size, and only proceed ahead when comfortable.

Moseying down the pool steps took longer than a cannonball off the diving board would have. But it served me well.


There’s a lot of clamoring these days about disruptive innovation, hot trends, and emergent opportunities. Futurists get plaudits. Nascent solutions get buzz. And figures like The Dogecoin Millionaire get rich.

It can seem as if leaning into the next big craze is the best way forward. As if changing all the variables at once is our only true path.

It’s not.

There is value in expanding our horizons, to be sure. But we’re more likely to maximize that value if we keep some constants in place along the journey.

This is the pattern of change we’re most comfortable with. It’s the pace of change that most fits our natural rhythms. And it’s the approach to change that best helps us hedge against risk.

This approach might not yield us new status, riches, or acclaim. But it will keep us from losing our ability to reason along the way.

And that is certainly a gift worth maintaining.

So next time you’re feeling the pressure to dive in, take a moment to consider the constants. And govern yourself accordingly.

Power Dynamics

As I stared at my phone’s home screen, frustration washed over my face.

The neat grid of app icons I’d perused just hours earlier was now an imperceptible mess.

I had updated the phone’s operating system overnight. And the new OS seemed to have put all the app icons in dark mode.

The white space on each app tile was now a dark gray. And the app icons were now a faded array of colors. This made the apps for Ford, AT&T, Venmo, Garmin and The Weather Channel appear interchangeable.

This was a first world problem of the highest order. But it was still a problem.

Indeed, I felt as lost navigating the screen at 6 AM as I had at 1 AM, when I’d stumbled to the kitchen for a glass of water. I knew the general direction of where I was headed, but getting there required a lot of squinting and some tentative movements.

This had to stop.

I turned to the phone settings screen and tried to revert the darkened icons. But this turned off dark mode entirely — making all the apps on my phone blindingly bright and draining the phone’s battery in the process. I rolled back that change quickly.

I thought about complaining to Apple, who was behind this phone update. Hey, maybe don’t tether dark mode to the app icons, or at least let us opt out of that view.

But I knew better.

This was Apple, after all. The company which once had Think Differently as it’s tagline. The poster child of the closed ecosystem.

Apple wasn’t going to make it easy for me to file a consumer complaint. And even if I persisted, they weren’t likely to take that complaint into account.

The power dynamics were not in my favor.


If I had asked people what they wanted, they would have said faster horses.

Such were the musings of Henry Ford. While it’s uncertain if he said these words verbatim, there’s no doubt that he thought along these lines.

Ford came of age in the first era of capitalism. Adam Smith’s The Wealth of Nations had been published in 1776, and it placed market dynamics front and center.

Without demand, Smith stated, there would be no impetus to create goods. And without those goods to sell, there would be no commerce.

Smith called the combination of these forces The Invisible Hand. And the term soon became ubiquitous.

The United States had also come to be in 1776. And as it established its economy, it deferred heavily to the power of consumer demand.

There was a heavy focus on producing items that the populace had expressed a need for. And on bringing those items to market at a fair price.

It was The Invisible Hand in action.

Innovation had trickled into the fold over the ensuing decades. But such efforts mostly focused on efficiency of production, or the quality of finished materials.

The machines in east coast textile mills helped turn more cotton, silk, or wool into clothing each day. The steel from Andrew Carnegie’s foundries helped build taller buildings and sturdier bridges.

The transportation needs of the people wearing that clothing and crossing those bridges to get from building to building? Those were accounted for by horses, steamships and railroads.

Those were the methods consumers used. As such, those would remain the areas of focus for businesses in the market.

Until Henry Ford turned the whole system on its head.

Ford had a grand vision for the automobile. The motorized wagon had cropped up in Europe, and it had recently found its way to America. Still, it was mostly a novelty for the rich, with no sign of widespread demand.

Ignoring these headwinds, Ford set out to create a reliable vehicle – the Model T. Then he rolled out new production techniques to assemble that vehicle at scale. He offered the vehicle at an appealing price point. All while unleashing messaging sure to spur interest.

Ford’s efforts ushered in the age of the automobile. Horse-drawn travel faded away. Suburbs became viable. The road trip became a thing.

And the second era of capitalism found its spark.

By succeeding with something the market hadn’t asked for, Henry Ford had usurped control.

No longer were consumers pulling the strings. Ford was the one who knew best what was needed. And he ran his company accordingly.

Consumers didn’t always like this, and some did voice their complaints. But as the automobile fast became ubiquitous, those complaints mostly fell on deaf ears.

The power dynamics was not in their favor.


Roughly a century after the Ford Model T hit the market, Steve Jobs took the stage at an Apple keynote. Partway through his presentation, he unveiled the iPhone.

Apple’s first smartphone didn’t come out of left field the way Ford’s automobile had. Consumers had already been using mobile phones for some time. And some of those phone models had email and text messaging capabilities.

But Jobs paid little attention to what consumers had expressed demand for. Instead, he spurred Apple to create something entirely novel.

The result was a pocket-sized supercomputer. One that embedded messaging and phone calls into the touchscreen. And one that allowed for additional functions through programs called phone apps.

Apple didn’t make the iPhone as affordable as Ford had made the Model T. And it took time for consumers to flock to the device.

But once they did, they ended up giving more than their money to the tech behemoth. They handed over leverage as well.

Indeed, the iPhone ended up transforming the way many went about their everyday lives, from accessing entertainment to paying bills to ordering food. Phone apps helped re-imagine these processes.

Many of these apps were built and managed by third parties. But Apple still controlled access to them through a proprietary App Store found on each iPhone.

Third party programs would have to confirm to Apple’s standards to remain in the App Store. Consumer demands carried little weight. What Apple wanted, Apple got.

The same held true for the iPhone’s underlying software. Apple could redesign it at will – by, say, making all app icons appear in dark mode – and then deploy the update to all phone users. The consumer had no say in the matter.

The power dynamics were not in their favor.


A day after the darkened phone icons wrecked my morning, I got a notification.

Check out the guide to your new operating system.

I scrolled through the tutorial, learning how to style text messages and customize my lock screen.

Suddenly, there it was. A tip for customizing my app icons on the phone’s home screen.

I followed the instructions. The process was anything but intuitive, but I got my icons to appear as before.

As I stared at my phone, I felt a mix of emotions.

I was relieved that I wouldn’t have to quint at my phone anymore to open the right app. But I was annoyed that it took a dose of fortune to get back something that never should have been taken from me.

I feel this way all too often in life. And I’m certain that many others do as well.

Our leverage has been taken from us in the name of innovation. And we’re forced to jump through hoops for the privilege of being strong-armed.

It’s a pernicious cycle. But it doesn’t have to be a self-fulfilling one.

We can demand more from those we buy from. We can buoy alternatives to send a message. And we can model behavior that shows more equitable power dynamics between buyer and seller.

None of this will be easy. And some of it might demand some sacrifice.

But it will prove worthwhile.

Power dynamics have gotten out of hand. It’s time to flip the script.

The Immersion Fallacy

The rain was coming down in torrents.

A hurricane had come ashore in South Carolina. And now the entire state was getting drenched. Including the hilly Upstate region.

This development was inconvenient enough. But a big time college football matchup between was set to be played Upstate, featuring the Notre Dame Fighting Irish and the Clemson Tigers.

Both teams were undefeated going into the matchup. The game was slated for a primetime kickoff slot, with the promise of a national TV audience.

A hurricane was not going to disrupt proceedings.

And so, the pageantry of the weekend went on. Fans rolled into town, and so did ESPN’s College Gameday.

The premier college football preview show set up a stage in the middle of Clemson University’s campus. And despite the rain and wind, the show went on as planned, with hosts bantering from behind a desk.

I was watching at home, and things didn’t look so bad at first. The canopy over the stage and the protective gear over the cameras likely had something to do with that.

But then, I saw the crowd behind the stage. Throngs of college students appeared to be nearly blinded by the windswept rain. And the ground they were standing on had become a boggy mess.

Suddenly, the cameras zoomed in on one student with a particularly youthful face. His shoes were off, and his pants were cuffed below the knee, Tom Sawyer style.

With the eyes of America on him, the student took off his shirt. Then he took a step back and leaped, faceplanting into a pile of mud.

The crowd went wild. But as I watched from my couch, I had a different reaction.

Horror.


Many of us have acute fears. Stimuli that cause us to panic, shut down and lose function.

Mine is mud.

The slippery byproduct of water and dirt repulses me like nothing else. I fear slipping on it, getting it on my clothes, or tracking it into my home or my vehicle.

This aversion is quite on brand for me. I am a neat freak. And nothing is as stubbornly messy as mud.

But the lengths I go to when avoiding this substance are somewhat extreme.

I’ve turned down opportunities to cruise in ATVs before, for fear of getting mud on my clothes. I’ve avoided hiking or running on dirt trails for weeks after a rain event, just to keep my shoes clean. And back when I was playing baseball as a kid, I was too frightened to slide on a wet field.

I realize this behavior is totally irrational. Getting dirty is not the end of the world. And there are plenty of proven ways to clean the mess off.

Yet, I can’t help myself.

I’m not alone in this regard. While I haven’t met anyone who avoids mud the way I do, I know plenty of people who have gone to irrational lengths to avoid their own fears.

But that’s starting to change.

There is an abundance of services out there to reform the spooked. Services that dub themselves immersion therapy.

The premise is straightforward. Immersing someone in the stimuli they fear can reduce their anxiety. It can show the worst outcomes to be unlikely or nonexistent. And the process can break the spell of fear.

And so, many have covered themselves in insects, touched the scaly skin of snakes, or listened to the boom of fireworks. They’ve done all this to face their fears head on.

Perhaps this is what that college student at Clemson University was doing when he bellyflopped into a mud pit on national television.

But I wasn’t about to follow his lead.

I knew better.


What is a fear anyway?

Is it an aversion we’ve picked up through experience? Or something we’re born with?

Many point to the first explanation. They see our origins as blank slates, onto which societal stressors – such xenophobia or bullying – and individualized stressful experiences – such as dog bites or near-drownings – are projected.

This theory posits that fears are accumulated, rather than innate. Which makes it possible to unburden these fears through methods like immersion therapy.

It’s a neat theory. A tidy one. And one that might be too good to be true.

Indeed, I’ve come to believe that the second explanation for fear is more accurate. I assert that fear is part of our DNA from Day One.

There’s plenty of evidence behind this assertion. Infants can curl their bodies in a protective stance long before they can crawl, talk, or understand language. And many physical changes to human genetic code over millennia have helped shield against lethal dangers.

Fear is an element of our survival. One that keeps us from becoming an unwitting snack for a lion or from wandering aimlessly off a cliff’s edge. It’s an inextricable part of us.

Even the most societal-oriented fears can fall under this definition. It’s true that no one is born racist. But the fear of abandonment from the pack is most certainly innate.

Redirecting the source of that existential fear from the pack to the outsider is a predictable shift. Why let the fear become a self-fulfilling prophecy when it can be used to keep our pack’s competitors at bay?

We gain security and acceptance in this process, without experiencing any of the pain of our actions. It’s a no-brainer, on the most primal of levels.

Yes, fear is an inextricable part of us. It always has been. And it always will be.


So, what does this all mean for immersion therapy?

Is it a farce? A sham? A load of nonsense?

Yes and no.

It’s undeniable that immersion therapy has some positive outcomes. Those who are terrified of spiders, or heights, or whatever else can find equilibrium around the same stimuli. They can live life more freely and fully.

These are all good outcomes. Desired outcomes, really.

But these fears have not been cured in the process. Arachnophobes remain arachnophobes, even if they no longer turn ghostly pale in the presence of spiders. Acrophobes are still, at their core, apprehensive of heights.

No, what immersion therapy has actually done is reframed the fear. Instead of reacting to the previously distressing stimuli, the brain has been trained to ignore them. The reaction that the phobic experiences – the one visible to others – it’s gone.

Yet, the fear itself remains in some far corner of the phobic’s brain.

This is not a trivial distinction.

For our society has consistently misrepresented fear. We’ve determined that it’s something that can be rooted out. That must be rooted out.

And so, we’ve waged multifaceted campaigns to create a world where racist, homophobic, and anti-faith impulses cease to exist. We conduct wide-scale immersion therapy to promote a world that is more equal in terms of acceptance and opportunities.

We make progress. We inch closer to the finish line. And then the ugliness rushes right back in.

This whole process is demoralizing for those crusading against the darkness of fear. They can feel like Sisyphus – pushing a boulder up a hill, only to see it tumble back down in the end.

But perhaps a shift in perspective can get them off this hamster wheel of misery.

Perhaps those crusaders can abandon their pursuit of the root cause of fear. And perhaps they can focus on redirecting its manifestations instead.

This means eliminating racist, homophobic, or anti-faith actions – all while acknowledging that the underlying Fear of the Other will remain.

The crusaders can still turn to immersion as their preferred tactic. But they must recognize that their efforts simply constitute a rewiring, not a demolition. The ignition coil can be manipulated, but the engine remains in place.

Such a compromise might be a hard pill to swallow, particularly for those with the purest of ideals. But it’s a necessary one. Particularly if we want to attain the objectives we strive for.

The immersion fallacy is real. We must govern ourselves accordingly.

Embedded Insecurities

It’s a three-story building.

Tan brick facades. Double-hung windows. A distinctly 1920s look.

On each of the edifice’s four sides, a set of doors provide entry. Above them, four Roman columns support a structure holding a modest clock.

The building is quaint. Not majestic.

And yet, it’s of great historical importance.

This building, you see, is the Old Collin County Courthouse. It sits in the center of a leafy square in downtown McKinney, Texas. A bevy of shops and restaurants surround the square in all directions.

Long before Dallas’ suburban sprawl overtook McKinney, this was the heart of Collin County. It’s where residents would gather to conduct business and gather supplies. It was a gathering place.

That spirit is still alive in the shops and restaurants surrounding the square – a refreshing oasis from the strip malls so prevalent in greater Dallas.

It’s still alive 32 miles west in Denton, where another set of shops and restaurants surround the Old Denton County Courthouse. And it’s still alive 28 miles west of there in Decatur, where some modest establishments buttress the Wise County Courthouse.

In fact, a similar scene can be found in many of Texas’ 254 county seats. Nearly every town has its county courthouse – or former courthouse – on a square, with shops and eateries around it.

The same can be said for municipalities outside the Lone Star State. When I visited the town in rural Missouri where my father was born, it had the same setup as McKinney. So too have towns I’ve frequented in North Carolina, Nevada and Vermont over the years.

This is no coincidence.

The courthouse square setup is an American staple. And while its utility might have faded in the era of 15-gallon gas tanks and Walmart supercenters, its importance most certainly has not.


Did you hear?

Those three words represented the start of seemingly every conversation when I was in high school.

Gossip was the name of the game, and we all fancied ourselves to be Michael Jordan.

It would be harsh to fault us for these delusions. Adolescence is a near-impossible assignment. A quest to find the answers within while complying with the abstract ideals of coolness.

It’s confounding mission. One that could demoralize and distress even the strongest willed of teenagers.

And we were no match for it.

So, we shifted our gaze. We galvanized around the stumbles our peers made on the journey. The land mines that we could avoid, now that others had triggered the trip wires.

We gossiped.

Most of this gossip made the halls of my high school the old-fashioned way. Someone witnessed something – or claimed to – and shared it with the group.

But a nascent technology called social media had also found our cohort. And suddenly, some of the fodder for gossip was originating online.

Things, of course, are far different these days. Online rumors re now the norm, not the exception. And social media-based discourse has gotten so toxic that it’s spawned a new name – cyber harassment.

This has led to severe effects for modern-day adolescents. And those effects have led some states to consider bans on social media for minors.

I understand where this movement is coming from. Several young people have taken their own lives because of cyber harassment. It’s tragic, and I feel for their families and friends.

But I do wonder if the proposed bans will have the desired effect. For the root cause of the toxicity afflicting adolescent culture is not social media – or even the Internet itself.

It’s gossip.

And gossip is firmly rooted in our society.


Back to that county courthouse in McKinney, Texas for a moment.

The building sits mostly vacant now. Courtrooms and county offices reside in an expansive building five miles away.

The modern courthouse is surrounded by parking lots and a highway. A supermarket and several other stores sit a couple exits down the highway, along with a movie theater and an assortment of restaurants.

The highway is now the central corridor for McKinney residents. Anyone looking to pick up supplies, take in mass entertainment, or conduct official business sets their vehicle’s GPS for U.S. 75. The shops and restaurants around the old courthouse – while still frequented – are off the beaten path.

This modern arrangement has its advantages. Residents can gather supplies from store shelves, pay for them at a self-checkout kiosk, and load them into their car in the parking lot – all without making eye contact with another human being. Efficiency reigns supreme.

But at what cost?

You see, back when the highway didn’t exist and the courthouse was based downtown, the luxury of secluded shopping simply did not exist.

Anyone heading for supplies was going to have to head to the courthouse square. They were going to have to engage with the store clerk, even if just to hand over payment. They were going to see other locals milling about. And those other locals were going to see them.

Any misstep in this adventure would be harshly scrutinized.

Whispers would softly spread around town. And judgmental stares would brand the afflicted like a hot iron.

Yes, the gossip mill was as much a part of life as maintaining a vocation and putting food on the table. Commerce on the courthouse square took two forms of tender – dollar bills and embedded insecurities.

People measured their success not only by what they had, but how it measured up to others. The fear of inadequacy loomed large.

Treks to the courthouse square offered opportunities to disprove that notion. To put on airs, to act proper, to get a pulse of where one really was. And hopefully not to be confirmed as a pariah in the process.

These days, that style of commerce has faded. But if we think the associated demands have not, we’re kidding ourselves.

People are still dealing in embedded insecurities. They’re still keeping up with the Joneses and yearning to gain acceptance.

But now, they’re doing all this online. They’re depending on an unsavory place where judgement converges from all angles at warp speed.

Yes, everything from neighborhood forums to social media mom groups to websites like People of Walmart lives in cyberspace 24/7. And all of it turbocharges the courthouse square effect.

McKinney, we have a problem.


How do we solve the puzzle? How do we reconcile our desire for validation with the risks of critique-based abuse?

These questions have dogged us for a couple decades, if not longer.

Some have proposed attacking the riddle’s central premise. By ridding ourselves of embedded insecurities, by affirming that we are adequate and no one else’s perceptions are worth a damn, we can sidestep the strife entirely and live happily ever after. Or so they say.

It’s an appealing concept. But not a realistic one.

You see, embedded insecurities are not a bug of our society. They’re a feature of our existence. They’re hard-wired into our brains for a reason.

Like just about any other species, we rely on a group for security. Without the power of the pack, we are so much more vulnerable to so many threats.

We stand little chance of warding off these threat time after time on our own. Fight or flight only gets us so far.

So, we find sanctuary in numbers. We conform to shared rules and make ourselves presentable to masses. All while harboring anxiety about triggers for rejection.

Drowning out this impulse won’t cure us of its effects. It will only accentuate them.

No, the key is to channel those embedded insecurities. To balance those inevitable questions of adequacy with constructive answers. To openly engage and to grow from the interactions.

And to do all this away from cyberspace. Far afield from the trolls, keyboard warriors, and endless scrolls that do us no favors.

It’s time to engage with each other in public again. Human to human, with our five senses as a guide.

It’s time to pick up on cues – both verbal and nonverbal – and to adapt our behavior accordingly. To be honest without being cruel. To find a common denominator of acceptance, even with those we disagree with.

The courthouse square might no longer be the physical center of society. But its spirit still can be.

Let’s make it so.

Gardens to Tend

Swish!

It was the telltale sign of a good shot in basketball. The audible marker of an orange ball grazing the nylon strings of a net.

Growing up, I’d watch a fair number of basketball games on television with my friends. Michael Jordan or Kobe Bryant would launch that orange ball into the stratosphere. And as gravity brought it closer to the basket, I would wait for that sound.

Swish meant success. And success is what kept me watching.

My friends would often focus on other aspects of the game – the crossover dribbles, the thunderous slam dunks, the gaggle of celebrities sitting courtside. But I was fixated on that swish.

It sounded cathartic. It provided more context than the often-blurred TV picture could.

And it was something of a novelty.

You see, I didn’t just watch basketball with my friends. I sometimes played it them as well.

These informal games or shootarounds often took place on outdoor courts in local parks. We brought the ball. The park had the rest.

Well, some of the rest.

You see, the park courts wouldn’t be confused for the glamorous ones Jordan and Bryant dominated on television. Instead of hardwood, there was blacktop. And instead of nets, there was…nothing.

There’d be no swish sound to indicate a made basket. There might be the clank of the rim or the thud of the backboard if a ball didn’t make its way through the hoop cleanly. But on the purest of shots, you’d hear nary a thing.

This bothered me. So, one day, I asked my father why the nets were missing.

I think those hoops used to have them, he replied. But then someone stole them. And that will keep happening if the city put new ones up. So, they’re leaving them be.

I was floored.

I’d never considered that public basketball courts could be anything but a net benefit. I’d never contemplated how others could use that public access for bad intent.

But now the blindfold was off. And there was no going back.


Several years ago, Malcolm Gladwell took aim at country clubs.

The acclaimed author and podcast host had traveled to Los Angeles for business. But when he ventured out for a morning run, he found himself relegated to a narrow dirt path wedged between a busy boulevard and the high fences of a golf club.

Those fences infuriated Gladwell. So, he made a podcast episode about them, and lambasted what they represented.

In the episode, Gladwell questioned why a group of golfers got exclusive access to the outdoors in a car-dominant city. He pointed out that in Canada and Scotland, golf course grounds were open to the public on certain days, or in certain parts of the year.

Surely, America could follow this pattern too, Gladwell argued.

This reasoning appealed to me. For I’ve long detested country club culture.

The exclusivity. The snobby attitudes. The idea of paying dues to get outdoor access.

None of it jibed with my own experience.

When I was growing up, I swam in the ocean at public beaches. I hiked through public nature preserves. And I played those aforementioned basketball games in the park with my friends. All without paying a dime.

These adventures were formative in my life. And I felt others deserved similar opportunities.

But I realize now that things were never quite so simple.

I might have moseyed about in my youth, enjoying myself for free. But there were others who looked after the spaces I frequented. A full roster of folks who kept those locales tidy and kept me safe.

There were workers at the park who mowed the grass and cleared the trash. There were lifeguards at the beach who saved swimmers from drowning. There were forest rangers who ensured the trails at the nature preserve were safe for hiking.

These officials took their jobs seriously, and they acquitted themselves well. But one must wonder if they felt as if they were rolling a boulder up a hill.

You see, the open nature of these parks, preserves, and beaches made their work obsolete quickly. Even after their garden was tended, a new crowd would converge on the space to lay waste to it once again.

It was as if they were fastening a fresh basketball net to the rim each day, with full knowledge that it would be gone by evening. No amount of salary or plaudits makes this work rewarding. And in a vacuum, the arrangement itself hardly seems to make much sense.

So, no, the answer to the country club problem isn’t as simple as Gladwell made it. But a better solution is out there.

We just need to shift our perspective to find it.


A few years ago, I captained a neighborhood kickball team.

The team didn’t play all that well. But where we played was anything but.

All the games in our league took place at a nearby sports complex. The fields were well-designed and meticulously maintained. They were far better than some of the fields I played high school baseball on.

And the scene outside the lines was no less impressive. Knowledgeable referees oversaw the kickball contests. And representatives from the recreation department kept an eye on the proceedings, resolving any situations that might arise.

This all floored me at first.

For we were technically playing ball in a city park. A public space, open to all.

And yet, none of the associated chaos had found its way here. It was all so…organized.

Perhaps this had something to do with where we were. Namely, a small suburb outside of Dallas. There was plenty of space to be found all over town, and thus little impetus to run this complex into ruin.

But I think the orderliness could also be attributed to the fine print of the kickball league. All teams had to pay a fee to register. (The community manager for the neighborhood I live in covered those costs for our squad.)

On top of that, all property owners in this suburb paid taxes and fees – with that money supporting both the recreation department and the sports complex.

These costs, while not exorbitant, sent a powerful message.

Yes, you can play ball here. No tall fences will keep you out. But if you think you can desecrate these fields on our dime, you’ve got another thing coming.

This suburb was not letting anyone shield their eyes from those who tend the garden. That process was instead shared, allowing order to rise from chaos.

Perhaps this is the model Malcolm Gladwell is looking for. Perhaps this is the scenario my younger self would have thrived in.

There’s only one way to know for sure.

So, let’s stop treating public spaces like entitlements. And let’s start treating them as gardens to tend instead. Let’s mind the space as if it were our own. And let’s respect those tasked with maintaining it.

A little shift can go a long way. Let’s forge that path.

The Custer Bias

In June of 1876, a regiment from the 7th Cavalry of the United States was in a conundrum.

Scouts had found a large encampment of Native Americans along the Little Bighorn River in the Montana Territory. This tribal encampment violated laws confining Native Americans to reservations. And the 7th Cavalry’s mission was to force them to comply.

Relations between native tribes and United States Army installments in the region were not good. Several skirmishes had broken out between the two already, and the 7th Cavalry had every reason to believe this encampment would be hostile to their demands.

And so, the regiment’s leader – Lieutenant Colonel George Custer – charted an attack. The Cavalry would be split into three brigades, encircling the encampment. The troops would trap the natives into compliance.

It was a bold strategy, but also a risky one. Custer had no idea how many warriors might be among the tribe, and how those ranks compared with his own. He also knew far less about the terrain his regiment was on than the natives.

By pressing ahead, Custer was taking a chance. And anyone who’s taken an American History class knows the rest.

The first brigade got bogged down by Lakota Sioux and Chayanne warriors just as Custer’s brigade was trying to flank the encampment. Native warriors spotted Custer’s men and attacked them with superior numbers.

The brigade had nowhere to retreat to, and not enough firepower to press on. It was systematically cut down. Every member of its five companies – including Custer – were killed.

The Battle of Little Bighorn was effectively over. But the legend of Custer’s folly was just beginning.

For generations, Americans would hear of Custer’s Last Stand. It was the ultimate cautionary tale of risk gone wrong.


Why would Lieutenant Colonel George Custer attempt such a bold maneuver? Why would he so unabashedly put the lives of his men in danger?

Military historians have been trying to answer this question for decades. For Custer wasn’t exactly a novice when he reported to the Montana Territory. He was an accomplished military leader who had led Union Army brigades in the American Civil War.

The volunteers under Custer had repelled Confederate forces at just about every turn, including the Battle of Gettysburg in July 1863. He knew what he was doing.

Or did he?

You see, at Gettysburg, Custer’s volunteers faced off with a Confederate cavalry force twice their size. Custer’s brigade was somewhat detached from the heart of the Union Army, and the Confederate cavalry caught him by surprise.

Undeterred, Custer led counterattack after counterattack with his own cavalry. The vicious fighting stalled the Confederate brigades, effectively preventing them from rendezvousing with other columns of fighters.

Once the cannon fire of the main battle could be heard in the distance, the Confederate cavalry retreated. Custer had won.

Custer had taken a massive risk exposing his cavalry so extensively. The chances of them getting overrun were as good as them prevailing.

It was effectively a coin flip. But the coin came up in Custer’s favor. The risk paid off

This certainly gave Custer confidence. Confidence to assume even more risk.

That attribute was what allowed him to rise to the rank of Lieutenant Colonel, and to get posted to the Montana Territory. And it’s likely what made his daring plan at Little Bighorn seem anything but.

Live by the sword. Die by the sword.

Call it The Custer Bias.


Lieutenant Colonel George Custer met his demise nearly 150 years ago. So, why discuss his travails in such detail.

Well, I believe they’re just as prescient in this era as any other.

We are a far different country now. A global superpower, fueled by big business.

But today’s industrial leaders are just as happy taking risks as Custer was. If not more so.

This sustained boldness perhaps most notable in big tech, where companies can shift from hypergrowth to cost cutting on a dime. But it’s also present in manufacturing, thanks to the rise of just-in-time inventory processes. It’s present in retail, where large brands venture into new product lines or sales channels time and again. And it’s present in dozens of other industries.

There are many reasons why this behavior is so present among corporate leadership. Many have pointed out that investors don’t stand still. And neither do competitors.

That’s all accurate. But the biggest reason leaders lean into risk-taking? It’s The Custer Bias.

Think about it. Just about any corporate leader has already taken a risk to get where they’re at. Maybe they were an entrepreneur, who defied the odds to found the business they now helm. Or perhaps they rose through the corporate ranks, trying something bold to fuel their breakthrough.

With those bold moves in their rearview, these corporate leaders are keen on rolling the dice once more. For all the market forces out there – consumers, competitors, investors – are yearning for them to push the envelope.

Of course, it could all go sour. And if it does, they could lose everything they’ve attained.

Such an outcome would sting. But not as much as the status quo does for others.


It’s easy to idolize the maverick leaders. To deify a Steve Jobs or a Howard Shultz or a Reed Hastings.

But for each one of them, there’s someone who failed at their mission. Someone who took a risk to create a tech company, or revitalize a small coffee shop, or disrupt the entertainment industry. And someone whose risk didn’t pay off.

These failures lurk in the shadows. You can’t prove a negative, and we have little patience for tales of what could have been.

And so, we comb through the success stories. We search for patterns and commonalities – all while forgetting about the inherent randomness.

Yes, success can seem inevitable if you weed out the duds. And that delusion can make risk-seeking appear less dangerous. Safe, even.

This is undoubtedly a tragedy. Not perhaps not in the way you might think.

Those burned by a risk gone bad will surely suffer – regardless of whether it’s the first or fifteenth risk they’ve taken. But those who avoid risk will suffer even more.

For there is no safe passage for the risk averse these days. Those who play it safe will still find themselves under the direction of renegades.

They’ll find themselves reporting to dice rollers infested with The Custer Bias. In this modern era, how could they not?

Yes, it is all too possible to stay away from the fire and still get burned. But what if that such a fate wasn’t so inevitable?

It’s time to turn the tables on The Custer Bias. To be less cavalier with the risks we take. To pay more credence to the odds of chance. And to avert our eyes from the shine of favorable outcomes.

Such actions run counter to our nature. But they’re essential to our survival.

So, let’s stop following the path of Lieutenant Colonel George Custer. Our story deserves a better ending.