Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Some years ago, I was in conversation with Sadhguru Jaggi Vasudev. His sense of humour had the audience eating out of his hand. As moderator, one of the questions I bowled at him related to “gut instinct”. What did he make of the idea?
He hit it out of the park. “The gut is full of shit,” he said. The audience roared. I remember looking at him and thinking: What a rockstar.
But I’ve often wondered, since, is the gut just full of shit?
A crazy thought experiment occurred a few days ago. What if I ignored the gut entirely, for at least a week, and made my decisions as an algorithm would? Spoiler alert: I gave up the experiment on Day 2.
What got me thinking about the algorithm model was that most of us now look to the internet — Google, social media, and lately, AI programs such as ChatGPT — for help with our dilemmas. We assume that these tools, with their terabytes of data and “smart” algorithms, will squeeze out answers that are free of human folly.
Ostensibly, algorithms are trained to be relentlessly logical and painfully impartial. They don’t tire, don’t judge, and don’t end their day wondering if they made the “right” decision.
We, on the other hand, sit on piles of memories, biases and hunches. We sift through snippets of lived experience, often clouded by emotion. Backed into a corner, we depend on something we can barely define; a surge of feeling that we call “gut instinct”.
Studies suggest that this wordless prompt evolved as a survival tactic, drawing on memory and experience to prod the human in moments of key decision-making: “Yes, that is a good move” or “No, something’s not quite right here”.
Sometimes, that instinct is spot-on, as when it tells you another person is “shifty” despite no clear evidence. Sometimes, it is more indulgent than accurate (“You deserve that third samosa; you’ll make up for it later”).
Is it flawless? Hardly. It is, in fact, intriguingly unpredictable.
So I thought, why not take this variable out of the equation and try living “by the book” instead. If there’s data on everything, why not let it lead?
I started by setting some ground rules. Every decision would be based on logic and historical data: input, output, statistical probability. My software program of choice was Notion.
The first day kicked off with breakfast. Usually, I crave a mug of chai followed by eggs and buttered toast. But algorithms don’t have cravings. The data on optimal nutrition insisted on a soulless protein shake and egg whites. The buttered bread and chai were out of bounds.
Next off, Notion suggested the inbox be tackled. Typically, I let my gut guide me, responding to whatever seems most urgent, pausing to chat with friends, and occasionally ignoring emails from people I find annoying.
The algorithm didn’t approve. Unread emails were arranged, instead, by “priority”, based on sender history, response likelihood, and the algorithm’s understanding of productivity. The program suggested I had become about 15% more efficient between breakfast and lunch. But I missed dipping in and out of conversations for no reason but pure enjoyment. Algorithms, it turns out, don’t do enjoyment.
Then came a big decision. A friend called, someone I hadn’t heard from in months. Operating on gut instinct, I would have picked up without hesitation, knowing full well that it would steer me off course for a bit but also knowing that I would thoroughly enjoy the conversation.
Algorithms don’t do catch-ups. They assess cost and benefit. So the data suggested low value, with regards to the call. Per my new system, I should have ignored it. Of course, I couldn’t.
I answered, and you know what? We laughed, reminisced, and I hung up feeling genuinely happy. ROI? Impossible to measure. Value? Incalculable.
By Day 2, the limitations of living like an algorithm were inescapable. There are things algorithms excel at: organising files, managing time blocks, optimising workflows. But they don’t “care”: not about friends, or the joys of an impromptu afternoon coffee break. In that sense, they’re uniquely clueless.
Because life cannot always be about efficiency. The most precious bits of it, in fact, are made up of our weird detours, irrational whims and spontaneous side quests.
This little adventure has taught me to appreciate algorithms for what they’re good at: efficiently cutting through noise when the path is clear. But life rarely hands us clean datasets or clear-cut walkways.
Which is a core difference, isn’t it? When it comes to opportunity, aspirations, goals: The machines can compute; we create.
I will continue to turn to the bots to help make sense of the world. But my gut will remain that quiet, instinctive wayfinder on murkier paths. It often has this uncanny way of cutting through the fog.
(Charles Assisi is co-founder of Founding Fuel. He can be reached on [email protected])