Monday, December 30, 2013

Looking for a New Year's Resolution? Start a journaling habit

Starting a new habit is as traditional part of a New Year's celebration as eating black-eyed peas. This year, consider starting a journal. It's a great way to mark accomplishments and things you're grateful for (or things you would rather avoid in the future). I have been keeping an online journal for the past two years and it's been very helpful for finding mistake patterns, as well as logging accomplishments - which is very helpful as you enter the annual review process.
Use paper, a word doc, or my favorite - the 3Minute Journal app (disclaimer: I have been involved in building the app). Plan to set aside a couple of minutes at the end of each weekday (or every day) to write down the most significant event that happened and answer a few questions about it. Within a month or so, you'll have a rich repository of data about yourself that you can use to track your inner work life.
Ask a friend to try it with you!

Thursday, December 19, 2013

Breaking Bad creator describes rejections from networks

From Vince Gilligan, creator of Breaking Bad, on the Rich Eisen podcast, discussing the fact that several networks passed on the show before AMC picked it up.

Then your studio and you have got to find a distributor or a broadcaster. You have to find your AMC. And that was a bit of a process. Getting to AMC involved several "no thank yous" along the way. Which is not atypical. Every movie, every TV show, every book you ever loved, probably all the ones you hated too, even were said "no" to by a half dozen people or more. But all it takes is the one "yes."

Wednesday, December 18, 2013

Starting a business? Dislliking your customers is a crucial mistake

This story is from Steve Blank's post "How Do You Want to Spend the Next 4 Years Of Your Life?," part of his ongoing series of advice to founders:

[As a startup founder,] now that you’ve gotten to know your potential channel and customers, regardless of how much money you’re going to make, will you enjoy working with these customers for the next 3 or 4 years?

One of the largest mistakes in my career was getting this wrong. I used to be in startups where I was dealing with engineers designing our microprocessors or selling supercomputers to research scientists solving really interesting technical problems. But in my next to last company, I got into the video game business.

My customers were 14-year old boys. (see 1:30 in the video) I hated them. It was a lifelong lesson that taught me to never start a business where you hate your customers. It never goes well. You don’t want to talk to them. You don’t want to do Customer Development with them. You just want them to go away. And in my case they did – they didn’t buy anything.

So you and your team need to feel comfortable being in this business with these customers.

The video Steve refers to is below:

Monday, December 16, 2013

The mistake of the glowing hockey puck

In a recent interview, Hank Adams, the CEO of Sportvision, the company behind the yellow first-down line superimposed onto the field during TV broadcasts, discussed the company's first augmented-reality project, the glowing hockey puck. The puck's blue aura was intended to help viewers keep track of the fast-moving object as it slid along the ice. But like many innovations, it was greeted with disdain by the core fan base which was happy with things as they were. Adams spoke to NPR's Audie Cornish about it:

Adams: It glowed, and we actually embedded electronics in the puck. It was such a phenomenon.... It captured popular attention. Some people loved it, some people hated it....

Cornish: Over time, people looked at it unfavorably. By the end of the two year [trial period]...they quit using the puck, and a lot of hockey purists still complain about it.

Adams: They do... for the hardcore hockey fan, they felt that it was over the top. It's something that, if we ever did it again, we'd be a lot more subtle about it, probably do it during replay. We did it in those cases live, during a live broadcast. We'd be a little smarter about how we went about it.

Thursday, December 12, 2013

Negative results are decreasing in scholarly papers

One of the side effects of our fear of mistakes is the discrediting of negative findings. On the few occasions when I played craps in a casino, I noted how poorly the other players at the table reacted when I bet the "don't pass" line - essentially, betting on the dice roller to fail - when the outcome of a roll was perfectly random and the expected payout was no different whether you played pass or don't pass.

The craps example demonstrates how dysfunctional trying to deny the negative is. As Edison said, "[Negative results are] just as valuable to me as positive results. I can never find the thing that does the job best until I find the ones that don't."

Given the above, reading the abstract of this 2012 paper was both unsurprising and somewhat discouraging. Entitled "Negative results are disappearing from most disciplines and countries," by Daniele Fanelli and published in the March 2012 issue of Scientometrics, the paper indicates a significant increase of scholarly papers reporting that their study results supported the stated hypothesis, rather than disproving it:

This study analysed over 4,600 papers published in all disciplines between 1990 and 2007, measuring the frequency of papers that, having declared to have “tested” a hypothesis, reported a positive support for it. The overall frequency of positive supports has grown by over 22% between 1990 and 2007....

Fanelli notes some fascinating cultural differences in reporting negative findings, and included these wise words of warning:

A system that disfavours negative results not only distorts the scientific literature directly, but might also discourage high-risk projects and pressure scientists to fabricate and falsify their data.

Yes indeed.

[Hat tip @Mangan150]

See some prior posts on negative data in research: "Free the Dark Data in Failed Scientific Experiments," "Web site offers scientists access to lessons from failed experiments."

Wednesday, December 11, 2013

Football quarterback relies on a "database" of failed decisions

From Sports Illustrated's Monday Morning Quarterback column. Nick Foles of the Philadelphia Eagles is having a great season, including a remarkable run with no interceptions - but he almost did throw one:

On Sunday, trying to get some insurance for a 24-21 lead with four minutes left, Foles had a 2nd-and-7 at his 34-yard line, and he faced a heavy rush. Instead of throwing it away, Foles floated one down the middle of the field into coverage. Cornerback Patrick Peterson picked it off—and there went the Foles streak. But a late flag came flying, and Tyrann Mathieu was called for holding wideout Jason Avant.

[Translation of above for people who don't follow American football - the quarterback dropped a few steps behind the line of scrimmage and looked to pass. As guys from the other team came close to tackling him for a loss of yardage, he threw the ball inadvisedly down the middle of the field, where there were lots of opposing defenders. One of them caught the ball for an interception. However, the referee called a penalty on another player for holding, and the play was negated and the interception didn't count.]

“Man, horrible throw, horrible decision,” Foles said from Philadelphia an hour after the game. “When I saw the flag and heard the call, I said, ‘Thank you God.’ I learned my lesson there. But that’s what I try to do: I build a database with decisions like that, and I learn from them. If I get that same look [defensive formation] the next time, I’ll make a different throw, or I’ll throw it away.”

Do you have a database for your decisions that don't work out, and what you'll do differently the next time you're faced with the same situation?

Monday, December 9, 2013

Failure guru Amy Edmondson deconstructs the Healthcare.gov fiasco

Amid all the breathless news coverage of the failed rollout of the Obamacare Healthcare.gov website, we now have some genuine analysis, courtesy of one of my heroes, Amy Edmondson of Harvard Business School ("The Mistakes Behind Healthcare.gov Are Probably Lurking In Your Company, Too"). She may be more qualified than anyone to weigh in, given her deep research experience in learning from mistakes and failure in very complex situations (including healthcare). A couple of potent excerpts:

Healthcare.gov is a good example of the importance of learning small and fast, rather than rolling out a risky new product or service launch all at once. Cycling out in phases includes the expectation of early failures – and demands all hands on deck to learn from them along the way. A roll-out, in contrast, implies that something is all set, ready to go — like a carpet. All it needs is a bit of momentum to propel it forward. For complex initiatives, of course, this is simply not the case. Getting people motivated enough to change is not the real challenge; it’s getting them engaged enough to learn — to become part of a discovery process.

and...

Managers must make it clear that they understand that excellent performance does not mean not making mistakes — it means learning quickly from mistakes and sharing the lessons widely.

Thursday, December 5, 2013

Story: don't pretend to be something you're not

This brief story was related by Dolf van den Brink, chief executive of Heineken USA, and was published in Adam Bryant's Corner Office column in the New York Times:


One big mistake I made came from listening to a lot of the advice I heard before I took the job. People told me: “Dolf, you need to be strong. You need to command respect because this is a tough environment.” I was 32, and I probably looked 28, so I tried to behave and look older than I was. After three months I was losing weight, we weren’t getting any traction, and I was drained. My wife said to me: “Just be yourself. Stop pretending.” I started wearing casual clothes and just started being myself.

Wednesday, December 4, 2013

Story: Cultural ignorance leads to "Fruitgate"

Here's a great story about a cultural faux pas from Deb Weidenhammer as presented in the New York Times You're The Boss blog. Weidenhammer is CEO of Auction Systems & Appraisers.

The day before “Fruitgate,” as we now term it in my China office, I hosted a luncheon at a famous restaurant in Shanghai. I invited members of my professional association, including government officials and several high-ranking private sector professionals.

Over a very pleasant lunch, we talked about the differences between business practices in America and China. While there were no breakthroughs, it was a fun few hours of cross-cultural sharing. I left the event feeling secure that I hadn’t made any major mistakes in my hostessing.

When I arrived at the office the following day to prepare for my return to the United States, the Chinese office manager told me that a call had come in from the association’s chairman with the message that I was invited to go fruit picking the next day with several of the other members.

Thinking the invitation a courtesy, I asked the office manager to let the chairman know I would be unable to make the event, as I was returning to America. The message was relayed, and I boarded my flight believing the invitation was evidence of the good job I had done at lunch.

When I returned to the Shanghai office a few weeks after the incident, I learned from my most senior Chinese manager that I had made a grave mistake — perhaps my biggest to date in China. In my Western view, I had a scheduling conflict. I had appointments in the United States and a precious seat on a flight home. An American business executive would have understood.

But in the view of my Chinese contacts — as it has been explained to me several times since — I came off as arrogant, believing myself too important to change my schedule to attend the event. There was clearly no ill will on my part, but perception is reality. I was dead wrong in declining the request....

The damage was done, and I am still paying the price. Since Fruitgate, I have heard tell of my reputation for being conceited and self-important from more than two dozen people who had no firsthand knowledge of the situation. Despite my ongoing efforts to patch things up, I heard yet again about my legendary arrogance just a few weeks ago.

What makes it worse is that I actually would have enjoyed the experience of picking fruit in China’s countryside. My inattention to my guanxi [network] in that single moment will take years to repair.

So here’s my advice: whatever else you do in China, always say yes to fruit picking.

There are several more cultural faux pas stories on the site, including this one from me and a comic story from Josh Neufeld.

Monday, December 2, 2013

Best Books of the Year 2013

Here's my yearly "best of" list. It's been a great year for mistake books, as you can tell by the Bookshelf feature we've done the second half of the year. I'd like to thank everyone who suggested books and ask you to please keep the suggestions coming. As always, these are books I read this year - some were published before 2013.

1.  Antifragile: Things That Gain from Disorder, Nassim Nicholas Taleb. At times long-winded, petty and vindictive, it has also got more vital, energizing ideas in one chapter than most great books contain in their entirety. Read this book, and you will appreciate all the more why economists cannot be trusted with the economy, the many ways in which philosophy trumps science, and the wide applicability of the "turkey problem." (Yes, think Thanksgiving.)






2. The End of Competitive Advantage: How to Keep Your Strategy Moving as Fast as Your Business, Rita Gunther McGrath. A great synthesis of trends that have been emerging for the past two decades, McGrath demonstrates how many of the core assumptions of Michael Porter-style strategic thinking (industry-based competition, the idea of long-term competitiveness, etc.) are now outmoded, and competitiveness now means the ability to nimbly discover, exploit, and move on from briefer periods of advantage.

3. The Logic Of Failure: Recognizing And Avoiding Error In Complex Situations by Dietrich Dörner. Written and published in the 1990s, this is a terrific and compassionate look at how people are befuddled by complex problems - those with many contributing factors, side effects, and time lags between cause and effect - and how we can learn to manage these problems better. For example, think before you act, make small changes, and carefully track results.

4.  Intuition Pumps And Other Tools for Thinking, Daniel C. Dennett. A huge book with all sorts of tools to help people find their way around intellectual problems. My favorite tool is the first: Making Mistakes.

5. What I Learned Losing a Million Dollars (Columbia Business School Publishing) Jim Paul and Brendan Moynihan. Very few types of work are as rich a source for mistake stories as investing. In this human, funny and compassionate work, Paul recalls his rise in prominence as a commodities investor, and then his rapid fall. In the second half of the book, he dissects his decisions and discusses the psychological factors that helped them go so wrong. [This was one of the many interesting books recommended by Taleb in Antifragile.]





And may I present one more item for your holiday shopping consideration? Contact me at mistakebank (at) caddellinsightgroup (dot) com for bulk pricing on the paper edition.

Thursday, November 28, 2013

Story: Can't anyone here set a thermostat?

Following on from last week's post on Dietrich Dörner's The Logic Of Failure: Recognizing And Avoiding Error In Complex Situations, I had this experience at a client site recently.

The conference room was too warm, so I took off my jacket. Our host went over to adjust the thermostat near the door. Within fifteen minutes the room was too chilly, and on went my jacket again. Then the director came in and adjusted the thermostat up. Ten minutes later it was too warm, and I took my jacket back off. We were in the room two hours and it never settled into a comfortable temperature despite several more readjustments.

The moral is that, as  Dörner pointed out, people have difficulty understanding control situations in which effect lags cause. In this case, the room took a while to cool down after the thermostat was set lower. The host tried to compensate this by moving it down a larger degree, hoping that this would result in faster response, when in fact it resuled in a far cooler room. The director then overcorrected, and the room was too warm. It's good to have a healthy dose of humility when dealing with complex situations. We control much less that we think we do, and the controls we put in place often give us more than we want. Better to watch and wait for a while, and make small, incremental movements.

Tuesday, November 26, 2013

Failure Forum: Learning from Redline Entertainment, Best Buy's Media experiment

This first appeared on Matt Hunt's blog. Reposted by permission.

In the early 2000s Best Buy launched their first entertainment media label Redline Entertainment.  The goal was to help grow the organization vertically into the entertainment industry.
As an entertainment label Best Buy would sign artists to create new material, produce the content, and provided distribution of the final products into retail outlets – Best Buy stores and others.  This was a new area for Best Buy but it was tangent to their core business of selling electronics, appliances, and media.  Coming off of a successful national expansion Best Buy had strong momentum and it was hungry for opportunities to continue to grow their business.
Jennifer “JJ” Schaidler had unknowingly altered her career when she took the lead role in building out Redline Entertainment.  As with many innovation projects the team had a good plan in place but without all of the pieces together it would be almost impossible to test individual hypothesis.  Redline’s success didn’t hinge on just one element but on a series of organizational errors that JJ documented for the company in what came to be known as the Redline Whitepaper.
What makes JJ’s story unique is that she not only was willing to own her mistakes but she was willing to document them and share them with others inside the organization.  Many executives would cower from this idea as career suicide but not JJ.  During my tenure at Best Buy, JJ’s Redline Whitepaper had provided an example of what good innovation work should look like: build your hypothesis, test your hypothesis, and share the results of your tests – good or bad.  This is JJ’s story.
1. So Redline Entertainment was going to be Best Buy’s entertainment media label.  How did that idea come about?
Our Senior Vice President of Entertainment, Gary Arnold, had the idea of growing our business by creating our own label and developing our own content.  This was right around the time that Best Buy had purchased Musicland (including Sam Goody) and Future Shop in Canada.  The idea had come from two concepts:  1) that the combined entities offered a huge distribution channel and 2) the artists were becoming increasingly frustrated with their labels and their binding contracts.  The plan was that Best Buy could go straight to the artists and offer distribution but allow them to own their masters.
At that same time, Best Buy was defining the ecosystems that they wanted to grow and expand into diverse businesses –even non-retail businesses.  Entertainment was one of those ecosystems.  Starting a label seemed like an adjacent idea where we could bring the leverage of the enterprise with all of the storefront assets.  We had been developing direct relationships with the artists and had connections within the manager community.
A critical piece to the puzzle was that Redline also had a distribution relationship with RED distribution (no affiliation).  RED was the independent arm of SONY distribution and they were in theory able to get Redline products into Target, Wal-Mart and all the rest of the music retailers.  That foot print would allow us to offer the same distribution as a major label.  The advantage would come from additional marketing and advertising from the Best Buy entities.
Ultimately the competitive nature of the other retailers was the undoing of Redline.  They knew that the products from Redline came from Best Buy and they didn’t want to support a competitor.
2. When you committed to the project did you consider what would happen if you failed?  What kind of odds were you giving yourself for success?
I actually thought there was a likelihood of failure but it didn’t concern me.  My perspective was that the company was growing so fast that it would find a place for me.  In retrospect, I should have been more concerned.  Only after I had taken this new role was it clear to me that going back to my old role as Vice President of Advertising wasn’t an option.
3. Where there ever expectations set by the company for what would happen if things didn’t work out?
No.  Truthfully we had never done these types of innovation projects before so it wasn’t discussed.
4. How long did the project last?  What was the ratio of the time spent planning vs. executing?
Around two years.  The planning phase was really getting the business plan approved and that took 3-6 months.  Execution or “signing” of artists and projects were started before the plan had full approval.
5. Was there ever a clear indication that they project wasn’t going to succeed?
Yes, there were several factors that popped up where we knew that we had problems: 1) our inability to get significant radio air play for our artists – radio was still a driving force behind sales, 2) the resistance/refusal from Target / Wal-Mart to buy Redline products, and 3) the lack of incoming revenue while signing on new projects.  If we were starting our own external company we would expect there to be a lag while building the portfolio of business but within a corporation there quickly needed to be something that was showing a positive return.
6. What was the most difficult task in shutting the business down?
For me the most difficult task was letting go of our people.  The truth was that they didn’t do anything wrong.  It wasn’t their fault.  Many people did find other roles at Best Buy so we were pretty successful at transitioning but for those that didn’t make the transition it was painful.
7. After you had shutdown Redline you did something that had never been done before at Best Buy, you wrote a formal whitepaper on what had been learned through the project.  Can you explain why and how that happened?
In a budget presentation with the President, he commented that “I’d be happy to lose $7m dollars on Redline if we really learned something from it.”  In addition, he was always referencing the Clay Christensen book – Innovator’s Dilemma.  I read the book and believed that Best Buy was exactly in that classic problem.  So I wrote the white paper as a way of illustrating to the company that we would need to change how we do innovation if we wanted to succeed.  At that same time Best Buy had hired the consulting firm Strategos to help build out an innovation process.  I participated in that work and witnessed many of the same problems repeating themselves.  When Best Buy hired Kal Patel, he read the white paper and encouraged others who were trying to innovate read it.  It ended up taking on a life of its own.  That was good because in one sense – it was a $7m white paper.  Too bad I didn’t get any royalty payments on it!
8. Have you used the lessons from Redline’s failure in your work since then?
I still get emails from time-to-time from people asking me to send it to them.  The frequent comments are that not much has changed since it was written in 2002.  The bottom-line is that innovation inside of large organizations is very difficult.  It takes people that are willing to take risks and willing to fail.  When a company is growing and has the funds to support innovation it makes it less risky.  Public corporations that need to report quarterly profits are also extremely tough.  When the numbers aren’t looking good, new ideas that just haven’t had enough time to turn a profit are the easiest to cut.  In our estimation Redline needed five years.  It only had two.  There was no appetite to wait that long.
Following her role with Redline JJ went on to lead many other strategic initiatives at Best Buy, including the initial launch of the Best Buy & Carphone Warehouse joint venture – Best Buy Mobile.  She continues to take risks in order to drive innovation in her work and in her career by continually defining new opportunities.  JJ is currently General Manager for Brightstar – the world’s largest specialized wireless distributor and mobile service company.

Monday, November 25, 2013

Founders who sold or didn't sell reflect on their decisions

In the New York Times, this article provides a very cool window into the minds of entrepreneurs who sold (or didn't sell) their companies. The founders' recollections provide a glimpse into some deep stuff, including how our significant decisions look upon reflection, what is a mistake, etc. Here's PayPal co-founder Max Levchin recalling his next startup experience:

His next company, Slide, was a different story. It made social apps and sold to Google for $228 million. Google shut it down a year later.

“The honest truth about Slide was we were a five-year-old company that had wandered through the desert for a long time wondering what business to be in,” said Mr. Levchin, who later started a new software company, HVF. “I wanted to top PayPal and it didn’t work.”

And here's Ben Horowitz on selling the company he co-founded, Opsware:

“I spent eight years, all day every day, trying to build this thing, and all of a sudden it’s gone, it’s just over,” he said. “It’s a little bit like something dies.

“That decision was one of the most isolated and alone decisions you ever make,” said Mr. Horowitz, who now advises entrepreneurs as a venture capitalist at Andreessen Horowitz. “On the surface it looked good, but I tell you after I sold the company I had total seller’s remorse.”

And Philippe Courtout on cc:Mail:

Mr. Courtot received a second acquisition offer, this time from Lotus Development for $55 million in cash.

Under Lotus, cc:Mail grew from four million users to 24 million, until IBM acquired Lotus in 1995 and shut down cc:Mail. Microsoft Mail eventually became Outlook.

“I should not have sold,” said Mr. Courtot, who is now chairman and chief executive of Qualys, a security company that went public last year. “That was my biggest regret. We could have moved much, much faster and brought it to the cloud. But such is life.”

Thursday, November 21, 2013

Mistake Bank Bookshelf: "The Logic of Failure"

This week’s selection on the bookshelf is an amazing book I discovered thanks to a tweet from Roxanne Persaud (@commutiny), The Logic Of Failure: Recognizing And Avoiding Error In Complex Situations by Dietrich Dörner. It was published in English in 1995, but is still (apart from a couple of dated references to the difficulty computers have playing world-class chess) amazingly current.

Dörner is professor emeritus of psychology at the University of Bamburg. In the book, he discusses many psychological simulations that illustrate how difficult it is for people to manage in complex environments. In the experiments, subjects are given a complex objective – say, to manage the well-being of an African tribal community by allocating water, seeds, etc. – and the ability to make periodic interventions. Due to the many psychological biases and blind spots we have, this task is very difficult, and very few participants can successfully keep things as good as they had been before the simulation began (many result in the collapse of the society). Participants develop tunnel vision, overcorrect for mistakes, and act before thinking. I have done some of these exercises and have suffered a similar fate. The few successful subjects observe before acting, develop an understanding of the interrelation of the system’s parts, and manage side effects. The average performance in this simulation might be cause to keep us humble about our ability to successfully intervene in the developing world.

One factor complicating the ability to manage complex systems is the time lag between cause and effect that these systems demonstrate. When a change is ordered, the result may not be seen for days/weeks/months, and may even be obscured by other factors. Dörner’s illustration of the difficulty subjects had with a relatively simple task containing such a lag (regulating the temperature in a room by adjusting a dial) will give pause to anyone thinking about proposed technological solutions to manage global warming.

There is a powerful amount of insight in this book. Some examples:

If, the moment something goes wrong, we no longer hold ourselves responsible but push the blame onto others, we guarantee that we remain ignorant of the real reasons for poor decisions, namely, inadequate plans and failure to anticipate the consequences….

This tendency to “oversteer” is characteristic of human interaction with dynamic systems. We let ourselves be guided not by development within the system, that is, by time differentials between sequential stages, but by the situation at each stage. We regulate the situation and not the process….

Clear goals will give us guidelines and criteria for assessing the appropriateness or inappropriateness of measures we might propose.

The Logic of Failure is a wise book with ample lessons. While it points out where our instincts get us into trouble, its point of view is generous and sympathetic. As a companion to Kahneman’s Thinking Fast or Slow or Taleb’s Antifragile, it is highly recommended.

Tuesday, November 19, 2013

A few companies are suing ex-employees for using "negative know-how" in a new job

This is from the book I'm currently reading, Orly Lobel's Talent Wants to Be Free: Why We Should Learn to Love Leaks, Raids, and Free Riding:

Thomas Edison once protested, "I haven't failed. I've simply found 10,000 ways that do not work." In the world of trade secrets, a remarkable example of the controversial expansion of the types of information and knowledge that can be deemed secret in the battles of our talent wars is negative know-how [JC emphasis]--the knowledge of what not to do. An example of negative know-how is when an ex-employee will not undertake a series of failed ways to get to a certain chemical result, but tests other unknown ways until she strikes success [as we've pointed out before, trial and error is not random]. Claiming theft of negative know-how is described as one of the strangest developments in trade secret law. When courts protect negative know-how as the property of an ex-employer the consequence for inventors who move to a new firm can be liability for not repeating past mistakes and failures. 

Certainly, if companies are claiming ownership of their former employees' knowledge of what not to do, learning from mistakes must really be important. Right?

Monday, November 18, 2013

A mini-mistake story by Venture Capitalist Fred Wilson

Fred posted on absorbing losses in the VC market last week, and he started his post with a tiny, compelling story:

When I was early in my career, I casually mentioned to an older VC that I had yet to lose money on an investment. He replied "that's not good, you aren't taking enough risk." I have gone on to lose a lot of money over the years. And made a fair bit too.

The mistake young Fred made was not testing the boundaries. To his mentor, not losing was a sign of over-conservatism, with the result of missing out on potential big wins. This reminds me of the hockey skating story Ashley Good related earlier this year.

Sunday, November 17, 2013

Musician Dev Hynes: "The best can be something you didn't think you could even do."

From Melena Ryzik's interview of producer/songwriter Dev Hynes in the NY Times:

Are you in a position to turn people down, as a producer?

Yeah, but I kind of hate doing it, because my whole thing is: I want to try everything in the world. So there’s not many people that I want to ever turn down, because I think: “Cool, yeah. So we make a bad song. That’s like the worst thing that can happen.” Which, in the scheme of bad things in the world, is not that bad. That’s why I make a lot of music and do a lot of different things. No one’s dying from bad collaborations. But the best that can happen is something that you didn’t think you could even do.

Thursday, November 14, 2013

A.A. Milne talks the value of strategic sloppiness

In a recent discussion with Paul Schoemaker, he mentioned the value of "strategic sloppiness" - in other words, not being too perfect; allowing some disorder to come into your work to allow collisions that may generate new solutions. The epitome of this approach is Alexander Fleming's unclean lab that spawned the discovery of penicillin.

Then I stumbled across this quote from Winnie-the-Pooh author A.A. Milne that sums up the value of strategic sloppiness. It is quoted in the new book Talent Wants To Be Free by Orly Lobel:

One of the advantages of being disorderly is that one is constantly making exciting discoveries.

Wednesday, November 13, 2013

How to develop a new, productive habit

When you find a pattern of mistakes, disrupting that pattern often involves making a new, more productive habit. For example, when I ran into trouble keeping all my tasks and meetings straight, I adopted the Getting Things Done method. It took a number of weeks till the method was ingrained in my daily routine.

Building a new habit isn't easy - we can slip up even if we know the habit is in our best interests. That's because building a habit - i.e., making something automatic - requires a lot of cognitive energy, something our brain actively tries to conserve. But it can be done. These tips came from a post on the Penguin Books blog by author Kelly McGonigal. She is the author of The Willpower Instinct. For more explanation, see the original post.


  1. Choose a tiny habit - changing one small thing at a time is easier than trying to make a huge change all at once. 
  2. "I will" power is stronger than "I won't" - focus on positive changes rather than negative ones.
  3. Find your "want" power - reminding yourself why you are making this change will help you maintain your enthusiasm.
  4. Expect resistance - part of you will question what you are doing; use that as fuel to continue, not as a reason to stop
  5. Forgive your mistakes - you won't be perfect, and beating yourself up will only make it easier to give up on what you are trying to do.
I am reminding the folks trying 3Minute Journal that the journaling habit will also take time, and that these lessons may help.

Monday, November 11, 2013

A journaling tool to track mistakes and accomplishments... and more

The #2 most read post on this site (out of more than 500) is "Why Journal Your Mistakes?" From that original post:

When something goes awry, all you need to do is write it down. Classify it as a Mistake and move on. Then, weeks later, after the intensity and emotions of the moment have dissipated, you look back at it, think about it. What happened? Think about your role - recognize that mistakes and failures are owned by groups, but self-improvement is your task alone. (This is having a sense of agency.) What could you have done differently that could have affected the outcome? Next time your face a similar circumstance, how will you handle it?

Since I wrote that post, I've learned that there are many more reasons to journal - tracking accomplishments/setbacks, as a way to measure the quality of inner work life (from The Progress Principle by Teresa Amabile and Steven Kramer); increasing mindfulness (Chade-Meng Tan's Search Inside Yourself); even measuring gratitude.

Dave Kaylor helped me put these ideas into a site that I've been using for more than a year to track my own progress. It's made a great difference in my outlook and day-to-day effectiveness. We've now released the tool so others can use it. It's called 3-Minute Journal and you can use it for free. If you'd like to be one of the early users of this and provide feedback so we can continue to improve the tool, please sign up here: 3-Minute Journal.

There is a startup guide available on the 3-Minute Journal blog. That's where I'll be posting on how to use the tool and things we find out through this beta process. 

Thursday, November 7, 2013

Mistake Bank Bookshelf: "Brilliant Blunders" by Marco Livio - the consequences of scientific mistakes

This week's entry in the bookshelf is Brilliant Blunders: From Darwin to Einstein - Colossal Mistakes by Great Scientists That Changed Our Understanding of Life and the Universe, by Marco Livio.

Livio profiles the work of Charles Darwin, Lord Kelvin, Linus Pauling, Albert Einstein and Fred Hoyle. A greater roster of 19th and 20th century scientists would be difficult to create. Livio first sketches out the subject's research, then examines a large mistake each made. These stories don't follow the typical inventor narrative - where the scientist makes a mistake and then, by overcoming it or following where it leads, achieves a breakthrough. In each case, the mistakes either follow or sit alongside a breakthrough. They are significant and frequently held onto tenaciously by the scientist, even in the face of evidence to the contrary. One blunder, Einstein's concept for a cosmological constant, was accepted (especially given its esteemed author), then discredited by evidence of an expanding universe and has, lately, somewhat come back into fashion. If you want to learn about how complex and convoluted scientific progress can be, read this: "Why Einstein Was Wrong About Being Wrong."

My favorite section of Brilliant Blunders concerns Linus Pauling's attempts to decode the structure of DNA - the major scientific race of the 1950s. Pauling, the most honored chemist in the first half of the 20th century, was widely expected to solve the mystery of genetic reproduction. We all know how that story ended. There are traces of the Innovator's Dilemma in Pauling's story - too tied to the theories that made his reputation, and complacent, he was outflanked by a nimbler, disruptive competitor (Crick and Watson and the frequently neglected Rosalind Franklin). There are lots of lessons in Pauling's story about how our biases contribute to mistakes. And I've never read a pithier postmortem review:

Pauling's wife, Ava Helen, asked him after all the hoopla surrounding the Watson and Crick model had subsided: "If that was such an important problem, why didn't you work harder on it?"

Possibly the greatest message from Brilliant Blunders is that scientific progress (indeed much important progress) requires competition, collaboration and dialogue (for more on this, see one of my favorite books, Smart World). Each of the scientists, despite his genius, made significant mistakes that, if unchallenged, would have greatly slowed the advance of their fields of study. Instead, others disputed, probed and created alternative explanations that, in combination with the breakthroughs from the "blundering" giants, provided a much greater understanding of our world.

Monday, November 4, 2013

What can we learn from "Disaster Lit"?

Interesting interview over at HBR.org of author Neil Swidey, about his upcoming book, Trapped Under the Sea: One Engineering Marvel, Five Men, and a Disaster Ten Miles Into the Darkness, about a diving disaster in Boston Harbor in the 1990s. In the interview, Swidey's book is lumped into a genre called "disaster lit" including books like Into Thin Air and The Perfect Storm. Interestingly, Swidey points out Krakauer's assertion in Into Thin Air that "The urge to catalogue the myriad blunders in order to ‘learn from the mistakes’ is for the most part an exercise in denial and self-deception."

And while that may seem to be an indictment of this site, I tend to agree with Krakauer's assertion - if you are talking about disasters that occur when attempting to climb Mount Everest. His theme is that amateurs have no place climbing Everest, because of the inherent, uncontrollable risk the mountain and environment offer. The risk in climbing Everest is not an issue of inadequate human design - Everest is simply more powerful than your plan. Trying to insure against disaster by studying other's mistakes causes you to take a much larger risk - underestimating the mountain and the power of randomness.

We can learn from many classes of mistakes and disasters, when we make the same mistake over and over again, when we ignore "near misses," when we lack a clear objective. But not all.

I would agree that there's little to learn from the Everest disasters that can help you avoid future ones, if you are inclined to challenge that mountain. If the mountain wants to take you, it will.

Thursday, October 31, 2013

Story: Startups, don't focus on competition. Instead, "make your own business work"

Very cool post from Brett Martin in which he provides a candid postmortem on his tech startup company, Sonar. There are lots of great mistake stories in the piece; here's one:

In the run up to SXSW 2012 when the insider media had fabricated Highlight as heir to the throne and some of our more fair weather investors had written us off, my confidence was against the ropes. We reordered our roadmap to rush out comparable features but were now BEHIND. I put on my best brave face but inside my gut was rotting away. I still remember thinking on the flight to Austin “fck, we had it, and now we are going to lose it.”

Oops! Highlight never went anywhere but we definitely wasted a ton of energy and sleep “responding to the threat” when we should have been figuring out how to make our own business work.

Lesson Learned:

Be steady at the wheel. The only way one startup can kill another startup is by getting into the other’s head and leading them off a cliff.

If you don’t believe me, try this proof. Are your competitors releasing a bunch of the same features that you have on your roadmap? Yes? Do you know what consumers want*? No? Great, then neither do your competitors. Get back to figuring out what users want!

*Hint: If you did, you would already have traction.

Hat tip Failcon.

Tuesday, October 29, 2013

To reduce the pain of failed projects, get small

I enjoyed Gretchen Gavett's post last week on the HBR Blog Network entitled "The Hidden Indicators of a Failing Project." In it, she discusses how to determine whether projects are going bad (before costly, late public failures, such as the launch of the healthcare.gov website).

Gavett rightly points out that we have biases that prevent us from admitting that our project may not be going as well as we'd like - such as the urge to avoid the recriminations and criticism that comes with calling a project that is going off the rails. The "quiet fixing" mentality also rules, as one of Gavett's sources states: "people actually think they can turn [a failing project] around, so they don’t bring it up." She passes along several pieces of advice to help diagnose problem projects: e.g., cast a wide net of knowledge, revisit requirements regularly, etc.

In my view, the most effective way to prevent big, expensive project failures is to break projects up into smaller chunks. Large projects have large, abstract goals and take a long time to complete - and a long time before end customers get a look at what was delivered (see: healthcare.gov). In uncertain situations (i.e., most projects), it is better to have clear goals than a completely defined plan.

When projects are decomposed into smaller deliverables, each chunk can be specified at a level to deliver value to the end-customer - instead of abstract deliverables such as diagrams, specs, etc. The customer (as opposed to project team members) determines whether the project meets requirements. Smaller projects with clear objectives are easier to measure. Due to this clarity, failures are not only less frequent, but are discovered more quickly and are more contained. The inevitable changes to project requirements are absorbed more easily because smaller pieces can be adapted cheaply. The epitome of this type of approach is the Toyota Production System, which pushes improvement responsibility to the lowest possible level on the factory floor, and through many many iterations of tiny projects, adapts a highly complex production process to the changing needs of the global car market.

So, to reduce the cost and pain with large project failures, do one thing: get small.

Thursday, October 24, 2013

Mistake Bank Bookshelf: "Creative Confidence" by Tom & David Kelley

This week we are profiling the Kelley brothers' book Creative Confidence: Unleashing the Creative Potential Within Us All. Without question, one of the purposes of this site is taking lessons about embracing failure and iteration learned and applied in the creative community and pulling them into mainstream business (while perhaps also reinforcing the ideas to folks in the creative space as well). The Kelleys are perfectly positioned to contribute to this. David Kelley is the founder of the legendary design firm IDEO (creator of products such as the Palm V PDA and Crest's stand-up toothpaste tube) and co-founder of the Stanford d.school. Tom Kelley is a partner at IDEO and teaches at Cal Berkeley's Haas School of Business.

Let me first say that Creative Confidence is a handsome book, printed on heavy, glossy magazine-style paper, and including color drawings, pencil sketches, graphics and photographs. It's a book that provides a memorable experience in print - I'd highly recommend you purchase it in that format. I imagine it could be amazing on iPad, but not so much on your Kindle (sorry).

In the book, there is extensive discussion of the role of failure in innovation. Faithful readers of this site will not find much new in this discussion (for example, the vital research of Carol Dweck), but it is a good summary for those exploring the topic.

Creative Confidence has many many tips for improving your own creativity (for example, building "karaoke confidence" - the ability to discount "fear of failure and judgment"), increasing your creative output, and facilitating brainstorming sessions.

Probably the most useful part of the book for me was its take on feedback. It's a vital topic - most negative feedback inhibits people's creativity and innovation, by summarily rejecting many new ideas ("that won't work," "we couldn't do that here"). But positive feedback unbalanced by critique is just as bad, enabling poor projects to linger or allowing promising projects to stray away from a success path.

Feedbackers need to be kind, but also crisp and clear. Feedback recipients need to be open and careful listeners, as well as shrewd editors (some feedback will be off the mark, other will be right on the mark - how to tell the difference is vital). The Kelleys recommend an "I like/I wish" tool for providing feedback, including in a group setting. "I like" is positive things you drew from the prototype/talk/meeting/etc. For example, "I liked how your talk covered the early creation of beer in Mesopotamia. I wish you had brought samples for us to try!"

This advice is superb and indicative of the quality of the book as a whole. Creative Confidence is fun and engaging, and will help you be more creative and innovative if you follow its advice.

Tuesday, October 22, 2013

Business owners should aim for this type of failure

A great piece from Jeff Haden in Inc. magazine. Jeff points out that business owners rarely "fail" inside their businesses, as the people who work there have every incentive to agree with the boss. This can create what Jeff calls a "king and queen" mentality that discourages employees from giving them negative feedback. This mentality can also cause bosses to rationalize away business failures, and instead blame their employees' performance.

As a result, he suggests that business owners find something to fail at, to remember what that feels like and to cultivate an attitude of humility they can take back into their leadership roles.

Failures could include setting stretch goals for sports or fitness activities, learning a new skill, or even trying to do one of your employees' jobs. While there may be an element of deliberate mistakes involved, Jeff is really focusing on the humility and understanding created by trying your best and coming up a bit short, something that most big bosses rarely experience.

Thursday, October 17, 2013

Tips for providing negative feedback

In the post from yesterday on encouraging productive responses to mistakes, I cited giving timely precise feedback as an important component. But giving useful negative feedback is hard, so I was happy to read this post from Michael Roberto (based on this Fast Company article) with good tips for doing just that.

In addition to the tips listed, I'd add yet another: announce your intention. Ask, "Is it OK if I give you some feedback?" This will allow the recipient to prepare him/herself for what's to come. Their answer of "yes, sure" will open the door for a useful dialogue, moreso than if you lay it on them without warning.

Wednesday, October 16, 2013

How to encourage productive reactions to mistakes in the workplace

A terrific paper came out late last year on the psychological impact of employee mistakes and how to promote more productive approaches to dealing with mistakes or failures at work. The paper is entitled "Guilt By Design: Structuring Organizations to Elicit Guilt as an Affective Reaction to Failure," by Vanessa Bohns of the University of Waterloo and Francis Flynn of Stanford Graduate School of Business.

Bohns and Flynn contrast two reactions to failure: one is guilt, and the other is shame. The authors conclude that the guilt response is more productive than the shame response, because, as they write, "guilt is more likely to inspire employees to rectify their mistakes rather than to dwell on them or react in other nonconstructive ways." Guilty feelings derive from a knowledge that others are let down and shameful feelings signal that the person is inadequate in some way. Bohns and Flynn conclude that the guilt feeling will "increase motivation and performance" when dealing with failure and the shame feeling will decrease it - resulting in outcomes such as ignoring, hiding or blaming others. So managers should aim to promote guilt versus shame when dealing with failures in their organizations.

I (and some colleagues with whom I discussed the paper) had difficulty viewing "guilt" as a behavior managers should encourage for any reason. As I thought about it, I found it easier to think of guilt as a label encompassing a productive approach to mistakes and failure and shame as a label describing an unproductive approach. (Apologies to the authors if I've totally corrupted their arguments.)

The paper asserts that aspects of company culture and organization ("social cues") can inspire the productive approach or encourage the negative. If reporting a mistake leads to a verbal beat-down, you'll be less likely to share what you experienced, even if it affects others.

I pulled out three more keys on encouraging productive response to failure. These are aligned with approaches discussed elsewhere on the site and in the book:


  1. Autonomy and control - workers who have more say and control over their work are more likely to respond productively when things go wrong.
  2. Feedback - a culture of rich, candid timely feedback (even negative feedback) elicits good responses to failure. Bohns and Flynn rightly point out that most managers give negative feedback poorly or not at all; and most employees are incented to minimize/avoid negative feedback.
  3. Appreciating impact on colleagues - being aware that mistakes affect our colleagues (the authors term it "outcome interdependence") causes us to seek to correct them and share information. 
Bohns and Flynn write that "these job characteristics will not always make people feel good." This is true in the short run. Confronting failure is unpleasant and scary. But a culture that is willing to confront mistakes without stigmatizing the individuals who make them is a far better (and likely far more successful) place to work in the long run.  


Tuesday, October 15, 2013

Startups - is it all about execution... or timing?

Mistake stories are amazing resources because you can easily get 30 minutes of valuable dialogue out of a 3-minute story. They are that full of information and insight. One reason is their complexity - they defy easy conclusions or snap judgments. Here's an example - two stories that seem to demonstrate exact opposite truths!

The first is from Dilbert cartoonist Scott Adams, from his upcoming book, "How To Fail At Almost Everything and Still Win Big." The excerpt is from the Wall Street Journal:

In the 1970s, tennis players sometimes used rosin bags to keep their racket hands less sweaty. In college, I built a prototype of a rosin bag that attached to a Velcro strip on tennis shorts so it would always be available when needed. My lawyer told me it wasn't patentworthy because it was simply a combination of two existing products. I approached some sporting-goods companies and got nothing but form-letter rejections. I dropped the idea.

But in the process I learned a valuable lesson: Good ideas have no value because the world already has too many of them. The market rewards execution, not ideas. From that point on, I concentrated on ideas that I could execute. I was already failing toward success, but I didn't yet know it.

The second is the high-drama depiction of the Twitter founding story as excerpted in the New York Times magazine. Albert Wenger, a venture capitalist and early investor in the company reacted to the Times excerpt:

So why does the Twitter story remind me [that life is unfair]? Because it demonstrates the relative importance of hitting upon the right thing at the right time over early execution. This goes a bit against one of the historic ideas held dear in venture capital that execution matters more than ideas. And yes it remains true that an idea alone is worthless, you have to build something. But beyond that it turns out that building the right thing at the right time will let you get away with all sorts of mistakes. Conversely, hypothetically perfect execution but too early or too late or on the wrong variant will not get you very far.

Who's right? It may depend on your own situation. Certainly, if you're involved in a startup, you could do worse than invest a half-hour discussing these stories and the relative impacts of idea quality, timing, luck and execution with your co-founder. But you may also consider these additional words from Wenger: "Somewhere somebody right now is building the next big thing and most likely it is not you. Just accept that and you’ll be happier."

Monday, October 14, 2013

Dump the participation trophies - let's be candid with kids on success and failure

I coached my kids' soccer teams before they turned 10 (and needed better coaching than I could provide!). One season, my older son's team lost every game they played, most by large margins. It was a very difficult season - probably for us coaches and parents most of all.

This story came back to me when I read "Losing Is Good For You" in the New York Times (thanks Rita McGrath for pointing it out). In this opinion piece, author Ashley Merryman criticizes the trend toward recognizing kids for participation rather than accomplishment.

I have heard this argument before, mostly from highly-competitive parents who dismiss the idea of recognition for anything other than ultimate victory. Merryman's argument is more complex and useful. She asserts that participation trophies dilute the excitement of winning, the toughening power of losing, the honor of competition, and the impetus to improve, no matter what your current abilities are:
When children make mistakes, our job should not be to spin those losses into decorated victories. Instead, our job is to help kids overcome setbacks, to help them see that progress over time is more important than a particular win or loss, and to help them graciously congratulate the child who succeeded when they failed.

I'm with Merryman. Competition is a complex mixture. Winning is fun, and losing is information. Improvement from game to game is also important and valuable. These lessons are important for young people, midcareer adults, everyone.

For the last game of our lost soccer season, our team had to travel nearly and hour and a half to the opposing team's field. It was a very hot May day. Some of our parents decided not to brave the trip, so the team was short-handed. In fact, we had to play one man short on the field the entire game.

Our guys fought terribly hard, and never gave up. They lost 7-0. After the game the other team saluted our players for their grit and determination. Our guys were happy and proud, as were we coaches. It was one of the biggest wins in my coaching experience.

Friday, October 11, 2013

Mistake Bank Bookshelf: "Intuition Pumps and Other Tools For Thinking"

This week’s entry in the Bookshelf is Intuition Pumps And Other Tools for Thinking, by Daniel C. Dennett. A philosopher and longtime teacher at Tufts University, Dennett has created a book with loads and loads of useful tips and hints for getting your head around challenging problems and arguing complex points. But one chapter alone makes it an essential tool for anyone who frequents this site: Chapter 1, “Making Mistakes.”

Dennett sees mistake-making as a means to explore ideas and move projects forward:

Sometimes you don’t just want to risk making mistakes; you actually want to make them – if only to give you something clear and detailed to fix. Making mistakes is the key to making progress. Of course there are times when it is really important not to make any mistakes – ask any surgeon or airline pilot. But it is less widely appreciated that there are also times when making mistakes is the only way to go…. I often find that I have to encourage [students] to cultivate the habit of making mistakes, the best learning opportunities of all. They get “writer’s block” and waste hours forlornly wandering back and forth on the starting line. “Blurt it out!” I urge them. Then they have something on the page to work with.

And this:

The chief trick in making good mistakes is not to hide them – especially not from yourself. Instead of turning away in denial when you make a mistake, you should become a connoisseur of your own misatkes, turning them over in your mind as if they were works of art, which in a way they are.

I marked highlight after highlight in this chapter. It alone is worth the price of Intuition Pumps. And then you get all the other dozens of “pumps” for free!

Thursday, October 10, 2013

Where does human judgment fit among all these quantitative models and tools we're making?

Some interesting thoughts here regarding complexity in the systems and processes we encounter every day, and our growing reliance on information systems and models to make sense of them. Important stuff when thinking of how mistakes happen and how we can learn from them.

Story: Put things where they belong, a lesson for father and son

One day I was talking to my son George and urging him to use the work-in-progress folder that we had created for him to put all of his assignments in, so they would not get lost. He was resisting, mostly because he's 12 and has little to learn from anyone, especially his parents ;). I used my best reasoning judgment to convince him, telling him that if he has a place for those important things, he will know where they are when he needs them. Finally, I had to basically insist that he use this tool that we provided for him.

Later that day I was gathering my work things together to do some work while at George's soccer practice. I had trouble finding the earpiece for my cellphone. I looked in all the places I thought it might be but couldn't find it. I eventually gave up, and took George to soccer practice. I finished my work there, and as I was packing up my computer, putting it back into my backpack, I noticed that my ear piece was sitting in the bottom of the computer sleeve. It was there, just not in the right spot. So the moral of the story for me was: practice what you preach. Put things where they belong and then they will be there when you need them.

Wednesday, October 9, 2013

Limitations of the "Swiss Cheese" analogy of complex failure

Kind readers of this site have pointed me to several papers related to complex-system failure. I'm grateful for their references - it's helped me process through a lot of my thinking on the subject. Please keep them coming!

One theme I've seen is that of the "Swiss Cheese analogy" for system failure. That is, individual activities or process steps are like slices of Swiss cheese - and the holes are individual errors. When the holes in a bunch of slices in a row line up, that is a systemic failure.

By this analogy, as long as we put actions in place to make sure the holes don't line up, we can avoid large failures. Those actions would be things like peer review, creating redundancy & backup systems, etc. This makes intuitive sense. Just prevent the holes from lining up! But there are two significant cases for which this analogy breaks down.

First is a system that is constantly evolving. The processes and practices that were put in place yesterday did not take into account the change that happened today. For example, a product sales plan is derailed by the increased adoption of a substitute product. Or a doctor's diagnosis is affected by having been involved in a minor car accident on the way to work.

In this example, new holes are being punched into the cheese. There are new failure states being created all the time, and reliance on process and behavior-type tools won't cover all the contingencies. Worse than that, a false confidence in these tools may allow people to overlook the brand new holes that have appeared.

The second situation is that of a highly unlikely but powerfully impactful event (a Black Swan). In this case, the probability of the event is so small that planners tend to overlook it (or at least underestimate its likelihood - what's the difference between 0.1% and 1%, anyway?). But when it occurs, the hole in the Swiss cheese is so huge that our safety measures can't cover it up.

For the second situation, there is a wealth of information in Nassim Taleb's book Antifragile and on his Facebook page.

The first one I'm thinking about right now. How do we create an environment where we both create processes to avoid holes lining up, and constantly keep everyone on the lookout for new holes?

Monday, October 7, 2013

4 pages of wisdom: "How Complex Systems Fail"

Roxanne Persaud (@commutiny) has shared an important paper in looking at mistakes and failure. It is "How Complex Systems Fail" (link - PDF) by Dr. Richard Cook, director of the Cognitive Technologies Laboratory at the University of Chicago. The 1998 paper is brief - only 4 pages long - and is in bullet form. In it, Dr. Cook summarizes his thoughts about complex human-developed systems - examples include power generation/distribution systems, health care, transportation.

Some of the headlines represent things I'm thinking about a lot these days: "Complex systems are intrinsically hazardous systems" - that is, failures will occur and not all can be avoided.

"Catastrophe requires multiple failures" - it occurs when "small, systemic failures join to create opportunity for an accident." This means that near misses are important and cannot be ignored - the right combination of near misses means disaster.

"Change introduces new forms of failure." Complex systems evolve, and what worked in the past may not work in the future due to the context and actors changing.

It's a terrific paper. One headline, "Human practitioners are the adaptable element of complex systems," makes me realize how important it is for workers in complex environments (that means all knowledge workers, managers, salespeople, customer service reps, nurses, etc.) to be aware they work in a complex environment. How many of those millions of workers don't even realize they are part of a system that breaks often, and that they can help repair by being cognizant of small mistakes and changes, and by sharing those lessons?

Sunday, October 6, 2013

Doctor realizes she is no different from her patients when it comes to changing habits

From the New York Times, Dr. Danielle Ofri on the difficulty of breaking the habit of the annual checkup.

We doctors constantly lament how difficult it is get our patients to change their behavior. We rant about those who won’t take their meds, who won’t quit smoking, who never exercise. But the truth is, we are equally intransigent when it comes to changing our own behaviors as caregivers....

The problem is, most of us are just like our patients — we often ignore good advice when it conflicts with what we’ve always done.

I thought about this as I read the latest recommendations from the Choosing Wisely campaign — a project led by the American Board of Internal Medicine to inform doctors and patients about overused and ineffective tests and treatments. Medical groups were asked to list five things in their field that are often overutilized but don’t offer much benefit.

Last month, my specialty group — the Society of General Internal Medicine — released its Choosing Wisely recommendations. No. 2 was: “Don’t perform routine general health checks for asymptomatic adults.”

This runs counter to a basic pillar in medicine that doctors and patients remain strongly attached to: the annual checkup. This is our chance to do screening tests and vaccinations and to discuss a healthy lifestyle. Anecdotally, we all can cite examples of checkups that uncovered serious illness. But the scientific evidence shows that on balance, the harm of annual visits — overdiagnosis, overtreatment, excess costs — can outweigh the benefits.

Yet, I still do them. Each time I see a healthy patient, I close the visit by saying, “See you in a year.” It’s a reflex.

After the research was initially published last year, I grappled with the evidence, or lack thereof, reaching a conclusion that I mainly still supported the annual visit, if only because it establishes a solid doctor-patient relationship. But seeing these new, strongly worded recommendations, I may have to re-evaluate. At the very least, I should take a moment to think before I reflexively recommend the annual visit. But I know that I might still end up doing the same thing, despite the evidence.

Humans are creatures of habit. Our default is to continue on the path we’ve always trod. If we doctors can recognize that impulse in ourselves, it will give us a dose of empathy for our patients, who are struggling with the same challenges when it comes to changing behavior.

Dr. Ofri is very unlikely to stop recommending the annual checkup. As she writes, changing habits is difficult. Without a determined objective ("I will urge my patients to follow the guideline and not schedule a routine physical if they don't show any symptoms"), she will certainly revert to what she's always done. Her article is a fascinating demonstration that ignoring recommendations is not limited to mere patients. Professional doctors are just as prone to wave stuff off, even stuff they believe to be best for them and the patients.

But if she is serious about the Choosing Wisely recommendation, she should first set a clear objective to do it, and not simply "take a moment to think before I reflexively recommend." What would she say to her patients who treated one of her recommendations that way?

Thursday, October 3, 2013

Duke "Anatomy of an Error" course: is every mistake human error or a process defect?

I've been thinking about how to teach "mistake appreciation," for lack of a better term, to knowledge professionals. As such, I am very attuned to discussions of errors/mistakes/failures, including definitions and prescriptions. So I was very interested when @whatsthepont shared the Duke "Anatomy of an Error" online course (discussed earlier this week). Here's a graphic from my recent materials (adapted from Amy Edmondson's work):



Health care is a business that sits squarely in the complex domain - the center of the spectrum. And as you can see, failures are unavoidable. How important, therefore, is it to teach people involved in this process how to understand, prevent and learn from errors? Very important, of course. So the Duke course is admirable.

However, my concern about the course is that it seems to hold a point of view that if workers acted carefully enough, and process designers were thorough enough, that errors in health care could be eliminated completely. Here's an example of what I'm talking about from the course:

Because humans are fallible, we must rely upon systems and back-ups to prevent or detect errors before they can cause harm. Unfortunately, our systems are not always designed well to achieve this. System and design factors that can lead to bad outcomes include:
Complexity:
  • Too many steps
  • Too many people (communication issues)
Workload:
  • Too heavy or too light (performance is best when workload is moderate)
  • Too much reliance on human vigilance/monitoring
Poor design:
  • Focus on functionality, while ignoring the real-life user 

This presumes that dealing with all the bulleted items will avoid all possible bad outcomes. This thinking is just wrong. Inherent in a complex process is changing interactions between different people and the environment. Things evolve. Health care is no different. As a result, unexpected results can occur even if workers work with the utmost care and the processes have been designed to the utmost quality. Because things change around them.

This is more reason to take a more open view of error than that simply created by tired workers or inadequate processes. And to thrive with this open view requires a culture that does not work by "stored patterns of pre-programmed instructions," but that is mindful, aware, and welcomes observations and insights from anyone in the organization (not just doctors).

In addition to "Anatomy of an Error," Duke Medical Center would be well advised to teach their workers to be very aware of surprises and disappointments - to use a spy novel term, "expect the unexpected." If well-trained, well-rested staff perform a well-designed process and something still goes awry, that is not a defect, it is vital information. It means something has not been taken into account, that the environment has changed, and a cool, clearheaded review is in order. It could be the appearance of a new strain of resistant bacteria, a new designer drug, or simply a unique patient situation. The worst possible step is to hide the error or to engage in "quiet fixing."

This is the next level of training that groups like Duke need to embark on. They should be commended for training their staff on this weird topic of errors and mistakes. But they can't stop there; there's much left to do.