Thursday, October 31, 2013

Story: Startups, don't focus on competition. Instead, "make your own business work"

Very cool post from Brett Martin in which he provides a candid postmortem on his tech startup company, Sonar. There are lots of great mistake stories in the piece; here's one:

In the run up to SXSW 2012 when the insider media had fabricated Highlight as heir to the throne and some of our more fair weather investors had written us off, my confidence was against the ropes. We reordered our roadmap to rush out comparable features but were now BEHIND. I put on my best brave face but inside my gut was rotting away. I still remember thinking on the flight to Austin “fck, we had it, and now we are going to lose it.”

Oops! Highlight never went anywhere but we definitely wasted a ton of energy and sleep “responding to the threat” when we should have been figuring out how to make our own business work.

Lesson Learned:

Be steady at the wheel. The only way one startup can kill another startup is by getting into the other’s head and leading them off a cliff.

If you don’t believe me, try this proof. Are your competitors releasing a bunch of the same features that you have on your roadmap? Yes? Do you know what consumers want*? No? Great, then neither do your competitors. Get back to figuring out what users want!

*Hint: If you did, you would already have traction.

Hat tip Failcon.

Tuesday, October 29, 2013

To reduce the pain of failed projects, get small

I enjoyed Gretchen Gavett's post last week on the HBR Blog Network entitled "The Hidden Indicators of a Failing Project." In it, she discusses how to determine whether projects are going bad (before costly, late public failures, such as the launch of the healthcare.gov website).

Gavett rightly points out that we have biases that prevent us from admitting that our project may not be going as well as we'd like - such as the urge to avoid the recriminations and criticism that comes with calling a project that is going off the rails. The "quiet fixing" mentality also rules, as one of Gavett's sources states: "people actually think they can turn [a failing project] around, so they don’t bring it up." She passes along several pieces of advice to help diagnose problem projects: e.g., cast a wide net of knowledge, revisit requirements regularly, etc.

In my view, the most effective way to prevent big, expensive project failures is to break projects up into smaller chunks. Large projects have large, abstract goals and take a long time to complete - and a long time before end customers get a look at what was delivered (see: healthcare.gov). In uncertain situations (i.e., most projects), it is better to have clear goals than a completely defined plan.

When projects are decomposed into smaller deliverables, each chunk can be specified at a level to deliver value to the end-customer - instead of abstract deliverables such as diagrams, specs, etc. The customer (as opposed to project team members) determines whether the project meets requirements. Smaller projects with clear objectives are easier to measure. Due to this clarity, failures are not only less frequent, but are discovered more quickly and are more contained. The inevitable changes to project requirements are absorbed more easily because smaller pieces can be adapted cheaply. The epitome of this type of approach is the Toyota Production System, which pushes improvement responsibility to the lowest possible level on the factory floor, and through many many iterations of tiny projects, adapts a highly complex production process to the changing needs of the global car market.

So, to reduce the cost and pain with large project failures, do one thing: get small.

Thursday, October 24, 2013

Mistake Bank Bookshelf: "Creative Confidence" by Tom & David Kelley

This week we are profiling the Kelley brothers' book Creative Confidence: Unleashing the Creative Potential Within Us All. Without question, one of the purposes of this site is taking lessons about embracing failure and iteration learned and applied in the creative community and pulling them into mainstream business (while perhaps also reinforcing the ideas to folks in the creative space as well). The Kelleys are perfectly positioned to contribute to this. David Kelley is the founder of the legendary design firm IDEO (creator of products such as the Palm V PDA and Crest's stand-up toothpaste tube) and co-founder of the Stanford d.school. Tom Kelley is a partner at IDEO and teaches at Cal Berkeley's Haas School of Business.

Let me first say that Creative Confidence is a handsome book, printed on heavy, glossy magazine-style paper, and including color drawings, pencil sketches, graphics and photographs. It's a book that provides a memorable experience in print - I'd highly recommend you purchase it in that format. I imagine it could be amazing on iPad, but not so much on your Kindle (sorry).

In the book, there is extensive discussion of the role of failure in innovation. Faithful readers of this site will not find much new in this discussion (for example, the vital research of Carol Dweck), but it is a good summary for those exploring the topic.

Creative Confidence has many many tips for improving your own creativity (for example, building "karaoke confidence" - the ability to discount "fear of failure and judgment"), increasing your creative output, and facilitating brainstorming sessions.

Probably the most useful part of the book for me was its take on feedback. It's a vital topic - most negative feedback inhibits people's creativity and innovation, by summarily rejecting many new ideas ("that won't work," "we couldn't do that here"). But positive feedback unbalanced by critique is just as bad, enabling poor projects to linger or allowing promising projects to stray away from a success path.

Feedbackers need to be kind, but also crisp and clear. Feedback recipients need to be open and careful listeners, as well as shrewd editors (some feedback will be off the mark, other will be right on the mark - how to tell the difference is vital). The Kelleys recommend an "I like/I wish" tool for providing feedback, including in a group setting. "I like" is positive things you drew from the prototype/talk/meeting/etc. For example, "I liked how your talk covered the early creation of beer in Mesopotamia. I wish you had brought samples for us to try!"

This advice is superb and indicative of the quality of the book as a whole. Creative Confidence is fun and engaging, and will help you be more creative and innovative if you follow its advice.

Tuesday, October 22, 2013

Business owners should aim for this type of failure

A great piece from Jeff Haden in Inc. magazine. Jeff points out that business owners rarely "fail" inside their businesses, as the people who work there have every incentive to agree with the boss. This can create what Jeff calls a "king and queen" mentality that discourages employees from giving them negative feedback. This mentality can also cause bosses to rationalize away business failures, and instead blame their employees' performance.

As a result, he suggests that business owners find something to fail at, to remember what that feels like and to cultivate an attitude of humility they can take back into their leadership roles.

Failures could include setting stretch goals for sports or fitness activities, learning a new skill, or even trying to do one of your employees' jobs. While there may be an element of deliberate mistakes involved, Jeff is really focusing on the humility and understanding created by trying your best and coming up a bit short, something that most big bosses rarely experience.

Thursday, October 17, 2013

Tips for providing negative feedback

In the post from yesterday on encouraging productive responses to mistakes, I cited giving timely precise feedback as an important component. But giving useful negative feedback is hard, so I was happy to read this post from Michael Roberto (based on this Fast Company article) with good tips for doing just that.

In addition to the tips listed, I'd add yet another: announce your intention. Ask, "Is it OK if I give you some feedback?" This will allow the recipient to prepare him/herself for what's to come. Their answer of "yes, sure" will open the door for a useful dialogue, moreso than if you lay it on them without warning.

Wednesday, October 16, 2013

How to encourage productive reactions to mistakes in the workplace

A terrific paper came out late last year on the psychological impact of employee mistakes and how to promote more productive approaches to dealing with mistakes or failures at work. The paper is entitled "Guilt By Design: Structuring Organizations to Elicit Guilt as an Affective Reaction to Failure," by Vanessa Bohns of the University of Waterloo and Francis Flynn of Stanford Graduate School of Business.

Bohns and Flynn contrast two reactions to failure: one is guilt, and the other is shame. The authors conclude that the guilt response is more productive than the shame response, because, as they write, "guilt is more likely to inspire employees to rectify their mistakes rather than to dwell on them or react in other nonconstructive ways." Guilty feelings derive from a knowledge that others are let down and shameful feelings signal that the person is inadequate in some way. Bohns and Flynn conclude that the guilt feeling will "increase motivation and performance" when dealing with failure and the shame feeling will decrease it - resulting in outcomes such as ignoring, hiding or blaming others. So managers should aim to promote guilt versus shame when dealing with failures in their organizations.

I (and some colleagues with whom I discussed the paper) had difficulty viewing "guilt" as a behavior managers should encourage for any reason. As I thought about it, I found it easier to think of guilt as a label encompassing a productive approach to mistakes and failure and shame as a label describing an unproductive approach. (Apologies to the authors if I've totally corrupted their arguments.)

The paper asserts that aspects of company culture and organization ("social cues") can inspire the productive approach or encourage the negative. If reporting a mistake leads to a verbal beat-down, you'll be less likely to share what you experienced, even if it affects others.

I pulled out three more keys on encouraging productive response to failure. These are aligned with approaches discussed elsewhere on the site and in the book:


  1. Autonomy and control - workers who have more say and control over their work are more likely to respond productively when things go wrong.
  2. Feedback - a culture of rich, candid timely feedback (even negative feedback) elicits good responses to failure. Bohns and Flynn rightly point out that most managers give negative feedback poorly or not at all; and most employees are incented to minimize/avoid negative feedback.
  3. Appreciating impact on colleagues - being aware that mistakes affect our colleagues (the authors term it "outcome interdependence") causes us to seek to correct them and share information. 
Bohns and Flynn write that "these job characteristics will not always make people feel good." This is true in the short run. Confronting failure is unpleasant and scary. But a culture that is willing to confront mistakes without stigmatizing the individuals who make them is a far better (and likely far more successful) place to work in the long run.  


Tuesday, October 15, 2013

Startups - is it all about execution... or timing?

Mistake stories are amazing resources because you can easily get 30 minutes of valuable dialogue out of a 3-minute story. They are that full of information and insight. One reason is their complexity - they defy easy conclusions or snap judgments. Here's an example - two stories that seem to demonstrate exact opposite truths!

The first is from Dilbert cartoonist Scott Adams, from his upcoming book, "How To Fail At Almost Everything and Still Win Big." The excerpt is from the Wall Street Journal:

In the 1970s, tennis players sometimes used rosin bags to keep their racket hands less sweaty. In college, I built a prototype of a rosin bag that attached to a Velcro strip on tennis shorts so it would always be available when needed. My lawyer told me it wasn't patentworthy because it was simply a combination of two existing products. I approached some sporting-goods companies and got nothing but form-letter rejections. I dropped the idea.

But in the process I learned a valuable lesson: Good ideas have no value because the world already has too many of them. The market rewards execution, not ideas. From that point on, I concentrated on ideas that I could execute. I was already failing toward success, but I didn't yet know it.

The second is the high-drama depiction of the Twitter founding story as excerpted in the New York Times magazine. Albert Wenger, a venture capitalist and early investor in the company reacted to the Times excerpt:

So why does the Twitter story remind me [that life is unfair]? Because it demonstrates the relative importance of hitting upon the right thing at the right time over early execution. This goes a bit against one of the historic ideas held dear in venture capital that execution matters more than ideas. And yes it remains true that an idea alone is worthless, you have to build something. But beyond that it turns out that building the right thing at the right time will let you get away with all sorts of mistakes. Conversely, hypothetically perfect execution but too early or too late or on the wrong variant will not get you very far.

Who's right? It may depend on your own situation. Certainly, if you're involved in a startup, you could do worse than invest a half-hour discussing these stories and the relative impacts of idea quality, timing, luck and execution with your co-founder. But you may also consider these additional words from Wenger: "Somewhere somebody right now is building the next big thing and most likely it is not you. Just accept that and you’ll be happier."

Monday, October 14, 2013

Dump the participation trophies - let's be candid with kids on success and failure

I coached my kids' soccer teams before they turned 10 (and needed better coaching than I could provide!). One season, my older son's team lost every game they played, most by large margins. It was a very difficult season - probably for us coaches and parents most of all.

This story came back to me when I read "Losing Is Good For You" in the New York Times (thanks Rita McGrath for pointing it out). In this opinion piece, author Ashley Merryman criticizes the trend toward recognizing kids for participation rather than accomplishment.

I have heard this argument before, mostly from highly-competitive parents who dismiss the idea of recognition for anything other than ultimate victory. Merryman's argument is more complex and useful. She asserts that participation trophies dilute the excitement of winning, the toughening power of losing, the honor of competition, and the impetus to improve, no matter what your current abilities are:
When children make mistakes, our job should not be to spin those losses into decorated victories. Instead, our job is to help kids overcome setbacks, to help them see that progress over time is more important than a particular win or loss, and to help them graciously congratulate the child who succeeded when they failed.

I'm with Merryman. Competition is a complex mixture. Winning is fun, and losing is information. Improvement from game to game is also important and valuable. These lessons are important for young people, midcareer adults, everyone.

For the last game of our lost soccer season, our team had to travel nearly and hour and a half to the opposing team's field. It was a very hot May day. Some of our parents decided not to brave the trip, so the team was short-handed. In fact, we had to play one man short on the field the entire game.

Our guys fought terribly hard, and never gave up. They lost 7-0. After the game the other team saluted our players for their grit and determination. Our guys were happy and proud, as were we coaches. It was one of the biggest wins in my coaching experience.

Friday, October 11, 2013

Mistake Bank Bookshelf: "Intuition Pumps and Other Tools For Thinking"

This week’s entry in the Bookshelf is Intuition Pumps And Other Tools for Thinking, by Daniel C. Dennett. A philosopher and longtime teacher at Tufts University, Dennett has created a book with loads and loads of useful tips and hints for getting your head around challenging problems and arguing complex points. But one chapter alone makes it an essential tool for anyone who frequents this site: Chapter 1, “Making Mistakes.”

Dennett sees mistake-making as a means to explore ideas and move projects forward:

Sometimes you don’t just want to risk making mistakes; you actually want to make them – if only to give you something clear and detailed to fix. Making mistakes is the key to making progress. Of course there are times when it is really important not to make any mistakes – ask any surgeon or airline pilot. But it is less widely appreciated that there are also times when making mistakes is the only way to go…. I often find that I have to encourage [students] to cultivate the habit of making mistakes, the best learning opportunities of all. They get “writer’s block” and waste hours forlornly wandering back and forth on the starting line. “Blurt it out!” I urge them. Then they have something on the page to work with.

And this:

The chief trick in making good mistakes is not to hide them – especially not from yourself. Instead of turning away in denial when you make a mistake, you should become a connoisseur of your own misatkes, turning them over in your mind as if they were works of art, which in a way they are.

I marked highlight after highlight in this chapter. It alone is worth the price of Intuition Pumps. And then you get all the other dozens of “pumps” for free!

Thursday, October 10, 2013

Where does human judgment fit among all these quantitative models and tools we're making?

Some interesting thoughts here regarding complexity in the systems and processes we encounter every day, and our growing reliance on information systems and models to make sense of them. Important stuff when thinking of how mistakes happen and how we can learn from them.

Story: Put things where they belong, a lesson for father and son

One day I was talking to my son George and urging him to use the work-in-progress folder that we had created for him to put all of his assignments in, so they would not get lost. He was resisting, mostly because he's 12 and has little to learn from anyone, especially his parents ;). I used my best reasoning judgment to convince him, telling him that if he has a place for those important things, he will know where they are when he needs them. Finally, I had to basically insist that he use this tool that we provided for him.

Later that day I was gathering my work things together to do some work while at George's soccer practice. I had trouble finding the earpiece for my cellphone. I looked in all the places I thought it might be but couldn't find it. I eventually gave up, and took George to soccer practice. I finished my work there, and as I was packing up my computer, putting it back into my backpack, I noticed that my ear piece was sitting in the bottom of the computer sleeve. It was there, just not in the right spot. So the moral of the story for me was: practice what you preach. Put things where they belong and then they will be there when you need them.

Wednesday, October 9, 2013

Limitations of the "Swiss Cheese" analogy of complex failure

Kind readers of this site have pointed me to several papers related to complex-system failure. I'm grateful for their references - it's helped me process through a lot of my thinking on the subject. Please keep them coming!

One theme I've seen is that of the "Swiss Cheese analogy" for system failure. That is, individual activities or process steps are like slices of Swiss cheese - and the holes are individual errors. When the holes in a bunch of slices in a row line up, that is a systemic failure.

By this analogy, as long as we put actions in place to make sure the holes don't line up, we can avoid large failures. Those actions would be things like peer review, creating redundancy & backup systems, etc. This makes intuitive sense. Just prevent the holes from lining up! But there are two significant cases for which this analogy breaks down.

First is a system that is constantly evolving. The processes and practices that were put in place yesterday did not take into account the change that happened today. For example, a product sales plan is derailed by the increased adoption of a substitute product. Or a doctor's diagnosis is affected by having been involved in a minor car accident on the way to work.

In this example, new holes are being punched into the cheese. There are new failure states being created all the time, and reliance on process and behavior-type tools won't cover all the contingencies. Worse than that, a false confidence in these tools may allow people to overlook the brand new holes that have appeared.

The second situation is that of a highly unlikely but powerfully impactful event (a Black Swan). In this case, the probability of the event is so small that planners tend to overlook it (or at least underestimate its likelihood - what's the difference between 0.1% and 1%, anyway?). But when it occurs, the hole in the Swiss cheese is so huge that our safety measures can't cover it up.

For the second situation, there is a wealth of information in Nassim Taleb's book Antifragile and on his Facebook page.

The first one I'm thinking about right now. How do we create an environment where we both create processes to avoid holes lining up, and constantly keep everyone on the lookout for new holes?

Monday, October 7, 2013

4 pages of wisdom: "How Complex Systems Fail"

Roxanne Persaud (@commutiny) has shared an important paper in looking at mistakes and failure. It is "How Complex Systems Fail" (link - PDF) by Dr. Richard Cook, director of the Cognitive Technologies Laboratory at the University of Chicago. The 1998 paper is brief - only 4 pages long - and is in bullet form. In it, Dr. Cook summarizes his thoughts about complex human-developed systems - examples include power generation/distribution systems, health care, transportation.

Some of the headlines represent things I'm thinking about a lot these days: "Complex systems are intrinsically hazardous systems" - that is, failures will occur and not all can be avoided.

"Catastrophe requires multiple failures" - it occurs when "small, systemic failures join to create opportunity for an accident." This means that near misses are important and cannot be ignored - the right combination of near misses means disaster.

"Change introduces new forms of failure." Complex systems evolve, and what worked in the past may not work in the future due to the context and actors changing.

It's a terrific paper. One headline, "Human practitioners are the adaptable element of complex systems," makes me realize how important it is for workers in complex environments (that means all knowledge workers, managers, salespeople, customer service reps, nurses, etc.) to be aware they work in a complex environment. How many of those millions of workers don't even realize they are part of a system that breaks often, and that they can help repair by being cognizant of small mistakes and changes, and by sharing those lessons?

Sunday, October 6, 2013

Doctor realizes she is no different from her patients when it comes to changing habits

From the New York Times, Dr. Danielle Ofri on the difficulty of breaking the habit of the annual checkup.

We doctors constantly lament how difficult it is get our patients to change their behavior. We rant about those who won’t take their meds, who won’t quit smoking, who never exercise. But the truth is, we are equally intransigent when it comes to changing our own behaviors as caregivers....

The problem is, most of us are just like our patients — we often ignore good advice when it conflicts with what we’ve always done.

I thought about this as I read the latest recommendations from the Choosing Wisely campaign — a project led by the American Board of Internal Medicine to inform doctors and patients about overused and ineffective tests and treatments. Medical groups were asked to list five things in their field that are often overutilized but don’t offer much benefit.

Last month, my specialty group — the Society of General Internal Medicine — released its Choosing Wisely recommendations. No. 2 was: “Don’t perform routine general health checks for asymptomatic adults.”

This runs counter to a basic pillar in medicine that doctors and patients remain strongly attached to: the annual checkup. This is our chance to do screening tests and vaccinations and to discuss a healthy lifestyle. Anecdotally, we all can cite examples of checkups that uncovered serious illness. But the scientific evidence shows that on balance, the harm of annual visits — overdiagnosis, overtreatment, excess costs — can outweigh the benefits.

Yet, I still do them. Each time I see a healthy patient, I close the visit by saying, “See you in a year.” It’s a reflex.

After the research was initially published last year, I grappled with the evidence, or lack thereof, reaching a conclusion that I mainly still supported the annual visit, if only because it establishes a solid doctor-patient relationship. But seeing these new, strongly worded recommendations, I may have to re-evaluate. At the very least, I should take a moment to think before I reflexively recommend the annual visit. But I know that I might still end up doing the same thing, despite the evidence.

Humans are creatures of habit. Our default is to continue on the path we’ve always trod. If we doctors can recognize that impulse in ourselves, it will give us a dose of empathy for our patients, who are struggling with the same challenges when it comes to changing behavior.

Dr. Ofri is very unlikely to stop recommending the annual checkup. As she writes, changing habits is difficult. Without a determined objective ("I will urge my patients to follow the guideline and not schedule a routine physical if they don't show any symptoms"), she will certainly revert to what she's always done. Her article is a fascinating demonstration that ignoring recommendations is not limited to mere patients. Professional doctors are just as prone to wave stuff off, even stuff they believe to be best for them and the patients.

But if she is serious about the Choosing Wisely recommendation, she should first set a clear objective to do it, and not simply "take a moment to think before I reflexively recommend." What would she say to her patients who treated one of her recommendations that way?

Thursday, October 3, 2013

Duke "Anatomy of an Error" course: is every mistake human error or a process defect?

I've been thinking about how to teach "mistake appreciation," for lack of a better term, to knowledge professionals. As such, I am very attuned to discussions of errors/mistakes/failures, including definitions and prescriptions. So I was very interested when @whatsthepont shared the Duke "Anatomy of an Error" online course (discussed earlier this week). Here's a graphic from my recent materials (adapted from Amy Edmondson's work):



Health care is a business that sits squarely in the complex domain - the center of the spectrum. And as you can see, failures are unavoidable. How important, therefore, is it to teach people involved in this process how to understand, prevent and learn from errors? Very important, of course. So the Duke course is admirable.

However, my concern about the course is that it seems to hold a point of view that if workers acted carefully enough, and process designers were thorough enough, that errors in health care could be eliminated completely. Here's an example of what I'm talking about from the course:

Because humans are fallible, we must rely upon systems and back-ups to prevent or detect errors before they can cause harm. Unfortunately, our systems are not always designed well to achieve this. System and design factors that can lead to bad outcomes include:
Complexity:
  • Too many steps
  • Too many people (communication issues)
Workload:
  • Too heavy or too light (performance is best when workload is moderate)
  • Too much reliance on human vigilance/monitoring
Poor design:
  • Focus on functionality, while ignoring the real-life user 

This presumes that dealing with all the bulleted items will avoid all possible bad outcomes. This thinking is just wrong. Inherent in a complex process is changing interactions between different people and the environment. Things evolve. Health care is no different. As a result, unexpected results can occur even if workers work with the utmost care and the processes have been designed to the utmost quality. Because things change around them.

This is more reason to take a more open view of error than that simply created by tired workers or inadequate processes. And to thrive with this open view requires a culture that does not work by "stored patterns of pre-programmed instructions," but that is mindful, aware, and welcomes observations and insights from anyone in the organization (not just doctors).

In addition to "Anatomy of an Error," Duke Medical Center would be well advised to teach their workers to be very aware of surprises and disappointments - to use a spy novel term, "expect the unexpected." If well-trained, well-rested staff perform a well-designed process and something still goes awry, that is not a defect, it is vital information. It means something has not been taken into account, that the environment has changed, and a cool, clearheaded review is in order. It could be the appearance of a new strain of resistant bacteria, a new designer drug, or simply a unique patient situation. The worst possible step is to hide the error or to engage in "quiet fixing."

This is the next level of training that groups like Duke need to embark on. They should be commended for training their staff on this weird topic of errors and mistakes. But they can't stop there; there's much left to do.