Apple R&D and Steve Jobs Methodology: Long-term vision

Let us continue our discussion on the Steve Jobs methodology. We discussed user centric design as the fundamental tenet of new product development under Jobs. Let us now focus on the long-term vision. I will be gleaning information from several sources to see if we can build a better picture. Let us start off with the information from the transcript of an interview with ex-Apple CEO John Sculley.

Steve had this perspective that always started with the user’s experience; and that industrial design was an incredibly important part of that user impression. And he recruited me to Apple because he believed that the computer was eventually going to become a consumer product. That was an outrageous idea back in the early 1980′s because people thought that personal computers were just smaller versions of bigger computers. That’s how IBM looked at it.
emphasis added

However, a long-term vision is not enough.  One has to break it down into manageable steps.  This is likely more art than science because many of the necessary technologies are not quite read at the time vision is generated.  So the idea is to have a clear vision in mind while working through the steps over the long-term.

He was a person of huge vision. But he was also a person that believed in the precise detail of every step. He was methodical and careful about everything — a perfectionist to the end.

The key capability for an R&D manager, then, is to look at roadmaps of different technologies required to achieve the long-term vision and identify what to invest in when.  Even when technologies do exist, they may not be implementable because of difficulty integrating into a user centric design.  So, R&D managers need to be able to combine a vision with technology roadmaps AND integrate them into a user-centric design:

On one level he is working at the “change the world,” the big concept. At the other level he is working down at the details of what it takes to actually build a product and design the software, the hardware, the systems design and eventually the applications, the peripheral products that connect to it.

Let us now use the Wired article The Untold Story: How the iPhone Blew Up the Wireless Industry to dig a bit deeper in to this concept of long-term vision aligned with technology plans and user experience.

In 2002, shortly after the first iPod was released, Jobs started thinking about developing a phone. He saw millions of Americans lugging separate phones, BlackBerrys, and — now — MP3 players; naturally, consumers would prefer just one device. He also saw a future in which cell phones and mobile email devices would amass ever more features, eventually challenging the iPod’s dominance as a music player. To protect his new product line, Jobs knew he would eventually need to venture into the wireless world.
emphasis added

Many people were talking about convergence back then.  But, vision is not enough.  It needs to be aligned with technology roadmaps.

If the idea was obvious, so were the obstacles. Data networks were sluggish and not ready for a full-blown handheld Internet device. An iPhone would require Apple to create a completely new operating system; the iPod’s OS wasn’t sophisticated enough to manage complicated networking or graphics, and even a scaled-down version of OS X would be too much for a cell phone chip to handle. Apple would be facing strong competition, too: In 2003, consumers had flocked to the Palm Treo 600, which merged a phone, PDA, and BlackBerry into one slick package. That proved there was demand for a so-called convergence device, but it also raised the bar for Apple’s engineers.

One way to explore a vision as complex as a convergence device is to experiment.  It is not enough to just trust the gut feeling about the vision, it is important to take small steps.

So that summer, while he publicly denied he would build an Apple phone, Jobs was working on his entry into the mobile phone industry. In an effort to bypass the carriers, he approached Motorola. It seemed like an easy fix: The handset maker had released the wildly popular RAZR, and Jobs knew Ed Zander, Motorola’s CEO at the time, from Zander’s days as an executive at Sun Microsystems. A deal would allow Apple to concentrate on developing the music software, while Motorola and the carrier, Cingular, could hash out the complicated hardware details.

However, results on experiments may not be as successful as hoped.

Of course, Jobs’ plan assumed that Motorola would produce a successor worthy of the RAZR, but it soon became clear that wasn’t going to happen. The three companies dickered over pretty much everything — how songs would get into the phone, how much music could be stored there, even how each company’s name would be displayed. And when the first prototypes showed up at the end of 2004, there was another problem: The gadget itself was ugly.

R&D managers need to have the faith in their long-term vision to learn from failures and continue.  As we have discussed, Nokia had a prototype touch-screen smart phone in 2004 as well, but chose not to pursue it any further.  Here is what Jobs did:

Jobs delivered a three-part message to Cingular: Apple had the technology to build something truly revolutionary, “light-years ahead of anything else.” Apple was prepared to consider an exclusive arrangement to get that deal done. But Apple was also prepared to buy wireless minutes wholesale and become a de facto carrier itself.

But the faith in the vision should always be supported by robust technology plans (or R&D plans in general):

Jobs had reason to be confident. Apple’s hardware engineers had spent about a year working on touchscreen technology for a tablet PC and had convinced him that they could build a similar interface for a phone. Plus, thanks to the release of the ARM11 chip, cell phone processors were finally fast and efficient enough to power a device that combined the functionality of a phone, a computer, and an iPod. And wireless minutes had become cheap enough that Apple could resell them to customers; companies like Virgin were already doing so.

Even a robust technology plan is not enough.  One has to bring it together into a user centric design.  Finally, may be we could use a process like spiral development to support this endeavor.

They built a prototype of a phone, embedded on an iPod, that used the clickwheel as a dialer, but it could only select and dial numbers — not surf the Net. So, in early 2006, just as Apple engineers were finishing their yearlong effort to revise OS X to work with Intel chips, Apple began the process of rewriting OS X again for the iPhone.

For more, please continue on to the next component of Steve Jobs methodology: Engaged leader:


When to rely on gut feelings

We have discussed papers and empirical data that show that reliance on gut feelings often produces sub-optimal results. Now we have a great explanation on why we should be careful about depending on intuition from the behavioral economist Dan Ariely (in the McKinsey Quareterly interview Dan Ariely on irrationality in the workplace):

One way to think about it is the following: imagine you stand on a field and you have a soccer ball and you kick it. You close your eyes and you kick it and then you open your eyes and you try to predict, where did the ball fall? Imagine you do this a thousand times; after a while you know exactly the relationship between your kick and where the ball is. Those are the conditions in which intuitions are correct—when we have plenty of experience and we have unambiguous feedback.

That’s learning, right? And we’re very good at it. But imagine something else happened. Imagine you close your eyes, you kick the ball, and then somebody picked it up and moved it 50 feet to the right or to the left or any kind of other random component. Then ask yourself, how good will you be in predicting where it would land? And the answer is: terrible.

The moment I add a random component, performance goes away very quickly. And the world in which executives live in is a world with lots of random elements. Now I don’t mean random that somebody really moves the ball, but you have a random component here, which you don’t control—it’s controlled by your competitors, the weather; there’s lots of things that are outside of your consideration. And it turns out, in those worlds, people are really bad.

So what is the solution?  We should experiment more and test our gut feelings before we go all out and implement a pervasive solution.

This actually, I think, brings us to the most important underutilized tools for management, which [are] experiments. You say, I can use my intuition, I can use data that tells me something about what might happen, but not for sure, or I can implement something and do an experiment. I am baffled by why companies don’t do more experiments.

I think the reason why many R&D executives I know do not experiment more is the lack of information – both about  factors driving the decision and potential impacts of the decision. For example, executives are normally forced to rely on gut feelings to decide future R&D investments.  It is difficult to experiment because R&D projects are interlinked. It is difficult to see the impact of changing one program on all the other linked programs.  Funding decisions also need to satisfy a multitude of often conflicting requirements.  There are no tools to quickly understand the impact of investments of staffing or on competitive position.  Even when information is available, it is normally at the wrong level of detail to actually make a difference.  We need tools to help executives experiment effectively in R&D management.


Apple R&D and Steve Jobs Methodology: User Centric Design

I have been fascinated with Apple’s R&D successes (Platform-based approach, Portfolio Management, etc.).  I have always suspected that Steve Jobs is a significant contributor to the R&D success at Apple. So I was thrilled to find a treasure trove of information on the Steve Jobs Methodology at the website Cult of Mac (In the transcript of an interview with ex-Apple CEO John Sculley On Steve Jobs).  I think we can all learn a lot from the this article:

  1. User experience centric design: See below
  2. A long-term platform-centric vision to support said user experience (and perseverance to take risks to achieve the vision)
  3. Leadership thoroughly engaged in R&D (e.g. Facebook, Google, Microsoft with Bill Gates, etc.)
  4. Small product development teams with real respect across the organization
  5. Understanding and focus on a niche (or align the entire company strategy around that niche)
  6. Align hiring with product platforms / niche strategy (For Apple, hire the best)
  7. The CEO defines and drives company culture!
I plan to write a post about each of the following aspects over the next few weeks.  Let us get started with user experience.
We have discussed research papers and empirical evidence that collaborations with customers do not always pay off.  This is especially important when the customer is not familiar with potential solutions to the problems they may be facing.  My favorite example is about Windows: If Microsoft did a customer survey back in the 80s about what products should they be developing, would many customers have suggested Windows?  Probably not.  Steve Jobs seems to have known this early on:

Steve from the moment I met him always loved beautiful products, especially hardware. He came to my house and he was fascinated because I had special hinges and locks designed for doors. … 

Steve in particular felt that you had to begin design from the vantage point of the experience of the user. He always looked at things from the perspective of what was the user’s experience going to be? But unlike a lot of people in product marketing in those days, who would go out and do consumer testing, asking people, “What did they want?” Steve didn’t believe in that. He said, “How can I possibly ask somebody what a graphics-based computer ought to be when they have no idea what a graphic based computer is? No one has ever seen one before.” He believed that showing someone a calculator, for example, would not give them any indication as to where the computer was going to go because it was just too big a leap.

One more lesson is that this user centric design has to be based on a long-term vision – not just the next step.  This is important because a small step in user centric design is not  going to build a long-term differentiator:

Steve had this perspective that always started with the user’s experience; and that industrial design was an incredibly important part of that user impression. And he recruited me to Apple because he believed that the computer was eventually going to become a consumer product. That was an outrageous idea back in the early 1980′s because people thought that personal computers were just smaller versions of bigger computers. That’s how IBM looked at it.

Even user centric design is not enough though, one has to fine tune this idea and make a precise niche:

What makes Steve’s methodology different from everyone else’s is that he always believed the most important decisions you make are not the things you do – but the things that you decide not to do. He’s a minimalist.

I can not over emphasize the importance of User Centric Design.  It drives long-term competitive advantages.  But more important, it is the foundation of new software-driven digital world.  So the culture of user centric design along with the digital revolution helped Apple succeed in the consumer electronics space (iPod, iPhone, etc).  As opposed to Japanese businesses such as Sony, whose culture revolved around analog components:

The Japanese always started with the market share of components first. So one would dominate, let’s say sensors and someone else would dominate memory and someone else hard drive and things of that sort. They would then build up their market strengths with components and then they would work towards the final product. That was fine with analog electronics where you are trying to focus on cost reduction — and whoever controlled the key component costs was at an advantage. It didn’t work at all for digital electronics because digital electronics you’re starting at the wrong end of the value chain. You are not starting with the components. You are starting with the user experience.

I have been learning about how difficult it is to change cultures.  The analog-centric culture still remains strong in Japanese consumer electronics companies.  For example, they are ceding the entire user experience design to Google Android in the smart phone segment and accepting the fact that they can not compete!

And you can see today the tremendous problem Sony has had for at least the last 15 years as the digital consumer electronics industry has emerged. They have been totally stove-piped in their organization. The software people don’t talk to the hardware people, who don’t talk to the component people, who don’t talk to the design people. They argue between their organizations and they are big and bureaucratic.

In summary, what does and R&D manager need to remember:

  1. There is no substitute for true product planning / development.  We can not just ask the customer what to do (at least not all the time).  We have to actually define what the user experience is going to be and then develop a product around it.
  2. R&D managers have to link future technologies to customer experiences and develop a  product development plan.  This requires bridging technologists with marketing / product management and is rather hard to do.  If one had resources like Intel, we could just hire stand-ins!
  3. R&D strategy and vision has to be broad (platforms) and long (far in to the future)
  4. User Centric Design has to be imbued into the company culture.  Or else it will not work…
Others?  Or continue on to the next component of the Steve Jobs Methodology: Long-term Vision.

Potential R&D savings from the Defense budget

New York Times Op-Chart The Pentagon’s Biggest Boondoggles has thoughts on cutting defense R&D costs (and quite a bit of historical data).  It also mentions the GAO study that we had discussed in an earlier post:

Listed below is just a sampling of what systems could be ended without endangering America; indeed, abandoning some of them might actually enhance national security. These cuts would generate only small savings initially — perhaps just several billion this fiscal year, as contracts would have to be wound down. But savings would swiftly rise to more than $50 billion annually thereafter.


Too much positivity increases Risk

The article Organizational Culture: An Overlooked Internal Risk in Business Week has useful data about how employees tend to hide bad news:

  • Nearly half of executive teams fail to receive negative news that is material to company performance in a timely manner because employees are afraid of being tainted by being the bearer of bad news.
  • Only 19 percent of executive teams are always promptly informed of bad news material to company performance.

The article points out that the corporate culture drives this behavior and the employee intent is not likely malicious. Clearly, there is a confirmation bias in most organizations.  The article points out that there are likely to be several reasons for this:

  1. It affirms your preexisting emotions (you wanted the meetings to go well and believe in the strategy)
  2. it reflects well on your own performance (it’s your job to communicate in a compelling way)
  3. it is not incorrect (generally speaking, the meetings went very well)
  4. perhaps you don’t believe your CEO is interested in hearing contrary feedback.

The survey shows that breaking down this communication barrier brings measurable benefits to the organization. The article suggest that managers should encourage employees to speak up, help eliminate the fear of retaliation (through actions, not just words) and educate employees how to speak up / escalate issues constructively.

So what specifically can be done?  We can encourage skepticism and questioning in R&D teams.  We can reward failure to encourage risk taking and communication of bad news.  Furthermore, a questioning environment actually is shown to drive innovation.


What Makes Teams Smart

An interesting paper in Science extends the concept of intelligence to teams and defines Collective Intelligence.  Based on study of 699 people working in small teams of 2-5 people, the study found that team intelligence is driven by three factors:

  1. Social sensitivity of team members increases team intelligence.  More sensitive the team members are about social cues such as facial expression, the better the team performs
  2. Teams where everyone participated in the discussion were more intelligent.  If few people dominated the discussion, the collective intelligence went down.  This is something R&D managers should keep in mind.  It is very easy for managers to dominate team discussions.
  3. Teams with women members were more intelligence than others.  This is likely because women tend to have higher social sensitivity than men.

Equally interesting is the list of factors that does not drive team intelligence. (via MIT Sloan Review and
What Makes Teams Smart):

Interestingly, the researchers found that collective intelligence wasn’t strongly correlated with the average intelligence of the individuals in the group — or with the intelligence of the smartest person in the group. They also found, as they wrote in Science, ”that many of the factors one might have expected to predict group performance — such as group cohesion, motivation, and satisfaction — did not.”

So what can you do about improving team performance?  Check out the article How to Keep Your Team Loose.  Or this one on where to focus on driving performance.


Hotbeds of Innovation

The Strategy + Business article Hotbeds of Innovation has some useful benchmarking information about how large corporations are accessing innovation from outside.  We have talked about Intel and others in the past.  Here are some more:

Called “ecosystem investing” by some innovation executives, it refers to the increasingly complex network of suppliers and innovators supporting large companies.
In this model, well-established U.S. companies are creating strategic partnerships with startups and small companies whose technologies and skills can help the large companies expand their own capabilities.

The idea is to gain access to the technology through strategic partnerships and alliances:

The goal of the incumbents is to systematically target emerging technologies and “harvest” ideas without having to take on the risk of acquiring the smaller companies. Sometimes the large company takes an equity stake, and its top executives may sit on the small company’s board or mentor its top management. Alternatively, it may seek to license the small company’s technology or buy its products and distribute them to global markets.

Here are a couple of results.  First from J&J:

Ortho-McNeil Inc., a J&J division, invested the modest sum of US$40 million in Metabolex Inc., a privately held biopharmaceutical company based in Hayward, Calif., so the two companies could collaborate on the development of compounds used to treat type 2 diabetes. …
In June 2010, Ortho-McNeil received an exclusive worldwide license to commercialize several Metabolex drugs, including the diabetes compound, for about $330 million. That’s far less than the $1 billion a pharmaceutical company typically spends to develop drugs internally, and far more than Metabolex could have expected to bring in on its own.

Second from Intel:

Intel was able to dramatically increase the clout of its ecosystem investment strategy recently when it teamed up with 24 other venture capital (VC) firms as part of the company’s “Invest in America” alliance, Intel’s commitment to promote U.S. competitiveness by supporting technology development and creating jobs for college graduates. Intel put up a mere $200 million of its own money, but the VC firms pledged to match that investment, for a total of $3.5 billion over several years.


Some best practice info about R&D and innovation

The post Six Myths of Corporate R&D at Corporate Executive Boards has a convenient list of best practices for encouraging innovation as opposed to incremental improvements.  I have arranged them in three categories and my comments are in parenthesis:

  1. Encourage learning
    1. Organize R&D functions to encourage learning instead of alignment with corporate strategy (I am not sure both are mutually exclusive.)
    2. Encourage R&D staff to form informal networks inside and outside the corporation.  (Good point, but difficult to do.IP Control will need to be a constant focus).
  2. Take more risks with investments
    1. Increase investment on breakthrough ideas as opposed to product improvements (The real answer is a balanced portfolio of investment.  The right balance depends on the the type of business and competitive environment).
  3. Be more flexible with early stage opportunities
    1. Be flexible with metrics, such as return on investment, for early stage opportunities (Clearly, it is difficult to estimate the value of breakthrough ideas.  However, it is also very difficult to identify which ideas are breakthrough…)
    2. Be flexible with project reviews of early stage opportunities.  Focus instead on customer value and  related scenarios.  Review early stage opportunities as a portfolio and mitigate risks at a portfolio level.
    3. Be flexible with project management and related processes.

More proof that innovation is a buzz word

The article How to Develop a Social Innovation Network, in my opinion, teaches you how not to purse innovation.

“Customers already use social technologies to wrest power away from large corporations. Now employees are adapting social technologies in pursuit of innovations to support these empowered customers; Forrester calls these employees HEROes (highly empowered and resourceful operatives)”

Innovation is not about buzz words (HEROes!) and definitely not about social networking.  We talked about innovative masses yesterday and realized that even if there are many end-users out there with some interesting ideas, filtering them for quality and adapting them to satisfy mass market quality is not normally cost effective.
 
Furthermore, sorting through ideas of varying maturity (from social media such as facebook) to identify innovation difficult at best and a complete waste of time at worst.  I remember holding an innovation challenge for employees of a large technology company.  Employees could win a prize by describe their innovative ideas in two or there sentences.  One employee suggested the company should turn trash into oil.  I asked how does one do that?  The employee said she was only going to give ideas, how to make them work was the company’s problem!  🙂


Tapping the Innovative Masses

MIT’s Technology Review has some interesting survey results in the article Tapping the Innovative Masses:

“We found that 6.2 percent—representing 2.9 million people, or two orders of magnitude more than are employed as product developers in the U.K.—created or modified consumer products over the past three years and spent 2.3 billion pounds per year, more than double what the U.K. firms spent on consumer-product R&D.”

A lot of this tinkering was quite sophisticated, such as adding new spin cycles to washing machines etc.  The question is how to tap into this huge volume of product ideas – some of them innovative.  In many cases, even if the ideas are not innovative, they can significantly cut down on development time.  However, the volume of ideas generated by such a process is so large, most companies will have trouble keeping up with them.  Also, since most tinkerers do not have quality control processes, building products on their developments is quite difficult as well.  Finally, as we have discussed in the past, collaborations with customers often tend to not be as value added as they could be.

Fortuitously, NY Times shares Microsoft’s approach to accessing innovations from the masses in the article Microsoft’s Effort to Build Apps and Reward Engineers:

“Because the platform is new, developers have to learn its ways before writing many of those apps. So to add them quickly, Microsoft has taken an unusual step. It has relaxed a strict rule and will let employees moonlight in their spare time and keep the resulting intellectual property and most of the revenue, as long as that second job is writing apps for Windows Phone 7-based devices.”

Clearly, Microsoft has not tapped the mass of people out there, but they are encouraging their employees to innovate and keep a larger fraction of the economic value they generate.  Their approach is quite attractive because:

  • Innovations are coming from employees, hence, Microsoft can have some confidence in the quality
  • Apps are separate from Microsoft products (Windows Phone 7), Microsoft has an easier time separating intellectual property and brand image
  • Revenues from the apps can be tracked separately, it is possible for Microsoft to compute economic value (MS offers 70% to employees and keeps 30%)
  • App development will help employees identify useful updates and upgrades to the OS, and guide innovation
However, some key concerns remain.  First is to maintain employee focus on jobs as opposed to moonlighting:

“Engineers work all hours; they don’t punch a 9-to-5 clock,” Professor Cusumano said. “Normally, you want your employees to pour their passions into their jobs. If they do something else on the side, you don’t cheer them on.”

The second being bandwidth.  If employees are developing the OS full-time as part of the job and then developing apps during their free time, will they have enough energy left to be creative and innovative?  How much work load is too much?