Questions and Answers from IRI's Members.
Q: What is your design thinking methodology?
Design thinking is increasingly a standard process for product innovation and design; the key is empathy for the user. Although the process includes a research phase, the discussion is typically about user needs and does not describe a way to include new materials, technologies, or methods of making. IRI wants to benchmark members to determine if this is a common gap or if there are best practices in this area.
We received responses from 18 IRI member organizations ranging in size from less than $100 million in sales (2) to greater than $50,000 million (6). The largest number of respondents were in Chemicals, Gases, and Advanced Materials (5); Industry Machinery, Equipment & Products and Consumer Products accounted for three each. The rest were distributed among a variety of industries (Figure 1). Asked about their primary customers, 12 identified themselves as business-to-business companies;
In this space, we publish the results of mini-surveys administered through IRI's e-mail-based community forum in response to member queries. Questions and responses are selected for completeness, relevance, and broad interest; questions and freeform responses may be edited for format, grammar, and to fit the available space.
The community forum is available to IRI members via the IRI website; from the home page, select Collaboration Center, then Community Forum. For more information about IRI membership, visit our website, http://www.iriweb.org the rest indicated that they sold primarily to end users.
All respondents reported using design thinking at least a little; two said they used the methodology "a lot." Two-thirds (12) reported that they used a process that included the general design thinking steps (empathize, define, ideate, prototype, test). Among those who did not use a process that included those steps, one said the steps had not been formalized in the company and others reported using a different set of steps, such as "define, ideate, prototype, test" or "define requirements, develop prototype, test, validate." One respondent noted that his or her organization did not use the term "design thinking," but that it did "try to define new products/projects based upon customer value propositions, which take into account the five steps of design thinking." For this respondent, "design thinking just seems to be a new way of describing good new product development processes." Other responses suggested other variations or described the reasons for modification in their companies. The empathize step appeared to present unique challenges; respondents noted:
* While we do to some point, we often miss empathize--as a B2B, we have more than one customer to consider. We have the person who installs our product, we have the person who uses our product, we have the management of the person who uses our product, and we have the C-level. Each person in this value chain has a want and need. Often we only focus on one and ignore the voice of the others.
* We don't spend a lot of time on empathize. Process kicks off with a definition of customer needs.
A large majority (II) said that they felt the design thinking process does not include adequate processes for evaluating new technologies, materials, and processes in new product development. Of those, eight thought the process should be modified to include such processes. These respondents suggested where in the process new technology evaluation could happen:
* During ideation, new solutions could be conceived that take advantage of new materials or other technological advances.
* We see design thinking to be more about identifying the need and the opportunity independent of the solution. The ideate phase is then free to include any of these factors.
* Since we are a materials science industry, we do well with materials-- where we fail or are less than perfect is in process and new technology. This is a shortcoming that we are working on with efforts in outside technology and open innovation.
* In research and in benchmarking during development.
* Curiosity-led discovery and hunch evaluation.
* Creation of a minimum viable product or product concept model.
* Rigorous physical testing.
* Ideate and prototype.
Finally, we asked respondents what ideas, experiences, or materials they had to share regarding new technology evaluation in design thinking. This question yielded a trove of perspectives and resources:
* We have found that by beginning with a technology-agnostic problem statement and looking to understand the customer need, the openings to include new materials, technologies, etc., in proposing solutions are straightforward.
* It should encompass leadership awareness training. For example, see www.designofthinking.com.
* Active efforts in open innovation can aid in driving more emphasis on technology, process, and materials. This has the natural effect of bringing it into your design thinking, as long as the efforts are joined in the total technology silo.
* Incorporating industrial and experience design should drive to ideating around new ways of delivering experiences that may involve undeveloped technology. Quick and dirty prototypes can help identify opportunities for new technology that can justify increasing the technology readiness level.
* Like any methodology, the base approach often lacks sufficient detail for highly technical decisions. A more detailed approach must be added for the technical questions.
* A constant challenge is the timeline. To adequately search, identify, and evaluate new technical approaches often takes more time than is allowed for ideation and prototyping.
* As always, the challenge is do we truly understand customer needs, many of which could be unarticulated. Typical VoC doesn't get you there. You really need to study markets and similar applications and often look at customers to see value.
* Doblin's ten types of innovation. Duncan Wardle's design thinking innovation tool kit.
* It includes an iterative process of hypothesis, test, redesign, trying to control time and costs.
Q: What is your approach to STEM?
Procter & Gamble recently created and filled a Diversity & Inclusion role leading a science, technology, engineering, and math (STEM) Coalition. The intent of this work is to step-change P&G's internal and external presence supporting STEM engagement and proficiency for K-12 and college students, with a more strategic approach. The business intent is to strengthen their pipeline of diverse talent for P&G functions that require STEM mastery. This work will also strengthen P&G's reputation in key areas (Diversity, Gender Equality, Innovation) in line with P&G's Citizenship agenda, and we expect it will strengthen employee health and culture by channeling untapped passion to serve. P&G is seeking to understand similar programs and activities across the industry.
We received responses from 18 IRI member organizations from a wide variety of industries, ranging in size (in terms of sales revenues) from less than $100 million in annual sales to greater than $50,000 million. All 18 reported that their organizations have active programs to work on STEM proficiency in their communities. Of those, 12 were focused on local efforts, 7 worked nationally, and 2 had internationally active programs (note that some programs worked across multiple levels). Spending on these programs ranged from less than $100,000 each year (9) to more than $1 million (4).
Respondent organizations had some clear goals for that spending, from promoting STEM interest, support, and proficiency to creating stronger bonds with the community (Figure 2). Beyond the list of possible goals provided by the survey, three respondents identified additional goals:
* Support secondary education
* Encourage technology interest
* Build a capable future workforce with the technical skills needed by the business
Respondents measured progress toward those goals using a variety of metrics. The most common, attracting eight responses each, were increased student awareness of STEM career options and a growing talent pipeline (Figure 3). Four respondents indicated having no metric at all, and another four selected "other." The metrics cited by these included numbers of STEM majors and graduates, number of students contacted, number of program participants, and employee engagement with the program.
Caption: FIGURE 1. With which industry does your organization primarily associate?
Caption: FIGURE 2. What is the purpose of your STEM proficiency program? (Select all that apply.)
Caption: FIGURE 3. What metrics do you use to measure the effectiveness of your investment in STEM? (Select all that apply.)