Printer Friendly

Keys to a membership survey.

When we talk about quality in association management, we define success as meeting or surpassing our members' expectations. As a tool to discover those expectations, surveys are quite useful--they are always revealing and sometimes quite eye-opening.

For example, from the survey we'll analyze in this article, the National Speakers Association, Tempe, Arizona, was surprised to learn that an overwhelming number of members wanted NSA to establish minimum membership criteria--even though many themselves might not qualify.

NSA also discovered that what appeared to be considerable demand for voice mail service actually represented only a few loud voices. Survey data help confirm perceived needs and can represent the views of less vocal members.

Respondents ranked professional development as the most important issue for NSA to address over the next five years. NSA is taking a hard look at how to accomplish this objective through its monthly audiotape series, annual convention, educational meetings, Professional Speaker magazine, and research services.

The message NSA hears is clear: Just as members must deliver quality in their pursuits, associations must deliver value to retain current members, attract new members, and flourish in any economy.

Surveys are not panaceas; they have limitations. They take time, money, and knowledgeable planning and effort. NSA foresaw changes in the industry and needed more information to prepare for them. NSA 1991-1992 President Edward E. Scannell, 1991-1992 President-Elect Al Walker, and Executive Vice President Barbara Nivala, CAE, realized that a formal long-range strategic plan also required accurate member data. With the support of the board and assistance provided by my research and statistical expertise, NSA set out to conduct a member survey in 1992. The steps NSA followed in that process can be adapted for use by any association.

Determine information needs

To get the project started, NSA's executive vice president and president-elect provided important background, including

* a verbal description of NSA operations;

* a verbal description of the issues and problems NSA needed to consider;

* the kinds of information that would resolve these problems;

* the potential value, risk, and opportunity this information would provide;

* specific time requirements; and

* resource allocation for my time and for the production, printing, mailing, tabulation, and analysis of the survey responses.

The executive vice president, president-elect, and I carried out this first step--including informal discussions with the board of directors--in about three weeks. Our overall time frame was tight, since NSA wanted to present the results at the annual meeting. From initial go-ahead to final results took us less than four months; six months is a much more comfortable estimate for most surveys.

Based on the preliminary information we collected, we developed four primary objectives (see sidebar, "Meeting Our Objectives"). These objectives became the cornerstone around which we designed the survey. With the objectives defined, we decided that NSA's strategic planning committee, chaired by the president-elect, was best suited to give further input. (We'll look at that input when we talk about designing the survey.)

Design your sample

The sample is the group to which you administer your survey. You have a number of questions to answer about two basic surveying issues: sampling method and sample size.

Sampling method. Associations often use either full population or random sampling. The size of NSA's membership--3,100 professional speakers--made it

feasible to include the entire population in the survey, which has the advantage of being most accurate. When you include only a fraction of a population in the survey sample, you risk creating errors in the data. If we had chosen to survey only members who had called in the last six months, for example, we might have under-represented people who don't care for voice mail.

If surveying only a segment of the entire population is feasible, you must ensure that the segment you select is truly representative of the whole membership. Choosing a random sample makes that possible.

Note that random does not mean arbitrary. An arbitrary sampling can introduce tremendous error. Check a book on statistical methods in your local library for simple directions and a random number table to choose a random sample.

Sample size. If you determine that surveying the entire membership is not feasible, then survey enough people so that you receive at least 100 responses. A response of 100 ensures statistical significance. The closer the sample size to the membership size, the more accurate the survey will be. Choose the largest sample possible within your time and budget limitations.

Time, budget, and analysis requirements determine sample size. Simply put, the more responses you have, the more accurate your data are; more sophisticated statistical tests are most reliable when we use a lot of responses. However, processing more data can be costly (see the section on layout and printing), and running additional tests takes more time. When determining an ideal survey sample size, consider how many questions you'll be asking, how many responses you expect to receive, and how many relationships among the data you want to test.

NSA had always received a high response from mail surveys, and it had a greater than 30 percent response on this one. One thousand people times about 200 possible survey responses meant a potential 200,000 pieces of data to input into the computer. Since the association wanted to have the research report completed by the annual convention, we printed the survey in a format that could be electronically scanned. This enabled us to have response data scanned onto a computer disk in a couple of hours, rather than spend several weeks on manual input, and also helped eliminate human error.

A final consideration regarding sample size and survey design is the statistical software that will be used to analyze the survey results. For the NSA project, all statistical processes and analyses were performed in-house. Our software allowed a virtually unlimited sample size but did require us to rearrange survey data for calculating some relationships. Software that is now available has removed this requirement and will make future processing much faster.

Choose a delivery method

A mail survey was the most practical and cost effective for NSA, because the association has historically had good response to mail surveys. Other possible avenues include focus group research, interviews, and telephone conversations.

While these are less accurate than mail surveys--because the increase in human involvement leads to increased error--the more personal approaches often have a higher response rate and obtain information that a mail survey might not get.

If you are having trouble defining exact research objectives, focus-group research and personal interviews often can help clarify the direction the final survey should take.

Design the instrument

At an NSA strategic planning committee meeting, I conducted a focus group session to determine what information we needed to meet each of the four stated objectives (see sidebar, "Meeting Our Objectives").

I then developed a set of questions for each objective, taking care to eliminate bias, reduce the sensitivity of certain questions, and ensure the highest, most accurate response. Some questions were easy and straightforward to design: Gender is either male or female, for example.

Questions requiring some kind of numeric response needed ranges of possible responses. For example, we wanted to know how many fee-paid speeches each person gave in 1991. It was important to offer ranges that would meaningfully segment the number of speeches while taking into account the question's sensitivity. The more sensitive the question, the more carefully you need to word it in a nonthreatening way while also ensuring the respondent's anonymity. We were surprised and pleased at both the number of responses and the honest answers to the most sensitive questions.

We included both open and closed questions, some with simple yes-or-no answers, others allowing respondents to write whatever they wanted.

Some questions offered scales, and the respondent selected a value between one and five. Others gave range choices and asked the respondent to choose one. Some questions had one answer; some allowed multiple responses.

Test the survey

The next step was to have about 10 other people review the questions and survey instructions for clarity, difficulty, overgeneralization, level of recall necessary, overemphasis, leading questions, and loaded questions. Any of these problems can introduce bias in the process, leading to inaccurate information in your final analysis.

If a question requires too high a level of recall--such as asking which conventions someone attended in the last 10 years--the response may not be accurate. A leading question is, "Don't you think the association has done a good job?" A loaded question is, "Do you think that a man or a woman would be a better NSA president?" Having too many questions on the same topic wearies the respondent and reduces response level and accuracy.

At first, you can test the survey on anyone handy and willing to help--family, friends, and people either in or out of the targeted population.

After you digest those comments and suggestions and make appropriate changes in the survey, test it again. We tested our second draft on NSA's board of directors. Because of the sensitivity of the questions and consequent need for anonymity, the pretest data were not analyzed and published. We used only comments about survey construction and design.

Survey layout and printing

You have several layout options for the survey and response form. The design choice depends on your budget. NSA chose a route that may seem expensive but in the end was the least expensive and introduced the least amount of error. Here were the possible choices:

1. Respondents mark answers directly on a printed survey that cannot be scanned. Responses must then be manually input into a statistics program. Advantage: This is the least expensive to design and print. Disadvantage: Data entry is labor-intensive and has a high possibility of error.

2. Respondents mark answers on separate computer answer sheets that can be electronically scanned onto a computer disk. Advantages: Data do not have to be manually input into the computer; the answer sheet is the least expensive format that can be scanned. Disadvantage: Separate answer sheets have a high incidence of error, since respondents can accidentally mark the wrong answer.

3. Respondents mark answers directly on the survey, which has been specially designed for electronic scanning. Advantages: This is most convenient for respondents; produces the least number of errors; encourages a high response rate; and lends the most professional image. Disadvantage: It is moderately expensive because it must be printed on special paper with special inks and computer formatting. Estimate $1,500-$2,000 to print 3,500 surveys. NSA chose the third method.

Mailing the survey

You may choose to mail your survey, as NSA did. To reduce mailing costs, the survey package can be part of another mailing, such as your magazine or newsletter. However, it usually will get more attention mailed alone. Three things belong in the package: a cover letter, the survey, and a postage-paid return envelope. Not providing a postage-paid, pre-addressed return envelope greatly reduces response rate by reducing the perceived anonymity of the respondent and by making it more of a hassle to respond. Response rate is important because the higher the response, the more accurate the results.

Because NSA wanted the results in time for the annual meeting in July, we sent the survey as a separate mailing.

Analyze your results

With data collected and in the computer, you can begin analyzing the results. If you work with a research firm, make sure it returns all data to the association in an ASCII file or other common format. This ensures that you can further analyze statistics at a later time.

With 200 survey responses, NSA literally has hundreds of thousands of pieces of information possible. We'll be gleaning information from that survey for the next two to three years.

I suggested that NSA's president-elect determine which statistical relationships were most important for NSA at this time. He decided that he wanted to know the relationships between gender and income, those attending conventions and income, satisfaction with NSA and gender, and satisfaction with NSA and convention attendance. In all, he identified approximately 100 relationships to analyze. Based on those relationships, I performed statistical analyses that included tallies, cross tabulations, and several other tests.

Reporting results

The survey report is your opportunity to communicate the value of the project to the board and membership. Don't cut corners. A good report has several parts.

1. Executive summary. This part of the survey report provides the major highlights of the project, addressing the objectives and the key results from each objective. The summary should take only a few minutes to read.

2. Questionnaire with tallies. Show each question with the tally for each possible response.

3. Statistical report. This narrative section of the survey results explains the relationships, demonstrating findings with appropriate graphs and charts.

I presented the results to NSA's board of directors in June, and the association's president-elect announced the findings to the membership at the annual convention in July. He distributed an executive summary to all in attendance; the summary was also mailed to the entire membership with the monthly magazine.

NSA's volunteer leadership is now able to use the survey results to make better decisions for the future. Committee chairs, for example, can use the survey results that affect their particular committee concerns, such as professional development and member services.

An interesting result of the survey is that 67 percent of those responding believe NSA should establish minimum membership requirements. We considered asking in the original survey what those criteria should be but opted not to. The follow-up question seemed to assume an affirmative response to the first question. That assumption could have biased the answer to the first question. NSA decided to conduct a follow-up survey to determine what minimum requirements members wanted.

Research results can identify interesting opportunities for associations both now and in the future. Associations need to better market themselves and their services to members and potential members. Use your research to provide quality service and products.

The Perils of Polling

When you consider a survey of this type and scope, be aware of four pitfalls.

1. Inaccurate sampling. It is most accurate to survey the entire population of interest--your association's membership. If that is not feasible, then you must select a sample using very specific methods to ensure that those surveyed are representative of the entire population.

2. Improper survey design. The survey must have objectives and the questions must follow specific guidelines, staying focused on the objectives as well as getting accurate information from the respondent. This means that the questions must not be biased or influence the respondent's answer one way or another. "The association is meeting all your needs, right?" is a biased question. NSA asked, "Overall, how well does the association meet your needs? Extremely well, very well, well, not very well, not at all."

3. Unrealistic expectations. Research surveys should be a major source of information for the decision-making process. But research alone usually does not provide all the information you need to make every decision.

Associations have to look at the total picture. For example, consider the association's mission: Where are you going, how will you get there, and are the members behind you? Also keep in mind the economic climate: Is the industry growing or depressed?

Finally, pay attention to resources: Does the organization have the resources to accomplish what the members want? Are you using resources efficiently? And are members willing to pay for what they want?

4. Lack of a knowledgeable resource. Research conducted by untrained people can actually do damage by leading the association to make inaccurate decisions. It can cost more to make the wrong decisions than to pay a professional to ensure that the information is correct.

Meeting Our Objectives

NSA defined four objectives, which became the cornerstone around which it designed its survey. Then the strategic planning committee brainstormed multiple ideas and questions that were used to develop the final survey questions.

Objective 1: Determine how well NSA meets its members' needs.

* How well does NSA meet members' needs as professional speakers?

* Rate the value of current services.

* Attendance at conventions and educational programs.

* Local chapter activity; why people belong or why they don't.

* Rate quality of publications.

* Desire for minimum membership criteria.

* Most important issues for next five years.

* How competitive is NSA with other groups?

* How well does NSA meet professional development needs?

Objective 2: Determine which services NSA should improve, change, add, or drop.

* Which products or services should be added?

* How to pay for added services.

* How often do members read the publications?

* Who else reads the publications?

* What topics should NSA address in its publications?

Objective 3: Determine how NSA members view its operation.

* Rate communication and experience with staff.

* How can NSA better serve the needs of its members?

* Rate communication and experience with members of the board.

* Rate elected and volunteer leadership, boards, and committees.

Objective 4: Profile the NSA membership; gather professional and personal information.

* Age, gender, education, office and residence locations.

* Type of business; employed or self-employed.

* Number of paid speeches; number made for free.

* Gross income from speaking, training, consulting, and product sales.

* Take-home pay.

* Number of employees, subcontractors, and free-lancers.

* Types and topics of presentations.

* Rate bureau experiences.

Sarah Layton, based in Orlando, Florida, holds a doctorate in business. Layton, who is president of Mishalanie, Layton & Associates, presents keynote speeches called "The Passion for Quality" and consults in strategic quality planning and market strategies.
COPYRIGHT 1993 American Society of Association Executives
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1993, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:includes related articles
Author:Layton, Susan
Publication:Association Management
Date:Jun 1, 1993
Words:2909
Previous Article:Tips from the back of the house.
Next Article:Celebrating the ASAE foundation's 30th anniversary.
Topics:


Related Articles
Finding your niche.
Boosting members' exports.
Defining the value of associations.
Protecting Organization and Membership Data.
Pace-setting practices: Take a sneak peek at ASAE's 2001 policies and procedures in Association Management survey.
Strategic plan 2002-2003: Association for Childhood Education International.
Integrating Web services with your AMS: the age of interoperability. (Tech Tool Kit).
The benefits and added value of online registration tools for associations. (Tech Tool Kit).
Information: the key to retreats that flourish. (Governance).

Terms of use | Copyright © 2016 Farlex, Inc. | Feedback | For webmasters