2003 SURVEY OF SMALL BUSINESS FINANCES
PREPARED FOR:
BOARD OF GOVERNORS OF THE FEDERAL RESERVE
PREPARED BY:
AUGUST 16, 2004
Table of Contents
1. INTRODUCTION 3
2. CATI QUESTIONNAIRES 5
3. TRAINING 15
4. CONTACT MATERIALS 18
5. SAMPLE 22
6. DATA COLLECTION 25
7. INTERVIEWERS 33
8. SYSTEM INTEGRATION 35
9. DATA DELIVERY 37
10. APPENDIX 39
This document describes the procedures and results of the second pretest of the 2003 Survey of Small Business Finances (SSBF). The SSBF is a study of the factors affecting the availability of credit to small businesses. The main purpose of the second pretest was to test the CATI instruments and data collection procedures for the study.
Pretest One. The first SSBF pretest was conducted in March-April 2004. The dates of pretest one data collection were March 10 - April 6. A final report for SSBF pretest one was delivered to the FRB on May 13, 2004; a copy of that report is available from the FRB or NORC.
Pretest Two. The second pretest was conducted in May-June 2004, one month following the end of the first pretest. The data-collection period for pretest two was May 11 - May 27, about two and a half weeks. The primary purpose of pretest two was to test changes that had been made to the instruments in pretest one, and to further test and identify any remaining bugs in the questionnaire programs and other computer systems used for data collection.
The list of specific objectives for pretest two was similar to, though less extensive than, those for pretest one:
Assessing completion rates was not an objective of either pretest one or two. For each pretest, NORC was contractually obligated to complete 50 main interviews with small business owners whose businesses were determined to be eligible for the survey based on a brief screening interview. In order to ensure the completion of 50 main interviews in the short data collection period allowed for each pretest, NORC selected larger-than-necessary pretest samples.
The body of this document is organized into nine sections, corresponding to the main steps of the survey. In each section, we describe what we did for the second pretest, how it worked, and finally, the changes we propose, if any, for the main survey. The Appendix contains documents that are either newly developed as a result of our experience conducting the second pretest, or significantly modified from those used in the second pretest.
The 2003 Survey of Small Business Finances uses two separate computer-assisted telephone interviewing (CATI) questionnaires. There is a screening questionnaire to identify firms that meet the eligibility criteria for the survey. (See Section 5 for a description of the target population for the survey.) There is also a main interview questionnaire, which is administered to respondents who complete the screener interview and qualify for the study. The performance of each of these questionnaires in pretest two is discussed in this section.
2.1 Screening Questionnaire
The screening questionnaire is a brief instrument designed to identify firms that meet the eligibility criteria for the survey. During the pretest, we were particularly concerned with how this instrument performed in terms of helping interviewers navigate past gatekeepers to reach an appropriate survey respondent, the wording and order of the questions, and the performance of the zip code look-up, particularly its accuracy and response time. Each of these issues is discussed in the following sections.
Based on observation and interviewers' comments after pretest one, NORC modified the screener for pretest two. Those changes are described in the sections below.
2.1.1 Introduction Script
Pretest two interviewers reported being more successful in this pretest, compared to the first pretest, in getting past gatekeepers, locating respondents and securing cooperation. A number of changes made between the pretests made the introduction more effective:
Though much improved from pretest one, getting past gatekeepers, finding qualified respondents and securing respondents remain substantial challenges for this study. The following changes came from debriefing pretest two interviewers and from internal team discussions; these changes will be implemented in time for the main study, and are designed to improve cooperation and responsiveness:
The zip code look-up worked well in pretest two for both the screening and main interviews. Interviewers reported negligible response time, and respondents confirmed that the correct city and state were being returned by the program. No revisions to the zip code look-up are planned for main production.
2.1.3 Question Order
The screening questionnaire was designed to first identify the owner or owner proxy; next, to confirm that we are calling the correct firm; third, to determine the eligibility of the firm, and finally, to collect information from eligible firms needed for the worksheet mailing. The question order in the screening interview seemed appropriate in production, as designed.
During the pretest debriefing, the interviewers did not report any problems with the order of the screener questions. Also, review of the item frequencies revealed that the bulk of the ineligible firms were identified early in the series of eligibility questions, with few firms deemed ineligible on the last two questions (Table 1).
Yes | No | DK/RF | N | |
---|---|---|---|---|
A3. BUSINESS OPERATING IN DEC03 | 97% | 3% | 0% | 332 |
A3.1 BUSINESS CURRENTLY IN OPERATION | 98 | 2 | 0 | 323 |
A5 FIRM OWNED 50%+ BY ANOTHER FIRM | 6 | 93 | 0.6 | 317 |
A6. FIRM IS FOR PROFIT | 95 | 5 | 0.3 | 294 |
A7. FIRM OWNED BY GOVERNMENT AGENCY | 0 | 100 | 0 | 279 |
A8.1 / A8.2 FEWER THAN 500 PEOPLE* | 98 | 0.3 | 2 | 264 |
*Working owners and non-owner employees combined
2.1.4 Question Wording
The wording of screener questions worked well. The expanded response options to the first question (A1), asking for the owner, provided interviewers with an adequate response frame, compared to pretest one. The additional responses were:
For pretest two, we took a different approach. We first identified if a firm has one location or more than one location. Most small businesses are single-location organizations (82% in pretest two screening), and for those firms we could safely assume we are calling the headquarters or main office (since there are no other locations that could be called for that firm).
For multiple-location firms, we asked if the D&B-provided physical address was for the firm's headquarters, a branch location or neither. In pretest two, all of the 49 respondents were able to answer this question, i.e. there were no DKs or RFs. Finally, if a respondent said that the D&B-provided physical address was neither the firm's headquarters or any branch location, we asked if the address had ever been for the HQ or branch.
The new questions worked well. Interviewers had no difficulty administering the questions, and the questions were clear in identifying if a record represented a firm's headquarters, in which case the interview continued, or if it represented a firm's branch or other location, in which case the interview was terminated and coded as ineligible.
For question A10.2, "What is the single most important problem facing your business today?", the pretest two list of response codes appears to have been inadequate. One hundred and eighty three responses (66%) were coded Other to this question. For main production, we recommend reviewing the Other responses from pretest two to determine if the response frame needs to be changed or expanded. We will also review whether interviewers need more training in working with respondents to find the code that correctly captures the response.
2.1.5 Sensitive Questions
We did not expect any of the screener questions to be considered as sensitive by respondents, as the screener generally confirms information that is publicly available about the firm. Our experience with the screener in pretest two supported our initial expectation. There were very few eligibility questions that respondents refused to answer, or did not know the answer (Table 2). Also, during the debriefing, interviewers did not report that any of the screener questions were considered sensitive by respondents.
DK | RF | Total | |
---|---|---|---|
A3. BUSINESS OPERATING IN DEC03 | 0 | 0 | 0 |
A3.1 BUSINESS CURRENTLY IN OPERATION | 0 | 0 | 0 |
A5 FIRM OWNED 50%+ BY ANOTHER FIRM | 2 | 0 | 2 |
A6. FIRM IS FOR PROFIT | 1 | 0 | 1 |
A7. FIRM OWNED BY GOVERNMENT AGENCY | 0 | 0 | 0 |
A8.1 NUMBER OF WORKING OWNERS | 3 | 1 | 4 |
A8.2 NUMBER OF WORKERS | 5 | 0 | 5 |
From pretest one, interviewers believed that many respondents seemed uncomfortable providing an email address, and that this part of the interview was occasionally awkward. For pretest two, we softened the language used to ask respondents for an owner's email address, and provided an optional interviewer prompt that explained the (benign) purpose of asking for an email address. Interviewers said that they sometimes read the prompt and that, on the whole, this part of the interview went more smoothly than in pretest one.
In pretest two, half (50%) of respondents provided an email address in the screening interview. Just two respondents refused to answer the question, and one respondent said he or she did not know if the firm's owner had an email address.
2.1.6 Screener Logic
The logic in the pretest two version of the screening questionnaire worked well, and was an improvement over pretest one, especially for the first question. As discussed, for pretest two we added response options for A1 to allow interviewers, in appropriate situations, to ask for a qualified owner-proxy before three calls had been made to the firm. The logic proved very effective when, for example, the owner would not be available during the entire data-collection period, but the interviewer was speaking to a qualified proxy-owner.
The exceptions to the three-call-attempt rule for finding the owner were presented during training, and trainees were instructed that in almost all cases the rule still applied. We cannot verify it to be true, but through observation we believe that interviewers followed the three-call rule, except in the designated situations in which the rule could be circumvented, e.g. an owner had designated a proxy, or the D&B-provided owner would never be available during the production period.
In pretest two, we verified that we had the correct physical address of the firm's headquarters. A physical address is not necessarily the same as a mailing address - the latter can be a rural route or P.O. Box. The wording of the questions in this section emphasized the focus of a physical address, and we added interviewer prompts to read when a respondent indicated that the firm's physical address was a rural route or P.O. Box. There were no DKs or RFs to the question asking if the firm's physical location was the street address, city, state and zip provided by D&B.
Through observation and interviewer debriefing, it was clear that respondents understood the logic of being asked about the firm's physical address, and that it was appropriate to collect address information after qualifying the respondent for the main survey. (This logic was not changed from pretest one.)
Following the section on the firm's physical address, the screener asked the respondent to provide an address to which Federal Express could send a package to the owner. This logic worked well. Respondents, generally, understood that they needed to provide an address to which Fedex would deliver - as with the physical address, rural routes and P.O. boxes do not qualify - and that the objective was to get the package to the owner (as opposed to an owner-proxy).
2.1.7 Question-by-Question Specifications (QxQs)
For the QxQ for A8.1, we added an instruction to ask respondents to count each active owner, even if he or she only worked part-time for the firm, as one owner in answering this question.
This section discusses the performance of the main interview questionnaire during the first pretest. Of particular concern were the introduction script, the institution look-up, question order, question wording, questionnaire logic, and the QxQs.
2.2.1.1 Introduction Script
As with the introduction script in the screening questionnaire, the introduction for the main questionnaire was found to be too long in pretest one. Interviewers said respondents did not need all the information in the script, and in fact, the current script length sometimes annoyed respondents. NORC proposed that this script be shortened for pretest two.
In pretest two, the main-quex introduction script was changed in ways similar to the screener introduction script: it was made shorter and designed to work harder to get past gatekeepers to the owner. Specific changes were:
For pretest two we also added questions to capture the name and title of a designated proxy, for a situation that occurred in pretest one - namely, the owner has designated a proxy to take the entire main interview, the interviewer is directed by the gatekeeper to the proxy, but the proxy is not available. With the new questions, the name and title of the owner-designated proxy are readily available to the next interviewer working the case.
With these changes, Section A worked fine, with several exceptions, discussed below.
The significant changes made to the branch/institution look-up following pretest one made the procedure far more user-friendly in pretest two. Interviewers said that other than the institution file being somewhat out of date, the procedure worked fine.
Throughout pretest two, NORC worked with FRB to test the logic in SKIP57. The primary purpose of SKIP57, which follows zip code, branch and financial institution look-ups in subsection H, is to ensure that a distance question is always asked in the correct situations, and is not asked unnecessarily in other situations. The distance question should be asked when the firm's main office and the branch used most often of the financial institution being discussed reside either in the same MSA or the same county.
While the concept is fairly straightforward, getting the logic of SKIP57 to work correctly proved challenging. The skip needs to handle a number of scenarios, including unusual ones, such as when a respondent cannot identify a financial institution's location because he or she deals with the institution entirely by telephone or over the Internet. When the zip codes of both the institution and the firm are known, another look-up function determines if a match exists for either MSA or county.
NORC has continued to work on SKIP57 up to the time of main production, and in its interim data deliveries to FRB during main production, the logic of SKIP57 is carefully checked and tested.
During the pretest debriefing, interviewers reported that they could usually find the reported branch in the institution look-up table, and that the response time for the look-up was good. Some interviewers expressed surprise at the large number of respondents who knew the zip codes of their banks. In fact, in one in four (25%) situations in which a respondent was asked for the zip code of the branch of the financial institution he or she uses most often, the answer was DK (Table 3). In 71% of such situations respondents were able to provide a zip code.
Question | Total | Zip Provided | DK | R |
---|---|---|---|---|
H5_1 | 68 | 61 | 5 | 2 |
H5_2 | 49 | 33 | 15 | 1 |
H5_3 | 26 | 13 | 12 | 1 |
H5_4 | 14 | 8 | 5 | 1 |
H5_5 | 8 | 4 | 2 | 2 |
H5_6 | 3 | 1 | 2 | 0 |
H5_7 | 1 | 0 | 1 | 0 |
H5_8 | 1 | 0 | 0 | 0 |
TOTAL | 169 | 120 | 42 | 7 |
TOTAL % | 100% | 71% | 25% | 4% |
The zip code database appears to have a high degree of accuracy. As shown in Table 4, of the 120 situations in which respondents provided the zip code of a branch of financial institution, 93% of the time the database returned a city and state that the respondent confirmed as correct for that branch location.
Question | Zip Provided | Match Recorded* | Match Confirmed* |
---|---|---|---|
H5_1 | 61 | 61 | 61 |
H5_2 | 33 | 32 | 31 |
H5_3 | 13 | 13 | 13 |
H5_4 | 8 | 7 | 1 |
H5_5 | 4 | 4 | 4 |
H5_6 | 1 | 1 | 1 |
H5_7 | 0 | 0 | 0 |
H5_8 | 0 | 0 | 0 |
TOTAL | 120 | 118 | 111 |
TOTAL % | 100% | 98% | 93% |
*H5_1 is asked when interviewer records a match. This question confirms that the branch is in the city and state associated in the database with the zip code.
Respondents were given the opportunity to provide a false name for a financial institution or other source of credit, as a way to protect the confidentiality of the source. These fake names were excluded from the institution look-up. In pretest two, in seven cases a respondent chose to not provide the real name of an institution or other credit source.
Table 5 shows the number of bank IDs captured at each pass. At the FRB's request, NORC can provide additional analysis of the how well the subsection H look-up functions performed in pretest two.
Question | Bank ID Captured* | Multiple Bank IDs Captured |
---|---|---|
H6_1 | 46 | 17 |
H6_2 | 17 | 6 |
H6_3 | 7 | 5 |
H6_4 | 4 | 1 |
H6_5 | 0 | 3 |
H6_6 | 0 | 0 |
H6_7 | 0 | 0 |
H6_8 | 0 | 0 |
TOTAL | 71 | 32 |
2.21.3 Question Order
The order of the questions in the main interview questionnaire appeared logical to both interviewers and respondents in the pretest, except for A4.1 and A4.1.1, as noted above.
2.21.4 Question Wording
Generally, question wording was fine in the main questionnaire in pretest two. As a result of the pretest, the team made a number of punctuation and other minor wording changes to some questions. Typos were corrected. In a few instances, a word or two could be dropped from a question without altering the question's meaning.
The stems, substitutions and optional reads amended to the questionnaire after the first pretest worked well in pretest two. They reduced repetition and made the wording more natural, less stilted. Interviewers appreciated these changes, which made interviews a bit less burdensome. The changes fell into three camps:
2.21.5 Questionnaire Logic
Except where noted elsewhere in this section, the logic of the main questionnaire was appropriate. Also, in subsection C, an instruction was added so that when ownership was split equally between or among owners, partners or shareholders, an interviewer is instructed to ask to talk about the respondent first, rather than another owner. The reason for the prompt is to enable questions in subsection U to be about the respondent, and not another individual.
In addition, a break-point screen was enhanced and given greater emphasis in future training sessions, based on pretest two. In this pretest, knowledge of the break-point screen was fairly low, meaning a fair number of interviewers of when to invoke the break-point screen and how to use it.
2.21.6 Question-By-Question Specifications (QxQs)
In general, pretest interviewers found the QxQs in the main interview helpful for defining financial terms to respondents. The primary negative comments were that QxQs need to be written in shorter sentences, and reformatted to break up large blocks of text to make it easier to retrieve the needed clarification.
In addition, the FRB identified a number of factual errors in the QxQs, especially for some of the financial subsections. NORC will make these changes to the QxQs in time for main production - improving their accuracy based on FRB feedback, and improving their usability based on interviewers' feedback.
This section describes the training of telephone interviewers for pretest two.
All pretest two interviewers had participated in pretest one, including the five days of SSBF-specific training that preceded the first pretest. Since they were seasoned SSBF interviewers, extensive training for pretest two was not necessary. Accordingly, NORC was able to cover all the training objectives in one day.
The primary objective was to present the changes to the instruments made since pretest one, and give interviewers the opportunity, through a mock interview, to familiarize themselves with the new versions of the CATI instruments. In addition, the session provided additional training and practice on getting past gatekeepers and securing cooperation.
The training session took place at NORC's Downers Grove Telephone Center on May 10, 2004. Participants included the ten interviewers, six members of NORC's project management staff, and two staff from the Federal Reserve Board. The materials for the training consisted of an agenda, training guide, and interviewer reference manual. The agenda is shown below. The training guide and interviewer reference manual were essentially unchanged from pretest one, except for:
Module | Topic | Duration | Time |
---|---|---|---|
1 | Welcome, Introductions, Overview of Training, Purpose of Pretest Two | 15 min | 8:30 - 8:45 |
2 | Overview of Changes Made for Pretest Two | 30 min | 8:45 - 9:15 |
3 | TNMS Changes and the Revised Screening Questionnaire | 60 min | 9:15 - 10:15 |
4 | The Revised Main Interview Questionnaire - Section I | 45 min | 10:30 - 11:15 |
5 | Main Interview - Subsections E: Use of Deposit Services, and
F: Use of Credit and Financing |
60 min | 11:15 - 12:15 |
6 | Main Interview - Subsection MRL: Most Recent Loan | 30 min | 12:45 - 1:15 |
7 | Main Interview - Subsection G: Use of Other Financial Services | 15 min | 1:15 - 1:30 |
8 | Main Interview - Subsection H: Relationships with Financial Institutions | 45 min | 1:30 - 2:15 |
9 | Main Interview - Subsections L: Trade Credit and M: New Equity Investments | 15 min | 2:15 - 2:30 |
10 | Main Interview - Subsections
P: Income & Expenses R: Assets S: Liabilities U: Credit History T: Respondent Payment Information |
60 min | 2:45 - 3:45 |
11 | Gaining Cooperation: Lessons Learned from Pretest One | 45 min | 3:45 - 4:30 |
12 | Interviewer Observations/Wrap Up | 30 min | 4:30 - 5:00 |
Starting with module three, interviewers used a round-robin mock interview to practice using the revised screener and main questionnaires. The mock interview was designed to highlight questions that had changed since pretest one, and during the round-robin trainers called out the changes, to further reinforce the new material. The round-robin mock script was based on interviewing a C-Corporation.
The training stayed on schedule; all the material was covered by the end of the day. Because the trainees were already experienced conducting SSBF interviews, they had relatively few questions, and they could focus on absorbing the new material.
A number of training issues emerged when talking to interviewers during the pretest two debriefing session. Listed below, these issues will be considered for subsequent rounds of training for new SSBF interviewers:
This section describes the materials that were mailed to respondents in pretest two. Respondents received a pre-screening advance mailing, and eligible respondents received a worksheet mailing. Each mailing is discussed in detail in this section. This section also describes the project websites.
4.1 Pre-Screening Mailing to Business Owners
The pre-screening advance mailing consisted of a cover letter from the NORC project director, a letter from Federal Reserve Board Chairman Alan Greenspan, a buckslip, and a general information brochure with answers to frequently-asked questions about the study. Other than the buckslip, the materials were modified versions of the documents used in the 1998 SSBF, and were identical to the documents used in 2003 SSBF pretest one. (For a detailed description of how the documents changed from 1998 to 2003, please see the pretest one report.)
For pretest two, NORC tested a "buck slip" in the pre-screening mailing (see Appendix). Slightly smaller than a No. 10 business envelope, with the FRB seal and three bulleted statements, the buck slip was intended to increase the incidence of recipients who came away from the pre-screening mailing with two critical pieces of information:
The buck slip was printed on green paper (suggesting paper currency) and was likely to be the first item a recipient saw once he or she opened the envelope. Of the 600 pretest two respondents, half (n=300) were randomly assigned to have the buck slip as part of their pre-screening mailing; the other half received an identical mailing, minus the buck slip.
The findings from the buck-slip experiment are shown in Table 6. Column A shows results for the 300 cases that had a buck slip as part of the pre-screening mailing; column B shows results for the 300 cases that did not get a buck slip. Directionally, the data suggest that a buck slip might increase response rates slightly. The number of calls required to complete a screening interview with a firm that got a buck slip, however, is slightly higher, on average, than for a firm that did not get a buck slip, 5.2 calls vs. 4.8 calls. At best, the results are inconclusive. They do not demonstrate a strong positive effect of adding a buck slip to the pre-screening mailing.
A. Buck Slip | B. No Buck Slip | |
---|---|---|
Screener completion rate | 43% | 41% |
Total completion rate (screener x main) | 12% | 10% |
Average number of calls per completed screener | 5.2 | 4.8 |
Average number of calls for all cases | 6.7 | 6.3 |
N | (300) | (300) |
Pretest two interviewers commented that respondents did not always remember receiving the pre-screening mailing, although respondents were not systematically asked if they had received it.
Results from pretest two provided no indication for making changes in the pre-screening mailing, going into main production.
4.2 Worksheet Mailing
The worksheet mailing was sent to firms that were determined to be eligible for the survey on the basis of the screening interview. As in pretest one, the pretest two worksheet mailing consisted of the following documents:
Two other changes were made to the PD letter. These changes were made to help recipients digest all the materials in the packet and not feel overwhelmed by the volume of materials:
N | % | |
---|---|---|
Worksheet | 23 | 33 |
Completed IRS forms or attachments | 19 | 27 |
Financial statements or accounting reports | 12 | 17 |
Bank statements | 7 | 10 |
Other written records | 1 | 1 |
From memory | 34 | 49 |
(70) | 100 |
*Except for "From memory," multiple responses allowed.
Pretest two findings, including the interviewers' debriefing, gave us no reason to change anything about the worksheet mailing.
We received 29 (44%) worksheets from respondents. We have not reviewed these returned packages and so at this point do not know how many contain completed or partially completed worksheets, and how many contain other records, such as tax forms. All 29 packages are from pretest two, though it is possible some of them were returned to us by respondents who did not complete the entire main interview.
4.3 SSBF Internet Sites
The pre-screening and worksheet letters from the project director refer respondents to two project websites, one hosted by the FRB (www.federalReserve.gov/ssbf) and the other hosted by NORC (www.norc.uchicago.edu/ssbf). The FRB website for the SSBF was online for pretest two, although we do not know how many respondents went to the site. The NORC site for the SSBF was still under development; pretest two respondents who clicked on the URL would have seen an "Under Construction" sign.
NORC's SSBF website went live on June 8, 2004. NORC had hoped to have the website available during pretest two, but developing the site took longer than planned, mainly because of the request that the site meet all requirements for 508 compliancy. "508-compliant" is a set of standards created by the federal government to make websites accessible to visually-impaired users. One of the standards, a "Skip to content" link above the navigation bar on each page, proved difficult to program with NORC's website development software (Microsoft Front Page). Ultimately NORC was able to find a solution by bringing together NORC's web developer with her FRB counterpart.
Other issues that delayed release of the SSBF website were:
The NORC project website provides a wealth of information to SSBF respondents, including frequently asked questions (FAQs), a letter from the project director, and information about NORC, the FRB and the survey itself. As a persuasive tool, the website has copies of endorsement letters from the National Business Association, the Small Business Administration, and the National Federation of Independent Businesses.
The NORC website has downloadable copies of the four versions of the financial worksheet required for the main interview. The main purpose of adding the worksheets to the website is to allow respondents to gain an understanding of the kinds of information they will be asked to provide. Though the worksheets at the website are downloadable, they print to 8 x 11-inch paper - too small for many respondents to use easily. NORC has posted a notice on the website that the intended size of each worksheet is 11 x 17 inches, with instructions to call the toll-free project hotline to request that full-size worksheets be sent to them through regular mail.
At the FRB's request, NORC added a page to the website explaining its privacy policy.
NORC reviewed the possibility of adding an internal counter to its site (which would track the number of site visitors), but the internal counters available for free all use persistent cookies, which are lines of code that reside in a visitor's computer. The FRB was not in favor of internal counters that use persistent cookies, so NORC's SSBF website does not have a counter.
This section describes the pretest sample and procedures for frame acquisition and sample selection for pretest two.
The initial frame, sent to NORC, by D&B consisted of 8,037,890 records. The initial data sent by D&B was a limited-content abstract of the standard DMI file. Each record consisted of the data listed in Table 8.
Data Elements | Data Elements |
---|---|
DUNS Number | Employee Here |
State FIPS Code | Code for Estimated Range |
County FIPS Code | Primary SIC Code |
MSA FIPS Code (prior to 2000) | Status Indicator |
Physical Location Zip Code | Subsidiary Indicator |
Sales Volume | Manufacturing Indicator |
Code for Estimated Range | Legal Status a.k.a. Business Type |
Employee Total | Transition Code |
Code for Estimated Range | D&B Report Date |
NORC first purged the abstract file of 81 records with out-of-scope primary Standard Industrial Classification (SIC) codes. NORC next categorized employee total, and purged an additional 16,506 records with 500+ employees from the file. The remaining 8,021,303 records comprised the sampling frame for both pretests one and two.
The process of acquiring the pretest sample frame from D&B presented numerous difficulties. These difficulties are described in the NORC report on SSBF pretest one.
5.2 Sample Selection
The original specifications for both pretest samples called for 500 cases to complete 50 interviews. However, ultimately an additional 250 cases were randomly selected from tentative Pretest two cases for possible use in pretest one. Thus, additional sample had to be selected from the more than eight million records of the limited-content abstract described in section 5.1 in order to ensure that sufficient sample would be available for use in pretest two. Prior to selecting an additional 1,000 cases from the frame of 8,021,303 firms, the previously selected 1,000 pretest cases were removed from the frame leaving 8,020,303 firms for selection.
Similar to the procedures that were used to select each of the 500 cases originally allocated to each pretest, NORC employed systematic, stratified random sampling to select the second 1,000 cases. As in pretest one, and the first 250 cases of pretest two, six strata were created by crossing employment size (0-19 and unknown, 20-499) with business type (sole proprietorship, partnership, corporation). Four of the six strata were allocated sample sizes of 166; the other two strata were allocated sample sizes of 168. The frame was sorted by SIC code, within strata, and selections were then made, systematically, within strata.
In total, 600 firms (the 250 that had been set aside from the first 1,000 pretest cases and 350 from the second 1,000 pretest cases) were selected for pretest two. Both the 250 and the 350 cases resulted from randomly ordering (uniform distribution) the second 500 cases originally planned for pretest two and the second 1,000 pretest cases, respectively, and then splitting them into replicates of size 50 each. In all, pretest two consisted of 12 replicates. Though it became unnecessary, the remainder of the second 1,000 selected cases was set aside for future pretest use.
Though each of the 12 replicates is a valid random subsample of the original systematically sampled cases, due to time constraints there was no attempt to control the number of cases that were selected into each replicate by stratum or primary SIC distribution. The procedures will be modified for the main sample selection such that each replicate is representative of the target population within each stratum, preserving the implicit SIC stratification.
The sample of the second 1,000 cases was checked to insure that the sample sizes within strata were equal to the amount allocated. The output was also spot-checked to make sure that the systematic approach was correctly implemented. Finally, the distribution of SIC codes in the sample was compared with the distribution in the frame. The sample passed each of these checks.
Note that for the main sample, any previously selected firms will first be removed from the updated DMI frame. Then the entire sample (37,600 firms) will be drawn simultaneously.
Sample Data Quality
Analyzing the quality of the sample data was an objective of the first pretest, not the second one, though in the following section we provide some basic reporting on data quality.
In this section, we analyze the quality of the address data for the firms in the sample based on the results of processing these data with SmartMailerTM software, which is designed to standardize address data and determine its completeness based on United States Post Office standards.
SmartMailer looks for the existence of the subject address in that city and state. If found, the Zip code is verified, and a Zip+4 is obtained if available. Often, amplifying address information will be found, e.g., an apartment, suite, or floor number; this new information will be provided in a separate field, without overwriting the existing address. If SmartMailer cannot verify a Zip, it will suggest an alternative Zip that matches that address, city and state. If SmartMailer cannot verify the address, it will flag the address but will not change any of the information. SmartMailer does not verify the resident or occupant at the subject address.
The results from applying SmartMailer to the addresses of the 600 pretest two cases are summarized in Table 9.
Outcome | N | % |
---|---|---|
Address validated by SmartMailer as complete and accurate | 281 | 46.8 |
Address lacked apt., suite, or floor number that SmartMailer could not provide | 70 | 11.7 |
Address contained spelling or formatting error corrected by SmartMailer | 100 | 16.7 |
Address lacked apt., suite, or floor number that SmartMailer could provide | 71 | 11.8 |
Address could not be found by SmartMailer | 78 | 13.0 |
Total | 600 | 100.0 |
Of the addresses supplied by D&B, 13% could not be found at all by SmartMailer. That is a relatively small percentage, and is manageable, though it is somewhat higher than the percentage of the 750 pretest one addresses that could not be found by SmartMailer (7.2%).
As Table 6 shows, less than half (46.8%) of the addresses supplied by D&B were validated as complete and accurate by SmartMailer. The most common problem with the address supplied by D&B was a spelling or formatting error; these were all corrected by SmartMailer. The next most common problem was a missing apartment, suite or floor number. SmartMailer was able to provide the missing information for about half (50.3%) of these cases.
This section describes the data collection procedures used in pretest two and the results of data collection.
6.1 Data Collection Procedures
The data collection procedures evaluated in pretest two were the callback rules and disposition codes used in the Telephone Number Management System (TNMS), other TNMS design features, and the procedures for mailing respondent materials.
6.1.1 Callback Rules
Due to the much shorter data collection period in both pretests, NORC reduced the callback intervals in TNMS to approximately half of what these will be in the main survey, so that more calls would be made in a shorter period of time. For example, if the callback interval was 24 hours after the third ring-no answer, we reduced this interval to 12 hours. Although NORC was able to complete the desired number of pretest interviews in the allocated time, pretest interviewers felt that cases were called back too frequently, and after too short a period time. This resulted in wasted calls, as well as being potentially annoying to respondents.
In the first pretest, managing the sample was made unnecessarily complicated by the rule that sent cases to locating after four ring/no-answer results. On reflection, these cases were going into locating too soon, requiring supervisors to determine which cases had been sent to locating as a conscious decision and which cases were there because of the programmed rule. Cases in locating by rule were routinely and effectively sent back for additional calls at different times and days. Cases in locating by decision were worked by interviewers who had locating experience. Although we did not change the callback rules for the second pretest, these will be modified for the main survey by basing the rule on seven ring/no-answer results instead of four.
6.1.2 Disposition Codes
NORC used standard disposition codes for this pretest, modified slightly to handle study-specific circumstances (e.g., proxy refusals). At the first pretest debriefing, the interviewers made the following suggestions regarding disposition codes:
In addition, pretest two interviewers identified three more disposition codes that should be added to the list:
These additional codes indicate that contact was made - as opposed to a non-contact call attempt in which an interviewer immediately reached someone's voice mail.
6.1.3 Other TNMS Design Issues
In addition to changes to the callback rules and disposition codes, pretest one interviewers also suggested the following changes, which were made for pretest two.
For pretest two, the NORC Mail Center followed the procedures established in pretest one. These procedures were for pre-screening advance materials, worksheets and related materials and respondent incentives. For a detailed description of these procedures, please see the 2003 SSBF Pretest One Report.
To determine which firms should be sent a worksheet mailing, NORC captured the outcome of the screening interviews, electronically, in NORC's SurveyCraft Telephone Number Management System (TNMS). In pretest one, a programmer created the files for the worksheet mailing, but this process was automated for pretest two.
There is one file for each worksheet type (i.e., sole proprietorship, partnership, C-corporation, and S-corporation), and a fifth file containing cases of unknown organization type. The worksheet mailing for companies with unknown organization type contained a copy of all four types of worksheets, and a cover page explaining why four worksheets are enclosed and requesting the respondent to complete the appropriate version. Of the 253 respondents in pretest two to whom we mailed worksheets, 17 had unknown organization type.
For pretest two, the screening questionnaire was revised to collect physical address, rather than mailing address. Also, an interviewer prompt was added, reminding interviewers to probe for a street address if a respondent gave a Post Office box or rural route as the physical address. These changes were implemented to reduce the number of non-deliverable Federal Express mailing addresses. Interviewers reported no respondents providing a P.O. Box or rural route when asked A11.1.6, the question specifying a physical address to which Federal Express delivers.
Pretest two mailings went smoothly and NORC plans to use the same procedures for main production.
6.2 Results of Data Collection
6.2.4 Overall Results
Table 10 shows the results of data collection, starting with a sample of 600 cases. Of the 600 cases, 253 (42%) were completed eligible screeners, and 66 (11%) were completed eligible main interviews. The production goal for pretest two was a minimum of 50 completed main interviews.
N | % | |
---|---|---|
Total sample | 600 | 100% |
Completed screeners | 322 | 54% |
Eligible completed screeners | 253 | 42% |
Completed main interviews | 66 | 11% |
Of completed screeners, 79% (253 / 322) were eligible. This ratio is in the range of previous SSBF rounds, in which between 70% and 80% of firms completing the screener were eligible for the main interview.
NORC completed 322 screeners and 66 main interviews. An additional nine main interviews were partially completed. The pretest was not a test of the response rates that might be achieved in the main study, for the following reasons:
Cumulative Production Results
Table 11 summarizes screening activity by day for pretest two. Screening started on May 10, 2004 and ended on May 27, 2004. As this table shows, Saturdays were very low production days. Hours per completed screener (0.9) were nearly double what NORC had budgeted, probably due in part to shortening the interval between callbacks (thus increasing the number of calls).
DAY, DATE | SCREENER
CUMULATIVE TO DATE Hours |
SCREENER
CUMULATIVE TO DATE Complete Eligible |
SCREENER
CUMULATIVE TO DATE Complete Ineligible |
SCREENER
CUMULATIVE TO DATE Total Complete |
SCREENER
CUMULATIVE TO DATE Hours per Complete |
---|---|---|---|---|---|
Monday, May 10, 2004 | 34.0 | 27 | 9 | 36 | 0.9 |
Tuesday, May 11, 2004 | 72.3 | 73 | 23 | 96 | 0.8 |
Wednesday, May 12, 2004 | 113.8 | 111 | 33 | 144 | 0.8 |
Thursday, May 13, 2004 | 149.0 | 148 | 39 | 187 | 0.8 |
Friday, May 14, 2004 | 176.3 | 180 | 42 | 222 | 0.8 |
Saturday, May 15, 2004 | 180.3 | 181 | 43 | 224 | 0.8 |
Monday, May 17, 2004 | 202.3 | 205 | 52 | 257 | 0.8 |
Tuesday, May 18, 2004 | 220.3 | 213 | 54 | 267 | 0.8 |
Wednesday, May 19, 2004 | 235.8 | 217 | 56 | 273 | 0.9 |
Thursday, May 20, 2004 | 250.7 | 221 | 56 | 277 | 0.9 |
Friday, May 21, 2004 | 260.5 | 229 | 62 | 291 | 0.9 |
Saturday, May 22, 2004 | 260.5 | 229 | 62 | 291 | 0.9 |
Monday, May 24, 2004 | 274.4 | 244 | 66 | 310 | 0.9 |
Tuesday, May 25, 2004 | 286.6 | 252 | 69 | 321 | 0.9 |
Wednesday, May 26, 2004 | 289.4 | 252 | 69 | 321 | 0.9 |
Thursday, May 27, 2004 | 292.4 | 253 | 69 | 322 | 0.9 |
Table 12 summarizes Main Interview activity by day for pretest two. Hours per complete were appreciably lower than budgeted (5.0), as well as lower than the HPC for pretest one (3.7). This may be due to the experience and caliber of interviewers (see Section 7), the changes from pretest one to the instruments that improved readability, flow and cooperation (see Section 2), or both. As with screening, Saturdays were unproductive for completing interviews.
DAY, DATE | MAIN Cumulative to Date Hours | MAIN Cumulative to Date Completes | MAIN Cumulative to Date Hours per Complete |
---|---|---|---|
Tuesday, May 18, 2004 | 13.3 | 6 | 2.2 |
Wednesday, May 19, 2004 | 37.0 | 11 | 3.4 |
Thursday, May 20, 2004 | 64.3 | 22 | 2.9 |
Friday, May 21, 2004 | 86.5 | 28 | 3.1 |
Saturday, May 22, 2004 | 91.0 | 28 | 3.3 |
Monday, May 24, 2004 | 114.2 | 39 | 2.9 |
Tuesday, May 25, 2004 | 141.0 | 45 | 3.1 |
Wednesday, May 26, 2004 | 155.8 | 49 | 3.2 |
Thursday, May 27, 2004 | 188.8 | 66 | 2.9 |
Call Dispositions at the End of Pretest Two
The results of pretest two screening, by final case disposition, are presented in Table 13, and the results of main interviewing are presented in Table 14. These tables also show the average number of call attempts made by each final disposition code.
Compared to pretest one, pretest two has more disposition codes at the end of interviewing, for both screening and the main questionnaire. In pretest one, NORC supervisors reviewed every pending case (and call history) before assigning it a final disposition code. In pretest two, pending cases did not undergo such rigorous review at the end of the field period; the reason for the different treatments between pretests is that immediately following pretest two, NORC had to start working on main production. This work included a training session with a large number of trainees, plus making and testing changes to the CATI instruments.
Last Disposition | Number of Cases | % | Cum. Number of Cases | Cum. % | Avg. Number of Calls |
---|---|---|---|---|---|
HOSTILE REFUSAL | 3 | 0.5% | 3 | 0.5% | 6.3 |
REFUSAL CONVERSION FAILED | 11 | 1.8% | 14 | 2.3% | 4.1 |
HOSTILE REFUSAL - SUSPEND SCREEN | 4 | 0.7% | 18 | 3.0% | 4.8 |
SUPERVISOR REVIEW | 24 | 4.0% | 42 | 7.0% | 6.2 |
SUPERVISOR REVIEW - SUSPEND SCREEN | 9 | 1.5% | 51 | 8.5% | 5.4 |
INELIGIBLE/OWNER SCREENED | 42 | 7.0% | 93 | 15.5% | 4.0 |
INELIGIBLE/PROXY SCREENED | 17 | 2.8% | 110 | 18.3% | 8.1 |
INELIGIBLE/DK RESPONSE/OWNER SCREENED | 5 | 0.8% | 115 | 19.2% | 9.0 |
INELIGIBLE/REF RESPONSE/OWNER SCREENED | 1 | 0.2% | 116 | 19.3% | 1.0 |
UNCONFIRMED FIRM NAME | 4 | 0.7% | 120 | 20.0% | 7.8 |
REGULAR BUSY | 3 | 0.5% | 123 | 20.5% | 8.7 |
RING NO ANSWER | 30 | 5.0% | 153 | 25.5% | 12.2 |
ANSWERING MACHINE NO MESSAGE LEFT | 3 | 0.5% | 156 | 26.0% | 8.7 |
LOCATING PROBLEM | 46 | 7.7% | 202 | 33.7% | 2.8 |
ANSWERING MACHINE MESSAGE LEFT | 40 | 6.7% | 242 | 40.3% | 13.5 |
OWNER/PROXY TO CALL 800 NUMBER | 6 | 1.0% | 248 | 41.3% | 10.7 |
LOCATING NEEDED (NON - CONTACT RULE) | 10 | 1.7% | 258 | 43.0% | 11.1 |
HUNG UP DURING INTRO | 10 | 1.7% | 268 | 44.7% | 9.6 |
PROXY REFUSAL | 2 | 0.3% | 270 | 45.0% | 17.0 |
OWNER REFUSAL | 3 | 0.5% | 273 | 45.5% | 5.3 |
GATEKEEPER REFUSAL | 1 | 0.2% | 274 | 45.7% | 1.0 |
OWNER NOT AVAILABLE/NO CALLBACK ESTABLISHED | 4 | 0.7% | 278 | 46.3% | 8.5 |
ADVANCE LETTER REMAIL | 1 | 0.2% | 279 | 46.5% | 1.0 |
PROXY REFUSAL - SUSPEND SCREEN | 5 | 0.8% | 284 | 47.3% | 12.2 |
OWNER REFUSAL | 18 | 3.0% | 302 | 50.3% | 7.8 |
GATEKEEPER REFUSAL - SUSPEND SCREEN | 8 | 1.3% | 310 | 51.7% | 9.3 |
PRIVACY MANAGER | 3 | 0.5% | 313 | 52.2% | 3.7 |
COMPLETE - ELIGIBLE | 253 | 42.2% | 566 | 94.3% | 5.0 |
SOFT APPOINTMENT - SUSPEND SCREEN | 26 | 4.3% | 592 | 98.7% | 10.2 |
SOFT APPOINTMENT | 3 | 0.5% | 595 | 99.2% | 11.0 |
LANGUAGE BARRIER | 1 | 0.2% | 596 | 99.3% | 2.0 |
INELIGIBLE GONE OUT OF BUSINESS | 4 | 0.7% | 600 | 100.0% | 2.8 |
Table 14 contains the results of data collection for the main interviewer, by final case disposition. As with the previous table, this table also contains the average number of calls made by final disposition. These results confirm the impression of the pretest interviewers, reported during the debriefing, that once a respondent starts the interview, he or she usually completes it.
Note that the 66 cases counted as "complete" in this table were not subject to the completeness test required by the FRB for main production, but rather made it all the way through the main interview questionnaire. Note that eleven cases were deliberately not worked, since NORC had achieved its production goal of 50 completes.
Last Disposition | Number of Cases | % | Cum. Number of Cases | Cum. % | Avg. Number of Calls |
---|---|---|---|---|---|
VIRGIN | 11 | 4.78% | 11 | 4.8% | 0.0 |
HOSTILE REFUSAL - SUSPEND SCREEN | 1 | 0.4% | 12 | 5.2% | 7.0 |
SUPERVISOR REVIEW | 8 | 3.5% | 20 | 8.7% | 6.6 |
SUPERVISOR REVIEW - SUSPEND SCREEN | 9 | 3.9% | 29 | 12.6% | 5.4 |
REGULAR BUSY | 1 | 0.4% | 30 | 13.0% | 9.0 |
RING NO ANSWER | 7 | 3.0% | 37 | 16.1% | 9.6 |
ANSWERING MACHINE NO MESSAGE LEFT | 2 | 0.9% | 39 | 17.0% | 3.0 |
ANSWERING MACHINE MESSAGE LEFT | 21 | 9.1% | 60 | 26.1% | 10.8 |
OWNER/PROXY TO CALL 800 NUMBER | 4 | 1.7% | 64 | 27.8% | 6.5 |
LOCATING NEEDED (NON - CONTACT RULE) | 3 | 1.3% | 67 | 29.1% | 6.7 |
HUNG UP DURING INTRO | 2 | 0.9% | 69 | 30.0% | 4.0 |
PROXY REFUSAL | 1 | 0.4% | 70 | 30.4% | 2.0 |
OWNER REFUSAL | 9 | 3.9% | 79 | 34.3% | 5.3 |
GATEKEEPER REFUSAL | 2 | 0.9% | 81 | 35.2% | 4.0 |
OWNER NOT AVAILABLE/NO CALLBACK ESTABLISHED | 10 | 4.3% | 91 | 39.6% | 6.8 |
PROXY REFUSAL - SUSPEND SCREEN | 2 | 0.9% | 93 | 40.4% | 5.5 |
OWNER REFUSAL | 18 | 7.8% | 111 | 48.3% | 5.4 |
GATEKEEPER REFUSAL - SUSPEND SCREEN | 3 | 1.3% | 114 | 49.6% | 2.7 |
COMPLETE - ELIGIBLE | 66 | 28.7% | 180 | 78.3% | 5.5 |
SOFT APPOINTMENT - SUSPEND SCREEN | 32 | 13.9% | 212 | 92.2% | 8.1 |
SOFT APPOINTMENT | 17 | 7.4% | 229 | 99.6% | 6.6 |
INELIGIBLE GONE OUT OF BUSINESS | 1 | 0.4% | 230 | 100.0% | 1.0 |
Owner-Proxies
Of the 66 completed main interviews, nine (7%) were completed wholly or partly by proxies. Table 15 shows the question at which a proxy took over the interview from the owner. In two cases, an owner-designated proxy completed the entire interview beginning with question A1. In two other cases, a proxy took over from the owner at N1, the first question in the Records subsection that leads into the subsections about the firm's financial history.
Note that Table 15 shows three proxies taking over the interview at the question BP1. BP1 is not an actual question; it is the reference an interviewer sees when he or she invokes the break point screen. Interviewers were trained to not record BP1, but rather to record the number of the last question asked to the owner before the break-off. That three responses were BP1 indicates the need for additional training on how interviewers should handle break-offs to a proxy.
Question Number | Number of Proxies |
---|---|
A1 | 2 |
D5 | 1 |
BP1 | 3 |
N1 | 2 |
P1 | 1 |
Proxy titles (below) suggest that owner-designated proxies were well qualified to take the survey beginning at the financial subsections following Section I. Titles generally indicate important positions. One proxy, however, was a bookkeeper; we have not checked the size of this firm - it is possibly a small firm in which a bookkeeper would be the appropriate proxy for an owner.
Proxy titles were (one proxy per title):
NORC recruited twelve trainees to conduct pretest one; of the twelve, all but one qualified to work on pretest one. During the one month between pretests, some SSBF interviewers tested changes to the CATI instruments and performed other tasks at the Downers Grove call center. Two interviewers dropped out between pretests, and were not replaced, meaning that nine interviewers worked on pretest two. Of those nine interviewers (Table 16):
Table 16. Interviewers for Pretest Two by Source
Source | Number of Interviewers |
---|---|
Experienced NORC staffers | 4 |
Accountemps | 1 |
Office Team | 2 |
Spherion | 2 |
TOTAL | 9 |
All pretest two interviewers were experienced pretest one interviewers.
All nine interviewers completed a six-hour course on general telephone interviewing practices and procedures. (Experienced NORC interviewers had completed this introductory training at the start of their employment with NORC; the other interviewers completed it immediately before the SSBF-specific training.) In addition, interviewers completed five days of SSBF-specific training before pretest one. Finally, interviewers received an additional day of training at the start of pretest two. (See Section 3 for a description of pretest two training.)
Pretest two interviewers had proven themselves in pretest one. They were experienced SSBF interviewers coming into pretest two and, through observation and other measures, all performed well in pretest two. Accordingly, there was no formal evaluation of interviewers' performance during pretest two.
It can be argued that, generally, pretest two interviewers were bright and motivated. A number of them have extensive, specialized knowledge of small-business accounting techniques and/or uses of credit. As of this writing, two pretest two interviewers had been promoted to SSBF telephone supervisors, and several of the others remain among our best, most productive interviewers on the project.
At the conclusion of pretest one, NORC formally evaluated interviewers' performance. The evaluation led to a number of ideas about SSBF best-practice hiring; these ideas are presented in detail in the report for pretest one. The key findings from pretest one can be summarized as follows:
NORC conducted a pretest two interviewers' debriefing on May 26, 2004. The half-day session was held at NORC's Telephone Center in Downers Grove, Illinois and was attended by the NORC project staff, John Wolken and Traci Mach from the FRB, and most of pretest two telephone interviewers.
The agenda for the debriefing can be found in the Appendix. The session was organized into eleven modules covering different aspects of pretest interviewing, including what worked and what did not work, screener issues, main questionnaire issues, the TNMS, training materials, job aids, and gaining cooperation.
Participation in the session was high, with all except one interviewer offering observations and comments. The discussion was informal and wide-ranging. FRB and NORC project staff felt that good information and feedback was collected during the debriefing that will be useful in improving systems, materials, and processes for the main survey. The findings from the debriefing session have been cited elsewhere in this report.
In a survey as complex as the Survey of Small Business Finances, it is critical that not only each task occur smoothly, but that the interface between tasks occur smoothly as well. In this section, we discuss the interface between sampling and IT in terms of getting the pretest sample loaded into the productions systems, between IT and the Mailing Center for the mailing of respondent materials, and between the Telephone Center and Accounts Payable for the payment of respondent incentives.
Early in the process of sample design, NORC sampling and IT staff worked together to identify the information from Dun and Bradstreet that should be preloaded into NORC's Case Management and Telephone Number Management systems. In general, this information included data used for reporting (such as organization type and size), for sample management (such as replicate and batch number), and for gaining cooperation (such as Standard Industrial Classification, and owner's name). Sampling and IT staff also agreed upon a data layout and a schedule for data delivery that would give IT enough time to load the sample information into the systems before data collection was to begin.
The close communication and joint planning of sampling and IT staff appears to have paid off, because this interface worked very smoothly in both pretests one and two.
As described above, the SSBF involves three separate mailings to respondents: 1) a pre-screening mailing to all sampled firms, 2) a worksheet mailing to firms determined to be eligible based on screening, and 3) an incentive mailing to firms that complete the main interview and select the $50 incentive. For the pre-screening mailing, the sample control file was read into NORC's Case Management System (CMS). NORC Mail Center staff accessed the sample information in the CMS and ran the company names and addresses through SmartMailer software (see Section 6.1.4). This process ran very smoothly in pretest two. NORC plans to follow the sample process for the main survey.
The worksheet mailing requires information about which firms screened in as eligible for the survey. The outcome of the screening interviews was captured, electronically, in NORC's SurveyCraft Telephone Number Management System (TNMS), and automatically transferred to the Case Management System (CMS). In pretest one, a programmer manually created the files for the worksheet mailing. In pretest two, the process was automated and ran smoothly; for main production we will continue using the automated process.
For the incentive mailing, case information from the TNMS was saved in a file and emailed to Accounts Payable. Of the 66 pretest respondents who made it through the entire main interview questionnaire, 52 (79%) requested the $50 incentive option.
This incentive choice was captured electronically in the TNMS during data collection. After pretest data collection ended, IT staff produced a file from the TNMS containing the necessary information for mailing a $50 check to each of the firms that selected this incentive option. This information included the owner's name and mailing address. Accounts Payable produced the checks and sent these, via interoffice mail, to the Mailing Center. The Mailing Center applied postage via postage meter and mailed these to respondents. The process went smoothly and will be replicated for the main study.
NORC delivered the following to the FRB for pretest two:
John Smith
ABC Company
123 First Avenue
Tempe, AZ 85283-1432
Dear John Smith:
You recently spoke with an NORC interviewer about the Survey of Small Business Finances. Thank you for sharing your time and information. You received this package because your firm has been selected for the main survey. Your participation in the main survey will make it possible to inform policymakers about the credit needs and the availability of credit for businesses like your own. Even if you are not currently using credit, your responses can help to ensure that credit will be available to you in the future.
We value your participation. As a token of appreciation for your participation in the main interview, we invite you to choose either $50 or Dun and Bradstreet's Small Business Solutions ® information package, which retails for $199.
The enclosed worksheet makes the interview go faster. Side two of this letter lists the materials included in this package. One of the items is your 11" by 17" SSBF worksheet. Ideally, we would like you to complete the worksheet prior to the interview. By completing the worksheet in advance, you will reduce the time required to complete the interview.
You may not need to complete the entire worksheet. Side 1 of the worksheet provides space to indicate the financial services you have used in the past year, and then name the financial institution(s) where you acquired these services. While the form is designed for all types and sizes of small businesses, many businesses only use one or two sources for one or two services. Side 2 provides space for balance sheet information and indicates where you can find this information in your current tax records.
We will accommodate your schedule to complete the telephone interview. An NORC interviewer will be calling you soon to conduct the interview, which typically takes 30 to 45 minutes. While participation is voluntary and you may skip any question you choose, we encourage you to participate so that we can gain an accurate picture. We want to reassure you that your responses will be kept confidential. Because we understand that your time is valuable, your interviewer will accommodate your busy schedule by working with you to arrange a time that is convenient for you. If necessary, the interview can be broken up into shorter sessions to accommodate your schedule.
After you have completed the interview, please use the enclosed prepaid envelope to return your worksheet to NORC. You may also fax records to 1-866-435-5637. If you have any questions about the study or need a different worksheet, please call our hotline at 1-800-692-4192. If you prefer, you can visit us online at norc.uchicago.edu/SSBF for more information, endorsements, and downloadable copies of the worksheets. You can also e-mail your questions to me at [email protected].
Again, thank you for participating in this important study. We look forward to speaking with you soon.
Sincerely,
Carol-Ann Emmons
Project Director, Survey of Small Business Finances
National Opinion Research Center
|
Explains the importance of participating. |
|
Answers questions you may have about the study. |
|
Simplifies and speeds up the interview. Provides space to record financial services used and the sources of these services on side 1, and to record financial data on side 2. |
|
Provides a detailed description of how the data are used, with examples from previous rounds of the study. |
|
Explains NORC's pledge and code of ethics for maintaining the confidentiality of information provided by participants. |
|
Describes the members and responsibilities of the Board of Governors. |
|
Describes information package you may choose as an alternative to receiving $50 as a token of our appreciation. |
When we call to conduct the interview, we will ask which gift you prefer. For more information about the information package, see the enclosed brochure, or view examples of the available reports by visiting D&B's Small Business Solutions ® at www.dnb.com/smallbusiness/ssbf.
Module | Schedule | Minutes | Topic, Moderator and Note-Taker |
---|---|---|---|
1 | 8.30 - 8.45 | 15 | Warm-Up and Ground Rules
Bill Sherman |
2 | 8.45 - 9.15 | 30 | Pretest Two vs. Pretest One: Compare and Contrast
Carol Emmons (Bill as note-taker) |
3 | 9.15 - 9.30 | 15 | New, Unusual or Difficult Situations
Mireya Dominguez (Bob as note-taker) |
4 | 9.30 - 9.45 | 15 | GKs, Gaining Cooperation and Finding Rs
Bob Bailey (Terri as note-taker) |
5 | 9.45 - 10.00 | 15 | Screener
Terri Kowalczyk (Carol as note-taker) |
10.00 - 10.15 | 15 | Break | |
6 | 10.00 - 10.30 | 30 | Main Quex
Terri Kowalczyk (Bob as note-taker) |
7 | 10.30 - 10.45 | 15 | TNMS
Mireya Dominguez (Bill as note-taker) |
8 | 10.45 - 11.00 | 15 | Break-Offs and Multiple Respondents
Mireya Dominguez (Bob as note-taker) |
9 | 11.00 - 11.10 | 10 | Job Aids
Bill Sherman (Mireya as note-taker) |
10 | 11.10 - 11.30 | 20 | Training
Carol Emmons (Terri as note-taker) |
11 | 11.30 - 12.00 | 30 | Wrap Up and Other Issues
Carol Emmons |