Version 1.0.0 09/01/2022
The Asthma Call-back Survey (ACBS) is funded by the National Asthma Control Program (NACP) in the Asthma and Community Health Branch (ACHB) of the National Center for Environmental Health (NCEH). The state health departments jointly administer the ACBS with the National Center for Chronic Disease Prevention and Health Promotion (NCCDPHP), Division of Population Health (DPH).
The NCEH and the NCCDPHP greatly appreciate the efforts of the BRFSS staff in each ACBS- participating state.
Kanta Sircar, PhD, MPH, PMP Acting Branch Chief
Commander, US Public Health Service Asthma and Community Health Branch
Division of Environmental Health Science and Practice National Center for Environmental Health
Centers for Disease Control and Prevention 4770 Buford Hwy, NE
Mailstop S106-6 Atlanta, GA 30341 USA
Phone: (770) 488-3388
Machell G. Town, PhD Branch Chief
Population Health Surveillance Branch Division of Population Health
National Center for Chronic Disease Prevention and Health Promotion Centers for Disease Control and Prevention
4770 Buford Hwy, NE Mailstop S107-6 Atlanta, GA 30341 USA
Phone: (770) 488-4681
Public health surveillance is the ongoing, systematic collection, analysis, interpretation, and dissemination of data regarding a health-related event for use in planning and delivering public health action to reduce morbidity (disease) and mortality (death) and to improve health. Data disseminated by a public health surveillance system can help in the formulation of research hypotheses, as well as aid the following actions:
Guiding immediate action in a public health emergency.
Measuring the prevalence of a disease.
Identifying populations at high risk for disease.
Monitoring disease outbreaks.
Planning, implementing, and evaluating prevention/control strategies for diseases, injuries, and adverse exposures.
Monitoring behavior that increases health risk.
Asthma is one of the nation’s most common and costly chronic conditions. It will affect about
42.5 million US residents during their lifetime (Table 1-1 Lifetime Asthma Population Estimates in thousands by Age, NHIS, 2020 | CDC). In 2020, about 8.5 million adults and 1.8 million children had an asthma attack, which can be life threatening (2020 National Health Interview Survey (NHIS) Data | CDC). More than 4,100 people died from asthma-related complications in 2020 (Most Recent National Asthma Data | CDC).
Managing asthma and reducing the burden of this disease requires a long-term, multifaceted approach that includes patient education, behavior changes, asthma-trigger avoidance, pharmacological therapy, frequent medical follow-up, and the development of best practices that put the findings of asthma-related research into sound public-health practice. In this way, disease-related data can help state and local health departments evaluate the need of their asthma control programs and interventions.
CDC’s National Asthma Control Program (NACP) plays a critical role in addressing the health risks that US residents face from this disease. The program funds states, cities, and schools to improve asthma surveillance, train health professionals, raise public awareness, and educate individuals with asthma and their families. The NACP is a function of the Asthma and Community Health Branch (ACHB), Division of Environmental Health Science and Practice in the National Center for Environmental Health (NCEH).
Surveys by the National Center for Health Statistics have been collecting data on asthma prevalence, asthma-related deaths (mortality), and several indirect indicators of asthma-related illness (morbidity), such as hospitalizations. These data provide a good basis for analyzing national trends, but not at the state level.
State health agencies acquire and use resources to reduce behavioral health risks and the diseases that may result from them. ACHB saw the need to expand existing data systems and develop new systems to make data readily available at a state or local level and provide asthma data with more detail.
In 1984, CDC established the Behavioral Risk Factor Surveillance System (BRFSS), a state- based system of health surveys administered and supported by the Division of Population Health, in the National Center for Chronic Disease Prevention and Health Promotion. Beginning with 15 states in 1984, the BRFSS is now conducted in all states, the District of Columbia, and participating US territories. The BRFSS is a telephone survey that obtains information on health risk behaviors, clinical preventive health practices, and health care access, primarily related to chronic disease and injury. The BRFSS population is drawn from a random, representative sample of noninstitutionalized adults in each state. States use BRFSS data to identify emerging health problems, establish and track health objectives, and develop public health policies and programs. Many states also use BRFSS data to inform health-related policies.
In 2000, ACHB added questions about current and lifetime asthma prevalence to the core BRFSS survey. Since 2001, states have also had the option of adding an adult Asthma History Module to their survey, and in 2005 a Child Asthma Prevalence Module was included in the questionnaire (which requires the use of the Random Child Selection Module as well). However, many states, do not choose to add these modules because of cost or because they have more- pressing needs for other health-related data.
Using the BRFSS to collect additional information on asthma met two of ACHB’s three objectives to improve asthma surveillance. First, the BRFSS provides data for state and metropolitan statistical areas for states/territories in the 50 states, the District of Columbia, and participating US territories. Second, it is a timely data source; data are available as soon as possible from the end of the calendar year of data collection.
The third ACHB surveillance objective is to increase the content detail for asthma surveillance data. Efforts to meet this objective began in 1998 when ACHB began creating a new survey with more detailed asthma content, called the National Asthma Survey (NAS) (SLAITS - National Asthma Survey (cdc.gov)). A few pilot tests of the survey were conducted in 2001 and 2002. The first survey used the State and Local Area Integrated Telephone Survey, an independent survey mechanism that was an offshoot of the National Immunization Program survey at CDC. The NAS complemented and extended survey work from the National Health Interview Survey, National Health and Nutrition Examination Survey, and the BRFSS. It added depth to the existing body of asthma data, helped to address critical questions surrounding the health and
experiences of persons with asthma, and in addition, could provide data at the state and local levels.
In 2003 and early 2004, data were collected by the NAS in a national sample and in four states, but this proved to be a complex and costly process. Therefore, in 2004, ACHB considered using the BRFSS to identify respondents with asthma for further interviewing on a call-back basis because the BRFSS includes a much larger sample size in each area than that of the NAS. Respondents who answered “Yes” to questions about current or lifetime asthma during the BRFSS interview would be eligible for the subsequent asthma survey.
In 2005, the original NAS questionnaire was modified to eliminate items already on the BRFSS and to add some content requested by the individual states. The BRFSS provided respondents for the call-back survey in three asthma grantee states (Minnesota, Michigan, and Oregon) for the call-back pilot. ACHB increased the size of each state’s BRFSS sample to 10,000 respondents, hoping to obtain at least 1,000 respondents with asthma to call back. However, this increase in sample size was very expensive, costing an additional $500,000 per state. Consequently, since 2006, the state BRFSS sample has not been increased for the ACBS.
States that plan to conduct the ACBS among adults with asthma no longer need to add the Adult Asthma History Module to the BRFSS, since the questions on the call-back survey provide more detailed answers. Nevertheless, if states wish to include children in the call-back survey, they must also include both BRFSS child modules: Random Child Selection and Childhood Asthma Prevalence Module.
The ACBS has been implemented through BRFSS every year since 2006. Since the 2011 survey, the weighting methodology for the BRFSS was changed significantly and cell phone samples were added to the traditional landline phone samples. The new weighting methodology— iterative proportional fitting, also known as “raking”, replaced the post stratification weighting method that had been used with previous BRFSS data sets. Due to these two methodological changes, data from years 2010 and earlier are not comparable with data from year 2011 and later. Since the ACBS is methodologically linked to the BRFSS survey, data from the ACBS is also subject to the two methodological changes. Consequently, ACBS data from 2010 and earlier should not be compared or combined with ACBS data from 2011 and later.
In addition, while BRFSS initiated cell phone samples in 2011, not all ACBS-participating states included the cell phone sample in the ACBS. In 2011, only 6 of the 40 states included the cell phone sample in the ACBS, therefore, the ACBS used only the landline samples. The landline sample weight was used to produce the ACBS weight and only landline data were included in the 2011 public-release file.
Detailed information for ACBS data from 2011–2019 can be found in the documents titled “History and Analysis Guidance” at: CDC - BRFSS - BRFSS Asthma Call-back Survey.
Data from the ACBS 2011 landline-only file are methodologically comparable with data from the landline-only files from 2012 and later but are not comparable with the ACBS Landline and Cell Phone (LLCP) data. Data from the ACBS 2012 LLCP files are methodologically
comparable only with ACBS LLCP files from 2013 and later. From 2015 forward, ACBS publicly released files only include states collecting both landline and cell phone samples for both adult and child data.
In 2020, ACBS protocol required that states collect both landline and cell phone (LLCP) samples for both adult and child. The adult ACBS LLCP public file includes 28 states/territories that met data quality standards from the states/territories that did both LLCP samples. Furthermore, many states/territories collected child data, however, only 7 states/territories met the data-quality standards from the states/territories that did both LLCP samples.
Questionnaires, tables, data files, and documentation for the ACBS can be accessed at CDC - BRFSS - BRFSS Asthma Call-back Survey.
The BRFSS cooperative agreement provided funding for the 2020 ACBS from ACHB. Any state or territory can apply for funds to implement the ACBS. States must include both the Childhood Asthma Prevalence Module and Random Child Selection Module to include children in the call- back survey. The BRFSS sample size will not be increased for the ACBS. To produce a sufficient number of respondents for detailed analysis, it is recommended that a state conduct the ACBS for at least 2 consecutive years. States participating in the 2020 ACBS are shown in the following table.
States | Adult LLCP | Child LLCP |
Arizona | | DNCCD |
California | + | * |
Connecticut | | |
Florida | | * |
Georgia | | |
Hawaii | | * |
Illinois | + | * |
Indiana | | * |
Iowa | | DNCCD |
Kansas | | * |
Kentucky | + | * |
Maine | | * |
Maryland | + | * |
Massachusetts | | * |
Michigan | | * |
Minnesota | | |
Missouri | | * |
Montana | | * |
Nebraska | | * |
Nevada | + | DNCCD |
New Hampshire | | * |
New Jersey | | |
New Mexico | | * |
New York | | * |
Ohio | | * |
Oregon | | DNCCD |
Pennsylvania | | * |
Rhode Island | | * |
Texas | | |
Utah | | |
Vermont | | |
Wisconsin | | * |
Puerto Rico | | * |
LLCP Landline Cell Phone Combined Sample DNCCD = Did Not Collect Child Data
* Child data not included in the public use file due to having < 75 completes (See Data Anomalies).
+ Adult data not included in the public use file (See Data Anomalies).
Included in public use file
BRFSS collects data in all 50 states as well as in the District of Columbia and participating territories. BRFSS questionnaires, data, and reports are available at CDC - BRFSS. The most- recent BRFSS data user guide can be found at: The BRFSS Data User Guide June 2013 (cdc.gov).
From the parent survey (BRFSS), the ACBS inherits a complex sample design involving multiple reporting states/territories. These factors complicate the analysis of the ACBS. Additionally, some states stray from traditional BRFSS and ACBS protocol; these variations should be considered prior to analysis of these data. Information on the BRFSS deviations can be found in the document titled Comparability of Data, which can be accessed at: BRFSS Comparability of Data 2016 (cdc.gov).
Several states did not collect ACBS data some months, over the 12-month collection period. This may be an issue when investigating seasonal patterns in the data. States with more than 3 months of no collected interviews are noted below. States missing 6 months or more ACBS data in the 12-month collection period are excluded from the public use data file.
2020
California did not collect adult nor child interviews in January, February, March, April, May, and August. California adult data are excluded from the public-release file due to having too few records (<200) to produce reliable weights. California child data are excluded from the public-release file due to having too few records (<75) to produce reliable weights.
Kansas did not collect adult interviews in January, February, March, and October.
Kentucky did not collect adult interviews in January, February, March, June, July, October, November, and December. Also, Kentucky did not collect child interviews for January, February, March, April, May, June, July, September, October, November, and December. Neither Kentucky’s adult nor child data can be included in the public-release file due to missing over 6 months of ACBS interviews.
Maryland did not collect adult interviews in January, February, March, April, May, June, July, August, and September. Also, Maryland did not collect child interviews in January, February, March, April, May, June, July, August, September, October,
and December. Neither Maryland’s adult nor child data can be included in the public-release file due to missing over 6 months of ACBS interviews.
Massachusetts did not collect child interviews in June, September, October, and November.
Montana did not collect child interviews in January, February, March, and October.
Nevada did not collect adult interviews in January, February, March, April, May, June, July, August, September, October, and November. Nevada adult data is excluded from the pubic-release file due to missing over 6 months of ACBS interviews.
New Jersey did not collect adult nor child interviews in January, February, March, and April.
New Mexico did not collect child interviews for January, February, March, April, and May.
Pennsylvania did collect adult interview in January, September, November, and December. Also, Pennsylvania did not collect child interviews in January, July, September, November, and December.
Puerto Rico did not collect adult interviews in January, February, March, April, May, June, July, August, and September. Also, Puerto Rico did not collect child interviews in January, February, March, April, May, June, July, August, September, and October.
Several states varied from ACBS protocol in ways that affected the weighting procedures.
2020
Massachusetts did call-back only for a version 1 sample for their child data set. Massachusetts is not included in the child public release file due to having too few records (<75) to produce reliable weights.
New York did call-back only for a version 2 sample for adult and child data set. New York is not included in the child public release file due to having too few records (<75) to produce reliable weights.
Puerto Rico did not collect landline data for their child data set and is excluded from public release for both having too few records (<75) and failure to collect both landline and cell phone data as per the ACBS guidelines for data collection.
Arizona, Iowa, Nevada, and Oregon did not collect child ACBS data.
Adult data for California, Kentucky, and Nevada were not included in the public- release data file because of having too few records (<200) to produce reliable weights.
Maryland only collected September, October, November and December data, and was excluded from public released file
Illinois adult and child data is not included in the public-release data file because they did not submit the correct data file due to vendor error and the data could not be extracted for weighting.
Adult data among weighted states in Arizona, Connecticut, Florida, Georgia, Hawaii, Iowa, Indiana, Maryland, Minnesota, New Hampshire, New Jersey, New York, Ohio, Pennsylvania, Rhode Island, Texas, Utah, Vermont, and Wisconsin had more than 10% of the BRFSS records for respondents with asthma and did not record information about the call-back participation. For these states, weighting was done using a Modified Adjustment Factor Method.
Data for children among weighted states in Minnesota, Utah, and Vermont had more than 10% of the BRFSS records for respondents with asthma and did not record information about the call-back participation. For these states, weighting was done using a Modified Adjustment Factor Method.
For additional information on weighting the ACBS records, refer to the document “Asthma Call-back Weighting Methods,” which can be requested from NCEH/EHHE/ACHB (asthmacallbackinfo@cdc.gov).
The Institutional Review Board (IRB) in some states required that asthma be mentioned when the BRFSS respondent was asked to participate in the ACBS. Other states required that asthma not be mentioned. Some state IRBs required that BRFSS respondents be specifically asked if their BRFSS responses could be linked to their ACBS responses. Other state IRBs did not. If a state required active consent to link the responses from the two interviews, the PERMISS variable on the data file will be coded one (1) for yes. If consent was denied, the ACBS was not conducted and there will be no record in the file. Wording for specific consent scripts can be obtained from each participating state.
Several states asked the ACBS consent questions directly after the asthma questions in the core of the BRFSS survey. Other states asked the consent questions at the end of the BRFSS interview.
Data collection period starts in January of calendar year of the BRFSS sampling year until the end of March of the following year. Approximately 10% of the ACBS interviews are completed in the following year of BRFSS sampling year. The variable IYEAR_F identifies the year of the call-back interview sampling year.
Information about survey disposition codes, item non-response, and complete and incomplete designation can be found in the ACBS Summary Data Quality Report. Similar information about the BRFSS can be found in the BRFSS Summary Data Quality Report, which can be accessed at Behavioral Risk Factor Surveillance System 2020 Summary Data Quality Report (cdc.gov).
When the intent of an analysis is to compare those with asthma with those who do not have asthma, the appropriate file to use is the BRFSS file. The sample size is larger and the responses to BRFSS questions are available for all respondents with asthma and without asthma. The percentage of ACBS adult respondents, non-respondents, and BRFSS-eligible asthma respondents by demographic groups and state/territory can be found on Behavioral Risk Factor Surveillance System 2020 Summary Data Quality Report (cdc.gov)
When the intent of an analysis is to compare subpopulations of those with asthma, the appropriate file to use is the asthma call-back file.
Data record
The ACBS record for a respondent consists of the entire BRFSS interview record, followed by the ACBS data. There is no need to merge the ACBS data with data from the BRFSS interview. The ACBS codebook, however, does not include the BRFSS portion of the data. BRFSS codebooks can be accessed at: CDC - BRFSS Annual Survey Data after selecting an individual survey year.
Skip patterns
The Asthma Call-back questionnaire has multiple and complex skip patterns. Each of the skip patterns has been coded into subsequent questions using individual value codes to identify the source response that caused the question to be skipped. These additional codes do not appear in the questionnaire but are in the codebook. This skip coding allows the analyst to clearly determine an existing skip pattern and easily decide the denominator appropriate for any given analysis or statement without tracing skip patterns in the questionnaire. For more information on coding skip patterns, see the document Coding Skip Patterns, which can be requested from NCEH/DEHSP/ACHB ( asthmacallbackinfo@cdc.gov ).
Calculated variables
Not all the variables that appear on the public use data set are taken directly from the ACBS questionnaire. CDC prepares a large set of calculated variables that are added to the actual questionnaire responses. Most of the variables on the ACBS file are calculated variables. The calculated variables are created for the user's convenience. The procedures for the calculated variables vary in complexity; some only combine codes from one or
two questions, while others require sorting and combining selected codes from multiple variables.
At the time of the call-back interview, the respondent is asked to confirm the responses to the two asthma questions from the BRFSS interview. Not all respondents agree with the responses that were recorded from the initial interview.
The calculated combined call-back asthma variables _CUR_ASTH_C and
_EVER_ASTH_C are not identical to the BRFSS asthma variables ASTHNOW and ASTHMA3 (CASTHNO2 and CASTHDX2 for children) or the BRFSS adult calculated variables _CASTHM1 and _LTASTH1.
The combined call-back variables _CUR_ASTH_C and _EVER_ASTH_C use the BRFSS responses when the respondent agreed with them and the ACBS responses at the time of the call-back interview when the respondent did not agree with the BRFSS responses.
Questionnaire changes
There were no changes to the ACBS questionnaire for 2014 through 2020.
2013 ACBS questionnaire changes included:
Inhaler medications Brethaire, Intal, and Tilade were deleted since all have been discontinued.
Inhaler medications Alvesco and Dulera were added.
Nebulizer medications Combivent Inhalation Solution and Perforomist/Formoterol were added.
2012 ACBS questionnaire changes included:
The name for INH_MEDS 25 was changed to Flex Haler.
Three inhaler medication questions were deleted (ILP01, ILP02, and ILP07).
Response categories for inhaler medication question ILP03 were changed.
The question PILLX (how long taking a specific pill) was deleted and PILL01 (on daily use) was added.
Three questions on nebulizers were added.
Some skip patterns and help screens were revised in the medication section.
The content of the Work-related asthma section was completely revised.
The time reference period for the activity limitation variable was changed from 12 months to 30 days.
Statistical issues
Unweighted data on the ACBS represent the actual responses of each respondent before any adjustment is made for variation in respondents' probability of selection, disproportionate selection of population subgroups relative to the state's population distribution, or nonresponse. To produce the ACBS final weight, the BRFSS final weight is adjusted for loss of sample between the BRFSS interview and the ACBS interview.
Weighted ACBS data represent results that have been adjusted to compensate for nonresponse at the BRFSS interview and at the ACBS interview. For further details regarding the ACBS final weight, refer to the document entitled Asthma Call-back Weighting Method, which can be requested from NCEH/DEHSP/ACHB (asthmacallbackinfo@cdc.gov).
Use of the ACBS final weight is essential when analyzing these data. If weights are not used, the estimates produced will be biased.
In 2020, all states implementing the ACBS included the BRFSS landline and cell phone sample; therefore, the public use file was released for both landline and cell phone samples. The data file includes landline and cell phone data from the subset of states that included both the landline and the cell phone samples and met data quality standards.
The ACBS child files must have a minimum of 75 completes to produce a reliable child weight. The public use file for children was released for the combined landline and cell phone samples (included 4 states/territories that met data quality standards).
In the 2020 ACBS LLCP file, the ACBS final weight was produced using the BRFSS landline cell phone weight (_LLCPWT for adults and _CLLCPWT for children). The 2020 ACBS LLCP final weight variables are:
The procedures for estimating variances described in most statistical texts and used in most statistical software packages are based on the assumption of simple random sampling (SRS). The data collected in the ACBS, however, are obtained through a complex sample design; therefore, the direct application of standard statistical analysis methods for variance estimation (including standard errors and confidence intervals) and hypothesis testing (p-values) may yield misleading results.
SAS SURVEYMEANS, SURVEYFREQ, SURVEYLOGISTIC, and SURVEYREG
can be used for tabular and regression analyses.
SUDAAN can be used for tabular and regression analyses and has additional options.
Epi Info's C-sample can be used to calculate simple frequencies and two-way cross- tabulations.
SPSS Complex Samples can be used to produce frequencies, descriptive analysis, cross-tabulations, and ratios as well as estimate general linear, logistic, ordinal, and Cox regression models.
STATA can produce cross-tabulations, means, logit, and general linear regression models.
Analytic issues
Although the overall number of respondents in the ACBS is more than sufficiently large for statistical inference purposes, subgroup analyses (including state-level analysis) can lead to estimates that are unreliable. Consequently, users need to pay particular attention to the subgroup sample when analyzing subgroup data, especially within a single data year or geographic area. Small sample sizes may produce unstable estimates. Reliability of an estimate depends on the actual unweighted number of respondents in a category, not on the weighted number. Interpreting and reporting weighted numbers that are based on a small, unweighted number of respondents can mislead the reader into believing that a given finding is much more precise than it actually is.
ACBS follows a rule of not reporting or interpreting point estimates based on fewer than 50 unweighted respondents (e.g. percentages based upon a denominator of < 50) or for which the Relative Standard Error is greater than 30%. For this reason, and to protect confidentiality of these data, the FIPS County code is not included on the ACBS public use data record.
When data from one time period are insufficient, data from multiple periods can be combined as long as the prevalence of the factor of interest did not substantially
change during one of the periods. One method that can be used to assess the stability of the prevalence estimates is shown in the following steps:
Compute the prevalence for the risk factor for each period.
Identify a statistical test appropriate for comparing the lowest and the highest estimates at the 5% level of significance. For example, depending on the type of data, a t-test, or the sign test might be appropriate.
Test the hypothesis that prevalence is not changing by using a two-sided test in which the null hypothesis is that the prevalence is equal.
Determine whether the resulting difference could be expected to occur by chance alone less than 5% of the time (i.e., test at the 95% confidence level).
When combining multiple years of ACBS data for the purpose of subgroup analysis, the final weight will need adjusting and the file year will need to be added as an additional stratum on the complex design specification. However, when combining multiple years of data for the purpose of examining trends, reweighting is not appropriate. For more information on reweighting combined years, see the document Reweighting Combined Files, which can be requested from NCEH/DEHSP/ACHB (asthmacallbackinfo@cdc.gov).
Provided that the prevalence of risk factors did not change rapidly over time, data combined for two or more years may provide a sufficient number of respondents for additional estimates for population subgroups (such as age/sex/race subgroups or state populations). Before combining data years for subgroup analysis, it is necessary to determine whether the total number of respondents will yield the precision needed, which depends upon the intended use of the estimate. For example, greater precision would be required to justify implementing expensive programs than what would be needed for general information only.
The table below shows the sample size required for each of several levels of precision, based on a calculation in which the estimated risk factor prevalence is 50% and the design effect is 1.5.
Precision desired | Sample size needed |
2% | 3600 |
4% | 900 |
6% | 400 |
8% | 225 |
10% | 144 |
15% | 64 |
20% | 36 |
Precision is indicated by the width of the 95% confidence interval around the prevalence estimate. For example, precision of 2% indicates that the 95% confidence interval is plus (+) or minus (-) 2% of 50%, or 48% to 52%. As shown in the table, to yield this high level of precision, the sample size required is about 3,600 persons.
When a lower level of precision is acceptable, the sample size can be considerably smaller.
The design effect is a measure of the complexity of the sampling design that indicates how the design differs from simple random sampling. It is defined as the variance for the actual sampling design divided by the variance for a simple random sample of the same size (Frazier 1992; Kish 1965). For most risk factors in most states, the design effect is less than 1.5. If it is more than 1.5, however, sample sizes may need to be larger than those shown in the table above.
The standard error of a percentage is largest at 50% and decreases as a percentage approach 0% or 100%. From this perspective, the required sample sizes listed in the table above are conservative estimates. They should be reasonably valid for percentages between 20% and 80% but may significantly overstate the required sample sizes for smaller or larger percentages.
Compared with face-to-face interviewing techniques, telephone interviews are easy to conduct and monitor and are cost efficient, but telephone interviews do have limitations. Telephone surveys may have higher levels of non-coverage than face-to-face interviews because some US households cannot be reached by telephone. While approximately 99% of households in the United States have telephones, several studies have shown that the telephone and non-telephone populations are different with respect to demographic, economic, and health characteristics (Groves 1979; Blumberg 2018; Federal Communications Commission 2020). Although the estimates of characteristics for the total population are unlikely to be substantially affected by the omission of the households without telephones, some of the subpopulation estimates could be biased. Telephone coverage is lower for population subgroups such as Blacks in the South, people with low incomes, people in rural areas, people with less than 12 years of education, people in poor health, and heads of households under 25 years of age (AAPOR 2017). Nevertheless, post stratification adjustments for age, race, and sex, and other weighting adjustments used for the BRFSS and ACBS data minimize the impact of differences in non-coverage, under-coverage, and nonresponse at the state level.
Despite the above limitations, prevalence estimates from the BRFSS correspond well with findings from surveys based on face-to-face interviews, including studies conducted by the National Institute on Alcohol Abuse and Alcoholism, CDC's National Center for Health Statistics, and the American Heart Association (Frazier 1992; Hsia 2020). A summary of methodological studies of BRFSS can be found at: CDC - BRFSS BRFSS Data Quality, Validity, and Reliability.
Surveys based on self-reported information may be less accurate than those based on physical measurements. For example, respondents are known to underreport weight. Although this type of potential bias is an element of both telephone and face-to-face interviews, the underreporting should be taken into consideration when interpreting self- reported data. When measuring change over time, this type of bias is likely to be constant and is therefore not a factor in trend analysis.
American Association for Public Opinion Research (AAPOR). The Future of U.S. General Population Telephone Research. Available at: https://www.aapor.org/Education- Resources/Reports/The-Future-Of-U-S-General-Population-Telephone-Sur.aspx.
Accessed 20 December 2021.
Blumberg SJ, Luke JV. Wireless substitution: Early release of estimates from the National Health Interview Survey, January–June 2018. National Center for Health Statistics; 2018. Available at: https://www.cdc.gov/nchs/data/nhis/earlyrelease/wireless201812.pdf.
Accessed 20 December 2021.
Federal Communications Commission USA. Universal Service Monitoring Report. 2020; DOC 369262A1.pdf (fcc.gov)
Accessed 20 December 2021.
Frazier EL, Franks AL, Sanderson LM. Chapter 4: Behavioral risk factor data. In: Using Chronic Disease Data: A Handbook for Public Health Practitioners. Centers for Disease Control and Prevention; 1992. 4.1-1.17
Groves RM, Kahn RL. Surveys by Telephone: A National Comparison with Personal Interviews, Academic Press; 1979.
Hsia J, Zhao G, Town T, et al. “Comparisons of estimates from the Behavioral Risk Factor Surveillance System and other National Health surveys, 2011 – 2016. Am J Prev Med. 2020; 58(6): e181-e190.
Available at: https://doi.org/10.1016/j.amepre.2020.01.025. Accessed 20 December 2021.
Kish L. Survey Sampling. John Wiley & Sons; 1965.