CSR-SSRC Survey Research Methods Seminars

Permanent link for this collectionhttps://hdl.handle.net/2022/21121

Browse

Recent Submissions

Now showing 1 - 7 of 7
  • Item
    Developing and testing quality data collection instruments
    (2020-02-27) Yahng, Lilian; Clark, Ashley
    A well-designed and tested data collection instrument is one of the most powerful tools that researchers in health and medicine, business and public policy, education, and the social sciences have to obtain accurate measurements of attitudes, opinions, beliefs, and behaviors. Harnessing the power of this tool requires careful attention to a wide range of design issues from wording the questions in a way that is easily understood to selecting response categories to ordering the questions to formatting the questions for a screen or on paper. In this seminar, we review best practices in the development and testing of data collection instruments that may be administered online, by mail, or by an interviewer. We will provide numerous examples of how to design and evaluate questions and how to implement commonly used testing procedures, including in-depth cognitive interviews and field pretests, to help ensure that measurements are accurate and reliable.
  • Item
    What is ‘good research’?: Designing and evaluating quality studies
    (2019-11-08) Clark, Ashley; Yahng, Lilian
    Today, there are more ways than ever for researchers in health and medicine, business and public policy, education, and the social sciences to design, conduct, and analyze studies to measure behaviors, attitudes, opinions, and beliefs. But with greater access to data and technologies, there is also increased potential for poorly designed research that can yield inaccurate findings. Poorly worded and formatted questions, reliance on easy, cheap samples of unknown quality, and low participation rates among some subgroups loom large. In this seminar, we will review potential threats to data quality using the widely-cited Total Survey Error (TSE) framework, focusing on coverage, sampling, nonresponse, and measurement error. Examples will be presented to highlight potential threats in these areas of data quality and how they may be addressed. This seminar should benefit researchers who are engaged in primary data collection as well as secondary data users.
  • Item
    Technological tools and best practices for conducting web surveys
    (2019-10-09) Tharp, Kevin; Yahng, Lilian
    Web surveys are now widely used to collect cutting-edge scientific data in health and medicine, business and public policy, education, and the social sciences. They have brought exciting new capabilities that expand what we are able to do as researchers and can improve efficiency and data quality. Yet, they also present new challenges in survey design and usability and must keep pace with rapidly changing technology. In this seminar, we will provide best practices on how to effectively lay out and design web surveys and implement them to maximize participation and data quality. We will show examples of available features for designing, administering, and analyzing web surveys using off-the-shelf and custom survey systems.
  • Item
    Developing and Testing a Framework for Understanding Public Support of "Fracking"
    (2015-06-24) Alcorn, Jessica; Schenk, Olga; Graham, John; Rupp, John; Carley, Sanya; Lee, Michelle; Zhang, Yu; Clark, Ashley
    As citizens within many U.S. states have interacted with increased natural gas production through unconventional gas development (UGD), the public’s attitudes towards the practice have not been straightforward. Previous research on acceptance of energy technologies, and UGD specifically, highlight a wide range of factors that may drive public attitudes, namely sociodemographic characteristics and political ideology. Decades of research have also sought to explore the dimensionality of environmental attitudes; however, the theoretical frameworks and methods from this literature have not been employed in analyzing attitudes towards specific kinds of energy development. In this study, we conducted a survey of adult U.S. residents in six U.S. states where UGD is underway or geologically promising: three states with high and/or growing production (Ohio, Pennsylvania, and Texas) and three that have little or no UGD (New York, Illinois, and California). We analyze both whether respondents typically identify different components of advantages and disadvantages of UGD in their attitudes and what factors influence attitudes. Our analysis comports with prior research indicating that those who identify as White or male perceive fewer disadvantages of UGD. As expected, older respondents and those who receive royalty payments perceive more advantages of UGD, and those who identify as members of the Democratic Party perceive more disadvantages of UGD. However, the role of education in influencing attitudes remains somewhat unclear. With respect to dimensionality, we find that respondents generally do not differentiate between the various aspects of potential advantages and disadvantages of UGD.
  • Item
    What Is The Impact of Smartphone Optimization on Long Surveys?
    (2015-06-24) Sarraf, Shimon; Brooks, Jennifer; Cole, James; Wang, Xiaolin
    Each year, an increasing number of college student survey respondents are accessing online surveys using smartphones, instead of desktop computers (Sarraf, Brooks, & Cole, 2014). In 2011, about 4% of respondents to the National Survey of Student Engagement (NSSE) used a smartphone, but by 2015 the proportion increased to about 27%. The widespread adoption of smartphones among college students has prompted discussions among some institutional and higher education researchers about data quality and appropriate survey formats. Some research suggests that there is little or no substantive difference for users accessing web surveys on tablets as compared to desktop machines, but the user experience on a smartphone is recognized to be significantly different than the other devices (Buskirk & Andrus, 2012). This has caused some researchers to optimize their online surveys for smartphones, however research is limited about how this may affect survey data quality. Using results from a ten institution experiment using the 2015 National Survey of Student Engagement (NSSE) the current study details the impact that smartphone optimization has on a survey with over one hundred questions. Study research questions center on how various data quality indicators are effected by optimizing a survey for smartphones, including item skipping, completion time, scale factor structure, and response option differentiation (straight-lining). Based on their recent experience, presenters will offer insights into developing a smartphone-optimized version of a relatively long survey.
  • Item
    Examining the Impact of Mobile First and Responsive Web Design on Desktop and Mobile Respondents
    (2015-06-24) Tharp, Kevin
    Mobile First and Responsive Web Design are two approaches that survey researchers can utilize to improve the experiences of smart phone users, who make up a growing proportion of web survey respondents. The Mobile First approach has a layout optimized for smart phones that also serves as the basis for the desktop design, while Responsive Web Design allows for more dynamic adjustments based on browser size. We conducted experiments with each design within the past year. Both experiments were attempts to better meet the needs and expectations of mobile respondents with designs that could also be employed for desktop respondents with minimal differences in layout. For each, a random sample of respondents was assigned to receive the experimental version of each survey, while other respondents were assigned to a more traditional, desktop-focused design. The Mobile First experiment was conducted first. Smart phone users who were assigned the Mobile First layout were less likely to break off, and had lower duration times than smart phone users assigned to the traditional design. They also gave the Mobile First design higher ratings in a short post-survey evaluation questionnaire. Desktop users, however, had longer duration times when completing the Mobile First design when compared to the traditional design. They also rated the Mobile First design lower on professional appearance, and commented on excessive need for scrolling. We found no significant differences in response distribution among the layout versions and devices. Our second experiment was designed to address these specific concerns of desktop while maintaining the positive benefits for smart phone users. When this additional experiment is completed, we will compare both designs, considering data quality, response rates, evaluation scores, cost, and development time.
  • Item
    A Comparison of Online Panels with GSS and ANES Data
    (2015-06-24) Zack, Elizabeth; Kennedy, John
    In the past five years, researchers have increasingly used low cost data collection methods to conduct surveys. Survey data collection software platforms such as Qualtrics and Survey Monkey allow researchers to easily and cheaply create questionnaires for distribution. Similarly, low cost methods are available to recruit survey participants. Some of these include online panels, Amazon’s Mechanical Turk (MTurk), and Google Consumer Surveys. With these tools, researchers are much less dependent on professional survey researchers to conduct surveys. More importantly however, survey researchers are now using non-probability online panels as a substitute for probability samples. In 2010 and 2013, AAPOR released task force reports that analyzed the challenges encountered when using online panels and nonprobability samples for high quality survey research. In 2014, a book on the use of online panels in survey research included chapters written by respected survey researchers (Callegaro et al, 2014). In the past five years, at least 20 peer-reviewed methods articles were published on the use of Mechanical Turk for social and behavioral science research. Despite the cautions raised about the appropriate use of online panels, they are being used more often, e.g., the YouGov panel and CBS News. In this presentation, we will discuss the results of a number of experiments we conducted that compared distributions in questions asked recently in the ANES and GSS to similar questions asked with MTurk samples and a Qualtrics online panel. In addition, we will show how simple multivariate models are similar and different using data from both the probability and non-probability samples. This presentation will contribute to the continuing research into the appropriate uses of online panels for survey research.