Show simple item record Donaldson, Devan Ray
dc.coverage.temporal The data were collected between August 2015 and February 2016. en 2017-08-10T14:19:29Z 2017-08-10T14:19:29Z
dc.description.abstract This package contains the data and associated analyses for a study which examined the benefits of acquiring Data Seals of Approval (DSAs) from the point of view of those who have them. This package includes data from a series of 15 semi-structured interviews with representatives from 16 different organizations in which participants described the benefits of having DSAs in their own words. The four files in this package include: 1) the coded interview transcripts and description of codes (i.e., codebook) in NVivo for Mac Version 11.3.2 (1888) file format (e.g., .nvpx), 2) the raw dataset that lists the frequency with which each benefit was mentioned by DSA board members and non-DSA board members in IBM SPSS Statistics 24 file format (e.g., .spv), and 3) the processed/analysed data from the Mann-Whitney U tests in two different file formats (e.g., .doc and .spv). en
dc.description.sponsorship This project was funded by a Research Data Alliance United States Data Share Fellowship from the Alfred P. Sloan Foundation. en
dc.language.iso en en
dc.subject Data Seal of Approval, Benefits, Trust in Digital Repositories en
dc.title The Perceived Value of Acquiring Data Seals of Approval Study Dataset and Associated Files en
dc.type Dataset en
dc.description.methodology This study investigates the value of acquiring DSAs from the perspectives of those who have them. To avoid data collector characteristics and bias that could potentially affect what data were collected and how they were analysed, only those without prior experience in the development of the DSA were selected to be a part of the research team that handled data collection and analysis. Specifically, this study included a data collector who is knowledgeable about repository standards but yet did not play an active role in the development of any of them. The data collector also does not serve as a formal third-party auditor, a contrast from existing research on this topic. Approval for conducting this research was received from the Indiana University Human Subjects Office. Our findings are drawn from data collected during interviews conducted between August 2015 and February 2016. All participants were at organizations whose repositories successfully acquired DSAs. We selected these individuals because only digital repository staff members at institutions which successfully acquired DSAs would be able to speak from experience about the actual benefits of having them. The list of 64 acquired seals on the DSA website constituted the sampling frame. The first author recruited participants by emailing representatives from each of the repositories that acquired the DSA and inviting their participation. The first author sent follow-up emails on two separate occasions to try to increase participation. As a result of these efforts, we successfully recruited 15 representatives from these repositories to participate in this study, with a response rate of 23%. The primary purpose of conducting the semi-structured, 30-minute interviews was to understand the value of the audit process and certification from the perspective of actual digital repository staff members. The first author asked respondents to discuss: how they learned about the DSA certification, how they decided to undergo audit, how they prepared for it, what the process was like, any lessons learned, the perceived value of the audit process, and the perceived value of certification since attaining it. No incentives for participation were provided. All interviews took place by telephone or via Skype and were audio recorded. Afterwards, all interviews were transcribed. Transcripts were then coded using NVivo – a qualitative data analysis software tool. Prior to analyzing the transcripts, the first author developed a codebook based primarily on the list of benefits on the DSA website. The first author also remained open to identifying additional themes as a result of analyzing the transcripts. The first author and a hired graduate student coded the transcripts. We calculated inter-rater reliability using Cohen’s Kappa. We achieved a score of 0.87; thus, on average, we agreed on codes 87% of the time. Some participants played an active role in the development of the DSA, either as past or present DSA board members. This was seen as a potential threat to the validity of the data. In particular, DSA board members could subconsciously or consciously over-report the benefits of acquiring DSAs based on their knowledge of and experience with the standard. For this reason, additional data analyses were performed. Specifically, a series of Mann-Whitney U tests were performed to detect whether any statistically significant differences existed between DSA board members and non-DSA board members regarding the frequency with which they reported benefits of acquiring DSAs. All statistical analyses were performed using IBM SPSS Statistics 24. en
dc.description.file To access the coded interview transcripts and description of codes (i.e., codebook), you need NVivo for Mac Version 11.3.2 (1888). To access the raw dataset that lists the frequency with which each benefit was mentioned by DSA board members and non-DSA board members, you need IBM SPSS Statistics 24. To access the processed/analysed data from the Mann-Whitney U tests, you need IBM SPSS Statistics 24 to access the .spv file and Microsoft Word to access the .doc file. en

Files in this item

This item appears in the following Collection(s)

Show simple item record

Search IUScholarWorks

Advanced Search


My Account