Master's Theses

Permanent link for this collection

Browse

Recent Submissions

Now showing 1 - 20 of 87
  • Item
    SEX INFLUENCES ON THE GUT MICROBIOME IN A MOUSE MODEL OF AN ENVIRONMENTAL ALLERGEN CHALLENGE
    ([Bloomington, Ind.] : Indiana University, 2023-05) Alford, Rachel R.; Silveyra, Patricia
    Asthma is a chronic inflammatory disease of the airway that compromises lung function and affects millions of people worldwide. Disparities in sensitivity between men and women to the initiation and exacerbation of asthma phenotypes have been identified; however, it remains unclear whether mediators such as the gut microbiome contribute to such differences. Commensal microbial communities found throughout the body are necessary for initiating immune responses and they contribute to the development of chronic inflammation. In this study, we used a mouse model of house dust mite (HDM) challenge and identified sex differences in the composition of the gut and lung microbiota. The aim being to discover to what extent the microbiome undergoes alteration in response to the induction of allergic asthma in mice challenged with HDM. For this, fecal pellets and whole lung tissue of male and female C57BL/6 mice intranasally exposed to 25 µg of house dust mite extract (HDM) in 50µL of phosphate buffered saline (PBS) or 50µL PBS (control) daily for 5 weeks (n=4-6/ treatment group) were collected before starting treatment and at the end of week 5. DNA from fecal pellets was extracted using the ZymoBIOMICS®-96 MagBead DNA Kit and analyzed with the ZymoBIOMICS® Service to determine the 16S microbiome: Targeted Metagenomic Sequencing (Zymo Research, Irvine, CA). DNA isolation from whole lung tissue and 16S rRNA sequencing was performed by the Microbiome Core Facility of the University of North Carolina at Chapel Hill. Visual and statistical comparisons using stacked taxa bar plots and Firmicutes to Bacteroidetes ratios expressed community composition changes in samples. Nonmetric Multidimensional Scaling illustrated structural alteration to the microbiota, which was verified through PERMANOVA testing. We concluded that the gut microbiome did experience statistically significant sex-specific changes to its composition and structure; while the lung microbiome saw significant differences as a result of treatment, not sex.
  • Item
    EXAMINING ENERGY INDICES IN RECREATIONAL RUNNERS: AIMING TO BETTER ADVISE PERIODIZATION
    ([Bloomington, Ind.] : Indiana University, 2023-05) Davel, Kathryn A.; Gruber, Allison H.
    Recreational running is a widely implemented form of physical activity, and it continues to gain popularity due to the associated cardiovascular, musculoskeletal, and mental health benefits (Lavie et al., 2015; Smith et al., 2020). Running specifically triumphs over other forms of physical activity due to its simplicity—the activity lacks age restrictions, a need for a team or equipment, or technical training for a specific skill. Nevertheless, running is not a risk-free sport. Up to 92.4% of runners will experience a running-related injury, or RRI, each year (van Gent et al., 2007). RRIs diminish pleasure in exercise altogether and are associated with undesirable consequences, including temporary or permanent discontinuation of running (van der Worp et al., 2015). Around half of those injured will stop running for at least a year while others cease any form of physical activity indefinitely (Fields et al., 2010). For recreational runners, the annual rate of RRIs has remained high despite increasing attention towards injury prevention. Specifically, attention is currently given to intrinsic modifiable risk factors such as malalignment, tissue recovery times and procedures, and presumed injury-associated running techniques as well as extrinsic modifiable risk factors such as running surface, training errors, and shoe age, fit, and type (Aderem & Louw, 2015). Presently, researchers have only found evidence supporting the modification of one of these risk factors: training error.
  • Item
    Effects of Eccentric Training Intervention on Performance During the Sprint, Drag, Carry Event of the Army Combat Fitness Test
    ([Bloomington, Ind.] : Indiana University, 2023-05) Davel, Jacob J.; Moscicki, Brian M.
    The purpose of this study was to determine the effect of introducing eccentrically focused movements into a normal Military training program on the Army Combat Fitness Test (ACFT). The fourth event of the test, the Sprint Drag Carry, was the focus of the research. Eccentric training has been shown to elicit a sprint performance benefit 10,13,19,27 as well as decrease injury prevalence 17,20,28,34,36, It was hypothesized that including eccentric movements would improve performance for this 250m shuttle sprint event. Twenty-two Cadets from the Indiana University ROTC program were recruited to participate in a 6-week training program. Participants were randomly assigned to one of two groups to conduct either 3 x 5 repetitions of a Nordic Complex (combination of Nordic hamstring curl and reverse Nordic hamstring curl) or 3 x 15 repetitions of a tempo-based body weight squat. Participants took a pre and post ACFT to measure the change in performance. Overall ACFT scores as well as three of the six events showed significant improvement; 3.1% (p<.001) improvement in overall score (out of 600), 6.7% (p=.001) improvement in Hex bar deadlift, 2.7% (p=.045) improvement to 2-mile run time, and 10.4 % (p<.001) improvement in Sprint Drag Carry performance from pre to post with no differences between groups. No injuries were reported during the training period. The results demonstrate there was an observed improvement from pre to post-test; however, it is more likely a result of a dedicated training program designed by the Army to improve ACFT performance than the addition of the movements.
  • Item
    THE EFFECT OF READING DIRECTION AND HISTORY OF MULTIPLE CONCUSSIONS ON THE KING-DEVICK TEST
    ([Bloomington, Ind.] : Indiana University, 2022-05) Cuiriz, Edson; Murray, Caroline; Kawata, Keisuke
  • Item
    CHARACTERIZING LEAD EXPOSURE IN HOUSEHOLDS THAT DEPEND ON PRIVATE WELLS FOR DRINKING WATER
    ([Bloomington, Ind.] : Indiana University, 2022-05) Alde, Alyson; Gibson, Jacqueline MacDonald; Kamendulis, Lisa; Weigel, Margaret
    Evidence accumulated over the past several decades indicates that there is no safe level of exposure to lead for young children. Although the Safe Drinking Water Act has safety measures in place to maintain the safety of municipal water supplies, no such protection exists for private wells. Recent research suggests U.S. children relying on private well water may be at increased risk from lead exposure compared to those with access to a regulated community water supply. However, no prior studies have investigated this risk through concurrent collection of water and blood samples to test for associations between lead in water and lead in blood. To assess these associations, we collected blood and water samples from 89 participants in North Carolina homes relying on private wells for their drinking water. We also collected dust samples to account for the potential risk of exposure to lead from paint and from lead-containing dirt tracked into the home. All environmental samples were analyzed for lead using inductively coupled plasma mass spectrometry. A multivariable regression was performed to examine the association between well water and blood lead, controlling for lead in dust and other factors that may be associated with lead exposure risk. We found that although water lead levels were not directly associated with blood lead, use of a water filter was associated with a decrease of 32% in blood lead (p<0.05). Additionally, use of a filter was significantly associated with a decreased risk of the occurrence of high lead levels drinking water (p=0.01). Furthermore, we found significant racial disparities in access to water filters. Among participants identifying as African American or Native American, 38% had a water filter, compared to 83% of those identifying as other races (p<0.001). This study highlights that those who get their drinking water from a private well and do not filter their water may be at increased risk of exposure to lead. Further research will be required to understand the association more fully between lead exposure and well water, with a focus on communities who do not have access to water filters.
  • Item
    SPECIFICITY OF TMS EFFECTS ON M1 AND S1
    ([Bloomington, Ind.] : Indiana University, 2022-05) Park, Jung Hyun; Block, Hannah
    The region of the human brain known as the sensorimotor cortex is composed of the primary motor cortex (M1) and the primary somatosensory cortex (S1). M1 codes for voluntary movement and S1 processes touch, position sense, and related sensory information. While the neighboring cortices of M1 and S1 are interconnected and reciprocally interact, they are also thought to be separately modulated by stimulation. This study asks if stimulating M1 or S1 has distinguishable effects on motor and tactile processing. Participants received non-invasive transcranial magnetic stimulation (TMS) in a repetitive pattern thought to inhibit cortical activity (continuous theta burst stimulation, or cTBS). This intervention was delivered over M1 or S1. A third group received sham stimulation. Pre- and post-cTBS, motor excitability was assessed with single TMS pulses over M1 and tactile processing was assessed with the 2-point discrimination test (2PD) or the grating orientation test (GOT). Results did not detect a distinguishable effect of cTBS on either motor or tactile processing. This could indicate that the cTBS protocol is not focal enough to cause distinguishable effects in M1 vs. S1, or it could indicate that a larger study would be needed to have sufficient power to detect such effects.
  • Item
    PREVALENCE OF MUSCULOSKELETAL INJURIES AMONG WORLD CLASS DRUM AND BUGLE CORPS: A cross-sectional study
    ([Bloomington, Ind.] : Indiana University, 2021-07) Pohlman, Jennifer; Docherty, Carrie L.
    Drum and Bugle Corps (drum corps) is an evolving and physically demanding activity that has grown in recognition over the years. Drum corps is considered a performing art as a musical marching unit consisting of brass, percussion, and colorguard sections. There are limited studies available regarding the health of marching artists, particularly those involved in drum corps. Specifically, the burden of musculoskeletal injury (MSI) in drum corps is currently unknown. Therefore, the purpose of this study was to determine the prevalence and correlates of MSI in a single drum corps participating at the World Class level in the Drum Corps International (DCI) competition circuit during the 2018 and 2019 seasons. A retrospective cross-sectional design was used to compare demographic and pre-participation variables between injured and non-injured participants. MSIs were evaluated and recorded by corps medical staff and extracted from the electronic medical record (EMR) for our analysis. MSI prevalence was summarized by season, type of injury and participant characteristics. Multiple logistic regression was used to identify demographic and pre-participation correlates of MSI. Overall MSI prevalence was 32.2% (N=149) and 26.2% (N=149) in the 2018 and 2019 seasons, respectively. Higher BMI (Adjusted OR: Obese = 0.88 [95% CI: 0.25 – 3.17]) and fewer years of DCI experience (Adjusted OR = 0.55 [95% CI: 0.37 – 0.84]) were associated with higher odds of MSI in the 2018 season. However, these factors were not associated with MSI in 2019 season. In conclusion, this study determined the MSI prevalence occurring in a World Class DCI drum corps to be considerable with substantial season variability. Lower BMI and attributes associated with more years of drum corps experience may be associated with lower odds of MSI.
  • Item
    EFFECTS OF READING DIRECTION ON KING-DEVICK TEST PERFORMANCE
    ([Bloomington, Ind.] : Indiana University, 2021-06) Carr, Jessica; Kawata, Keisuke
    The King-Devick test (KDT) is a concussion assessment tool that requires individuals to read single-digit numbers as fast and as accurately as possible on either test cards or a tablet. Although there has been sufficient research conducted on the KDT, there is no current research on the effects of reading direction on KDT times The purpose of this study is to determine performance on KDT being read from left to right compared to right to left. Twelve subjects were recruited to participate in this study. Inclusion criteria included being between the ages of 18 and 26, being enrolled as a student at Indiana University, and being a fluent English speaker. Exclusion criteria included proficient fluency of any language that reads right to left or top to bottom and, a history of being diagnosed with a concussion, any visual, ocular, or brain injury within the past 12 months, any history of an eye movement disorder, any noncorrected visual impairment, and a diagnosis of attention deficit/hyperactivity disorder, attention deficit disorder, dyslexia, dyscalculia, or a language processing disorder. Subjects are randomly assigned within each group to one of two forms of the KDT: the traditional assessment or a reverse reading direction version (R-KDT). The participants will then complete their version of the KDT twice and the researcher will record the total time and number of errors on the subject data collection sheet. The effect of reading direction on KDT performance will be examined using an independent t-test. For all tests, significance will be set a priori to α = 0.05. The mean of the control group was 42.7 ± 3.6 and the experimental group was 50.7 ±9.7 sec. There was no significant difference between the groups when comparing subject’s time. When comparing the number of errors the range for the control group was 0-4 and the experimental group was 0-1. There was no significant difference between the groups when comparing subject’s time. There is no significant difference between reading direction and KDT performance. Further research is needed with a larger and more diverse sample size before any conclusions can be made.
  • Item
    EFFECT OF HISTORY OF CONCUSSION ON KING-DEVICK TEST PERFORMANCE
    ([Bloomington, Ind.] : Indiana University, 2021-06) Scheuer, Casey; Kawata, Keisuke
    The King-Devick Test (KDT) is a concussion assessment tool that requires individuals to read single-digit numbers from left to right as fast and as accurately as possible. Although sufficient research has been conducted on the KDT as a whole, little research has analyzed the effect of concussion history on KDT performance. The purpose of this study is to determine if history of concussion will predict worse performance on the KDT. Ten participants were recruited to this study with inclusion criteria being 1) between the ages of 18 and 26 and 2) enrolled as a student at Indiana University. Exclusion criteria included 1) visual, ocular, or brain injury within the past 12 months, 2) history of an eye movement disorder, 3) noncorrected visual impairment, and 4) diagnosis of attention deficit/hyperactivity disorder, dyslexia, dyscalculia, or a language processing disorder. Based on self-reported concussion history, participants were sorted into one of two groups: no concussion history (NoHx) or a history of concussion (ConHx). The participants completed the KDT twice, completion time and number of errors were recorded, and the two times were averaged to attain the participant’s overall time score. An independent t-test was used to examine the effect of concussion history on KDT performance. For all tests, significance was set to α = 0.05. The average time for the NoHx group to complete the KDT was 42.71 seconds ± 3.62 and the average time for the ConHx group was 44.15 seconds ± 6.71. There was no significant difference between the groups when comparing performance (p value =0.687). Although not significant, the study showed promising results regarding those with concussion history producing slower KDT times than those without concussion history. Further research is needed with a larger sample size to better determine the effect of concussion history on KDT performance.
  • Item
    N-Acetyl-Cysteine Supplementation does not Alter the Erythropoietin Response in Trained Endurance Athletes in Acute Hypoxia
    ([Bloomington, Ind.] : Indiana University, 2021-05) Noble, Tyler; Chapman, Robert F.
    To assess the potential effect of N-Acteyl-Cysteine (NAC) supplementation on the magnitude of EPO response with acute exposure to hypoxia in trained endurance athletes, 10 trained male endurance athletes engaged in a placebo-controlled crossover design, featuring two conditions: a placebo (PLA) condition and an NAC condition. Each condition featured a day of baseline testing followed by an 1800 mg/day supplementation period lasting eight days and, finally, a six hour acute hypoxic exposure (FIO2 = 15.8%, ~2500m). Subject serum EPO (EPO), total hemoglobin mass (Hbmass), and hematocrit (Hct) measurements were recorded during the baseline visit of each condition in addition to immediately prior to hypoxic exposure in order to evaluate changes in subject hematological metrics in response to each supplementation condition. Additionally, EPO was measured every two hours during hypoxic exposure to evaluate the EPO response in each condition. Eight days of NAC supplementation prior to acute hypoxic exposure did not significantly alter the EPO (IU/L) response between baseline and pre hypoxic exposure (10.4±2.9 vs. 9.9±1.5; p=0.52) when compared with the PLA condition(10.9±2.5 vs. 10.2±2.0; p=0.23). Additionally, no significant changes were seen in Hbmass (g/kg) in the NAC condition (15.75±0.97 vs 16.01±0.68; p=0.58) when compared to the PLA condition (15.58±0.72 vs 15.84±0.97; p=0.24). Finally, no significant changes were seen in hematocrit (%) in the NAC condition (46.2±1.6 vs 46.6±1.9; p= 0.57) when compared to the PLA condition (45.8±2.7 vs 46.4±2.2; p=0.23). EPO increased at each time point after tent entry in both conditions but the magnitude of increase (NACΔ 32.3±23.6% vs PLAΔ 31.4±20.9%) was not significantly different between conditions. NAC supplementation prior to acute hypoxic exposure has no significant impact on EPO concentration in trained endurance athletes. The large variation in intra-subject EPO response to hypoxia between conditions may have concealed any impact that NAC supplementation had on EPO response.
  • Item
    Skin Carotenoid Accumulation in Response to a Two-Week Sweet Potato Snack Added to the Usual Diet
    ([Bloomington, Ind.] : Indiana University, 2021-01) Erickson, Taylor L.; Fly, Alyce D.
    A lack of consistency exists in the literature regarding the most appropriate time to measure the effectiveness of nutrition interventions by way of pressure-mediated reflectance spectroscopy (RS). There is a growing use of non-invasive methods like pressure-mediated RS to objectively measure fruit and vegetable intake in different populations, particularly in the context of nutrition interventions. This study evaluated the changes in those measurements, skin carotenoid scores (SCS), from baseline SCS values both during and following a two-week intervention promoting increased red-orange vegetable consumption on top of a participant’s usual diet. Specifically, it investigated a general time frame where measurement of SCS would be most advantageous for determining the effectiveness of a nutrition intervention. To test the hypothesis that SCS values would be different from baseline SCS values both during and following the intervention, participants took part in a three-phase nutrition intervention that included a 1-week baseline period, a feeding period in which participants were fed a sweet potato snack 3 times a week for two weeks on top of their usual diet, and a 4-week monitoring period. SCS were measured through each of the three periods with pressure-mediated RS. The scores were analyzed using a linear mixed model with repeated measures for participants over time to determine whether skin carotenoids increased from baseline to the follow up points of the intervention and post-intervention periods. The results showed that change in mean SCS from baseline over time was significant. While SCS during the intervention period were not significantly higher than baseline, SCS at post-intervention were significantly higher than baseline on average. Observations of these data suggest that 2 to 3 weeks after the beginning of a two-week intervention may be a period of interest when measuring the efficacy of such an intervention. On this basis, future studies should be designed for participant final measurements to be made approximately 2 to 3 weeks following a similar dietary intervention.
  • Item
    The Psychological Readiness of Athletes After Sustaining an Injury
    ([Bloomington, Ind.] : Indiana University, 2020-06) Blake, Kirsten; Raglin, John
    The purpose of this study is to determine if there are differences in the psychological readiness of an athlete when returning to play based on a player’s status on the team (starter v. non-starter) or their severity of injury (short-term v. long-term). A total of 14 collegiate athletes who sustained an injury that withheld them from participation for at least 24 hours were included in this study. The participants completed the Athlete Fear Avoidance Questionnaire (AFAQ), InjuryPsychological Readiness to Return to Sport Scale (I-PRRS), Profile of Mood States (POMS), and demographic information within 72 hours of becoming injured. The participants completed those same questionnaires a second time after completing their prescribed rehabilitation and being cleared to return to play. A total of 10 participants completed both sets of questionnaires and were included in the analysis of data. There was a statistically significant difference between the immediately after injury scores and return to play scores for the AFAQ, but not for the I-PRRS and POMS. There were no statistically significant differences between starters and non-starters for all 3 questionnaires or between athletes with short-term and long-term injuries for all 3 questionnaires. Information about psychological readiness could help athletic trainers create appropriate rehabilitation programs that address an athlete’s psychological readiness and doing so could possibly reduce the athlete’s time until he or she is able to return to play.
  • Item
    A Comparative Analysis of Race and Mattering in Leisure Literature
    ([Bloomington, Ind.] : Indiana University, 2020-05) Rubinstein, Cassandra Faye; Mowatt, Rasul
    The purpose of this thesis was to examine the progression of discourse on race within leisure studies scholarship through the lens of racecraft and the construct of mattering. The Journal of Leisure Research as well as Schole were examined within the periods of the 1990s (1989 – 2000) and the 2010s (2009 – 2019). Articles were chosen based upon their employment of the keywords of community recreation, youth development, and race within both time periods, yielding a total of 99 articles that were examined. A discourse historical approach (DHA) was utilized in assessment of the impact of the socio-political context on leisure research as well as the development of discourse on race. Through DHA techniques and the concept of racecraft, this project classified articles under five overarching themes: Faint mentions of race, racialization in the negative, improper terminology use, intentionality of race, and inadequate lens of problem/solution. Based upon the findings of this thesis, leisure literature has displayed minimal progression in its conceptualizations of race. Leisure studies scholarship reflects the dominant discourse through its latent ideology of racism that maintains marginalization of various racialized ethnic groups. It is posited that, without institutional examination and targeted mitigation efforts, the field of leisure will continue to uphold a detrimental racial order with an underdeveloped political and historical stance on race.
  • Item
    Functional Balance Measurements in Ballet Dancers With Varying Visual Input
    ([Bloomington, Ind.] : Indiana University, 2019-05) Wiese, Kelley Rock
    Ballet dancers have been found to exhibit superior balance when compared to non-dancers. However, many believe this heightened balance is achieved by reliance on vision. Ballet dancers spend most of their time rehearsing in a studio with mirrors, providing consistent visual input. After a brief transition to the stage environment, which alters visual information, dancers perform in front of an audience. The purpose of this study was to test if varying environmental factors diminished functional performance in ballet dancers. Healthy college-aged female dancers majoring in ballet performance in a Bachelor of Science program from a large Midwestern university volunteered to participate in this study. A repeated measures design was utilized. The independent variable was environment at three levels: studio facing the mirror, studio facing away from the mirror, and stage. The dependent variables were time in balance during a passé en relevé (seconds), 95% ellipse area (cm2) during a double pirouette, and number of maximum fouetté rotations. Change in environment elicited statistically significant changes in all dependent variables. Tukey post hoc testing revealed that performing on stage negatively impacted balance time, ellipse area and fouetté rotations. Time in balance during the passé en relevé decreased on the stage compared to both studio conditions (0.61 and 0.72 seconds). 95% Ellipse area during the double pirouette was larger on the stage compared to the studio without mirror (3.86 cm2 larger). Number of fouetté rotations decreased on the stage compared to the studio with mirror (1.43 rotations fewer). This study identified decreased balance and turning performance of ballet dancers on the stage in comparison to the studio. This may be due to the dramatic transition between studio and stage environments without allowing proper acclimation for ballet dancers to alter their performance or balance strategies with different visual input. It is important to recognize this deficit and to adapt training strategies to maximize performance abilities on stage. To do this, increased training opportunities on the stage should be integrated into daily schedules and prior to performances.
  • Item
    Effects of Chronic Omega-3 Fatty Acid Supplementation on Erythrocyte Deformability and Muscle Microvascular Oxygenation in Endurance Trained Cyclists
    ([Bloomington, Ind.] : Indiana University, 2018-08) Campbell, Allison J.; Chapman, Robert
    The ability to perform endurance exercise is dependent on the transport of oxygen to the active skeletal muscles. For oxygen to perfuse into the skeletal muscle, flexibility of red blood cells is crucial in determining the passage through narrow capillaries, thus the erythrocytes must deform to increase regional blood flow through the microvasculature. It has been shown at submaximal exercise and in conditions of acute hypoxia, erythrocyte deformability is decreased, however supplementation of omega-3 polyunsaturated fatty acids (PUFAs) has been shown to increase erythrocyte deformability. PURPOSE: The purpose of this study is to determine the effects of chronic omega-3 fatty acid supplementation on erythrocyte deformability and tissue oxygenation in highly trained cyclists during submaximal exercise in both normoxic and hypoxic conditions. METHODS: The protocol was performed on thirteen highly trained cyclists (VO2 max ≥55ml/kg/min). The subjects were divided into treatment (2g of DHA, 3g of EPA, and 100mg of Vitamin E) and placebo (safflower oil) groups. The subjects’ visits pre- and postsupplementation followed identical protocols with one visit occurring in normoxia and the other in acute hypoxia (FIO2 =15%). Each visit involved 2 submaximal cycling bouts of 3 minutes, equaling 25% and 50% of peak power, and one bout at 75% of peak power to exhaustion. Erythrocyte elongation index (EI) and near infrared spectroscopy (NIRS) measures were recorded for each trial. RESULTS: There were no significant differences in Elongation Index (EI) when comparing baseline values to measurements taken post-exercise in the hypoxic condition prior to supplementation. After supplementation, there were no significant differences in baseline EI when compared to pre-supplementation values. After the hypoxic exercise trial, EI at 20 Pa of shear stress was significantly greater within the PUFA group post-supplementation (pre: 0.574 ± 0.004; post: 0.580 ± 0.003; p<0.05).EI at 20 Pa was significantly greater postsupplementation in the PUFA group compared to the placebo group (PUFA: 0.580 ± 0.003; placebo: 0.574 ± 0.006; p<0.05).There were no significant difference in any of the NIRS measures any common workload CONCLUSION: Erythrocye deformability did not decrease with exercise in hypoxia, nor did baseline levels change with omega-3 supplementation, and only a marginal significant difference on post-exercise erythrocyte deformability measures was demonstrated. There was no effect on microvascular tissue oxygenation measures.
  • Item
    Acute Effects of Sleep Deprivation on Ocular-Motor Function as Assessed by King-Devick Test Performance
    ([Bloomington, Ind.] : Indiana University, 2018-06) Coon, Sarah; Kawata, Keisuke
    Less than 10% of high school and college student-athletes obtain the recommended amount of sleep each night. Ocular-motor function, specifically saccade movement, is known to be negatively affected by sleep deprivation. The King-Devick Test is a concussion assessment tool that measures neurocognitive and saccadic function. However, how one’s King-Devick Test performance is affected by sleep deprivation remains unknown. Thus, the purpose of this study was to investigate the acute effects of sleep deprivation on ocular-motor function as assessed by the King-Devick Test. We hypothesized that those who were sleep deprived would have an increased, or worsened time performance in the King-Devick Test compared to those in the control group. Forty-two college-aged subjects (18-26 years old) who regularly slept 7-9 hours per night and had normal or corrected-to-normal vision were recruited in the study. Exclusion criteria were any sleep disorders or neurocognitive dysfunctions, and pregnant females. Participants were randomly assigned to one of 2 groups: control and sleep deprivation. One week prior to testing, participants were fitted with an ActiGraph and given a sleep log to objectively and subjectively record sleep durations, respectively. There were 3 test sessions with 1 intervention over a 24-hour period: Test 1 (7am), Test 2 (7pm), overnight sleep intervention, Test 3 (7am). King-Devick measurements of time in seconds and cumulative errors made were recorded for each test session. Individuals in the sleep deprivation group underwent sleep restriction of 3.5 hours during the intervention while the control group slept 8 hours. A repeated measures analysis of variance was used to identify a significant time by group interaction as well as main effect for time. Further investigation using Bonferroni post-hoc testing allowed insight to the significant difference(s) that occurred. Data were analyzed using SPSS Statistics Version 23 and level of statistical significance was set to P< 0.05. The repeated measures analysis of variance showed that there was a statistically significant Group x Time interaction for King-Devick speed, [F(1.7,67.6)=8.840, p=0.001], and showed significant main effect for time [F(1.7,67.6)=12.736, p <0.001]. Bonferroni post-hoc analysis revealed that the control group continued to improve their saccadic performance over time [Time 1 (42.0$\pm$4.4), Time 2 (40.2$\pm$4.2), Time 3 (38.4$\pm$4.2); Time 1 vs. 2, p<0.001; Time 1 vs. 3, p<0.001; Time 2 vs. 3, p=0.028], while such improvement was not seen in the sleep deprivation group after acute sleep deprivation [Time 1 (41.3$\pm$5.8), Time 2 (38.7$\pm$6.0), Time 3 (40.9$\pm$7.3): Time 1 vs. 2, p<0.001; Time 1 vs. 3, p=1.00; Time 2 vs. 3, p=0.002]. In conclusion, acute sleep deprivation mitigates learning effect seen in the control group, indicating that acute sleep deprivation may cause substantial decrease in neuro-ophthalmologic efficiency. Sleep deprivation negatively impacts ocular-motor function, specifically saccadic movement, as illustrated by regressing expected learning curve in the King-Devick perfromance in the sleep deprivation group. Relying on saccades and cognitive function, the King-Devick Test is a commonly used concussion assessment tool. The result has an immense clinical implication because administering the King-Devick Test to someone in a sleep deprived condition may falsify the results. A baseline test could be inaccurately high which, when compared to a post-injury result, may create a false negative. Further, a post-injury test in a sleep deprived condition may increase the performance time even in the absence of a concussion leading to a false positive.
  • Item
    Relationship between substance use and the onset of Spinal Cord Injuries: A medical chart review
    ([Bloomington, Ind.] : Indiana University, 2018-06) Eldridge, Lori Ann; Piatt, Jennifer
    Spinal cord injuries (SCI), although known to have a long-term impact on the individual who sustains the injury, can have a more debilitating impact on an individual with a co-occurring substance use disorder. In fact, previous research has identified substance use as a primary risk factor of sustaining a SCI. Although the majority of SCI and substance misuse disorder research typically focuses on treatment of individual post-injury. It is unclear what form of substance an individual engages in prior to sustaining the SCI (i.e., cannabis, opioids, alcohol). This study reviewed 20 medical charts of individuals ages 18 and older who have sustained a SCI and received medical care at a level 1 trauma center in Indiana, United States. Results showed that SCI were sustained from the following: 45% fall, violence 25%, vehicular 25%, and other 5%. Data revealed an 80% positive toxicology or self-report of substance use immediately prior to the onset of the SCI. Males were positive for (or reported uses of) more than one substance at a rate of 25% at time of injury. The following substances were identified as used prior to injury and listed as most to least prevalent: opioids (37.5%), alcohol & marijuana (25%), methamphetamines & benzodiazepines (12.5%), and cocaine and synthetic cathinone (6.25%). Additionally, the rate of unemployment prior to injury was 80% for the total population sampled and 81.25% for individuals with positive substance use reported immediately prior to injury.
  • Item
    Acute Subconcussive Effect on Plasma Neurofilament Light Polypeptide in Collegiate Soccer Players
    ([Bloomington, Ind.] : Indiana University, 2018-06) Wirsching, Angela; Kawata, Keisuke
    Research continues to allude to the efficacy of using blood biomarkers as tools to detect brain parenchymal damage in the periphery of the body. While emerging evidence supports the use of neurofilament light polypeptide (NF-L) as a specific blood biomarker of cerebral trauma, the cogency of its use with respect to repetitive subconcussive head impacts remains elusive. Therefore, the purpose of this study was to investigate the acute effects of subconcussive head impacts on plasma NF-L levels in the collegiate soccer cohort. To be considered for inclusion, participants were required to have at least three years of soccer heading experience and be between the ages of 18 and 26. Conversely, participants were excluded if they had a history of head injury during one year prior to the study or a history of neurological disorders. In effort to test our central hypothesis that plasma levels of NF-L will increase in proportion to the amount of subconcussive head impact experienced, thirty-four participants were distributed between kicking-control and heading groups and assessed at four time points (pre-intervention, 0h-post, 2h-post, and 24h-post intervention). At each time point, a blood sample was collected from each participant. Between the pre- intervention and the 0h-post time point, each participant completed an intervention based on random group assignment. Participants in the heading group performed 10 soccer headers, while the kicking-control group performed 10 kicks. The ball was released by a JUGS machine approximately 40 feet from the participant at a speed of 25mph, inducing an average force of about 30g, similar to that of a long-throw during a soccer game, in one-minute intervals. Our results support previous findings that NF-L expression markedly increases in proportion to subconcussive head impacts cumulatively, with a significant increase at 24h post-heading. Subconcussive head impacts gradually increased plasma NF-L expression, as illustrated by a significant time by group interaction, F(1, 31) = 9.17, p = 0.0049, while the kicking-control group remain consistent throughout the study time points. A significant difference was revealed at 24h post-heading (3.68 $\pm$ 0.30 pg/mL) compared to pre-heading (3.12 $\pm$ 0.29 pg/mL, p = 0.0013; Cohen’s d = 1.898). The heading group (3.68 $\pm$ 0.30 pg/mL) showed a significantly higher level of plasma NF-L than the kicking-control group (2.97 $\pm$ 0.24 pg/mL: p = 0.038: Fig 1). There are two chief findings from this study: 1) There was no significant increase in NF-L expression until the 24h post-heading assessment in the heading group and 2) NF-L expression remained consistent across all time points in the kicking-control group. These findings suggest potential for cumulative effects of subconcussive head impacts. As research continues to reflect the negative neurocognitive effects of subconcussive head impacts, it is imperative that a gold standard for determining diagnosis, treatment, and prognosis is established. Our study further supports the efficacy of using biomarkers, specifically NF-L, in the triage and management of brain injuries.
  • Item
    Changes in Running Gait Complexity During a Cross-Country Season in Collegiate Runners
    ([Bloomington, Ind.] : Indiana University, 2018-06) Vollmar, Jacob E.; Gruber, Allison H.
    Center of mass (COM) acceleration complexity has been shown to decrease during a single fatiguing run. However, no studies have investigated how COM acceleration complexity changes over the course of a running training program and before the onset of a running related overuse injury (RROI). The purpose of the present study was to observe if the COM acceleration complexity of collegiate cross-country athletes over the course of their season training changed prior onset of a RROI. Thirty athletes wore a triaxial, research grade accelerometer secured over the posterior aspect of their pelvis during all continuous training runs. The accelerometers were worn for the entire cross-country season. Participants completed a daily online survey to report any musculoskeletal pain or injuries. An RROI was assessed by a trainer and defined as any musculoskeletal pain or problem that resulted in a reduction or stoppage of normal training. Control entropy (CE) analysis was used to assess the complexity of the resultant COM acceleration collected by the wearable accelerometer. Participants who developed a RROI and matched (by gender and age) uninjured controls were compared. Seven participants developed a RROI. No change in COM acceleration complexity was seen prior to the diagnosed RROI (p = 0.64). The unchanged COM acceleration complexity may be explained by similar training workloads between start of the season and immediately prior to RROI onset (p = 0.20).
  • Item
    Family Perceptions on Stressors Present for an Older Adult when Transitioning from the Community into a Long-Term Care facility
    (2017-12) Daly, Kaitlin
    This study examined family perceptions of stressors present for an older adult in transition from the community into a long-term care facility. With the rise of older adults from the “baby boomer generation” moving into long-term care facilities, this study aimed to better understand the circumstances surrounding when an older adult leaves the community setting. During this transition, the family member’s support often plays a role in the stress level for the older adult. This study asked the family members about the overall circumstances that led to the older adult’s transition into the long-term care facility. Family member participants were recruited from a senior community in the Midwestern part of the United States. The family member participants were qualified to be interviewed based on their older adult moving into the long-term care facility in the past three years. Data was collected by administering the Reintegration to Normal Living Index (RNL) and Survey for Identifying Stressors Present for an Older Adult Transitioning from the Community into Long Term Care). A total of 30 family members participated in the study. Results indicated that family members perceived that older adults experienced stress in the transition into long-term care. Factors that increased stress included not being able to physically care for the self, falls, and decline in mental status. Results also showed that family members perceived that if the older adult was not satisfied with how their self-care needs were met, they were less likely to participate in recreational activities.