Clinical Research
This section includes Class of 2023 Embark Projects within the Clinical and Translational research areas. Many of these projects were initiated from our OUWB Clinical Faculty within many areas of clinical practice.
Iatrogenic nerve injuries during surgeries of the neck: a systematic review
Tajuldeen Al-Hasani1, Jickssa Gemechu, Ph.D.2, Varna Taranikanti, Ph.D2
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Department of Foundational Medical Studies, Oakland University William Beaumont School of Medicine, Rochester, Michigan
INTRODUCTION
Iatrogenic nerve injuries in the neck can present with a wide spectrum of symptoms ranging from mild pain, numbness, or weakness, to devastating consequences for patients such as permanent irreversible damage, disability, or even death. Surgical interventions in the neck region, especially those requiring radical dissections in the neck carry a significant risk of iatrogenic injury to several vital nerves passing through the neck such as vagus, phrenic, brachial plexus, cervical plexus, and ansa cervicalis.
METHODS
A three-step search strategy was utilized in this systematic review. An initial limited search of PUBMED and Embase was undertaken to identify text words contained in the title, abstract and index terms used to describe relevant articles. The reference list of all identified reports and articles were hand-searched for additional studies. The databases searched include: PubMed, Embase, Cochrane Library, Scopus, Web of Science, Northern Lights Abstracts, Proquest Dissertations & Theses.
RESULTS
Our results show that Lymph node biopsy in the posterior triangle of the neck constitutes the highest risk of iatrogenic nerve injuries in the neck. In thyroidectomy and parathyroidectomy, the most commonly injured nerve is the superior laryngeal nerve. Recurrent laryngeal nerve injury is not common, but once occurs, carries the most widely devastating post-operative effect and prompts the earliest possible intervention to lessen these effects.
CONCLUSIONS
If a neurological deficit is noticed immediately after an operation, the patient should be closely monitored with neurological, neurophysiological, and neurosonographic methods. If no improvement occurs after 3 months, the injured nerve should be explored. If neurosonography shows a complete separation or neuroma in continuity, an operation should be performed immediately. The most commonly affected nerves are the accessory nerve after a lymph node biopsy in the posterior triangle of the neck.
Neuroplasticity: Molecular And Cellular Changes In Cerebral Cortex After Traumatic Brain Injury (Systematic Review)
Qasim Alameri, Ph.D.1, Gustavo Patino, M.D./Ph.D.2
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Department of Foundational Medical Studies, Oakland University William Beaumont School of Medicine, Rochester, Michigan
INTRODUCTION
Neuroplasticity, also known as brain plasticity or neural plasticity, is the ability of nervous system cells to adapt and change in response to interactions of the living organism with the environment, not only in normal condition but also in conditions that apply stress on the brain tissue such as infection, emotional stress, and trauma. This process includes the recovery from brain injury by inducing changes in the physiology and connectivity of individual neuros.
METHODS
This project designed as a systematic review as follows:
Databases searched: PubMed, Embase, Cochrane Library, Scopus, SportDiscus, Web of Science, Google Scholar, Northern Lights Conference Abstracts, Dissertations & Theses (Proquest). Search terms: brain plastic, trauma, injury, neuroplastic, neuronal-plasticity, athletic injuries, accidental injuries, nervous system trauma, and their synonyms/variations.
Inclusion criteria: Primary research articles pertaining to neuroplasticity in response to sports-related trauma. Publication must be written in English and published dates between 1980 and 2020.
Exclusion criteria: Studies on patients with preexisting brain injury/trauma. Non-scientific studies (reviews, editorials, comments, news items, etc.) and non-English studies were excluded as well, books and book chapters. Research articles on non-human subjects
RESULTS
Results can not shown here because all the results has been represented with figures and tables that shows the mechanism of neuroplasticity.
CONCLUSIONS
Neuroplasticity researches are still in it’s early steps of the discovery of the complex and multilevel pathways, this process will take some time. We recommend more research that focus more on the recovery rather than the initial injury to the neuronal cortex.
Studies on human brain cortex is limited because of the increased risk of adverse outcome due to intervention. We recommend conducting research on animals (specially primates) and use the animal model toward understanding the process of regeneration in the nervous system as well the ability to reverse the outcome of TBI.
Elderly AML Patients: Effects of Comorbidities and Choice of Treatment on Overall Survival: A Beaumont Experience
Bilal M. Ali, B.S.1, Emma Herrman, M.D.2, James Huang, M.D.3, Mohammad Muhsin Chisti, M.D.4
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Department of Internal Medicine, Beaumont Health Royal Oak, Michigan
3Department of Pathology, Beaumont Health, Royal Oak, Michigan
4Department of Hematology and Oncology, Beaumont Health, Royal Oak, Michigan
INTRODUCTION
First line therapy for Acute Myeloid Leukemia (AML) is 7+3 regimen. It often cannot be used in elderly patients due to intensity. Venetoclax + hypomethylating agent (HMA) is approved for AML treatment in these patients. We investigate the efficacy of this treatment in a community setting where patients do not have the same resources available to them as a large academic center. Primary outcome was survival of patients greater than 60 years of age with a diagnosis of AML who received 7+3 therapy versus those who received venetoclax + HMA. Secondary outcomes included characteristics of those who received the two therapies.
METHODS
Retrospective chart review was conducted for patients seen by Beaumont Hematology and Oncology Group that were 60 or older with a diagnosis of AML and received treatment. Patients were seen between 09/2019 and 09/2020.
RESULTS
Of the 23 patients who received 7+3 initial treatment, 13 passed away with a median time of death of 1.98 years, while the median time of death for 17/26 patients treated with HMA/Venetoclax was 0.71 years (P-value .039). The initial treatment of HMA/ven was associated with 2.19-fold greater hazard of mortality as compared to 7+3 (HR:2.19; P = 0.0403). When comparing which treatment was chosen for those with comorbidities and age, those that received the initial treatment of 7+3 had less comorbid conditions (4.5 versus 6.1; p=0.04), and were younger (62 versus 75.5; p<0.0001) than those that did not receive the initial treatment of 7+3.
CONCLUSIONS
As expected, median survival for those who received HMA + Venetoclax was less than those who received 7+3. Our hospital was using this in the appropriate patients; those who were older or had more comorbidities.
The Routine Use and Cost Analysis of Acid-Fast Bacilli and Fungal Cultures in Foot and Ankle Surgery: A Retrospective Study
Margaret Bohr, B.S.1, Robert Dean, M.D.2, Zein El-Zein, M.D.2, Megan Audet, M.D.2, Paul Fortin, M.D.2, Zachary Vaupel, M.D.2
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Beaumont Health, Royal Oak, MI
INTRODUCTION
Infection is a significant complication seen in orthopaedic surgery. In such cases, samples are sent to be cultured for aerobes, anaerobes, acid-fast bacilli (AFB) and fungi. AFB and fungi are generally slow-growing, difficult to culture, and rarely the pathogen, whereas aerobic and anaerobic cultures are routinely positive and guide antibiotic treatment. The goal of this study is to evaluate the value of routinely ordering AFB and fungal cultures in the setting of foot and ankle surgery.
METHODS
A retrospective chart review was conducted to determine the number of positive AFB and fungal cultures out of the total foot and ankle samples tested. Between 2014 to 2019, 322 patients were identified who underwent surgery for foot and ankle infection. Each chart was reviewed to identify the results of the microbiological tests performed. To determine the value of ordering AFB and fungal cultures, the costs associated with each were provided by our institution’s microbiology lab.
RESULTS
Of the 322 patient charts reviewed, 434 AFB and 525 fungal cultures were performed. No cultures were indicated to be positive for AFB (0%), and 22 (4.2%) were positive for fungi. The total labor and material costs were calculated to be $38,767. The AFB cultures cost $23,967, the positive fungal cultures cost $2,371 and the negative fungal cultures cost $36,395.36.
CONCLUSIONS
This 322-case series of surgically managed foot and ankle infections showed 0% and 4.1% positivity rates for AFB and fungal cultures, respectively, which were associated with significant financial costs. Additional analysis is needed to determine best practices for obtaining vs. declining to culture for AFB or fungal species, including assessing patient outcomes in the series of fungal-positive cases. Future work may include observing this relationship in other subspecialties of orthopaedic surgery in order to establish a broader protocol for ordering atypical cultures that is clinically and cost effective.
Dietary Pattern Appears Not to Affect C-Reactive Protein and White Blood Cell Count In Obese Hispanic Women, Unlike In White Women: An Examination of NHANES Data From 1999-2010
Anna Bruins, B.S.1, Jacob Keeley, M.S.1, Virginia Uhley, Ph. D., RDN2,3, Kyeorda Kemp, Ph.D.2
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Department of Foundational Medical Studies, Oakland University William Beaumont School of Medicine, Rochester, Michigan
3Family Medicine and Community Health, Beaumont Health System
INTRODUCTION
Obesity is associated with elevated pro-inflammatory mediator C-reactive protein (CRP) and white blood cell (WBC) counts but reduced protective compounds such as enterolignans. Studies show plant-based diets may help mitigate the effects of inflammation in chronic disease, including obesity. However, less is known about the differential effects of diet on inflammation in women of color compared to white women. The primary goal of this study is to specifically investigate the role of adherence to a majority healthy plant-based diet on inflammation levels in obese and non-obese Hispanic women compared to obese and non-obese non-Hispanic white women.
METHODS
Data regarding CRP, WBC count, and the enterolignans enterodiol and enterolactone were collected for non-obese/ obese Hispanic women and non-obese/ obese non-Hispanic white women using the National Health and Nutrition Examination Survey (NHANES), years 1999-2010. Adherence to majority healthy plant-based, less healthy plant-based, and animal-based diets was determined based on two 24-hour recall interviews. Foods were assigned to these categories using the study design of Satija et al. Participants were sorted into groups based on BMichigan and majority diet adherence. Two-way ANOVA with post-hoc comparisons using Tukey adjustments was run accounting for NHANES survey sampling design.
RESULTS
Obese women demonstrated higher levels of CRP and WBC count compared to their non-obese counterparts when matching for dietary pattern regardless of ethnicity. As expected, CRP and WBC increased in obese non-Hispanic white women as dietary pattern moved from healthy plant-based to animal-based (pCRP=0.002 and pWBC=0.017). However, CRP and WBC expression was similar in obese Hispanic women regardless of dietary pattern.
CONCLUSIONS
The results indicate that diet influences inflammation in obese Hispanic women differently than in obese non-Hispanic white women. Further research should be done to elucidate the reasons behind these differences and to inform optimal diet type for obese women based on ethnicity.
Tumor Locations Impact on Cardiac Toxicity in Women that Received Partial Breast Irradiation
Sara Diltz, B.S.1, Muayad Almahariq, M.D./Ph.D.2, Joshua Dilworth, M.D./Ph.D.2
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Department of Radiation Oncology, Beaumont Health System, Royal Oak, Michigan
INTRODUCTION
Accelerated partial breast irradiation (APBI) is an appropriate modality for select women with low-risk breast cancer. While APBI delivers radiation dose to a smaller volume of breast tissue compared to whole breast irradiation, dose to the heart and coronary vessels may be relatively high, depending on the proximity of the treatment device to these structures. The primary goal of this study is to determine if the risk of adverse cardiac events depends on tumor location in women receiving APBI.
METHODS
Data from patients who received ABPI from 1993 to February 2017 were gathered from EPIC, Mosaiq, and Axis databases at William Beaumont Hospital. ICD-9 and ICD-10 codes were collected to document cardiac events. Univariate and multivariable analyses were conducted to correlate patient age, smoking status, Deyo-Charson Comborbidity Index score (DCCI), receipt of systemic therapy, and tumor location (right outer, right inner/central, left outer, and left inner/central) with subsequent cardiac events. Results were reported as hazard ratios (HR) and 95% confidence interval. P values of less than 0.05 were considered significant.
RESULTS
DCCI and patient age but not smoking status or the receipt of chemotherapy predicted cardiac events. Left inner/central tumor location was borderline significantly and independently correlated with increased cardiac events (HR=2.06 [1.00,4.24], p=0.05).
CONCLUSIONS
Treatment of tumor beds located in the inner/central portion of the left breast are associated with a higher risk of developing cardiac toxicity. These data support careful contouring and avoidance of cardiac structures during APBI treatment planning.
Micronutrient deficiencies in Beaumont Integrative Medicine fatigue patients
Jessica Dorschner, M.S.1, Maureen Anderson, M.D.2
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Department of Integrative Medicine, Beaumont Health System, Royal Oak, Michigan
INTRODUCTION
Fatigue is a common patient concern at primary care visits and can persist after addressing etiologies such as anemia, infection, malignancy, depression and cardiopulmonary disorders. Micronutrient deficiencies have also been investigated in fatigue patients, particularly individual vitamins such as C, D, B12, E and the mineral zinc. Fewer studies have investigated broad panels of micronutrients in fatigue patients. The Beaumont Integrative Medicine Clinic evaluates many patients with fatigue and assesses micronutrient levels using SpectraCell testing which measures 31 vitamin, mineral and metabolite levels concurrently.
METHODS
A retrospective chart review of 50 Beaumont Integrative Medicine patients from 2014-2018 who reported fatigue as their primary concern and received micronutrient testing were included in the study. Age at testing, sex and fatigue-related comorbidity data were also collected. The percent prevalence of borderline and frank deficiencies were calculated for each micronutrient.
RESULTS
The Beaumont Integrative Medicine fatigue patient population was 80% female, 20% male with a median age at testing of 55.5. The most prevalent nutrient deficiencies, seen in nearly one third of the study population, were in vitamins B5 (pantothenate), B12, D and coenzyme Q10, zinc, oleic acid and chromium (each with prevalences of 30%). Regarding fatigue-related comorbidities, the prevalences of depression, anxiety and insomnia tended to be higher in the Beaumont Integrative Medicine fatigue patient population as compared to the prevalences cited in the literature for the general population.
CONCLUSIONS
Among Beaumont Integrative Medicine fatigue patients, the most prevalent nutrient deficiencies were in vitamins B5, B12, D and coenzyme Q10, zinc, chromium and oleic acid. All of these nutrient deficiencies except for oleic acid have been previously linked to fatigue in the literature. Since no controls were included in our study, it is unknown whether these deficiencies are otherwise common among all Beaumont Integrative Medicine patients or the wider population.
Outcomes of DA R-EPOCH versus R-CHOP in treating patients diagnosed with Double-Expressor lymphoma
Phat Duong, B.S.1, Ishmael Jaiyesimi, D.O.2
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Department of Hematology and Oncology, Beaumont Health, Royal Oak, Michigan
INTRODUCTION
Double-expressor lymphoma (DEL) is a subtype of Diffuse Large B-cell lymphoma (DLBCL) that is associated with poor prognosis. The standard treatment for DLBCL is rituximab, cyclophosphamide, doxorubicin, vincristine, and prednisone (R-CHOP). It is hypothesized that the aggressive nature of DEL deserves a more intense regimen such as dose-adjusted etoposide, prednisone, vincristine, cyclophosphamide, and doxorubicin plus rituximab (DA-R-EPOCH). A comparison of outcomes between those two treatments is needed to guide clinical decisions.
METHODS
A retrospective review of the Beaumont Health system from 2014 to 2019 was conducted. The primary objective was progression-free survival (PFS) and overall survival (OS). The secondary objective was to identify the demographics of DEL patients.
RESULTS
Between 2014 and 2019, 44 patients were diagnosed with DEL of which 34 patients, 10 received DA-R-EPOCH and 24 received R-CHOP. There was no difference in race, ethnicity, and disease stage between the two groups. The median age was 66.4 and 75.2 for DA R-EPOCH and R-CHOP groups, respectively. There was no difference in PFS between 2 arms (hazard ratio, 0.66; 95% CI, 0.21 to 2.02, p=0.46) with a 3-year PFS rate of 56% (95% CI, 21% to 81%) for DA-R-EPOCH compared to 38% (95% CI, 18% to 59%) for R-CHOP. OS was not statistically significant (hazard ratio 1.03, 95% CI, 0.33 to 3.24, p=0.96) with a 3-year OS rate of 60% (95% CI, 25% to 83%) for DA-R-EPOCH and 62% (95% CI, 40% to 78%) for R-CHOP.
CONCLUSIONS
In this study, the more aggressive regiment DA-R-EPOCH did not improve OS and PFS compared to the standard R-CHOP approach. A study involving a larger number of patients and multiple centers is needed.
Demographics and Survival in AML patients over 60 Years of Age. A Single Institutional Analysis
Damilola Gbadebo, B.S.1, Nwabundo Anusim, M.D.2, Ishmael Jaiyesimi, D.O.2
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Beaumont Health System, Department of Medical Oncology, Royal Oak, Michigan
INTRODUCTION
Acute Myeloid Leukemia (AML) is a malignancy of the myeloid cell line. Patients diagnosed with AML, typically exhibit symptoms of neutropenia, anemia and thrombocytopenia. The diagnosis of AML is based on greater than 20 percent of myeloid cells in the bone marrow (1). Some risk factors for acquiring the disease include, but not limited to age, sex, smoking, exposure to certain chemicals, radiation, genetic predisposition and being treated with certain chemotherapeutic agents (2).
AML is frequently diagnosed among people between ages 65-74, with a current relative survival rate for AML being 28.7%. Men account for majority of these cases, at a rate of 5.2 per 100,000 persons compared to 3.6 per 100,000 persons for female patients (3).
Our study will assess whether treatment for AML patients has improved after 2015 with the utilization of novel chemotherapeutic agents, particularly for Beaumont Hospital patients.
METHODS
400 patients were analyzed, and 19 patients were taken from 2010-2014 and 11 patients were taken from 2015-2020. Survival time was calculated based on the date of diagnosis, and the deceased date. A welch T-test was used to calculate statistical significance.
RESULTS
There were 19 patients in the pre 2015 group (2010-2014) with an average age of 69.58 and a survival time of approximately 10 months. The post 2015 group (2015-2020) had 11 patients, with an average age of 76.93 and survival time of approximately 5 months. Comparison of both data gave a p value of .2267.
CONCLUSIONS
The results did not support the hypothesis, that patients treated after 2015 will have better survival rates. This could be due to the post 2015 patients having a higher age of diagnosis, or the lack of power of the study. The second aim was also not addressed due to lack of patients that met the criteria.
Googling Depression: A Critical Appraisal of Online Health Information
Brianna Gibney, B.S.1, Misa Mi, Ph.D.2
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Department of Foundational Medical Studies, Oakland University William Beaumont School of Medicine, Rochester, Michigan
INTRODUCTION
Mood disorders such as depression can easily be misunderstood due to the stigma and misconceptions surrounding mental health. Oftentimes people turn to the internet for insight. Unfortunately, online health information is unregulated and may contain misinformation and confusing content. The aim of this study was to evaluate the quality, content, and readability of the most popular depression websites in the United States.
METHODS
Eight of the top depression websites in the United States were identified through a comprehensive, multiple-query search. Using modified methods from previously published research, the websites were evaluated based on user-friendly design, credibility, accessibility, literacy, engagement, content, and cultural sensitivity.
RESULTS
Data analysis revealed that all websites provided basic depression information, user-friendly designs, accessibility features, opportunities for reader engagement, and social media presence. Most sites were credible; however, two sites were missing an author, editor, and references. The average readability level was high, at an 11.5-grade level based on the SMOG index. However, out of all categories, cultural sensitivity fared worst. Only two sites were offered in multiple languages, and only one mentioned at-risk minority groups. Cultural stigmas related to depression were never mentioned.
CONCLUSIONS
The results of this study highlight the demand intensified by the COVID-19 pandemic for culturally sensitive, multilingual depression resources online. Further efforts are needed to create accessible and easy-to-understand depression resources for all health consumers, regardless of educational and/or cultural background. Improving public understanding of depression through online resources can help reduce the stigma around mental health illness and health disparities among diverse patients.
Incidence of Myopathy in Post COVID-19 Patients undergoing Statin Therapy
Jithin John, B.S.1, Eduardo Leon, B.S.1, Ramin Homayouni, Ph.D.2
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Department of Foundational Medical Studies, Oakland University William Beaumont School of Medicine, Rochester, Michigan
INTRODUCTION
Statin-associated myositis has been documented since its first use, causing an increase in Creatinine Kinase levels. While some suggest avoiding statins in individuals with COVID-19, recent literature suggests that statin use is protective against SARS-CoV-2 infection due to its antioxidant and anti-inflammatory properties. The role of statin therapy in COVID-19 cases, especially regarding myopathy, remains scarce. We hypothesize that different statin therapies prior to SARS-CoV-2 infection are associated with an increased risk of myopathy and an increase in blood CK levels.
METHODS
In this retrospective study, medical records were extracted from the Beaumont Health Epic electronic medical records (EMR). The study was limited to all adult patients, with a PCR diagnosis of COVID-19 and an updated problem list within one year prior to their infection and at least one encounter within three months after infection. Patients with “New Myalgia,” and an increase in CK values were extracted using inclusion and exclusion criteria.
RESULTS
Using a multivariate linear regression model adjusted for several factors, we found no association between the type of statin therapy prior to SARS-CoV-2 infection and the development of new myalgia or increases in blood CK levels from normal ranges. In contrast, we found that the development of new myalgia, but not an increase in CK level, after SARS-CoV-2 infection was significantly associated with female gender irrespective of the type of statin therapy.
CONCLUSIONS
We found no differences between the type of statin therapy and the onset of new myalgia post-COVID-19 or elevated CK levels from normal pre-COVID ranges. Further research is needed to fully understand the risk factors and benefits of statin therapy in COVID-19 patients. This study highlights the importance of individualized clinical decision-making and the need for additional research in this area.
The Effects of BMI on the Development of Radiation Cystitis Onset: A Pilot Study
Kelsa G. Kazyak, M.S.1, Bernadette M.M. Zwaans, Ph.D.2
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Department of Urology at Beaumont Health System, Royal Oak, Michigan
INTRODUCTION
Radiation cystitis (RC) is a debilitating adverse side effect of pelvic radiation therapy. Despite advancements within the field of radiation oncology, the location of the bladder makes it difficult to avoid during radiation and approximately 5-10% of cancer survivors with a history of pelvic radiation are at risk. While RC is not common, the long-term effects can be crippling and we hope to provide more information on the risk factors to help survivors and inform clinicians.
A primary goal of this study is to determine if increased Body Mass Index (BMI) prior to the onset of radiation therapy affects the development of RC onset. A secondary goal is to begin to characterize patients diagnosed with RC.
METHODS
A list of patients diagnosed with RC between 2010-2022 was extracted from EPIC using the diagnosis codes N30.40, N30.41 (RC with and without hematuria), and N30.90, N30.91 (cystitis, undefined with and without hematuria). We identified 51 male patients aged 18-90, with a diagnosis of prostate cancer and a history of pelvic radiation. Retrospective chart review was used to collect BMI and RC onset dates. Patients were divided into two groups based on when they developed RC and their BMIs were subsequently evaluated.
RESULTS
A chi-squared test of independence showed no significant association between initial BMI and onset of RC, X2(2, N= 51) = 1.42, p= 0.49. Among patients with an initial BMI classified as healthy (M= 4.26, SD= 2.77), overweight (M= 5.19, SD= 3.29), and obese (M= 6.13, SD= 4.16) there was no significant difference in onset of RC (F(2, 28) = [0.978], p = 0.38).
CONCLUSIONS
The results from our study suggest that BMI at time of radiation does not increase the risk of developing radiation cystitis. These results are limited due to data constraints and more research is encouraged.
High Dose Rate Brachytherapy monotherapy versus External Beam Radiotherapy with HDR Brachytherapy Boost for Unfavorable Intermediate Prostate Cancer Patients
Doyle Lang, B.S.1, Benjamin Willen, M.D.2, Daniel J Krauss, M.D.2, Sirisha R. Nandalur, M.D.2
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Beaumont Health, Royal Oak, Michigan, United States
INTRODUCTION
Prostate cancer is the most common male malignancy by incidence in the world. Treatment differs by the patient’s risk stratification. For the treatment of unfavorable intermediate-risk prostate cancer, external beam radiotherapy with high-dose-rate brachytherapy boost was the accepted treatment but high-dose-rate brachytherapy as monotherapy has been proposed as a potentially viable treatment option. External beam radiotherapy treatment involves shooting high-energy photons or particle radiation through normal healthy tissue to hit the tumor directly. High-dose-rate brachytherapy involves inserting radioactive seeds into the tumor. Studies are needed to compare toxicity profiles and relative outcomes between the two treatment options.
METHODS
A retrospective analysis of 51 matched pairs of patients who received external beam radiotherapy with high-dose-rate brachytherapy boost or high-dose-rate brachytherapy as monotherapy was conducted. The Kaplan-Meier method was used to estimate overall survival (OS), cause-specific survival (CSS), loco-regional recurrence (LRR), disease-free survival (DFS), and distant metastases (DM).
RESULTS
There are no significant differences in overall survival, cause-specific survival, loco-regional recurrence, distant metastases, and freedom from biochemical failure between the patients treated with HDR brachytherapy monotherapy compared to EBRT with HDR boost.
CONCLUSIONS
The results support the hypothesis that similar toxicity profiles and treatment outcomes exist for patients treated with HDR brachytherapy monotherapy compared to EBRT with HDR boost. As a treatment option that avoids external pelvic radiation, HDR brachytherapy monotherapy can be seen as a viable option for unfavorable intermediate-risk prostate cancer patients.
A Retrospective Review of Catheter-Directed Therapy for Patients with Intermediate Risk Pulmonary Embolisms
Austin Lehew, B.S.1, Michael Tucciarone, M.D.2
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2William Beaumont Hospital, Troy, Michigan
INTRODUCTION
Each year, hundreds of patients present to the William Beaumont hospital system with intermediate risk pulmonary embolisms (IRPE) with common symptoms of chest pain, shortness of breath, and hypoxia. Conventional treatments include anticoagulants alone, or a more aggressive approach with systemic thrombolytics which come with a higher risk of major bleeding and death. However, in the last few years, catheter-directed therapies (CDT) have been developed to mechanically remove clots in a minimally invasive fashion. The goal of this study is to determine the safety and efficacy of novel CDT at Beaumont Troy and Royal Oak.
METHODS
A retrospective chart review was conducted for 199 IRPE patients who have undergone CDT at William Beaumont Troy and Royal Oak from 2018-2022. The pre and post mean pulmonary artery pressure (mPAP) (mmHg) and follow up right ventricular systolic pressure (RVSP), which estimates mPAP, were obtained on each patient. Incidence of major bleed requiring transfusion and death within 30 days of intervention was also obtained.
RESULTS
Out of the 199 patients included in this studied, 10 (5.0%) died within 30 days of intervention and 22 (11.1%) had a major bleed requiring a blood transfusion within 30 days of intervention. There was a statistically significant immediate improvement in pulmonary artery pressure by 7.98mmHg with p<0.001 as measured by right heart catheterization.
CONCLUSION
Though there was a higher rate of major bleeding than anticipated, the results support the hypothesis that novel CDT are relatively safe and have excellent health outcomes evidenced by the immediate reduction in right heart strain which correlates with PE-related morbidity. The continued use and advancements of this technology will surely result in better health outcomes for IRPE patients.
A retrospective study on the incidence of CKD Diagnosis Post COVID-19 infection with variations in glycemic control
Eduardo Leon, B.S.1, Jithin John, B.S.1, Nick Ludka1, Ramin Homayouni, Ph.D.2
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Department of Foundational Medical Studies, Oakland University William Beaumont School of Medicine, Rochester, Michigan
INTRODUCTION
The COVID-19 pandemic has emphasized that the virus can cause multi-organ complications, especially in patients with pre-existing conditions such as type-2 diabetes mellitus (T2DM), who are at higher risk for poor outcomes. Some patients may also develop T2DM post-infection due to the virus's effects on insulin secretion and blood glucose regulation. Glycated hemoglobin (HbA1c) is a useful tool for assessing blood glucose levels over time, diagnosing diabetes and monitoring disease management. Elevated levels of HbA1c have been linked to diabetic nephropathy (DN) and chronic kidney disease (CKD), both of which are potential complications of COVID-19 infection. This study aims to investigate the development of CKD in patients with varying HbA1c levels, and other comorbidities after SARS-CoV-2 infection, hypothesizing that COVID-19 may accelerate the development of CKD in at-risk patients with higher levels of HbA1c.
METHODS
This retrospective study analyzed electronic medical records of adult COVID-19 patients from January 2020 to September 2021, focusing on patient demographics, HbA1c values, COVID-19 severity, and comorbidities. The study followed up on patients for 8 months after COVID-19 diagnosis to investigate the development of CKD in those with different HbA1c levels. Statistical tests were performed to assess significant differences in CKD development.
RESULTS
Severe COVID-19 and high HbA1c were significant risk factors for CKD onset. Severe COVID-19 had an OR of 4.48 (95% CI, 3.36-5.52) and HbA1c had an OR of 1.57 (95% CI, 1.46-1.69). Male gender, atrial fibrillation, hypertension, hypothyroidism, age, and non-white race were also associated with increased risk of developing CKD after infection.
CONCLUSIONS
Severe COVID-19 illness may exacerbate the rate at which uncontrolled blood sugars can lead to CKD. Management of modifiable factors such as blood sugar, hypertension and hypothyroidism may decrease risk of CKD development after COVID-19.
Evaluating High Frequency Remote Monitoring of Temperature Using Wireless Temperature Sensor Patches Versus Standard-of-Care Temperature Monitoring in Cancer Patients
Ryan Lindstrom, B.A.1, Kelly Mayhew, M.S.2, Christopher Flora, Ph.D.2, Muneesh Tewari, M.D./Ph.D.2, Sung Choi, M.D.3
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Department of Hematology/Oncology, Michigan Medicine, Ann Arbor, Michigan
3Department of Pediatric Hematology/Oncology, Michigan Medicine, Ann Arbor, Michigan
INTRODUCTION
Collection of real-time temperature data has been shown to help detect fevers in patients undergoing chimeric antigen receptor T cell (CAR-T) or hematopoietic stem cell therapy (HCT) earlier than traditional vital sign collection times in the hospital. Earlier detection of fever allows for earlier intervention and less threatening sequelae. For real-time temperature data to make an impact, however, it must be consistently received. Our goal is to analyze the best methods for consistent real-time temperature data collection in cancer patients undergoing CAR-T or HCT therapy.
METHODS
This was an analysis of a prospective study of 61 patients undergoing CAR-T (n=22) or HCT (n=39) therapy in the inpatient setting who were undergoing at least one week of inpatient monitoring. Patients were given an FDA-approved high frequency remote monitoring (HFRM) wearable sensor (TempTraq®, BlueSpark Technologies), worn in their axilla. Each count of temperature measurement, taken every 2 minutes using the sensor, was noted and compared to temperature measurement counts pulled from the medical record.
Both HFRM and SOC temperature counts from all patients were averaged and plotted as box plots. HFRM and SOC counts were also broken down into 7 day periods and averaged. Those numbers were then compared using a two variable t test assuming unequal variance and corrected for using a Bonferroni equation assuming a p value of <0.05.
RESULTS
HFRM was found to be statistically significant in terms of average amount of counts taken per patient when compared to SOC. HFRM was also found to have a significantly superior number of counts in the week before therapy through three weeks post therapy compared to SOC. HFRM did not appear to have significantly more temperature measurement counts than SOC from the fourth week onwards.
CONCLUSIONS
HFRM produces significantly more temperature counts than SOC during the most vital periods of CAR-T and HCT therapies.
A Prospective Sonographic Evaluation of Peripheral Intravenous Catheter Associated Thrombophlebitis
Nicholas Mielke, B.S.1,2, Steven Johnson, D.O.2,3, Patrick Karabon1, Amit Bahl, M.D.2
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Department of Emergency Medicine, Beaumont Health, Royal Oak, Michigan
3Department of Anesthesia and Critical Care, Keck School of Medicine of the University of Southern California, Los Angeles, California
INTRODUCTION
Thrombophlebitis associated with peripheral intravenous catheters (PIVCs) is a poorly described complication in the literature. Given limited accuracy of current assessment tools and poor documentation in the medical record, the true incidence and relevance of this complication is misrepresented. We aimed to identify risk factors in the development of thrombophlebitis using an objective methodology coupling serial diagnostic ultrasound and clinical assessment.
METHODS
We conducted a single-site, prospective observational cohort study. Adult patients presenting to the emergency department that underwent traditionally placed PIVC insertion and were being hospitalized with an anticipated length of stay greater than two days were eligible participants. Using serial, daily ultrasound evaluations and clinical assessments via the phlebitis scale, we identified patients with asymptomatic and symptomatic thrombosis. The primary goal was to identify demographic, clinical, and IV related risk factors associated with thrombophlebitis. Univariate and multivariate analyses were employed to identify risk factors for thrombophlebitis.
RESULTS
62 PIVCs were included between July and August 2020. 54 (87.10%) developed catheter-related thrombosis with 22 (40.74%) of the thrombosed catheters were characterized as symptomatic. Multivariate cox regression demonstrated that catheter diameter relative to vein diameter greater than one-third [AHR 5.41 (1.91, 15.4) P=0.0015] and angle of distal tip of catheter against vein wall ≥ 5 degrees [AHR 4.39 (1.39, 13.8) P=0.0116] were associated with increased likelihood of thrombophlebitis.
CONCLUSIONS
Our study found that the increased proportion of catheter relative to vein size and steeper catheter tip angle increased the risk of thrombophlebitis. Catheter size relative to vein size is a modifiable factor that should be considered when inserting PIVCs. Additional larger prospective investigations using objective methodologies are needed to further characterize complications in PIVCs.
Outcomes of Locally Advanced Lung Cancer Patients Treated with 60 Gy vs. 70 Gy
Batoul Nasser, M.S.1, Muyad Almahariq, M.D./Ph.D.2, Inga Grills, M.D.2
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Beaumont Health, Royal Oak, Michigan
INTRODUCTION
The standard form of treatment for locally advanced lung cancer is to prescribe the conventional dose of 60 Gy radiation therapy (RT) concurrently with chemotherapy. However, five-year overall survival is still less than 20%. These outcomes remain poor due to both distant and local regional recurrence. The literature suggests that local regional control and overall survival is strongly associated with doses of radiotherapy that are higher than 60 Gy. However, when escalating radiation dose was prescribed exceeding 70 Gy, the overall survival rates plateaued. The primary goal of this study is to determine if there is a marked difference in clinical outcomes (i.e., mortality, disease progression, local recurrence) when administering varying RT doses – standard 60 Gy vs. simulated integrated boost (SIB) to 70 Gy – in concordance with chemotherapy, to patients with locally advanced lung cancer. A secondary goal is to determine if our findings support of contradict the results of previous studies that examined escalating doses of radiotherapy.
METHODS
A retrospective cohort review of 165 patients with localized lung cancer from Beaumont hospitals between 2009 to 2019 were analyzed as either receiving standard 60 Gy or lung SIB to 70 Gy treated concomitantly with chemotherapy. The following patients’ data was collected and evaluated: performance status, radiation dose, clinical outcomes of overall survival, local regional control, progression-free survival, status, and history of heart disease, hypertension, kidney disease, and diabetes.
RESULTS
Compared to the patients receiving conventional 60 Gy, the patients receiving lung SIB to 70 Gy demonstrated slightly improved local control (p=0.057).
CONCLUSIONS
The results demonstrated that lung SIB to 70 Gy demonstrated a trend of improved local control compared to conventional 60 Gy. Therefore, lung SIB to 70 Gy is s safe approach to deliver higher doses of radiation therapy to gross tumor while sparing normal tissue.
Impact of Structured Reporting Template on the Quality of HRCT Radiology Reports for Interstitial Lung Disease
Han G. Ngo, B.S.1, Girish B. Nair, M.D.2, Sayf Al-Katib, M.D.3
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Division of Pulmonary and Critical Care Medicine, Beaumont Health, Royal Oak, Michigan
3Department of Diagnostic Radiology and Molecular Imaging, Beaumont Health, Royal Oak Michigan
INTRODUCTION
This QI study compared the completeness of HRCT radiology reports before and after the implementation of a disease-specific structured reporting template for suspected cases of interstitial lung disease (ILD).
METHODS
A pre-post analysis of HRCT radiology reports for the thorax at a multicenter health system was performed. Data was collected in 6-month period intervals before (June 2019-Nov 2019) and after (Jan 2021-June 2021) the implementation of a disease-specific template. The use of the template was voluntary. The primary outcome measure was the completeness of HRCT reports graded based on the documentation of ten descriptors. The secondary outcome measure assessed the use of which descriptor(s) improved after the intervention.
RESULTS
521 reports before and 557 reports after the intervention were reviewed. Of the 557 reports, 118 reports (21%) used the implemented structured reporting template. The mean completeness score of the pre-intervention group was 9.20 (SD = 1.08) and the post-intervention group was 9.36 (SD = 1.03) with a difference of -0.155, 95% CI [-0.2822, -0.0285, p <0.0001]. Within the post-intervention group, the mean completeness score of the unstructured reports was 9.25 (SD = 1.07) and the template reports was 9.93 (SD = 0.25) with a difference of -0.677, 95% CI [-0.7871, -0.5671, p <0.0001]. After the intervention, the use of two descriptors improved significantly: the "presence of honeycombing" from 78.3% to 85.1% (p <0.0039) and the "technique" from 90% to 96.6% (p <0.0001).
CONCLUSIONS
The use of an ILD disease-specific template significantly increased the completeness of HRCT radiology reports and improved the use of two descriptors: technique and presence of honeycombing. A more substantial impact of structured reporting may have been noticed with increased utilization of the voluntary template. Further research on how to improve the voluntary uptake of a disease-specific template is needed to help increase the acceptance of structured reporting among radiologists.
Assessment of Radiographic Features in Predicting Complete Quadriceps and Complete Patellar Tendon Ruptures in the Pre-Operative Setting
Mitchell Pfennig, B.S.1, Matthew Astolfi, M.D.2, Christopher Vasileff, M.D.2, Betina Hinckel, M.D./Ph.D.2
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Department of Orthopedic Surgery, Beaumont Health System
INTRODUCTION
Acute patellar tendon or quadriceps tendon rupture is a severely debilitating injury resulting in complete loss of the knee extensor mechanism. Early diagnosis and prompt surgical repair are preferential in order to achieve good functional outcomes. In the setting of trauma and knee pain, plain film radiographs of the knee are obtained to rule out fractures. Patellar and quadriceps tendon rupture is primarily a clinical diagnosis but can benefit from additional imaging to differentiate between a partial and complete tear when there is uncertainty. Our goal is to assess the reliability and reproducibility of radiographic features in a population with acute traumatic isolated quadriceps tendon or patellar tendon ruptures.
METHODS
Two residents reviewed the patient charts and calculated values for the Insall-Salvati, Caton-Deschamps, and Blackburne-Peel ratios, as well as evaluated soft tissue appearance of the bony insertion and focal intratendinous radiolucency. Diagnostic abilities were evaluated and compared between those in a control group vs those with quadriceps and patellar tendon ruptures.
RESULTS
This study evaluated 23 patients, 15 of whom had quadriceps tendon ruptures and 8 had patellar tendon ruptures. Our results showed that the Insall-Salvati (1.77 ± 0.51 vs. 0.96 ± 0.18, PT vs QT, respectively) and Caton-Deschamps ratios (0.76 ± 0.35 vs 1.00 ± 0.18, PT vs QT, respectively) had high rater reliability. However, the Blackburne-Peel ratio and focal intratendinous radiolucency had fair to moderate reliability. These measures' diagnostic abilities (sensitivity and specificity) were evaluated and showed promising results in diagnosing quadriceps and patellar tendon ruptures.
CONCLUSIONS
We conclude that plain radiographs can be a reliable diagnostic tool for acute traumatic isolated quadriceps and patellar tendon ruptures. Proper use of radiographs can provide a simpler, cheaper, and faster diagnosis than ultrasound or MRI, leading to potential improved outcomes.
Frailty Among Total Hip and Knee Arthroplasty Recipients: Epidemiology and Effect on In-hospital Postoperative Outcomes
Luu Pham, B.A.1, Abdul K Zalikha, M.D.2, Jacob Keeley, M.S.1, Inaya Hajj Hussein, Ph.D.3, Mouhanad M. El-Othmani, M.D.4
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Department of Orthopaedic Surgery and Sports Medicine, Detroit Medical Center, Detroit, Michigan
3Department of Foundational Medical Studies, Oakland University William Beaumont School of Medicine, Rochester, Michigan
4Department of Orthopaedic Surgery, Columbia University Medical Center, New York, New York
INTRODUCTION
Total joint arthroplasty (TJA) has been one of the most highly successful and frequently performed procedures in the United States. The number of these procedures is projected to continue growing rapidly in the coming years and with it, comes the demand for more sophisticated perioperative risk and complication assessment. This study examines the impact of one qualifier from such assessments, frailty, on postoperative inpatient complications and hospital resources utilization following TJA.
METHODS
Discharge data from the National Inpatient Sample registry was used to identify all patients of 50 or older who underwent TJA in the period between 2006 and 2015. Nonelective admissions and hip fractures were excluded. Patients were stratified into two groups with and without concomitant diagnostic criteria that qualify them as having frailty. An analysis comparing the two groups epidemiology, postoperative outcomes, hospital economic and disposition results was performed.
RESULTS
A total of 8,434,946 TJAs were included in this analysis, with 5,757,628 total knees (96,602 frail and rest non-frail) and 2,677,318 total hips (61,423 frail and rest non-frail). Among these patients, the average age was 67.02-year-old and the female distribution was 61.1%. Patient with frailty were found to have increased risk of any postoperative complications, longer length-of-hospital-stay and higher hospital charges. Patient with frailty were associated with an increased risk of central nervous system disorder, hematoma/seroma, wound dehiscence, infection, deep vein thrombosis, and anemia complications compared with non-frail patients. Patients with frailty were also found to have significantly higher rate of all individual comorbidities compared to non-frail patients.
CONCLUSIONS
Patients with frailty undergoing TJA procedures are at a significantly higher risk for developing postoperative complications and worse hospital economic outcomes. As this patient population continues to increase, it is imperative for clinicians to utilize their risk factors in optimizing their perioperative care and support.
Cannabis Use Disorder in the Setting of Primary Total Hip Arthroplasty: Understanding the Epidemiology, Demographic Characteristics, and Inpatient Postoperative Outcomes
Dalia Rahmon, B.S.1, Inaya Hajj Hussein, Ph. D.2, Abdul Zalikha, M.D.3, Matthew Mazur, M.D.3, Mouhanad El-Othmani, M.D.3
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Department of Foundational Medical Studies, Oakland University William Beaumont School of Medicine, Rochester, Michigan
3Department of Orthopaedic Surgery, DMC, Detroit, Michigan
INTRODUCTION
Cannabis use is expected to increase in the context of its decriminalization and legalization in several states. The purpose of this study was to report on the epidemiologic and demographic characteristics and inpatient postoperative outcomes of patients with cannabis use disorder (CUD) undergoing primary total hip arthroplasty (THA).
METHODS
The National Inpatient Sample registry was used to identify patients undergoing THA between 2006 and 2015. Patients were stratified into groups with and without CUD. Epidemiology, comorbidity, and outcomes data were comparatively analyzed between these two groups.
RESULTS
A total of 2,838,742 THAs were performed during the study period. The prevalence of CUD significantly increased from 0.10% in 2006 to 0.39% in 2015 (P < 0.0001). Patients with CUD were significantly younger, more likely to be male, had higher rates of Medicaid insurance and were more likely to be non-Hispanic Black and less likely to be non-Hispanic White when compared with the control group. When comparing patients with and without CUD, there was no significant difference in the composite any complication variable and no significant difference in seven of eight individual in-hospital complications assessed, with the exception being higher genitourinary complications in the CUD group. There were no significant differences in discharge disposition or length of stay.
CONCLUSIONS
Although CUD is significantly associated with various demographic, comorbidity, and hospital characteristics, it is not significantly associated with in-hospital complications, discharge disposition, and length of stay outcomes in the immediate in-hospital, postoperative period. It is critical for clinicians and public health professionals to understand the characteristics and expected inpatient outcomes of this evolving population of patients with CUD undergoing THA, particularly in the context of widespread legalization.
Combining DRIP Score and Rapid Diagnostics For Improved Antibiotic Stewardship
Richard Ramirez, B.S.1, Matthew Sims, M.D./Ph.D.1,2
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Infectious Diseases and International Medicine Section, Beaumont Hospital, Royal Oak, MI
INTRODUCTION
Treatment analysis patterns for broad-spectrum antibiotic use in pneumonia revealed that 60% of patients were overtreated, highlighting the need for effective antibiotic stewardship practices. Systems such as the Drug Resistance in Pneumonia (DRIP) score select patients more likely to require broad spectrum antibiotics but still leads to overtreatment as it does not target specific pathogens. Rapid diagnostics such as the Unyvero Lower Respiratory Tract Panel (LRTP) combined with the DRIP score can identify specific pathogens to further narrow antibiotic use.
METHODS
Using an existing patient pool from a clinical trial of the LRTP (NCT01922024) a DRIP score was determined for each patient. When data for the DRIP score were unavailable a DRIPmax and a DRIPmin were calculated assuming missing data were positive or negative respectively. The sensitivity and specificity of the DRIP score based on culture and LRTP were determined. An algorithm for antibiotic selection based on the results was applied to each patient.
RESULTS
The sensitivity of the DRIP score vs culture in this population was 91.2% and the specificity was 65.1%. DRIP score vs culture and LRTP combined had a sensitivity of 86.6% and a specificity of 66.3%., the lower sensitivity was mainly due to Stenotrophomonas maltophilia. Applying the algorithm to each patient improved antibiotic choice and provided more efficient use of LRTP resourced.
CONCLUSIONS
Using an antibiotic stewardship algorithm combining DRIP score with LRTP data can lead to improved prediction of the presence of drug resistant pathogens and aid in narrowing antibiotics. The LRTP compensates for the DRIP score limitations by identifying the presence of specific antibiotic resistant pathogens. The DRIP score stratifies risk in patients that would benefit from LRTP analysis to narrow antibiotic use. Further study using a prospectively collected cohort with antibiotic adjustment in real time is needed for validation of our results.
Gestational Weight Gain and the Associated Perinatal Outcomes in Middle Eastern Women
Dana Rector, B.S.1,2, Zeynep Alpay Savasan, M.D.2,3
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Department of Maternal and Fetal Medicine, Corewell Health, Royal Oak, Michigan
3Department of OB/GYN, Corewell Health, Royal Oak, Michigan
INTRODUCTION
Gestational Weight Gain (GWG) can impact perinatal outcomes for mothers and infants, especially if the mother gains inadequate or excessive weight during pregnancy. We hypothesized that Middle Eastern women would be more likely to gain inadequate weight, compared to the general population, resulting in adverse perinatal outcomes.
METHODS
A retrospective IRB approved chart review was conducted on 255 Middle Eastern women who gave birth at Beaumont Dearborn Hospital July-December 2019 and their newborns. The mothers’ prepregnancy and postpregnancy weight were used to determine the GWG categories of inadequate (IA), adequate (A), or excessive (E) based on the Institute of Medicine’s (IOM) guidelines. Composite maternal outcomes (gestational hypertension, gestational diabetes, preterm birth, and cesarean birth) and composite neonatal outcomes (hyperbilirubinemia, hypoglycemia, small and large for gestational age) were compared. One-Way Analysis of Variance (ANOVA) and multivariable logistic regression models were used to analyze the data.
RESULTS
Among 255 Middle Eastern women, 35% were IA, 34.5% were A, and 30.5% were E. There was no significant difference between groups for composite maternal outcomes (IA vs A and E vs A: OR 0.67, 95% CI:0.34-1.30; OR 0.76, 95% CI:0.39-1.51, respectively) and neonatal outcomes (IA vs A and E vs A: OR .97, 95% CI:0.50-1.87; OR 0.82, 95% CI:0.41-1.63, respectively). Increasing maternal age was significantly associated with maternal outcomes (OR 1.11; P<0.001).
CONCLUSIONS
When compared to other studies, which showed that 21% of the general population gained inadequate weight, Middle Eastern women were more likely to gain inadequate weight (35%). Although risks for adverse pregnancy outcomes vary by GWG, this study did not show an effect on Middle Eastern women, likely due to a small sample size. However, there was an association between maternal age and maternal outcomes.
View Poster
Evaluation of Phenobarbital in the Treatment of Alcohol Withdrawal in the Intensive Care Unit
Sienna J. Ringgenberg, B.S.1, Vishal K. Patel, M.D.1,2
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Department of Internal Medicine, Beaumont Health, Royal Oak, Michigan
INTRODUCTION
Alcohol withdrawal syndrome (AWS) is typically managed using the Clinical Institute Withdrawal Assessment (CIWA) protocol which includes benzodiazepines, supportive care, and close clinical monitoring. Alternatives to benzodiazepines such as phenobarbital provide several advantages including less frequent dosing, longer tapering off of doses, and additional glutamate inhibition. The purpose of this study is to evaluate the effectiveness and potential benefit to using phenobarbital in addition to CIWA protocol in the treatment of AWS at Beaumont, Royal Oak.
METHODS
A total of 492 subjects diagnosed with AWS were analyzed. The electronic medical record was queried for adults age 18-75 years admitted to William Beaumont Hospital, Royal Oak between 2017 and 2021. Subjects were divided into two groups: those who received phenobarbital and those who did not. Intensive care unit (ICU) length of stay (LOS) and hospital LOS were compared using two-sample t-tests. Need for mechanical ventilation and all-cause mortality were analyzed using chi-squared tests. Differences in baseline CIWA scores were controlled for by utilizing regression models.
RESULTS
There was not a statistically significant difference in ICU LOS, hospital LOS, and need for mechanical ventilation for subjects who received phenobarbital compared to those who did not. There was a statistically significant difference in all-cause mortality (0.55% mortality in those who received phenobarbital vs 8.68% mortality in those who did not; p=0.000174). However, when controlling for initial CIWA scores this difference was no longer statistically significant (p=0.085). The group of subjects who received phenobarbital had a higher average CIWA score on admission (13.296 vs 9.850 with p=0.0000139).
CONCLUSIONS
The results do not suggest that the addition of phenobarbital to the standard of care CIWA protocol provides a significant decrease in ICU length of stay, hospital length of stay, or need for mechanical ventilation by using phenobarbital in the treatment of AWS.
Changes in Electrocardiographic and Cardiac Implantable Electronic Device Parameters Following Transcatheter Aortic Valve Replacement
Elizabeth Seeley, B.S.1, Luai Madanat, M.D.2, Nishaki Mehta, M.D.3
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Department of Internal Medicine, Corewell Health, Royal Oak, Michigan
3Department of Cardiology, Corewell Health, Royal Oak, Michigan
INTRODUCTION
Transcatheter aortic valve replacement (TAVR) is known to cause conduction abnormalities leading to the need for permanent pacemaker implantation. However, the impact of TAVR-related conduction abnormalities on cardiac implantable electronic device (CIED) parameters in patients with preexisting devices is not known. We sought to investigate and describe changes in EKG and CIED parameters following TAVR in patients with preexisting CIEDs.
METHODS
We retrospectively reviewed patients with preexisting CIEDs who underwent TAVR at a tertiary care center from 2012 to 2020. EKG and device parameters pre- and post-TAVR were collected. Continuous variables were reported as mean (± SD) or percentage where appropriate. Paired t-test was used to compare various EKG and device parameters pre- and post-TAVR.
RESULTS
A total of 113 patients were included. Median time of device interrogation pre- and post-TAVR was 50 and 1 day(s) respectively. There was an increase in QRS duration (mean 8.9ms ± 32.2; p-value = 0.007) and QTc interval (mean 14.9ms ± 42.5; p-value = 0.0005). Additionally, there was an increase in right ventricular (RV) pacing (mean 5.9% ± 17.7; p-value < 0.0001) and RV threshold (mean 0.14V ± 0.4; p-value = 0.0048) and a decrease in RV impedance (mean 35.5Ω ± 72.5; p-value = 0.0036) post-TAVR. Seven patients (6.2%) experienced an increase in RV sensing burden from <40% pre-TAVR to >40% post-TAVR (mean 51.4% ± 26.9).
CONCLUSIONS
There are significant electrocardiographic and device parameters changes in patients with preexisting CIEDs who undergo TAVR. Incorporating routine post-TAVR device interrogation would lead to early detection of clinically meaningful changes.
Dose to the Left Anterior Descending Artery Correlates With Cardiac Events After Irradiation for Breast Cancer
Brittany R. Silverman, B.S.1, Andrew H. Zureick, M.D.2, Vincent P. Grzywacz, M.D./Ph.D.2, Muayad F. Almahariq, M.D./Ph.D.2, Aleksander Vayntraub, M.D.2, Joshua T. Dilworth, M.D./Ph.D.2
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Department of Radiation Oncology, Beaumont Health, Royal Oak, Michigan
INTRODUCTION
Although global heart dose has been associated with late cardiac toxic effects in patients who received radiation therapy for breast cancer, data detailing the clinical significance of cardiac substructure dosimetry are limited. We investigated whether dose to the left anterior descending artery (LAD) correlates with adverse cardiac events.
METHODS
We identified 375 consecutively treated female patients from 2012 to 2018 who received left-sided breast or chest wall irradiation (with or without regional nodal irradiation). Medical records were queried to identify cardiac events after radiation therapy. Mean and maximum LAD and heart doses (LAD Dmean, LAD Dmax, heart Dmean, and heart Dmax) were calculated and converted to 2-Gy equivalent doses (EQD2). Univariate and multivariable Cox regression analyses were performed to determine association with cardiac toxic effects. Potential dose thresholds for each of the 4 dose parameters were identified by receiver operating characteristic (ROC) curve analysis, after which Kaplan-Meier analysis was performed to compare cardiac event-free survival based on these constraints.
RESULTS
Median follow-up time was 48 months. Thirty-six patients experienced a cardiac event, and 23 patients experienced a major cardiac event. On univariate and multivariable analyses, increased LAD Dmean, LAD Dmax, and heart Dmean were associated with increased risk of any cardiac event and a major cardiac event. ROC curve analysis identified a threshold LAD Dmean EQD2 of 2.8 Gy (area under the ROC curve, 0.69), above which the risk for any cardiac event was higher (P = .001). Similar results were seen when stratifying by LAD Dmax EQD2 of 6.7 Gy (P = .005) and heart Dmean EQD2 of 0.8 Gy (P = .01).
CONCLUSIONS
Dose to the LAD correlated with adverse cardiac events in this cohort. Contouring and minimizing dose to the LAD should be considered for patients receiving radiation therapy for left-sided breast cancer.
Comparing overall survival outcomes of patients treated with immunotherapy + chemotherapy, or chemotherapy + radiation for non-small cell lung cancer (NSCLC)
Sukhmani Singh, BS1, Ishmael Jaiyesimi, DO1,2
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Department of Hematology, Medical Oncology, Beaumont Health, Royal Oak, Michigan
INTRODUCTION
Traditionally, non-small cell lung cancer has been treated with chemotherapy or surgery, and remained first line treatment until around 2015 when PDL-1 immune checkpoint immunotherapy was introduced and for many patients where it become the standard first line treatment. A few research studies comparing outcome differences between patients receiving immunotherapy versus patients who did primarily chemotherapy as first line treatment indicate increased survival benefit, however, as immunotherapy is still relatively new, longer term survival benefit and analysis of other factors is continuing to be delineated.
METHODS
Retrospective chart review with data taken from the years 2010-2018 from Beaumont’s electronic medical record system, Epic. Data includes hospital visits as well as from outpatient clinics connected to Beaumont’s Health System. Study participants includes adults given immunotherapy (or combined chemotherapy and immunotherapy) and chemotherapy alone for treating advanced non-small cell lung cancer.
RESULTS
Out of 173 patients analyzed, 86 were in the immunotherapy + chemotherapy group [immunotherapy] while 87 were in the chemotherapy + radiation [chemotherapy] group. The immunotherapy + chemotherapy group showed an increased overall survival [difference between treatment start date and date of death] with a mean of 17.42 months or 533.23 days compared to the chemotherapy + radiation therapy group with a mean of 14.73 months or 448.53 days. P-value=0.22
CONCLUSIONS
Based on the results of the study there was an improved overall survival in the immunotherapy group compared to chemotherapy for NSCLC however, the difference was not statistically significant with a p-value of 0.22. The results were however limited due to sample size. Statistically significant results were seen in randomized control trials that compared immune modulator therapy against PD1 receptors compared to traditional chemotherapy, and showed improved overall survival, progression free survival and decreased adverse effects in patients receiving immunotherapy for NSCLC.
Influence of Pretreatment Magnetic Resonance Imaging on Local Therapy Decisions for Intermediate-Risk Prostate Cancer Patients
Christian Skowronski, B.S.1, Andrew Shanholtzer, B.S.1, Brent Yelton, B.S.1, Muayad Almahariq, M.D.2, Daniel Krauss, M.D.2
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Department of Radiation Oncology, Beaumont Health, Royal Oak, Michigan
INTRODUCTION
Prostate cancer has the third highest incidence rate and is the second leading cause of cancer death for men in the United States. Magnetic resonance imaging (MRI) provides superior soft tissue delineation serving as a valuable tool for both diagnosis and treatment planning. With minimal data regarding utility on diagnosis and treatment planning for intermediate-risk prostate cancer, the National Comprehensive Cancer Network’s guidelines indicate MRI as optional in intermediate-risk prostate cancer evaluation. This project aims to elucidate whether MRI affects radiation treatment decisions for intermediate-risk prostate cancer.
METHODS
This retrospective study evaluated 210 patients with intermediate-risk prostate cancer, treated with definitive radiotherapy at our institution between 2019-2020. NCCN risk stratification criteria were used to define intermediate-risk prostate cancer. Patients were divided into two groups: those with and without pretreatment prostate MRI. We compared the use of external beam radiotherapy, brachytherapy alone, brachytherapy boost, and androgen depravation therapy. Inverse probability of treatment weighting was used to match the two groups for confounding variables. Wilcoxon Rank Sum and Chi-squared tests were used to compare continuous and categorical variables.
RESULTS
Of the patients who met eligibility criteria, 133 had a prostate MRI and 77 did not. Following propensity matching, there were no differences between baseline characteristics between the two groups. There were no statistically significant differences in treatments pursued between the two groups including brachytherapy alone, external beam radiotherapy alone, external beam radiotherapy with a brachytherapy boost, and androgen deprivation therapy.
CONCLUSIONS
This analysis suggests pretreatment MRI does not significantly impact radiation therapy or androgen deprivation therapy decisions in patients with intermediate-risk prostate cancer. Obtaining a pretreatment prostate MRI should be judicious and pursued only to answer a specific question, for which the answer is likely to impact treatment decision. Further follow-up is needed to correlate MRI findings with their impacts on specific oncologic outcomes.
Obesity and Metabolic Syndrome Impact Outcomes After Total Knee and Hip Arthroplasty
Christeena Twal, B.S.1, Mouhanad M. El-Othmani, M.D.2, Jacob Keely1, Abdul Zalikha, M.D.3, Inaya Hajj-Hussein, Ph.D.4
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Columbia University Medical Center, New York, NY
3Detroit Medical Center, Detroit, Michigan
4Department of Foundational Medical Studies, Oakland University William Beaumont School of Medicine, Rochester, Michigan
INTRODUCTION
Total joint arthroplasty (TJA) of the hip (THA) and knee (TKA) help relieve symptoms, improve function, and restore quality of life in patients with end-stage osteoarthritis and are among the most frequently performed procedures in the United States. Although several studies have assessed the independent correlation of obesity or metabolic syndrome (MetS) with respect to TJA outcomes, there is sparse literature that highlights the interplay between these two comorbidities. The purpose of this study is to (1) evaluate the combined impact of metabolic syndrome and obesity on immediate inhospital outcomes and complications after TJA and (2) analyze resource utilization among patients with obesity and metabolic syndrome.
METHODS
A retrospective analysis was conducted using hospital discharge data from 2006 to the third quarter of 2015 from the National Inpatient Sample (NIS). Patients who underwent a primary THA or primary TKA and were at least 40 years old were included in our study. Patients were then further stratified into two groups: obese patients without a concomitant diagnosis of metabolic syndrome and obese patients with a concomitant diagnosis of metabolic syndrome. Patient demographics, hospital length of stay, discharge disposition, and inpatient complication and economic outcomes were compared using weighted cohorts. The analysis of continuous and categorical data was conducted using t-tests and univariate logistic regressions, respectively.
RESULTS
Patients with obesity and MetS had significantly higher rates of any complication and postoperative anemia and were significantly less likely to be discharged home versus a rehab facility compared with patients with obesity but without MetS.
CONCLUSIONS
The prevalence of MetS is increasing globally, and it has become a major public health concern. This study demonstrates that obese patients with MetS have significantly worse in-hospital outcomes than those without it.
Urodynamic Characteristics of Patients with Urge Incontinence treated with Sacral Neuromodulation and Adjunct Botulinum Toxin Injection
Brent Yelton , M.D.1, Jason Gilleran, M.D.2
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Department of Urology at Beaumont Royal Oak, Royal Oak, Michigan
INTRODUCTION
Sacral neuromodulation (SN) and intradetrusor botulinum toxin (BTX) are effective treatments for refractory overactive bladder (OAB) and urge incontinence (UI).1 The decision on which treatment to use is based on several factors. Urodynamics (UDS) have not been proven to determine who will respond to SN. Detrusor overactivity (DO) suggests that BTX may be more effective, given its direct role on muscle. The purpose of this study is to determine if presence and severity of DO on UDS is associated with failure of SN and response to BTX.
METHODS
This is a retrospective review of patients with OAB and UI who failed SN and were subsequently treated with BTX, with complete UDS for review. All UDS were performed off OAB medication, and prior to the first BTX injection or greater than 6 months following the most recent injection. Tracings with or without fluoroscopy were reviewed by a single clinician for presence of DO, bladder volume at first DO, presence of urine leak, max DO amplitude, stress incontinence, and maximum cystometric capacity (MCC).
RESULTS
We identified 53 subjects (39 female, mean age 66.2 y) who underwent SN (50 sacral, 9 pudendal) between 2007-2021. Mean time from implant to first BTX was 32.1 mos (range 3-90). Mean total BTX injections was 3.4 (1-14). DO occurred on 40/53 (75.4%) and with leak in 32/53 (60.3%) of UDS. Stress leak occurred in only 3 subjects. Mean volume at first DO of 166.4 mL (13-581), and mean MCC was 295.6 mL (81-643). Mean amplitude at first DO was 38.3 cm H2O, and maximum DO amplitude was 52.3 cm H2O.
CONCLUSIONS
The presence of DO at high amplitude (>25 cm H2O) and with leak on UDS is associated with worse UI that does not respond to SN monotherapy.
Fatigue and Dietary Habits of Medical students
Aidan Zubak, B.S.1, Virginia Uhley, Ph.D.2
1Oakland University William Beaumont School of Medicine, Rochester, Michigan
2Department of Foundational Medical Studies, Oakland University William Beaumont School of Medicine, Rochester, Michigan
INTRODUCTION
Fatigue is an overlooked detrimental reaction experienced by many medical students during the arduous process of medical school which may impact their academic performance. The goal of this study was to investigate if dietary intake habits are associated with fatigue levels reported by medical students.
METHODS
An online Qualtrics survey was developed to assess fatigue levels and dietary habits of OUWB medical students in their M3 and M4 years based on validated questionnaires (Michielsen et al Fatigue Assessment Scale (FAS) and MIND (Mediterranean-DASH) FFQ). Twenty-four students completed the survey. Their responses were separated into two groups, unhealthy and healthy dietary habits, based responses on the survey. Differences between the two groups of various dietary components and habits, as well as the fatigue levels) were assessed using a two-sample t-test. Correlation analysis and evaluation of median FAS scores for each independent dietary habit variable were also performed.
RESULTS
The median FAS Score was 23. FAS scores less than 22 are normal, scores between 22 and 34 indicate moderate fatigue, and 35 or more indicates severe fatigue. The Wilcoxon Signed Rank P-Value of 0.0822 did not suggest evidence of a statistically significant difference in FAS score distributions between the healthy and unhealthy participants. Correlation analysis did identify a statistically significant relationship between the intake of fish and whole grains with differences in median FAS scores between low and high serving sizes.
CONCLUSIONS
Reported dietary intake levels of fish and whole grains were shown to have a significant association with reported fatigue levels, indicating that nutrient intake would be an important variable to consider by medical students experiencing fatigue.