Study Design: Literature review.Objectives: Paraspinal muscle integrity is believed to play a critical role in low back pain (LBP) and numerous spinal deformity diseases and other pain pathologies. The influence of paraspinal muscle atrophy (PMA) on the clinical and radiographic success of spinal surgery has not been established. We aim to survey the literature in order to evaluate the impact of paraspinal muscle atrophy on low back pain, spine pathologies, and postoperative outcomes of spinal surgery.Methods: A review of the literature was conducted using a total of 267 articles identified from a search of the PubMed database and additional resources. A full-text review was conducted of 180 articles, which were assessed based on criteria that included an objective assessment of PMA in addition to measuring its relationship to LBP, thoracolumbar pathology, or surgical outcomes. Results: A total of 34 studies were included in this review. The literature on PMA illustrates an association between LBP and both decreased cross-sectional area and increased fatty infiltration of paraspinal musculature. Atrophy of the erector spinae and psoas muscles have been associated with spinal stenosis, isthmic spondylolisthesis, facet arthropathy, degenerative lumbar kyphosis. A number of studies have also demonstrated an association between PMA and worse postoperative outcomes. Conclusions: PMA is linked to several spinal pathologies and some studies demonstrate an association with worse postoperative outcomes following spinal surgery. There is a need for further research to establish a relationship between preoperative paraspinal muscle integrity and postoperative success, with the potential for guiding surgical decision making.
The therapeutic significance of timing of decompression in acute traumatic central cord syndrome (ATCCS) caused by spinal stenosis remains unsettled. We retrospectively examined a homogenous cohort of patients with ATCCS and magnetic resonance imaging (MRI) evidence of post-treatment spinal cord decompression to determine whether timing of decompression played a significant role in American Spinal Injury Association (ASIA) motor score (AMS) 6 months following trauma. We used the t test, analysis of variance, Pearson correlation coefficient, and multiple regression for statistical analysis. During a 19-year period, 101 patients with ATCCS, admission ASIA Impairment Scale (AIS) grades C and D, and an admission AMS of ≤95 were surgically decompressed. Twenty-four of 101 patients had an AIS grade C injury. Eighty-two patients were males, the mean age of patients was 57.9 years, and 69 patients had had a fall. AMS at admission was 68.3 (standard deviation [SD] 23.4); upper extremities (UE) 28.6 (SD 14.7), and lower extremities (LE) 41.0 (SD 12.7). AMS at the latest follow-up was 93.1 (SD 12.8), UE 45.4 (SD 7.6), and LE 47.9 (SD 6.6). Mean number of stenotic segments was 2.8, mean canal compromise was 38.6% (SD 8.7%), and mean intramedullary lesion length (IMLL) was 23 mm (SD 11). Thirty-six of 101 patients had decompression within 24 h, 38 patients had decompression between 25 and 72 h, and 27 patients had decompression >72 h after injury. Demographics, etiology, AMS, AIS grade, morphometry, lesion length, surgical technique, steroid protocol, and follow-up AMS were not statistically different between groups treated at different times. We analyzed the effect size of timing of decompression categorically and in a continuous fashion. There was no significant effect of the timing of decompression on follow-up AMS. Only AMS at admission determined AMS at follow-up (coefficient = 0.31; 95% confidence interval [CI]:0.21; p = 0.001). We conclude that timing of decompression in ATCCS caused by spinal stenosis has little bearing on ultimate AMS at follow-up.
BACKGROUND Patients who survive aneurysmal subarachnoid hemorrhage (aSAH) are at risk for delayed neurological deficits (DND) and cerebral infarction. In this exploratory cohort comparison analysis, we compared in-hospital outcomes of aSAH patients administered a low-dose intravenous heparin (LDIVH) infusion (12 U/kg/h) vs those administered standard subcutaneous heparin (SQH) prophylaxis for deep vein thrombosis (DVT; 5000 U, 3 × daily). OBJECTIVE To assess the safety and efficacy of LDIVH in aSAH patients. METHODS We retrospectively analyzed 556 consecutive cases of aSAH patients whose aneurysm was secured by clipping or coiling at a single institution over a 10-yr period, including 233 administered the LDIVH protocol and 323 administered the SQH protocol. Radiological and outcome data were compared between the 2 cohorts using multivariable logistic regression and propensity score-based inverse probability of treatment weighting (IPTW). RESULTS The unadjusted rate of cerebral infarction in the LDIVH cohort was half that in SQH cohort (9 vs 18%; P = .004). Multivariable logistic regression showed that patients in the LDIVH cohort were significantly less likely than those in the SQH cohort to have DND (odds ratio (OR) 0.53 [95% CI: 0.33, 0.85]) or cerebral infarction (OR 0.40 [95% CI: 0.23, 0.71]). Analysis following IPTW showed similar results. Rates of hemorrhagic complications, heparin-induced thrombocytopenia and DVT were not different between cohorts. CONCLUSION This cohort comparison analysis suggests that LDIVH infusion may favorably influence the outcome of patients after aSAH. Prospective studies are required to further assess the benefit of LDIVH infusion in patients with aSAH.
Hemorrhage in the central nervous system (CNS), including intracerebral hemorrhage (ICH), intraventricular hemorrhage (IVH), and aneurysmal subarachnoid hemorrhage (aSAH), remains highly morbid. Trials of medical management for these conditions over recent decades have been largely unsuccessful in improving outcome and reducing mortality. Beyond its role in creating mass effect, the presence of extravasated blood in patients with CNS hemorrhage is generally overlooked. Since trials of surgical intervention to remove CNS hemorrhage have been generally unsuccessful, the potent neurotoxicity of blood is generally viewed as a basic scientific curiosity rather than a clinically meaningful factor. In this review, we evaluate the direct role of blood as a neurotoxin and its subsequent clinical relevance. We first describe the molecular mechanisms of blood neurotoxicity. We then evaluate the clinical literature that directly relates to the evacuation of CNS hemorrhage. We posit that the efficacy of clot removal is a critical factor in outcome following surgical intervention. Future interventions for CNS hemorrhage should be guided by the principle that blood is exquisitely toxic to the brain.
Expansion duraplasty to reopen effaced subarachnoid space and improve spinal cord perfusion, autoregulation, and spinal pressure reactivity index (sPRX) has been advocated in patients with traumatic cervical spinal cord injury (tCSCI). We designed this study to identify candidates for expansion duraplasty, based on the absence of cerebrospinal fluid (CSF) interface around the spinal cord on magnetic resonance imaging (MRI), in the setting of otherwise adequate bony decompression. Over a 61-month period, 104 consecutive American Spinal Injury Association Impairment Scale (AIS) grades A–C patients with tCSCI had post-operative MRI to assess the adequacy of surgical decompression. Their mean age was 53.4 years, and 89% were male. Sixty-one patients had falls, 31 motor vehicle collisions, 11 sport injuries, and one an assault. The AIS grade was A in 56, B in 18, and C in 30 patients. Fifty-four patients had fracture dislocations; there was no evidence of skeletal injury in 50 patients. Mean intramedullary lesion length (IMLL) was 46.9 (standard deviation = 19.4) mm. Median time from injury to decompression was 17 h (interquartile range 15.2 h). After surgery, 94 patients had adequate decompression as judged by the presence of CSF anterior and posterior to the spinal cord, whereas 10 patients had effacement of the subarachnoid space at the injury epicenter. In two patients whose decompression was not definitive and post-operative MRI indicated inadequate decompression, expansion duraplasty was performed. Candidates for expansion duraplasty (i.e., those with inadequate decompression) were significantly younger ( p < 0.0001), were AIS grade A ( p < 0.0016), had either sport injuries (six patients) or motor vehicle collisions (three patients) ( p < 0.0001), had fracture dislocation ( p = 0.00016), and had longer IMLL ( p = 0.0097). In regression models, patients with sport injuries and inadequate decompression were suitable candidates for expansion duraplasty ( p = 0.03). Further, 9.6% of patients failed bony decompression alone and either did (2) or would have (8) benefited from expansion duraplasty.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.