Social care nurses practise at the intersection of health and social care, supporting people with complex, long-term and often fluctuating needs. Despite their central role in care coordination, clinical oversight and safeguarding, social care nursing remains under-recognised in policy, education and workforce planning in England. This article examines the case for recognising social care nursing as a form of specialist community nursing practice. The authors situate social care nursing alongside district nursing and general practice nursing, highlighting shared capabilities in risk management, coordination and autonomous decision making outside hospital environments. International comparisons illustrate how clearer credentials and development pathways can strengthen specialist identity. The article concludes that reframing social care nursing as part of the community nursing workforce is essential to improving professional recognition, education pathways, workforce sustainability and the quality and safety of care.
Publications
2026
PURPOSE: To identify structural alterations of Descemet's membrane (DM) in bullous keratopathy (BK) and to explore their association with intracellular dark endothelial spots (IDESs) observed by specular microscopy.
METHODS: This multicenter, retrospective, observational study included 75 eyes that underwent endothelial keratoplasty for corneal endothelial dysfunction. Based on preoperative clinical diagnosis, eyes were classified as having Fuchs endothelial corneal dystrophy (FECD)-related or non-FECD BK. DM specimens were collected during endothelial keratoplasty and analyzed as flat-mounted preparations using phase-contrast microscopy. IDESs were evaluated preoperatively by masked assessment using specular microscopy.
RESULTS: Among the 75 eyes, 25 were clinically diagnosed with FECD. Of the remaining 50 eyes with BK, 15 showed no characteristic histological abnormalities, 11 exhibited guttae-like changes, and 24 demonstrated a distinct and previously unrecognized DM alteration, termed dome-shaped protrusions (DSPs). Specular microscopy images suitable for IDES evaluation were available in a subset of cases. IDESs were detected in 10 of 18 DSP-positive eyes and in 1 of 10 DSP-negative eyes, indicating a significant association between DSPs and IDESs (odds ratio 11.25; P < 0.05).
CONCLUSIONS: DSPs represent a distinct structural alteration of the DM in non-FECD BK and are significantly associated with IDESs observed by specular microscopy. These findings provide insight into the heterogeneity of endothelial failure in BK and suggest a link between clinical imaging findings and underlying DM morphology.
TRANSLATIONAL RELEVANCE: Dome-shaped protrusions provide a histopathological correlate for intracellular dark endothelial spots observed by specular microscopy and may support improved phenotyping of endothelial dysfunction in non-Fuchs endothelial corneal dystrophy bullous keratopathy.
A single experiment evaluated younger and older observers' ability to judge angular spatial relationships in an ordinary outdoor environment. Previous research from multiple laboratories has found that the visual ability to perceive distance is either well-maintained or improves with advancing age. The present experiment investigated whether this age-related equivalence or superiority also occurs for other spatial abilities, such as the ability to judge angles. Thirty adults judged 12 angles formed from trees, signs, light poles, and stone benches. The observers' overall performance was good: 74% of the variance in the judged angles could be accounted for by variance in the physical stimulus angles (overall Pearson r correlation coefficient was 0.86). The judgments of the older observers were nevertheless more accurate than those made by the younger observers (Cohen's d was 0.72). A detailed analysis of the observers' judgments revealed consistent local distortions of particular stimulus angles such that some angles were perceived to be much larger than they actually were (physically), whereas other stimulus angles were perceived to be much smaller than their physical magnitude. These local distortions in perceived angle magnitude may be related to the presence of environmental features that are associated with (linear) perspective.
PURPOSE: To evaluate the independent and joint effects of the frailty index (FI) and composite dietary antioxidant index (CDAI) with major ocular diseases and quantify contributions of their components.
METHODS: We analyzed 4455 U.S. adults aged ≥40 years from NHANES 2005-2008. Frailty was defined using a 49-item deficit accumulation FI (FI > 0.21 considered frail), and CDAI was derived from six antioxidants (carotenoids, vitamins A, C, E, selenium, zinc). Weighted multivariable logistic regression, variable importance, restricted cubic spline analyses, and sensitivity analyses assessed associations with retinopathy, cataract, diabetic retinopathy (DR), glaucoma, and age-related macular degeneration.
RESULTS: Higher FI was associated with higher odds of all major ocular diseases (any ocular disease: odds ratio [OR] = 1.35; DR: OR = 2.21; glaucoma: OR = 1.34), whereas higher CDAI was associated with lower odds (any ocular disease: OR = 0.97; glaucoma: OR = 0.91). Participants with high FI and low CDAI had the highest odds, based on predefined cutoff categories (any ocular disease: OR = 2.22; DR: OR = 4.92; cataract: OR = 2.50; glaucoma: OR = 3.18; all P < 0.05). Exploratory analyses showed that CDAI contributions varied by disease, whereas chronic disease burden dominated among FI domains.
CONCLUSIONS: Higher FI and lower CDAI were independently and in combination associated with major ocular diseases, highlighting the relevance of considering frailty status and dietary antioxidant profiles together in ocular health.
TRANSLATIONAL RELEVANCE: Higher FI and lower CDAI show combined associations with major ocular diseases, emphasizing their relevance to vision-related health.
BACKGROUND AND AIMS: Climate change in Mediterranean ecosystems is lengthening summer droughts towards the wet seasons (autumn and/or spring). These seasonal rains are critical to ensure post-fire recovery, especially for obligate-seeder species since they exclusively rely on germination success. While shifts in drought seasonality after fire could constrain regeneration, drought events can filter out more drought-tolerant species in adult communities. Yet the role of drought as a selective filter during the first wet seasons after fire in obligate seeders remains unclear.
METHODS: We did an experimental fire on three sites. Then, we experimentally extended post-fire summer drought either by delaying its end into autumn or by advancing its onset into the following spring. Four and six years after the fire, we measured key life-history (height and fruit production), and leaf traits related to water use strategy in the most abundant post-fire species found, Cistus albidus.
KEY RESULTS: Four years after the fire, plants surviving autumn drought were taller and produced more fruits than those subjected to spring drought. Despite this higher resource investment, six years after the fire, these individuals also exhibited leaf traits (e.g., SLA) that enhanced drought tolerance. Furthermore, the advanced summer drought to spring led to opposite results. Further analysis revealed that drought treatments had a more relevant role than, for example, plant density, in explaining outcomes in key traits such as SLA.
CONCLUSIONS: Our findings suggest that a drought during the first post-fire autumn can filter individuals with traits directly related to fitness and survival under drought. Moreover, we conclude that more than drought during recovery, the timing of drought in this phase is crucial to generate intrapopulation filtering. This mechanism could have important implications for obligate seeders, which rely on replenishing their seedbank to persist in the face of disturbances.
Los trastornos musculoesqueléticos en profesionales en el área de la peluquería responden a una combinación de factores ergonómicos, organizacionales y de carga laboral extensa que exceden la naturaleza original del oficio. En vista de ello, las posturas sostenidas, las jornadas prolongadas, el diseño inadecuado del espacio y de muebles no convencionales desde esta perspectiva, resulta imprescindible que la profesión de peluquería deje de concebirse como una actividad inevitablemente lesiva y sea abordada como un entorno laboral susceptible, no solo con el medio ambiente sino también con las personas que ejercen la profesión.
Uncontrolled hypertension can result from untreated high blood pressure (BP) or the inefficacy of established antihypertensive therapeutic regimens. Renal denervation (RDN) is a nonpharmacologic catheter-based intervention that achieves targeted renal sympathetic nerve ablation to modulate sympathetic activation. RDN is suitable for those with uncontrolled primary hypertension, resistant to therapy or intolerant to drugs, and who have a favorable renal artery anatomy. Long-term data demonstrate RDN's efficacy in significantly reducing elevated BP. RDN procedures have shown a good safety profile, and no significant difference in adverse events has been reported between RDN-treated and control groups in most clinical trials. Thus, RDN offers an effective and safe approach for sustained BP control.
Physical examination is pivotal for getting a clue about the disease and making a provisional diagnosis. The respiratory examination is considered to be one of the toughest systemic examinations by undergraduate and postgraduate residents. No well-defined literature is available regarding the ideal method and interpretation of respiratory examination findings. There are many questions asked by experts that are hardly found in the literature. This review included a total of 30 important questions and the best possible answers, including expert questions from top institutes that are important for respiratory examination and would help all students (MBBS/MD/DNB/DM) to excel in their practical examination.
INTRODUCTION: "Academic overdose (AO) leads to a state of mental and emotional saturation with constant academic input, to the point that learning and productivity decline and may lead to mental exhaustion and burnout, affecting quality of life (QOL)." Medical conferences (MC) are essential for knowledge dissemination, academic recognition, and professional transformation. This AO stems from the pressure to present research, networking, and demanding clinical and academic responsibilities. Adding to this are unlimited, exhaustive, and irritating queries from patients and attendants arising from internet searches.
DISCUSSION: In recent years, the frequency of MCs and continuous medical educations (CMEs) has increased across local, national, and international levels. While this growth offers educational opportunities, it has also led to content redundancy, extended sessions, and a lack of audience engagement. The healthcare professionals (HCPs) have high academic expectations to be achieved in multiple domains, such as position, sustainability, promotions, and excellence in clinical practice; they also maintain scholarly, educational, and administrative responsibilities, and balancing these is highly challenging and may lead to emotional exhaustion and burnout, exacerbated by academic preparation for MC presentations. MCs have various advantages and disadvantages and require structural reforms to attract more participants and to be recognized as being of very high standards. Restructuring of MCs seems logical, and MCs must remain accessible, affordable, and academically oriented.
CONCLUSION: MCs offer learning, innovations, professional networking, and knowledge and experience sharing, while at the same time needing to be more inclusive, ethical, cost-effective, and image-building opportunities. Associated risks of exhaustion, sleep deprivation, burnout, and financial constraint necessitate restructuring of MCs.
BACKGROUND: Global efforts to reduce tuberculosis (TB) are severely hampered by stigma. With a high number of TB infections, India struggles with the widespread stigma surrounding the illness, which makes it difficult to diagnose and treat patients promptly. To shed light on an important but often ignored component of TB management, we calculate the prevalence of TB-related stigma and variability in the manifestation in different groups.
METHODS: After calculating the sample size, we stratified them into different groups: patients with TB, healthcare workers providing TB services, and family members living with the patients. A validated, predesigned questionnaire was employed to assess stigma across various domains. MS Excel was used to compile the data, and Epi Info 7 to analyze it.
RESULTS: Health professionals made up the largest percentage of those who experienced stigma (11.78%), followed by family members (8.91%), and patients (6.05%). The association of stigma with different groups of study participants was statistically significant, implying that stigma exists variably in the other groups. The majority of the patients (3.50%) perceived stigma at their home, whereas the majority of the family members faced stigma in the community (5.41%). Healthcare workers face stigma majorly in the community (7.96%).
CONCLUSION: Stigma related to TB lays its foundation in varied perceptions by society. Societal norms determine acceptable and undesirable behaviors. Our study reveals major roadblocks on the way to TB eradication in the country and reveals a picture that can be extrapolated to most communities throughout. Aiming to reduce stigma will, in turn, improve treatment-related outcomes in TB and pave the way for smoother management and eradication.