Removing ovaries and fallopian tubes linked to lower risk of early death among certain breast cancer patients
Women with certain variants of the genes BRCA1 and BRCA2 have a high risk of developing ovarian and breast cancer. These women are recommended to have their ovaries and fallopian tubes removed at a relatively early age – between the ages 35 and 40 years for BRCA1 carriers, and between the ages 40 and 45 for BRCA2 carriers.
Previously, BSO has been shown to lead to an 80% reduction in the risk of developing ovarian cancer among these women, but there is concern that there may be unintended consequences as a result of the body’s main source of oestrogen being removed, which brings on early menopause. This can be especially challenging for BRCA1 and BRCA2 carriers with a history of breast cancer, as they may not typically receive hormone replacement therapy to manage symptoms. The overall impact of BSO in BRCA1 and BRCA2 carriers with a prior history of breast cancer remains uncertain.
Ordinarily, researchers would assess the benefits and risks associated with BSO through randomised controlled trials, the ‘gold standard’ for testing how well treatments work. However, to do so in women who carry the BRCA1 and BRCA2 variants would be unethical as it would put them at substantially greater risk of developing ovarian cancer.
To work around this problem, a team at the University of Cambridge, in collaboration with the National Disease Registration Service (NDRS) in NHS England, turned to electronic health records and data from NHS genetic testing laboratories collected and curated by NDRS to examine the long-term outcomes of BSO among BRCA1 and BRCA2 PV carriers diagnosed with breast cancer. The results of their study, the first large-scale study of its kind, are published today in The Lancet Oncology.
The team identified a total of 3,400 women carrying one of the BRCA1 and BRCA2 cancer-causing variants (around 1,700 women for each variant). Around 850 of the BRCA1 carriers and 1,000 of the BRCA2 carriers had undergone BSO surgery.
Women who underwent BSO were around half as likely to die from cancer or any other cause over the follow-up period (a median follow-up time of 5.5 years). This reduction was more pronounced in BRCA2 carriers compared to BRCA1 carriers (a 56% reduction compared to 38% respectively). These women were also at around a 40% lower risk of developing a second cancer.
Although the team say it is impossible to say with 100% certainty that BSO causes this reduction in risk, they argue that the evidence points strongly towards this conclusion.
Importantly, the researchers found no link between BSO and increased risk of other long-term outcomes such as heart disease and stroke, or with depression. This is in contrast to previous studies that found evidence in the general population of an association between BSO and increased risk of these conditions.
First author Hend Hassan, a PhD student at the Centre for Cancer Genetic Epidemiology, Department of Public Health and Primary Care, and Wolfson College, Cambridge, said: “We know that removing the ovaries and fallopian tubes dramatically reduces the risk of ovarian cancer, but there’s been a question mark over the potential unintended consequences that might arise from the sudden onset of menopause that this causes.
“Reassuringly, our research has shown that for women with a personal history of breast cancer, this procedure brings clear benefits in terms of survival and a lower risk of other cancers without the adverse side effects such as heart conditions or depression.”
Most women undergoing BSO were white. Black and Asian women were around half as likely to have BSO compared to white women. Women who lived in less deprived areas were more likely to have BSO compared to those in the most-deprived category.
Hassan added: “Given the clear benefits that this procedure provides for at-risk women, it’s concerning that some groups of women are less likely to undergo it. We need to understand why this is and encourage uptake among these women.”
Professor Antonis Antoniou, from the Department of Public Health and Primary Care, the study’s senior author, said: “Our findings will be crucial for counselling women with cancer linked to one of the BRCA1 and BRCA2 variants, allowing them to make informed decisions about whether or not to opt for this operation.”
Professor Antoniou, who is also Director of the Cancer Data-Driven Detection programme, added: “The study also highlights the power of exceptional NHS datasets in driving impactful, clinically relevant research.”
The research was funded by Cancer Research UK, with additional support from the National Institute for Health and Care Research (NIHR) Cambridge Biomedical Research Centre.
The University of Cambridge is fundraising for a new hospital that will transform how we diagnose and treat cancer. Cambridge Cancer Research Hospital, a partnership with Cambridge University Hospitals NHS Foundation Trust, will treat patients across the East of England, but the research that takes place there promises to change the lives of cancer patients across the UK and beyond. Find out more here.
Reference
Hassan, H et al. Long-term health outcomes of bilateral salpingo-oophorectomy in BRCA1 and BRCA2 pathogenic variant carriers with personal history of breast cancer: a retrospective cohort study using linked electronic health records. Lancet Oncology; 7 May 2025; DOI: 10.1016/S1470-2045(25)00156-1
Women diagnosed with breast cancer who carry particular BRCA1 and BRCA2 genetic variants are offered surgery to remove the ovaries and fallopian tubes as this dramatically reduces their risk of ovarian cancer. Now, Cambridge researchers have shown that this procedure – known as bilateral salpingo-oophorectomy (BSO) – is associated with a substantial reduction in the risk of early death among these women, without any serious side-effects.
Our findings will be crucial for counselling women with cancer linked to one of the BRCA1 and BRCA2 variants, allowing them to make informed decisions about whether or not to opt for this operationAntonis Antonioupixelfit (Getty Images)Doctor and patient making a mammography
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.
Removing ovaries and fallopian tubes linked to lower risk of early death among certain breast cancer patients
Women with certain variants of the genes BRCA1 and BRCA2 have a high risk of developing ovarian and breast cancer. These women are recommended to have their ovaries and fallopian tubes removed at a relatively early age – between the ages 35 and 40 years for BRCA1 carriers, and between the ages 40 and 45 for BRCA2 carriers.
Previously, BSO has been shown to lead to an 80% reduction in the risk of developing ovarian cancer among these women, but there is concern that there may be unintended consequences as a result of the body’s main source of oestrogen being removed, which brings on early menopause. This can be especially challenging for BRCA1 and BRCA2 carriers with a history of breast cancer, as they may not typically receive hormone replacement therapy to manage symptoms. The overall impact of BSO in BRCA1 and BRCA2 carriers with a prior history of breast cancer remains uncertain.
Ordinarily, researchers would assess the benefits and risks associated with BSO through randomised controlled trials, the ‘gold standard’ for testing how well treatments work. However, to do so in women who carry the BRCA1 and BRCA2 variants would be unethical as it would put them at substantially greater risk of developing ovarian cancer.
To work around this problem, a team at the University of Cambridge, in collaboration with the National Disease Registration Service (NDRS) in NHS England, turned to electronic health records and data from NHS genetic testing laboratories collected and curated by NDRS to examine the long-term outcomes of BSO among BRCA1 and BRCA2 PV carriers diagnosed with breast cancer. The results of their study, the first large-scale study of its kind, are published today in The Lancet Oncology.
The team identified a total of 3,400 women carrying one of the BRCA1 and BRCA2 cancer-causing variants (around 1,700 women for each variant). Around 850 of the BRCA1 carriers and 1,000 of the BRCA2 carriers had undergone BSO surgery.
Women who underwent BSO were around half as likely to die from cancer or any other cause over the follow-up period (a median follow-up time of 5.5 years). This reduction was more pronounced in BRCA2 carriers compared to BRCA1 carriers (a 56% reduction compared to 38% respectively). These women were also at around a 40% lower risk of developing a second cancer.
Although the team say it is impossible to say with 100% certainty that BSO causes this reduction in risk, they argue that the evidence points strongly towards this conclusion.
Importantly, the researchers found no link between BSO and increased risk of other long-term outcomes such as heart disease and stroke, or with depression. This is in contrast to previous studies that found evidence in the general population of an association between BSO and increased risk of these conditions.
First author Hend Hassan, a PhD student at the Centre for Cancer Genetic Epidemiology, Department of Public Health and Primary Care, and Wolfson College, Cambridge, said: “We know that removing the ovaries and fallopian tubes dramatically reduces the risk of ovarian cancer, but there’s been a question mark over the potential unintended consequences that might arise from the sudden onset of menopause that this causes.
“Reassuringly, our research has shown that for women with a personal history of breast cancer, this procedure brings clear benefits in terms of survival and a lower risk of other cancers without the adverse side effects such as heart conditions or depression.”
Most women undergoing BSO were white. Black and Asian women were around half as likely to have BSO compared to white women. Women who lived in less deprived areas were more likely to have BSO compared to those in the most-deprived category.
Hassan added: “Given the clear benefits that this procedure provides for at-risk women, it’s concerning that some groups of women are less likely to undergo it. We need to understand why this is and encourage uptake among these women.”
Professor Antonis Antoniou, from the Department of Public Health and Primary Care, the study’s senior author, said: “Our findings will be crucial for counselling women with cancer linked to one of the BRCA1 and BRCA2 variants, allowing them to make informed decisions about whether or not to opt for this operation.”
Professor Antoniou, who is also Director of the Cancer Data-Driven Detection programme, added: “The study also highlights the power of exceptional NHS datasets in driving impactful, clinically relevant research.”
The research was funded by Cancer Research UK, with additional support from the National Institute for Health and Care Research (NIHR) Cambridge Biomedical Research Centre.
The University of Cambridge is fundraising for a new hospital that will transform how we diagnose and treat cancer. Cambridge Cancer Research Hospital, a partnership with Cambridge University Hospitals NHS Foundation Trust, will treat patients across the East of England, but the research that takes place there promises to change the lives of cancer patients across the UK and beyond. Find out more here.
Reference
Hassan, H et al. Long-term health outcomes of bilateral salpingo-oophorectomy in BRCA1 and BRCA2 pathogenic variant carriers with personal history of breast cancer: a retrospective cohort study using linked electronic health records. Lancet Oncology; 7 May 2025; DOI: 10.1016/S1470-2045(25)00156-1
Women diagnosed with breast cancer who carry particular BRCA1 and BRCA2 genetic variants are offered surgery to remove the ovaries and fallopian tubes as this dramatically reduces their risk of ovarian cancer. Now, Cambridge researchers have shown that this procedure – known as bilateral salpingo-oophorectomy (BSO) – is associated with a substantial reduction in the risk of early death among these women, without any serious side-effects.
Our findings will be crucial for counselling women with cancer linked to one of the BRCA1 and BRCA2 variants, allowing them to make informed decisions about whether or not to opt for this operationAntonis Antonioupixelfit (Getty Images)Doctor and patient making a mammography
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.
Significant gaps in NHS care for patients who are deaf or have hearing loss, study finds
A team of patients, clinicians, researchers and charity representatives, led by the University of Cambridge and the British Society of Audiology, surveyed over 550 people who are deaf or have hearing loss about their experiences with the NHS – making it the largest study of its kind. Their findings, reported in the journal PLOS One, highlight systemic failures and suggest changes and recommendations for improving deaf-aware communication in the NHS.
“The real power of this study lies in the stories people shared,” said lead author Dr Bhavisha Parmar from Cambridge’s Department of Clinical Neurosciences (Sound Lab) and UCL Ear Institute. “Patients weren’t just rating their experiences – they were telling us how these barriers affect every part of their healthcare journey, and in many cases, why they avoid healthcare altogether.”
The study found that despite being a legal requirement under the Accessible Information Standards, NHS patients have inadequate and inconsistent access to British Sign Language (BSL) interpreters and other accessibility accommodations such as hearing loop systems.
Nearly two-thirds (64.4%) of respondents reported missing at least half of the important information during appointments, and only a third (32%) expressed satisfaction with NHS staff communication skills. Respondents said they had to rely on family members or advocates to communicate with healthcare workers, raising privacy and consent concerns.
The research found that communication barriers extend across the entire patient journey – from booking appointments to receiving results. Simple actions, like calling a patient’s name in a waiting room or giving instructions during a scan, become anxiety-inducing when basic accommodations are lacking. Respondents noted that hearing aids often must be removed for X-rays or MRI scans, leaving them struggling or unable to follow verbal instructions.
“We heard over and over that patients fear missing their name being called, or avoid making appointments altogether,” said Parmar. “These aren’t isolated experiences – this is a systemic issue.”
The idea for the study was sparked by real-life experiences shared online by NHS patients, particularly audiology patients– a field Parmar believes should lead by example. “We’re audiologists: we see more patients with hearing loss than anyone else in the NHS,” she said. “If we’re not deaf-aware, then how can we expect other parts of the NHS to be?”
The research team included NHS patients with deafness or hearing loss, who contributed to study design, data analysis, and report writing. As part of the study, they received training in research methods, ensuring the work was grounded in and reflective of lived experiences.
Co-author Zara Musker, current England Deaf Women’s futsal captain and winner of deaf sports personality of the year 2023 said her disappointing experiences with the NHS in part motivated her to qualify as an audiologist.
“The research is extremely important as I have faced my own experiences of inadequate access, and lack of deaf awareness in NHS healthcare not just in the appointment room but the whole process of booking appointments, being in the waiting room, interacting with clinicians and receiving important healthcare information,” said Musker. “I really hope that the results will really highlight that NHS services are still not meeting the needs of patients. Despite this, the study also highlights ways that the NHS can improve, and recommendations are suggested by those who face these barriers within healthcare.”
The researchers have also released a set of recommendations that could improve accessibility in the NHS, such as:
- Mandatory deaf awareness and communication training for NHS staff
- Consistent provision of interpreters and alert systems across all NHS sites
- Infrastructure improvements, such as text-based appointment systems and visual waiting room alerts
- The creation of walk-through assessments at hospitals to ensure accessibility across the full patient journey
“This is a legal obligation, not a luxury,” said Parmar. “No one should have to write down their symptoms in a GP appointment or worry they’ll miss their name being called in a waiting room. These are simple, solvable issues.”
A practice guidance resource – developed in consultation with patients and driven by this research – is open for feedback until 15 June and will be made publicly available as a free tool to help clinicians and NHS services improve deaf awareness. People can submit feedback at the British Society of Audiology website.
“Ultimately, better communication for deaf patients benefits everyone,” Parmar said. “We’re not just pointing out problems – we’re providing practical solutions.”
Reference:
Bhavisha Parmar et al. ‘“I always feel like I’m the first deaf person they have ever met.” Deaf Awareness, Accessibility and Communication in the United Kingdom's National Health Service (NHS): How can we do better?’ PLOS One (2025). DOI: 10.1371/journal.pone.0322850
A majority of individuals who are deaf or have hearing loss face significant communication barriers when accessing care through the National Health Service (NHS), with nearly two-thirds of patients missing half or more of vital information shared during appointments.
Better communication for deaf patients benefits everyone. We’re not just pointing out problems – we’re providing practical solutions. Bhavisha Parmarsturti via Getty ImagesA male doctor sits next to a male patient in a waiting room while holding a digital tablet.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.
Significant gaps in NHS care for patients who are deaf or have hearing loss, study finds
A team of patients, clinicians, researchers and charity representatives, led by the University of Cambridge and the British Society of Audiology, surveyed over 550 people who are deaf or have hearing loss about their experiences with the NHS – making it the largest study of its kind. Their findings, reported in the journal PLOS One, highlight systemic failures and suggest changes and recommendations for improving deaf-aware communication in the NHS.
“The real power of this study lies in the stories people shared,” said lead author Dr Bhavisha Parmar from Cambridge’s Department of Clinical Neurosciences (Sound Lab) and UCL Ear Institute. “Patients weren’t just rating their experiences – they were telling us how these barriers affect every part of their healthcare journey, and in many cases, why they avoid healthcare altogether.”
The study found that despite being a legal requirement under the Accessible Information Standards, NHS patients have inadequate and inconsistent access to British Sign Language (BSL) interpreters and other accessibility accommodations such as hearing loop systems.
Nearly two-thirds (64.4%) of respondents reported missing at least half of the important information during appointments, and only a third (32%) expressed satisfaction with NHS staff communication skills. Respondents said they had to rely on family members or advocates to communicate with healthcare workers, raising privacy and consent concerns.
The research found that communication barriers extend across the entire patient journey – from booking appointments to receiving results. Simple actions, like calling a patient’s name in a waiting room or giving instructions during a scan, become anxiety-inducing when basic accommodations are lacking. Respondents noted that hearing aids often must be removed for X-rays or MRI scans, leaving them struggling or unable to follow verbal instructions.
“We heard over and over that patients fear missing their name being called, or avoid making appointments altogether,” said Parmar. “These aren’t isolated experiences – this is a systemic issue.”
The idea for the study was sparked by real-life experiences shared online by NHS patients, particularly audiology patients– a field Parmar believes should lead by example. “We’re audiologists: we see more patients with hearing loss than anyone else in the NHS,” she said. “If we’re not deaf-aware, then how can we expect other parts of the NHS to be?”
The research team included NHS patients with deafness or hearing loss, who contributed to study design, data analysis, and report writing. As part of the study, they received training in research methods, ensuring the work was grounded in and reflective of lived experiences.
Co-author Zara Musker, current England Deaf Women’s futsal captain and winner of deaf sports personality of the year 2023 said her disappointing experiences with the NHS in part motivated her to qualify as an audiologist.
“The research is extremely important as I have faced my own experiences of inadequate access, and lack of deaf awareness in NHS healthcare not just in the appointment room but the whole process of booking appointments, being in the waiting room, interacting with clinicians and receiving important healthcare information,” said Musker. “I really hope that the results will really highlight that NHS services are still not meeting the needs of patients. Despite this, the study also highlights ways that the NHS can improve, and recommendations are suggested by those who face these barriers within healthcare.”
The researchers have also released a set of recommendations that could improve accessibility in the NHS, such as:
- Mandatory deaf awareness and communication training for NHS staff
- Consistent provision of interpreters and alert systems across all NHS sites
- Infrastructure improvements, such as text-based appointment systems and visual waiting room alerts
- The creation of walk-through assessments at hospitals to ensure accessibility across the full patient journey
“This is a legal obligation, not a luxury,” said Parmar. “No one should have to write down their symptoms in a GP appointment or worry they’ll miss their name being called in a waiting room. These are simple, solvable issues.”
A practice guidance resource – developed in consultation with patients and driven by this research – is open for feedback until 15 June and will be made publicly available as a free tool to help clinicians and NHS services improve deaf awareness. People can submit feedback at the British Society of Audiology website.
“Ultimately, better communication for deaf patients benefits everyone,” Parmar said. “We’re not just pointing out problems – we’re providing practical solutions.”
Reference:
Bhavisha Parmar et al. ‘“I always feel like I’m the first deaf person they have ever met.” Deaf Awareness, Accessibility and Communication in the United Kingdom's National Health Service (NHS): How can we do better?’ PLOS One (2025). DOI: 10.1371/journal.pone.0322850
A majority of individuals who are deaf or have hearing loss face significant communication barriers when accessing care through the National Health Service (NHS), with nearly two-thirds of patients missing half or more of vital information shared during appointments.
Better communication for deaf patients benefits everyone. We’re not just pointing out problems – we’re providing practical solutions. Bhavisha Parmarsturti via Getty ImagesA male doctor sits next to a male patient in a waiting room while holding a digital tablet.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.
Adolescents with mental health conditions use social media differently than their peers, study suggests
Young people with a diagnosable mental health condition report differences in their experiences of social media compared to those without a condition, including greater dissatisfaction with online friend counts and more time spent on social media sites.
This is according to a new study led by the University of Cambridge, which suggests that adolescents with “internalising” conditions such as anxiety and depression report feeling particularly affected by social media.
Young people with these conditions are more likely to report comparing themselves to others on social media, feeling a lack of self-control over time spent on the platforms, as well as changes in mood due to the likes and comments received.
Researchers found that adolescents with any mental health condition report spending more time on social media than those without a mental health condition, amounting to an average of roughly 50 minutes extra on a typical day.*
The study, led by Cambridge’s Medical Research Council Cognition and Brain Sciences Unit (MRC CBU), analysed data from a survey of 3,340 adolescents in the UK aged between 11 and 19 years old, conducted by NHS Digital in 2017.**
It is one of the first studies on social media use among adolescents to utilise multi-informant clinical assessments of mental health. These were produced by professional clinical raters interviewing young people, along with their parents and teachers in some cases.***
“The link between social media use and youth mental health is hotly debated, but hardly any studies look at young people already struggling with clinical-level mental health symptoms,” said Luisa Fassi, a researcher at Cambridge’s MRC CBU and lead author of the study, published in the journal Nature Human Behaviour.
“Our study doesn’t establish a causal link, but it does show that young people with mental health conditions use social media differently than young people without a condition.
“This could be because mental health conditions shape the way adolescents interact with online platforms, or perhaps social media use contributes to their symptoms. At this stage, we can’t say which comes first – only that these differences exist,” Fassi said.
The researchers developed high benchmarks for the study based on existing research into sleep, physical activity and mental health. Only findings with comparable levels of association to how sleep and exercise differ between people with and without mental health conditions were deemed significant.
While mental health was measured with clinical-level assessments, social media use came from questionnaires completed by study participants, who were not asked about specific platforms.****
As well as time spent on social media, all mental health conditions were linked to greater dissatisfaction with the number of online friends. “Friendships are crucial during adolescence as they shape identity development,” said Fassi.
“Social media platforms assign a concrete number to friendships, making social comparisons more conspicuous. For young people struggling with mental health conditions, this may increase existing feelings of rejection or inadequacy.”
Researchers looked at differences in social media use between young people with internalising conditions, such as anxiety, depression and PTSD, and externalising conditions, such as ADHD or conduct disorders.
The majority of differences in social media use were reported by young people with internalising conditions. For example, “social comparison” – comparing themselves to others online – was twice as high in adolescents with internalising conditions (48%, around one in two) than for those without a mental health condition (24%, around one in four).
Adolescents with internalising conditions were also more likely to report mood changes in response to social media feedback (28%, around 1 in 4) compared to those without a mental health condition (13%, around 1 in 8). They also reported lower levels of self-control over time spent on social media and a reduced willingness to be honest about their emotional state when online.*****
“Some of the differences in how young people with anxiety and depression use social media reflect what we already know about their offline experiences. Social comparison is a well-documented part of everyday life for these young people, and our study shows that this pattern extends to their online world as well,” Fassi said.
By contrast, other than time spent on social media, researchers found few differences between young people with externalising conditions and those without a condition.
“Our findings provide important insights for clinical practice, and could help to inform future guidelines for early intervention,” said Cambridge’s Dr Amy Orben, senior author of the study.
“However, this study has only scratched the surface of the complex interplay between social media use and mental health. The fact that this is one of the first large-scale and high-quality studies of its kind shows the lack of systemic investment in this space.”
Added Fassi: “So many factors can be behind why someone develops a mental health condition, and it's very hard to get at whether social media use is one of them.”
“A huge question like this needs lots of research that combines experimental designs with objective social media data on what young people are actually seeing and doing online.”
“We need to understand how different types of social media content and activities affect young people with a range of mental health conditions such as those living with eating disorders, ADHD, or depression. Without including these understudied groups, we risk missing the full picture.”
Notes:
*Study participants were asked to rate their social media use on a typical school day and a typical weekend or holiday day. This was given as a nine-point scale, ranging from less than 30 minutes to over seven hours. Responses from adolescents with any mental health condition approached on average "three to four hours," compared to adolescents without a condition, who averaged between "one to two hours" and "two to three hours."
The category of all mental health conditions in the study includes several conditions that are classed as neither internalising or externalising, such as sleep disorders and psychosis. However, the numbers of adolescents suffering from these are comparatively small.
**The survey was conducted as part of NHS Digital’s Mental Health of Children and Young People Survey (MHCYP) and is nationally representative of this age group in the UK. The researchers only used data from those who provided answers on social media use (50% male, 50% female).
*** Previous studies have mainly used self-reported questionnaires (e.g. a depression severity questionnaire) to capture mental health symptoms and conditions in participants.
**** The researchers point out that, as responses on social media use were self-reported, those with mental health conditions may be perceiving they spend more time on social media rather than actually doing so. They say that further research with objective data is required to provide definitive answers.
***** For data on social media use, study participants were asked to rate the extent to which they agree with a series of statements on a five-point Likert scale. The statements ranged from “I compare myself to others on social media” to “I am happy with the number of friends I have on social media”.
Researchers divided responses into 'disagree' (responses 1 to 3) and 'agree' (responses 4 and 5) and then calculated the proportion of adolescents agreeing separately for each diagnostic group to aid with public communication of the findings.
One of the first studies in this area to use clinical-level diagnoses reveals a range of differences between young people with and without mental health conditions when it comes to social media – from changes in mood to time spent on sites.
D-Keine via GettyTeenage boy with smart phone
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.
Adolescents with mental health conditions use social media differently than their peers, study suggests
Young people with a diagnosable mental health condition report differences in their experiences of social media compared to those without a condition, including greater dissatisfaction with online friend counts and more time spent on social media sites.
This is according to a new study led by the University of Cambridge, which suggests that adolescents with “internalising” conditions such as anxiety and depression report feeling particularly affected by social media.
Young people with these conditions are more likely to report comparing themselves to others on social media, feeling a lack of self-control over time spent on the platforms, as well as changes in mood due to the likes and comments received.
Researchers found that adolescents with any mental health condition report spending more time on social media than those without a mental health condition, amounting to an average of roughly 50 minutes extra on a typical day.*
The study, led by Cambridge’s Medical Research Council Cognition and Brain Sciences Unit (MRC CBU), analysed data from a survey of 3,340 adolescents in the UK aged between 11 and 19 years old, conducted by NHS Digital in 2017.**
It is one of the first studies on social media use among adolescents to utilise multi-informant clinical assessments of mental health. These were produced by professional clinical raters interviewing young people, along with their parents and teachers in some cases.***
“The link between social media use and youth mental health is hotly debated, but hardly any studies look at young people already struggling with clinical-level mental health symptoms,” said Luisa Fassi, a researcher at Cambridge’s MRC CBU and lead author of the study, published in the journal Nature Human Behaviour.
“Our study doesn’t establish a causal link, but it does show that young people with mental health conditions use social media differently than young people without a condition.
“This could be because mental health conditions shape the way adolescents interact with online platforms, or perhaps social media use contributes to their symptoms. At this stage, we can’t say which comes first – only that these differences exist,” Fassi said.
The researchers developed high benchmarks for the study based on existing research into sleep, physical activity and mental health. Only findings with comparable levels of association to how sleep and exercise differ between people with and without mental health conditions were deemed significant.
While mental health was measured with clinical-level assessments, social media use came from questionnaires completed by study participants, who were not asked about specific platforms.****
As well as time spent on social media, all mental health conditions were linked to greater dissatisfaction with the number of online friends. “Friendships are crucial during adolescence as they shape identity development,” said Fassi.
“Social media platforms assign a concrete number to friendships, making social comparisons more conspicuous. For young people struggling with mental health conditions, this may increase existing feelings of rejection or inadequacy.”
Researchers looked at differences in social media use between young people with internalising conditions, such as anxiety, depression and PTSD, and externalising conditions, such as ADHD or conduct disorders.
The majority of differences in social media use were reported by young people with internalising conditions. For example, “social comparison” – comparing themselves to others online – was twice as high in adolescents with internalising conditions (48%, around one in two) than for those without a mental health condition (24%, around one in four).
Adolescents with internalising conditions were also more likely to report mood changes in response to social media feedback (28%, around 1 in 4) compared to those without a mental health condition (13%, around 1 in 8). They also reported lower levels of self-control over time spent on social media and a reduced willingness to be honest about their emotional state when online.*****
“Some of the differences in how young people with anxiety and depression use social media reflect what we already know about their offline experiences. Social comparison is a well-documented part of everyday life for these young people, and our study shows that this pattern extends to their online world as well,” Fassi said.
By contrast, other than time spent on social media, researchers found few differences between young people with externalising conditions and those without a condition.
“Our findings provide important insights for clinical practice, and could help to inform future guidelines for early intervention,” said Cambridge’s Dr Amy Orben, senior author of the study.
“However, this study has only scratched the surface of the complex interplay between social media use and mental health. The fact that this is one of the first large-scale and high-quality studies of its kind shows the lack of systemic investment in this space.”
Added Fassi: “So many factors can be behind why someone develops a mental health condition, and it's very hard to get at whether social media use is one of them.”
“A huge question like this needs lots of research that combines experimental designs with objective social media data on what young people are actually seeing and doing online.”
“We need to understand how different types of social media content and activities affect young people with a range of mental health conditions such as those living with eating disorders, ADHD, or depression. Without including these understudied groups, we risk missing the full picture.”
Notes:
*Study participants were asked to rate their social media use on a typical school day and a typical weekend or holiday day. This was given as a nine-point scale, ranging from less than 30 minutes to over seven hours. Responses from adolescents with any mental health condition approached on average "three to four hours," compared to adolescents without a condition, who averaged between "one to two hours" and "two to three hours."
The category of all mental health conditions in the study includes several conditions that are classed as neither internalising or externalising, such as sleep disorders and psychosis. However, the numbers of adolescents suffering from these are comparatively small.
**The survey was conducted as part of NHS Digital’s Mental Health of Children and Young People Survey (MHCYP) and is nationally representative of this age group in the UK. The researchers only used data from those who provided answers on social media use (50% male, 50% female).
*** Previous studies have mainly used self-reported questionnaires (e.g. a depression severity questionnaire) to capture mental health symptoms and conditions in participants.
**** The researchers point out that, as responses on social media use were self-reported, those with mental health conditions may be perceiving they spend more time on social media rather than actually doing so. They say that further research with objective data is required to provide definitive answers.
***** For data on social media use, study participants were asked to rate the extent to which they agree with a series of statements on a five-point Likert scale. The statements ranged from “I compare myself to others on social media” to “I am happy with the number of friends I have on social media”.
Researchers divided responses into 'disagree' (responses 1 to 3) and 'agree' (responses 4 and 5) and then calculated the proportion of adolescents agreeing separately for each diagnostic group to aid with public communication of the findings.
One of the first studies in this area to use clinical-level diagnoses reveals a range of differences between young people with and without mental health conditions when it comes to social media – from changes in mood to time spent on sites.
D-Keine via GettyTeenage boy with smart phone
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.
Adolescents who sleep longer perform better at cognitive tasks
But the study of adolescents in the US also showed that even those with better sleeping habits were not reaching the amount of sleep recommended for their age group.
Sleep plays an important role in helping our bodies function. It is thought that while we are asleep, toxins that have built up in our brains are cleared out, and brain connections are consolidated and pruned, enhancing memory, learning, and problem-solving skills. Sleep has also been shown to boost our immune systems and improve our mental health.
During adolescence, our sleep patterns change. We tend to start going to bed later and sleeping less, which affects our body clocks. All of this coincides with a period of important development in our brain function and cognitive development. The American Academy of Sleep Medicine says that the ideal amount of sleep during this period is between eight- and 10-hours’ sleep.
Professor Barbara Sahakian from the Department of Psychiatry at the University of Cambridge said: “Regularly getting a good night’s sleep is important in helping us function properly, but while we know a lot about sleep in adulthood and later life, we know surprisingly little about sleep in adolescence, even though this is a crucial time in our development. How long do young people sleep for, for example, and what impact does this have on their brain function and cognitive performance?”
Studies looking at how much sleep adolescents get usually rely on self-reporting, which can be inaccurate. To get around this, a team led by researchers at Fudan University, Shanghai, and the University of Cambridge turned to data from the Adolescent Brain Cognitive Development (ABCD) Study, the largest long-term study of brain development and child health in the United States.
As part of the ABCD Study, more than 3,200 adolescents aged 11-12 years old had been given FitBits, allowing the researchers to look at objective data on their sleep patterns and to compare it against brain scans and results from cognitive tests. The team double-checked their results against two additional groups of 13-14 years old, totalling around 1,190 participants. The results are published today in Cell Reports.
The team found that the adolescents could be divided broadly into one of three groups:
Group One, accounting for around 39% of participants, slept an average (mean) of 7 hours 10 mins. They tended to go to bed and fall asleep the latest and wake up the earliest.
Group Two, accounting for 24% of participants, slept an average of 7 hours 21 mins. They had average levels across all sleep characteristics.
Group Three, accounting for 37% of participants, slept an average of 7 hours 25 mins. They tended to go to bed and fall asleep the earliest and had lower heart rates during sleep.
Although the researchers found no significant differences in school achievement between the groups, when it came to cognitive tests looking at aspects such as vocabulary, reading, problem solving and focus, Group Three performed better than Group Two, which in turn performed better than Group One.
Group Three also had the largest brain volume and best brain functions, with Group One the smallest volume and poorest brain functions.
Professor Sahakian said: “Even though the differences in the amount of sleep that each group got was relatively small, at just over a quarter-of-an-hour between the best and worst sleepers, we could still see differences in brain structure and activity and in how well they did at tasks. This drives home to us just how important it is to have a good night’s sleep at this important time in life.”
First author Dr Qing Ma from Fudan University said: “Although our study can’t answer conclusively whether young people have better brain function and perform better at tests because they sleep better, there are a number of studies that would support this idea. For example, research has shown the benefits of sleep on memory, especially on memory consolidation, which is important for learning.”
The researchers also assessed the participants’ heart rates, finding that Group Three had the lowest heart rates across the sleep states and Group One the highest. Lower heart rates are usually a sign of better health, whereas higher rates often accompany poor sleep quality like restless sleep, frequent awakenings and excessive daytime sleepiness.
Because the ABCD Study is a longitudinal study – that is, one that follows its participants over time – the team was able to show that the differences in sleep patterns, brain structure and function, and cognitive performance, tended be present two years before and two years after the snapshot that they looked at.
Senior author Dr Wei Cheng from Fudan University added: “Given the importance of sleep, we now need to look at why some children go to bed later and sleep less than others. Is it because of playing videogames or smartphones, for example, or is it just that their body clocks do not tell them it’s time to sleep until later?”
The research was supported by the National Key R&D Program of China, National Natural Science Foundation of China, National Postdoctoral Foundation of China and Shanghai Postdoctoral Excellence Program. The ABCD Study is supported by the National Institutes of Health.
Reference
Ma, Q et al. Neural correlates of device-based sleep characteristics in adolescents. Cell Reports; 22 Apr 2025; DOI: 10.1016/j.celrep.2025.115565
Adolescents who sleep for longer – and from an earlier bedtime – than their peers tend to have improved brain function and perform better at cognitive tests, researchers from the UK and China have shown.
Even though the differences in the amount of sleep that each group got was relatively small, we could still see differences in brain structure and activity and in how well they did at tasksBarbara Sahakianharpazo_hope (Getty Images)Teenager asleep and wrapped in a blanket
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.
Adolescents who sleep longer perform better at cognitive tasks
But the study of adolescents in the US also showed that even those with better sleeping habits were not reaching the amount of sleep recommended for their age group.
Sleep plays an important role in helping our bodies function. It is thought that while we are asleep, toxins that have built up in our brains are cleared out, and brain connections are consolidated and pruned, enhancing memory, learning, and problem-solving skills. Sleep has also been shown to boost our immune systems and improve our mental health.
During adolescence, our sleep patterns change. We tend to start going to bed later and sleeping less, which affects our body clocks. All of this coincides with a period of important development in our brain function and cognitive development. The American Academy of Sleep Medicine says that the ideal amount of sleep during this period is between eight- and 10-hours’ sleep.
Professor Barbara Sahakian from the Department of Psychiatry at the University of Cambridge said: “Regularly getting a good night’s sleep is important in helping us function properly, but while we know a lot about sleep in adulthood and later life, we know surprisingly little about sleep in adolescence, even though this is a crucial time in our development. How long do young people sleep for, for example, and what impact does this have on their brain function and cognitive performance?”
Studies looking at how much sleep adolescents get usually rely on self-reporting, which can be inaccurate. To get around this, a team led by researchers at Fudan University, Shanghai, and the University of Cambridge turned to data from the Adolescent Brain Cognitive Development (ABCD) Study, the largest long-term study of brain development and child health in the United States.
As part of the ABCD Study, more than 3,200 adolescents aged 11-12 years old had been given FitBits, allowing the researchers to look at objective data on their sleep patterns and to compare it against brain scans and results from cognitive tests. The team double-checked their results against two additional groups of 13-14 years old, totalling around 1,190 participants. The results are published today in Cell Reports.
The team found that the adolescents could be divided broadly into one of three groups:
Group One, accounting for around 39% of participants, slept an average (mean) of 7 hours 10 mins. They tended to go to bed and fall asleep the latest and wake up the earliest.
Group Two, accounting for 24% of participants, slept an average of 7 hours 21 mins. They had average levels across all sleep characteristics.
Group Three, accounting for 37% of participants, slept an average of 7 hours 25 mins. They tended to go to bed and fall asleep the earliest and had lower heart rates during sleep.
Although the researchers found no significant differences in school achievement between the groups, when it came to cognitive tests looking at aspects such as vocabulary, reading, problem solving and focus, Group Three performed better than Group Two, which in turn performed better than Group One.
Group Three also had the largest brain volume and best brain functions, with Group One the smallest volume and poorest brain functions.
Professor Sahakian said: “Even though the differences in the amount of sleep that each group got was relatively small, at just over a quarter-of-an-hour between the best and worst sleepers, we could still see differences in brain structure and activity and in how well they did at tasks. This drives home to us just how important it is to have a good night’s sleep at this important time in life.”
First author Dr Qing Ma from Fudan University said: “Although our study can’t answer conclusively whether young people have better brain function and perform better at tests because they sleep better, there are a number of studies that would support this idea. For example, research has shown the benefits of sleep on memory, especially on memory consolidation, which is important for learning.”
The researchers also assessed the participants’ heart rates, finding that Group Three had the lowest heart rates across the sleep states and Group One the highest. Lower heart rates are usually a sign of better health, whereas higher rates often accompany poor sleep quality like restless sleep, frequent awakenings and excessive daytime sleepiness.
Because the ABCD Study is a longitudinal study – that is, one that follows its participants over time – the team was able to show that the differences in sleep patterns, brain structure and function, and cognitive performance, tended be present two years before and two years after the snapshot that they looked at.
Senior author Dr Wei Cheng from Fudan University added: “Given the importance of sleep, we now need to look at why some children go to bed later and sleep less than others. Is it because of playing videogames or smartphones, for example, or is it just that their body clocks do not tell them it’s time to sleep until later?”
The research was supported by the National Key R&D Program of China, National Natural Science Foundation of China, National Postdoctoral Foundation of China and Shanghai Postdoctoral Excellence Program. The ABCD Study is supported by the National Institutes of Health.
Reference
Ma, Q et al. Neural correlates of device-based sleep characteristics in adolescents. Cell Reports; 22 Apr 2025; DOI: 10.1016/j.celrep.2025.115565
Adolescents who sleep for longer – and from an earlier bedtime – than their peers tend to have improved brain function and perform better at cognitive tests, researchers from the UK and China have shown.
Even though the differences in the amount of sleep that each group got was relatively small, we could still see differences in brain structure and activity and in how well they did at tasksBarbara Sahakianharpazo_hope (Getty Images)Teenager asleep and wrapped in a blanket
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.
Charles Darwin Archive recognised by UNESCO
The UNESCO Memory of the World Programme serves as the documentary heritage equivalent of UNESCO World Heritage Sites, protecting invaluable records that tell the story of human civilisation.
A collaboration between Cambridge University Library, the Natural History Museum, the Linnean Society of London, English Heritage’s Down House, the Royal Botanic Gardens, Kew and the National Library of Scotland, the Charles Darwin documentary heritage archive provides a unique window into the life and work of one of the world’s most influential natural scientists.
The complete archive, comprising over 20,000 items across the six major institutions, includes Darwin’s records illustrating the development of his ground-breaking theory of evolution and extensive global travels.
At Cambridge University Library, the Darwin Archive is a significant collection of Darwin’s books, experimental notes, correspondence, and photographs, representing his scientific and personal activities throughout his life.
The collection in Cambridge includes Darwin’s pocket notebooks recording early statements of key ideas contributing to his theory of evolution, notably that species are not stable. These provide important insights into the development of his thought and feature the iconic ‘Tree of Life’ diagram which he drew on his return from the voyage of the HMS Beagle.
The Linnean Society of London holds several of Darwin's letters, manuscripts and books. Here is also home to John Collier’s original iconic portrait of Charles Darwin, commissioned by the Society and painted in 1883 to commemorate the first reading of the theory of evolution by natural selection at a Linnean Society meeting in 1858.
At the Natural History Museum, a letter written to his wife Emma in 1844, provides insight into Darwin’s perceived significance of his species theory research and holds instructions on what she should do in the case of his sudden death. This is alongside other letters to Museum staff and other family members which demonstrate the broad scope of his scientific thinking, research and communication ranging from caterpillars to volcanoes, dahlias to ants and the taking of photographs for his third publication Expression of the Emotions in Man and Animals.
Correspondence with Darwin’s publisher John Murray, held at the National Library of Scotland document the transformation of his research into print, including the ground-breaking On the Origin of Species publication.
At the Royal Botanic Gardens, Kew, documents include a highly significant collection of 44 letters sent around the HMS Beagle expedition from Darwin to Professor John Stevens Henslow, detailing his travels and the genesis of his theory of evolution as he comes in contact with new plants, wildlife and fossils; as well as a rare sketch of the orchid Gavilea patagonica made by Darwin. Other items include a letter from Darwin to his dear friend Joseph Hooker, Director of Kew in which he requests cotton seeds from Kew's collections for his research.
Down House (English Heritage) in Kent was both a family home and a place of work where Darwin pursued his scientific interests, carried out experiments, and researched and wrote his many ground-breaking publications until his death in 1882.
The extensive collection amassed by Darwin during his 40 years at Down paint a picture of Darwin’s professional and personal life and the intersection of the two. The archive here includes over 200 books from Darwin’s personal collection, account books, diaries, the Journal of the Voyage of the Beagle MSS, and Beagle notebooks and letters. More personal items include scrapbooks, Emma Darwin’s photograph album and Charles Darwin’s will. The collection at Down House has been mainly assembled through the generous donations of Darwin’s descendants.
This inscription marks a significant milestone in recognising Darwin’s legacy, as it brings together materials held by multiple institutions across the UK for the first time, ensuring that his work's scientific, cultural, and historical value is preserved for future generations.
In line with the ideals of the UNESCO Memory of the World Programme, much of the Darwin archive can be viewed by the public at the partner institutions and locations.
The UNESCO International Memory of the World Register includes some of the UK’s most treasured documentary heritage, such as the Domesday Book, the Shakespeare Documents, alongside more contemporary materials, including the personal archive of Sir Winston Churchill. The Charles Darwin archive now joins this esteemed list, underscoring its historical, scientific, and cultural significance.
The inscription of the Charles Darwin archive comes as part of UNESCO’s latest recognition of 75 archives worldwide onto the International Memory of the World Register.
These newly inscribed collections include a diverse range of documents, such as the Draft of the International Bill of Human Rights, the papers of Friedrich Nietzche, and the Steles of Shaolin Temple (566-1990) in China.
Baroness Chapman of Darlington, Minister of State for International Development, Latin America and Caribbean, Foreign, Commonwealth & Development Office (FCDO) said: "The recognition of the Charles Darwin archive on UNESCO's International Memory of the World Register is a proud moment for British science and heritage.
"Darwin's work fundamentally changed our understanding of the natural world and continues to inspire scientific exploration to this day. By bringing together extraordinary material from our world class British institutions, this archive ensures that Darwin's groundbreaking work remains accessible to researchers, students, and curious minds across the globe."
Ruth Padel, FRSL, FZS, poet, conservationist, great-great-grand-daughter of Charles Darwin and King’s College London Professor of Poetry Emerita, said: "How wonderful to see Darwin’s connections to so many outstanding scientific and cultural institutions in the UK reflected in the recognition of his archive on the UNESCO Memory of the World International Register. All these institutions are open to the public so everyone will have access to his documentary heritage."
Dr Jessica Gardner, University Librarian and Director of Library Services at Cambridge University Libraries (CUL) said: "For all Charles Darwin gave the world, we are delighted by the UNESCO recognition in the Memory of the World of the exceptional scientific and heritage significance of his remarkable archive held within eminent UK institutions.
"Cambridge University Library is home to over 9,000 letters to and from Darwin, as well as his handwritten experimental notebooks, publications, and photographs which have together fostered decades of scholarship and public enjoyment through exhibition, education for schools, and online access.
"We could not be prouder of UNESCO’s recognition of this remarkable documentary heritage at the University of Cambridge, where Darwin was a student at Christ’s College and where his family connections run deep across the city, and are reflected in his namesake, Darwin College."
Read the full, illustrated version of this story on the University Library's site.
Documentary heritage relating to the life and work of Charles Darwin has been recognised on the prestigious UNESCO International Memory of the World Register, highlighting its critical importance to global science and the necessity of its long-term preservation and accessibility.
We could not be prouder of UNESCO’s recognition of this remarkable documentary heritageJessica GardnerCambridge University LibraryTwo of Charles Darwin’s pocket notebooks in Cambridge University Library
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.
Charles Darwin Archive recognised by UNESCO
The UNESCO Memory of the World Programme serves as the documentary heritage equivalent of UNESCO World Heritage Sites, protecting invaluable records that tell the story of human civilisation.
A collaboration between Cambridge University Library, the Natural History Museum, the Linnean Society of London, English Heritage’s Down House, the Royal Botanic Gardens, Kew and the National Library of Scotland, the Charles Darwin documentary heritage archive provides a unique window into the life and work of one of the world’s most influential natural scientists.
The complete archive, comprising over 20,000 items across the six major institutions, includes Darwin’s records illustrating the development of his ground-breaking theory of evolution and extensive global travels.
At Cambridge University Library, the Darwin Archive is a significant collection of Darwin’s books, experimental notes, correspondence, and photographs, representing his scientific and personal activities throughout his life.
The collection in Cambridge includes Darwin’s pocket notebooks recording early statements of key ideas contributing to his theory of evolution, notably that species are not stable. These provide important insights into the development of his thought and feature the iconic ‘Tree of Life’ diagram which he drew on his return from the voyage of the HMS Beagle.
The Linnean Society of London holds several of Darwin's letters, manuscripts and books. Here is also home to John Collier’s original iconic portrait of Charles Darwin, commissioned by the Society and painted in 1883 to commemorate the first reading of the theory of evolution by natural selection at a Linnean Society meeting in 1858.
At the Natural History Museum, a letter written to his wife Emma in 1844, provides insight into Darwin’s perceived significance of his species theory research and holds instructions on what she should do in the case of his sudden death. This is alongside other letters to Museum staff and other family members which demonstrate the broad scope of his scientific thinking, research and communication ranging from caterpillars to volcanoes, dahlias to ants and the taking of photographs for his third publication Expression of the Emotions in Man and Animals.
Correspondence with Darwin’s publisher John Murray, held at the National Library of Scotland document the transformation of his research into print, including the ground-breaking On the Origin of Species publication.
At the Royal Botanic Gardens, Kew, documents include a highly significant collection of 44 letters sent around the HMS Beagle expedition from Darwin to Professor John Stevens Henslow, detailing his travels and the genesis of his theory of evolution as he comes in contact with new plants, wildlife and fossils; as well as a rare sketch of the orchid Gavilea patagonica made by Darwin. Other items include a letter from Darwin to his dear friend Joseph Hooker, Director of Kew in which he requests cotton seeds from Kew's collections for his research.
Down House (English Heritage) in Kent was both a family home and a place of work where Darwin pursued his scientific interests, carried out experiments, and researched and wrote his many ground-breaking publications until his death in 1882.
The extensive collection amassed by Darwin during his 40 years at Down paint a picture of Darwin’s professional and personal life and the intersection of the two. The archive here includes over 200 books from Darwin’s personal collection, account books, diaries, the Journal of the Voyage of the Beagle MSS, and Beagle notebooks and letters. More personal items include scrapbooks, Emma Darwin’s photograph album and Charles Darwin’s will. The collection at Down House has been mainly assembled through the generous donations of Darwin’s descendants.
This inscription marks a significant milestone in recognising Darwin’s legacy, as it brings together materials held by multiple institutions across the UK for the first time, ensuring that his work's scientific, cultural, and historical value is preserved for future generations.
In line with the ideals of the UNESCO Memory of the World Programme, much of the Darwin archive can be viewed by the public at the partner institutions and locations.
The UNESCO International Memory of the World Register includes some of the UK’s most treasured documentary heritage, such as the Domesday Book, the Shakespeare Documents, alongside more contemporary materials, including the personal archive of Sir Winston Churchill. The Charles Darwin archive now joins this esteemed list, underscoring its historical, scientific, and cultural significance.
The inscription of the Charles Darwin archive comes as part of UNESCO’s latest recognition of 75 archives worldwide onto the International Memory of the World Register.
These newly inscribed collections include a diverse range of documents, such as the Draft of the International Bill of Human Rights, the papers of Friedrich Nietzche, and the Steles of Shaolin Temple (566-1990) in China.
Baroness Chapman of Darlington, Minister of State for International Development, Latin America and Caribbean, Foreign, Commonwealth & Development Office (FCDO) said: "The recognition of the Charles Darwin archive on UNESCO's International Memory of the World Register is a proud moment for British science and heritage.
"Darwin's work fundamentally changed our understanding of the natural world and continues to inspire scientific exploration to this day. By bringing together extraordinary material from our world class British institutions, this archive ensures that Darwin's groundbreaking work remains accessible to researchers, students, and curious minds across the globe."
Ruth Padel, FRSL, FZS, poet, conservationist, great-great-grand-daughter of Charles Darwin and King’s College London Professor of Poetry Emerita, said: "How wonderful to see Darwin’s connections to so many outstanding scientific and cultural institutions in the UK reflected in the recognition of his archive on the UNESCO Memory of the World International Register. All these institutions are open to the public so everyone will have access to his documentary heritage."
Dr Jessica Gardner, University Librarian and Director of Library Services at Cambridge University Libraries (CUL) said: "For all Charles Darwin gave the world, we are delighted by the UNESCO recognition in the Memory of the World of the exceptional scientific and heritage significance of his remarkable archive held within eminent UK institutions.
"Cambridge University Library is home to over 9,000 letters to and from Darwin, as well as his handwritten experimental notebooks, publications, and photographs which have together fostered decades of scholarship and public enjoyment through exhibition, education for schools, and online access.
"We could not be prouder of UNESCO’s recognition of this remarkable documentary heritage at the University of Cambridge, where Darwin was a student at Christ’s College and where his family connections run deep across the city, and are reflected in his namesake, Darwin College."
Read the full, illustrated version of this story on the University Library's site.
Documentary heritage relating to the life and work of Charles Darwin has been recognised on the prestigious UNESCO International Memory of the World Register, highlighting its critical importance to global science and the necessity of its long-term preservation and accessibility.
We could not be prouder of UNESCO’s recognition of this remarkable documentary heritageJessica GardnerCambridge University LibraryTwo of Charles Darwin’s pocket notebooks in Cambridge University Library
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.
Throwing a ‘spanner in the works’ of our cells’ machinery could help fight cancer, fatty liver disease… and hair loss
Scientists at the Medical Research Council (MRC) Mitochondrial Biology Unit, University of Cambridge, have worked out the structure of this machine and shown how it operates like the lock on a canal to transport pyruvate – a molecule generated in the body from the breakdown of sugars – into our mitochondria.
Known as the mitochondrial pyruvate carrier, this molecular machine was first proposed to exist in 1971, but it has taken until now for scientists to visualise its structure at the atomic scale using cryo-electron microscopy, a technique used to magnify an image of an object to around 165,000 times its real size. Details are published today in Science Advances.
Dr Sotiria Tavoulari, a Senior Research Associate from the University of Cambridge, who first determined the composition of this molecular machine, said: “Sugars in our diet provide energy for our bodies to function. When they are broken down inside our cells they produce pyruvate, but to get the most out of this molecule it needs to be transferred inside the cell’s powerhouses, the mitochondria. There, it helps increase 15-fold the energy produced in the form of the cellular fuel ATP.”
Maximilian Sichrovsky, a PhD student at Hughes Hall and joint first author of the study, said: “Getting pyruvate into our mitochondria sounds straightforward, but until now we haven’t been able to understand the mechanism of how this process occurs. Using state-of-the-art cryo-electron microscopy, we’ve been able to show not only what this transporter looks like, but exactly how it works. It’s an extremely important process, and understanding it could lead to new treatments for a range of different conditions.”
Mitochondria are surrounded by two membranes. The outer one is porous, and pyruvate can easily pass through, but the inner membrane is impermeable to pyruvate. To transport pyruvate into the mitochondrion, first an outer ‘gate’ of the carrier opens, allowing pyruvate to enter the carrier. This gate then closes, and the inner gate opens, allowing the molecule to pass through into the mitochondrion.
“It works like the locks on a canal but on the molecular scale,” said Professor Edmund Kunji from the MRC Mitochondrial Biology Unit, and a Fellow at Trinity Hall, Cambridge. “There, a gate opens at one end, allowing the boat to enter. It then closes and the gate at the opposite end opens to allow the boat smooth transit through.”
Because of its central role in controlling the way mitochondria operate to produce energy, this carrier is now recognised as a promising drug target for a range of conditions, including diabetes, fatty liver disease, Parkinson’s disease, specific cancers, and even hair loss.
Pyruvate is not the only energy source available to us. Our cells can also take their energy from fats stored in the body or from amino acids in proteins. Blocking the pyruvate carrier would force the body to look elsewhere for its fuel – creating opportunities to treat a number of diseases. In fatty liver disease, for example, blocking access to pyruvate entry into mitochondria could encourage the body to use potentially dangerous fat that has been stored in liver cells.
Likewise, there are certain tumour cells that rely on pyruvate metabolism, such as in some types of prostate cancer. These cancers tend to be very ‘hungry’, producing excess pyruvate transport carriers to ensure they can feed more. Blocking the carrier could then starve these cancer cells of the energy they need to survive, killing them.
Previous studies have also suggested that inhibiting the mitochondrial pyruvate carrier may reverse hair loss. Activation of human follicle cells, which are responsible for hair growth, relies on metabolism and, in particular, the generation of lactate. When the mitochondrial pyruvate carrier is blocked from entering the mitochondria in these cells, it is instead converted to lactate.
Professor Kunji said: “Drugs inhibiting the function of the carrier can remodel how mitochondria work, which can be beneficial in certain conditions. Electron microscopy allows us to visualise exactly how these drugs bind inside the carrier to jam it – a spanner in the works, you could say. This creates new opportunities for structure-based drug design in order to develop better, more targeted drugs. This will be a real game changer.”
The research was supported by the Medical Research Council and was a collaboration with the groups of Professors Vanessa Leone at the Medical College of Wisconsin, Lucy Forrest at the National Institutes of Health, and Jan Steyaert at the Free University of Brussels.
Reference
Sichrovsky, M, Lacabanne, D, Ruprecht, JJ & Rana, JJ et al. Molecular basis of pyruvate transport and inhibition of the human mitochondrial pyruvate carrier. Sci Adv; 18 Apr 2025; DOI: 10.1126/sciadv.adw1489
Fifty years since its discovery, scientists have finally worked out how a molecular machine found in mitochondria, the ‘powerhouses’ of our cells, allows us to make the fuel we need from sugars, a process vital to all life on Earth.
Drugs inhibiting the function of the carrier can remodel how mitochondria work, which can be beneficial in certain conditionsEdmund Kunjibob_bosewell (Getty Images)Bald young man, front view
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.
Throwing a ‘spanner in the works’ of our cells’ machinery could help fight cancer, fatty liver disease… and hair loss
Scientists at the Medical Research Council (MRC) Mitochondrial Biology Unit, University of Cambridge, have worked out the structure of this machine and shown how it operates like the lock on a canal to transport pyruvate – a molecule generated in the body from the breakdown of sugars – into our mitochondria.
Known as the mitochondrial pyruvate carrier, this molecular machine was first proposed to exist in 1971, but it has taken until now for scientists to visualise its structure at the atomic scale using cryo-electron microscopy, a technique used to magnify an image of an object to around 165,000 times its real size. Details are published today in Science Advances.
Dr Sotiria Tavoulari, a Senior Research Associate from the University of Cambridge, who first determined the composition of this molecular machine, said: “Sugars in our diet provide energy for our bodies to function. When they are broken down inside our cells they produce pyruvate, but to get the most out of this molecule it needs to be transferred inside the cell’s powerhouses, the mitochondria. There, it helps increase 15-fold the energy produced in the form of the cellular fuel ATP.”
Maximilian Sichrovsky, a PhD student at Hughes Hall and joint first author of the study, said: “Getting pyruvate into our mitochondria sounds straightforward, but until now we haven’t been able to understand the mechanism of how this process occurs. Using state-of-the-art cryo-electron microscopy, we’ve been able to show not only what this transporter looks like, but exactly how it works. It’s an extremely important process, and understanding it could lead to new treatments for a range of different conditions.”
Mitochondria are surrounded by two membranes. The outer one is porous, and pyruvate can easily pass through, but the inner membrane is impermeable to pyruvate. To transport pyruvate into the mitochondrion, first an outer ‘gate’ of the carrier opens, allowing pyruvate to enter the carrier. This gate then closes, and the inner gate opens, allowing the molecule to pass through into the mitochondrion.
“It works like the locks on a canal but on the molecular scale,” said Professor Edmund Kunji from the MRC Mitochondrial Biology Unit, and a Fellow at Trinity Hall, Cambridge. “There, a gate opens at one end, allowing the boat to enter. It then closes and the gate at the opposite end opens to allow the boat smooth transit through.”
Because of its central role in controlling the way mitochondria operate to produce energy, this carrier is now recognised as a promising drug target for a range of conditions, including diabetes, fatty liver disease, Parkinson’s disease, specific cancers, and even hair loss.
Pyruvate is not the only energy source available to us. Our cells can also take their energy from fats stored in the body or from amino acids in proteins. Blocking the pyruvate carrier would force the body to look elsewhere for its fuel – creating opportunities to treat a number of diseases. In fatty liver disease, for example, blocking access to pyruvate entry into mitochondria could encourage the body to use potentially dangerous fat that has been stored in liver cells.
Likewise, there are certain tumour cells that rely on pyruvate metabolism, such as in some types of prostate cancer. These cancers tend to be very ‘hungry’, producing excess pyruvate transport carriers to ensure they can feed more. Blocking the carrier could then starve these cancer cells of the energy they need to survive, killing them.
Previous studies have also suggested that inhibiting the mitochondrial pyruvate carrier may reverse hair loss. Activation of human follicle cells, which are responsible for hair growth, relies on metabolism and, in particular, the generation of lactate. When the mitochondrial pyruvate carrier is blocked from entering the mitochondria in these cells, it is instead converted to lactate.
Professor Kunji said: “Drugs inhibiting the function of the carrier can remodel how mitochondria work, which can be beneficial in certain conditions. Electron microscopy allows us to visualise exactly how these drugs bind inside the carrier to jam it – a spanner in the works, you could say. This creates new opportunities for structure-based drug design in order to develop better, more targeted drugs. This will be a real game changer.”
The research was supported by the Medical Research Council and was a collaboration with the groups of Professors Vanessa Leone at the Medical College of Wisconsin, Lucy Forrest at the National Institutes of Health, and Jan Steyaert at the Free University of Brussels.
Reference
Sichrovsky, M, Lacabanne, D, Ruprecht, JJ & Rana, JJ et al. Molecular basis of pyruvate transport and inhibition of the human mitochondrial pyruvate carrier. Sci Adv; 18 Apr 2025; DOI: 10.1126/sciadv.adw1489
Fifty years since its discovery, scientists have finally worked out how a molecular machine found in mitochondria, the ‘powerhouses’ of our cells, allows us to make the fuel we need from sugars, a process vital to all life on Earth.
Drugs inhibiting the function of the carrier can remodel how mitochondria work, which can be beneficial in certain conditionsEdmund Kunjibob_bosewell (Getty Images)Bald young man, front view
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.
Mouse study suggests a common diabetes drug may prevent leukaemia
Around 3,100 people are diagnosed with acute myeloid leukaemia (AML) each year in the UK. It is an aggressive form of blood cancer that is very difficult to treat. Thanks to recent advances, individuals at high risk of AML can be identified years in advance using blood tests and blood DNA analysis, but there’s no suitable treatment that can prevent them from developing the disease.
In this study, Professor George Vassiliou and colleagues at the University of Cambridge investigated how to prevent abnormal blood stem cells with genetic changes from progressing to become AML. The work focused on the most common genetic change, which affects a gene called DNMT3A and is responsible for starting 10-15% of AML cases.
Professor Vassiliou, from the Cambridge Stem Cell Institute at the University of Cambridge and Honorary Consultant Haematologist at Cambridge University Hospitals NHS Foundation Trust (CUH) co-led the study. He said: “Blood cancer poses unique challenges compared to solid cancers like breast or prostate, which can be surgically removed if identified early. With blood cancers, we need to identify people at risk and then use medical treatments to stop cancer progression throughout the body.”
The research team examined blood stem cells from mice with the same changes in DNMT3A as seen in the pre-cancerous cells in humans. Using a genome-wide screening technique, they showed that these cells depend more on mitochondrial metabolism than healthy cells, making this a potential weak spot. The researchers went on to confirm that metformin, and other mitochondria-targeting drugs, substantially slowed the growth of mutation-bearing blood cells in mice. Further experiments also showed that metformin could have the same effect on human blood cells with the DNMT3A mutation.
Dr Malgorzata Gozdecka, Senior Research Associate at the Cambridge Stem Cell Institute and first author of the research said: “Metformin is a drug that impacts mitochondrial metabolism, and these pre-cancerous cells need this energy to keep growing. By blocking this process, we stop the cells from expanding and progressing towards AML, whilst also reversing other effects of the mutated DNMT3A gene.”
In addition, the study looked at data from over 412,000 UK Biobank volunteers and found that people taking metformin were less likely to have changes in the DNMT3A gene. This link remained even after accounting for factors that could have confounded the results such as diabetes status and BMI.
Professor Brian Huntly, Head of the Department of Haematology at the University of Cambridge, Honorary Consultant Haematologist at CUH, and joint lead author of the research, added: “Metformin appears highly specific to this mutation rather than being a generic treatment. That specificity makes it especially compelling as a targeted prevention strategy.
“We’ve done the extensive research all the way from cell-based studies to human data, so we’re now at the point where we have a made a strong case for moving ahead with clinical trials. Importantly, metformin’s lack of toxicity will be a major advantage as it is already used by millions of people worldwide with a well-established safety profile.”
The results of the study, funded by Blood Cancer UK with additional support from Cancer Research UK, the Leukemia & Lymphoma Society (USA) and the Wellcome Trust, are published in Nature.
Dr Rubina Ahmed, Director of Research at Blood Cancer UK, said: “Blood cancer is the third biggest cancer killer in the UK, with over 280,000 people currently living with the disease. Our Blood Cancer Action plan shed light on the shockingly low survival for acute myeloid leukaemia, with only around 2 in 10 surviving for 5 years, and we urgently need better strategies to save lives. Repurposing safe, widely available drugs like metformin means we could potentially get new treatments to people faster, without the need for lengthy drug development pipelines.”
The next phase of this research will focus on clinical trials to test metformin’s effectiveness in people with changes in DNMT3A at increased risk of developing AML. With metformin already approved and widely used for diabetes, this repurposing strategy could dramatically reduce the time it takes to bring a new preventive therapy to patients.
Tanya Hollands, Research Information Manager at Cancer Research UK, who contributed funding for the lab-based screening in mice, said: “It's important that we work to find new ways to slow down or prevent AML in people at high risk. Therefore, it’s positive that the findings of this study suggest a possible link between a commonly-used diabetes drug and prevention of AML progression in some people. While this early-stage research is promising, clinical trials are now needed to find out if this drug could benefit people. We look forward to seeing how this work progresses.”
Reference
Gozdecka, M et al. Mitochondrial metabolism sustains DNMT3A-R882-mutant clonal haematopoiesis. Nature; 16 Apr 2025; DOI: 10.1038/s41586-025-08980-6
Adapted from a press release from Blood Cancer UK
Metformin, a widely used and affordable diabetes drug, could prevent a form of acute myeloid leukaemia in people at high risk of the disease, a study in mice has suggested. Further research in clinical trials will be needed to confirm this works for patients.
We’ve done the extensive research all the way from cell-based studies to human data, so we’re now at the point where we have a made a strong case for moving ahead with clinical trialsBrian HuntlyUniversity of CambridgeBrown lab mouse on blue gloved hand
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.
Mouse study suggests a common diabetes drug may prevent leukaemia
Around 3,100 people are diagnosed with acute myeloid leukaemia (AML) each year in the UK. It is an aggressive form of blood cancer that is very difficult to treat. Thanks to recent advances, individuals at high risk of AML can be identified years in advance using blood tests and blood DNA analysis, but there’s no suitable treatment that can prevent them from developing the disease.
In this study, Professor George Vassiliou and colleagues at the University of Cambridge investigated how to prevent abnormal blood stem cells with genetic changes from progressing to become AML. The work focused on the most common genetic change, which affects a gene called DNMT3A and is responsible for starting 10-15% of AML cases.
Professor Vassiliou, from the Cambridge Stem Cell Institute at the University of Cambridge and Honorary Consultant Haematologist at Cambridge University Hospitals NHS Foundation Trust (CUH) co-led the study. He said: “Blood cancer poses unique challenges compared to solid cancers like breast or prostate, which can be surgically removed if identified early. With blood cancers, we need to identify people at risk and then use medical treatments to stop cancer progression throughout the body.”
The research team examined blood stem cells from mice with the same changes in DNMT3A as seen in the pre-cancerous cells in humans. Using a genome-wide screening technique, they showed that these cells depend more on mitochondrial metabolism than healthy cells, making this a potential weak spot. The researchers went on to confirm that metformin, and other mitochondria-targeting drugs, substantially slowed the growth of mutation-bearing blood cells in mice. Further experiments also showed that metformin could have the same effect on human blood cells with the DNMT3A mutation.
Dr Malgorzata Gozdecka, Senior Research Associate at the Cambridge Stem Cell Institute and first author of the research said: “Metformin is a drug that impacts mitochondrial metabolism, and these pre-cancerous cells need this energy to keep growing. By blocking this process, we stop the cells from expanding and progressing towards AML, whilst also reversing other effects of the mutated DNMT3A gene.”
In addition, the study looked at data from over 412,000 UK Biobank volunteers and found that people taking metformin were less likely to have changes in the DNMT3A gene. This link remained even after accounting for factors that could have confounded the results such as diabetes status and BMI.
Professor Brian Huntly, Head of the Department of Haematology at the University of Cambridge, Honorary Consultant Haematologist at CUH, and joint lead author of the research, added: “Metformin appears highly specific to this mutation rather than being a generic treatment. That specificity makes it especially compelling as a targeted prevention strategy.
“We’ve done the extensive research all the way from cell-based studies to human data, so we’re now at the point where we have a made a strong case for moving ahead with clinical trials. Importantly, metformin’s lack of toxicity will be a major advantage as it is already used by millions of people worldwide with a well-established safety profile.”
The results of the study, funded by Blood Cancer UK with additional support from Cancer Research UK, the Leukemia & Lymphoma Society (USA) and the Wellcome Trust, are published in Nature.
Dr Rubina Ahmed, Director of Research at Blood Cancer UK, said: “Blood cancer is the third biggest cancer killer in the UK, with over 280,000 people currently living with the disease. Our Blood Cancer Action plan shed light on the shockingly low survival for acute myeloid leukaemia, with only around 2 in 10 surviving for 5 years, and we urgently need better strategies to save lives. Repurposing safe, widely available drugs like metformin means we could potentially get new treatments to people faster, without the need for lengthy drug development pipelines.”
The next phase of this research will focus on clinical trials to test metformin’s effectiveness in people with changes in DNMT3A at increased risk of developing AML. With metformin already approved and widely used for diabetes, this repurposing strategy could dramatically reduce the time it takes to bring a new preventive therapy to patients.
Tanya Hollands, Research Information Manager at Cancer Research UK, who contributed funding for the lab-based screening in mice, said: “It's important that we work to find new ways to slow down or prevent AML in people at high risk. Therefore, it’s positive that the findings of this study suggest a possible link between a commonly-used diabetes drug and prevention of AML progression in some people. While this early-stage research is promising, clinical trials are now needed to find out if this drug could benefit people. We look forward to seeing how this work progresses.”
Reference
Gozdecka, M et al. Mitochondrial metabolism sustains DNMT3A-R882-mutant clonal haematopoiesis. Nature; 16 Apr 2025; DOI: 10.1038/s41586-025-08980-6
Adapted from a press release from Blood Cancer UK
Metformin, a widely used and affordable diabetes drug, could prevent a form of acute myeloid leukaemia in people at high risk of the disease, a study in mice has suggested. Further research in clinical trials will be needed to confirm this works for patients.
We’ve done the extensive research all the way from cell-based studies to human data, so we’re now at the point where we have a made a strong case for moving ahead with clinical trialsBrian HuntlyUniversity of CambridgeBrown lab mouse on blue gloved hand
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.
Extreme drought contributed to barbarian invasion of late Roman Britain, tree-ring study reveals
The ‘Barbarian Conspiracy’ of 367 CE was one of the most severe threats to Rome’s hold on Britain since the Boudiccan revolt three centuries earlier. Contemporary sources indicate that components of the garrison on Hadrian’s wall rebelled and allowed the Picts to attack the Roman province by land and sea. Simultaneously, the Scotti from modern-day Ireland invaded broadly in the west, and Saxons from the continent landed in the south.
Senior Roman commanders were captured or killed, and some soldiers reportedly deserted and joined the invaders. Throughout the spring and summer, small groups roamed and plundered the countryside. Britain’s descent into anarchy was disastrous for Rome and it took two years for generals dispatched by Valentian I, Emperor of the Western Roman Empire, to restore order. The final remnants of official Roman administration left Britain some 40 years later around 410 CE.
The University of Cambridge-led study, published today in Climatic Change, used oak tree-ring records to reconstruct temperature and precipitation levels in southern Britain during and after the ‘Barbarian Conspiracy’ in 367 CE. Combining this data with surviving Roman accounts, the researchers argue that severe summer droughts in 364, 365 and 366 CE were a driving force in these pivotal events.
First author Charles Norman, from Cambridge’s Department of Geography, said: “We don’t have much archaeological evidence for the ‘Barbarian Conspiracy’. Written accounts from the period give some background, but our findings provide an explanation for the catalyst of this major event.”
The researchers found that southern Britain experienced an exceptional sequence of remarkably dry summers from 364 to 366 CE. In the period 350–500 CE, average monthly reconstructed rainfall in the main growing season (April–July) was 51 mm. But in 364 CE, it fell to just 29mm. 365 CE was even worse with 28mm, and 37mm the following year kept the area in crisis.
Professor Ulf Büntgen, from Cambridge’s Department of Geography, said: “Three consecutive droughts would have had a devastating impact on the productivity of Roman Britain’s most important agricultural region. As Roman writers tell us, this resulted in food shortages with all of the destabilizing societal effects this brings.”
Between 1836–2024 CE, southern Britain only experienced droughts of a similar magnitude seven times – mostly in recent decades, and none of these were consecutive, emphasising how exceptional these droughts were in Roman times. The researchers identified no other major droughts in southern Britain in the period 350–500 CE and found that other parts of northwestern Europe escaped these conditions.
Roman Britain’s main produce were crops like spelt wheat and six-row barley. Because the province had a wet climate, sowing these crops in spring was more viable than in winter, but this made them vulnerable to late spring and early summer moisture deficits, and early summer droughts could lead to total crop failure.
The researchers point to surviving accounts written by Roman chroniclers to corroborate these drought-driven grain deficits. By 367 CE, Ammianus Marcellinus described the population of Britain as in the “utmost conditions of famine”.
“Drought from 364 to 366 CE would have impacted spring-sown crop growth substantially, triggering poor harvests,” Charles Norman said. “This would have reduced the grain supply to Hadrian’s Wall, providing a plausible motive for the rebellion there which allowed the Picts into northern Britain.”
The study suggests that given the crucial role of grain in the contract between soldiers and the army, grain deficits may have contributed to other desertions in this period, and therefore a general weakening of the Roman army in Britain. In addition, the geographic isolation of Roman Britain likely combined with the severity of the prolonged drought to reduce the ability of Rome to alleviate the deficits.
Ultimately the researchers argue that military and societal breakdown in Roman Britain provided an ideal opportunity for peripheral tribes, including the Picts, Scotti and Saxons, to invade the province en masse with the intention of raiding rather than conquest. Their finding that the most severe conditions were restricted to southern Britain undermines the idea that famines in other provinces might have forced these tribes to invade.
Andreas Rzepecki, from the Generaldirektion Kulturelles Erbe Rheinland-Pfalz, said: “Our findings align with the accounts of Roman chroniclers and the seemingly coordinated nature of the ‘Conspiracy’ suggests an organised movement of strong onto weak, rather than a more chaotic assault had the invaders been in a state of desperation.”
“The prolonged and extreme drought seems to have occurred during a particularly poor period for Roman Britain, in which food and military resources were being stripped for the Rhine frontier, while immigratory pressures increased.”
“These factors limited resilience, and meant a drought induced, partial-military rebellion and subsequent external invasion were able to overwhelm the weakened defences.”
The researchers expanded their climate-conflict analysis to the entire Roman Empire for the period 350–476 CE. They reconstructed the climate conditions immediately before and after 106 battles and found that a statistically significant number of battles were fought following dry years.
Tatiana Bebchuk, from Cambridge’s Department of Geography, said: “The relationship between climate and conflict is becoming increasingly clear in our own time so these findings aren’t just important for historians. Extreme climate conditions lead to hunger, which can lead to societal challenges, which eventually lead to outright conflict.”
Charles Norman, Ulf Büntgen, Paul Krusic and Tatiana Bebchuk are based at the Department of Geography, University of Cambridge; Lothar Schwinden and Andreas Rzepecki are from the Generaldirektion Kulturelles Erbe Rheinland-Pfalz in Trier. Ulf Büntgen is also affiliated with the Global Change Research Institute, Czech Academy of Sciences and the Department of Geography, Masaryk University in Brno.
Reference
C. Norman, L. Schwinden, P. Krusic, A. Rzepecki, T. Bebchuk, U. Büntgen, ‘Droughts and conflicts during the late Roman period’, Climatic Change (2025). DOI: 10.1007/s10584-025-03925-4
Funding
Charles Norman was supported by Wolfson College, University of Cambridge (John Hughes PhD Studentship). Ulf Büntgen received funding from the Czech Science Foundation (# 23-08049S; Hydro8), the ERC Advanced Grant (# 882727; Monostar), and the ERC Synergy Grant (# 101118880; Synergy-Plague).
Three consecutive years of drought contributed to the ‘Barbarian Conspiracy’, a pivotal moment in the history of Roman Britain, a new Cambridge-led study reveals. Researchers argue that Picts, Scotti and Saxons took advantage of famine and societal breakdown caused by an extreme period of drought to inflict crushing blows on weakened Roman defences in 367 CE. While Rome eventually restored order, some historians argue that the province never fully recovered.
Our findings provide an explanation for the catalyst of this major event.Charles NormanAdam CuerdenMilecastle 39 on Hadrian's Wall
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.
Extreme drought contributed to barbarian invasion of late Roman Britain, tree-ring study reveals
The ‘Barbarian Conspiracy’ of 367 CE was one of the most severe threats to Rome’s hold on Britain since the Boudiccan revolt three centuries earlier. Contemporary sources indicate that components of the garrison on Hadrian’s wall rebelled and allowed the Picts to attack the Roman province by land and sea. Simultaneously, the Scotti from modern-day Ireland invaded broadly in the west, and Saxons from the continent landed in the south.
Senior Roman commanders were captured or killed, and some soldiers reportedly deserted and joined the invaders. Throughout the spring and summer, small groups roamed and plundered the countryside. Britain’s descent into anarchy was disastrous for Rome and it took two years for generals dispatched by Valentian I, Emperor of the Western Roman Empire, to restore order. The final remnants of official Roman administration left Britain some 40 years later around 410 CE.
The University of Cambridge-led study, published today in Climatic Change, used oak tree-ring records to reconstruct temperature and precipitation levels in southern Britain during and after the ‘Barbarian Conspiracy’ in 367 CE. Combining this data with surviving Roman accounts, the researchers argue that severe summer droughts in 364, 365 and 366 CE were a driving force in these pivotal events.
First author Charles Norman, from Cambridge’s Department of Geography, said: “We don’t have much archaeological evidence for the ‘Barbarian Conspiracy’. Written accounts from the period give some background, but our findings provide an explanation for the catalyst of this major event.”
The researchers found that southern Britain experienced an exceptional sequence of remarkably dry summers from 364 to 366 CE. In the period 350–500 CE, average monthly reconstructed rainfall in the main growing season (April–July) was 51 mm. But in 364 CE, it fell to just 29mm. 365 CE was even worse with 28mm, and 37mm the following year kept the area in crisis.
Professor Ulf Büntgen, from Cambridge’s Department of Geography, said: “Three consecutive droughts would have had a devastating impact on the productivity of Roman Britain’s most important agricultural region. As Roman writers tell us, this resulted in food shortages with all of the destabilizing societal effects this brings.”
Between 1836–2024 CE, southern Britain only experienced droughts of a similar magnitude seven times – mostly in recent decades, and none of these were consecutive, emphasising how exceptional these droughts were in Roman times. The researchers identified no other major droughts in southern Britain in the period 350–500 CE and found that other parts of northwestern Europe escaped these conditions.
Roman Britain’s main produce were crops like spelt wheat and six-row barley. Because the province had a wet climate, sowing these crops in spring was more viable than in winter, but this made them vulnerable to late spring and early summer moisture deficits, and early summer droughts could lead to total crop failure.
The researchers point to surviving accounts written by Roman chroniclers to corroborate these drought-driven grain deficits. By 367 CE, Ammianus Marcellinus described the population of Britain as in the “utmost conditions of famine”.
“Drought from 364 to 366 CE would have impacted spring-sown crop growth substantially, triggering poor harvests,” Charles Norman said. “This would have reduced the grain supply to Hadrian’s Wall, providing a plausible motive for the rebellion there which allowed the Picts into northern Britain.”
The study suggests that given the crucial role of grain in the contract between soldiers and the army, grain deficits may have contributed to other desertions in this period, and therefore a general weakening of the Roman army in Britain. In addition, the geographic isolation of Roman Britain likely combined with the severity of the prolonged drought to reduce the ability of Rome to alleviate the deficits.
Ultimately the researchers argue that military and societal breakdown in Roman Britain provided an ideal opportunity for peripheral tribes, including the Picts, Scotti and Saxons, to invade the province en masse with the intention of raiding rather than conquest. Their finding that the most severe conditions were restricted to southern Britain undermines the idea that famines in other provinces might have forced these tribes to invade.
Andreas Rzepecki, from the Generaldirektion Kulturelles Erbe Rheinland-Pfalz, said: “Our findings align with the accounts of Roman chroniclers and the seemingly coordinated nature of the ‘Conspiracy’ suggests an organised movement of strong onto weak, rather than a more chaotic assault had the invaders been in a state of desperation.”
“The prolonged and extreme drought seems to have occurred during a particularly poor period for Roman Britain, in which food and military resources were being stripped for the Rhine frontier, while immigratory pressures increased.”
“These factors limited resilience, and meant a drought induced, partial-military rebellion and subsequent external invasion were able to overwhelm the weakened defences.”
The researchers expanded their climate-conflict analysis to the entire Roman Empire for the period 350–476 CE. They reconstructed the climate conditions immediately before and after 106 battles and found that a statistically significant number of battles were fought following dry years.
Tatiana Bebchuk, from Cambridge’s Department of Geography, said: “The relationship between climate and conflict is becoming increasingly clear in our own time so these findings aren’t just important for historians. Extreme climate conditions lead to hunger, which can lead to societal challenges, which eventually lead to outright conflict.”
Charles Norman, Ulf Büntgen, Paul Krusic and Tatiana Bebchuk are based at the Department of Geography, University of Cambridge; Lothar Schwinden and Andreas Rzepecki are from the Generaldirektion Kulturelles Erbe Rheinland-Pfalz in Trier. Ulf Büntgen is also affiliated with the Global Change Research Institute, Czech Academy of Sciences and the Department of Geography, Masaryk University in Brno.
Reference
C. Norman, L. Schwinden, P. Krusic, A. Rzepecki, T. Bebchuk, U. Büntgen, ‘Droughts and conflicts during the late Roman period’, Climatic Change (2025). DOI: 10.1007/s10584-025-03925-4
Funding
Charles Norman was supported by Wolfson College, University of Cambridge (John Hughes PhD Studentship). Ulf Büntgen received funding from the Czech Science Foundation (# 23-08049S; Hydro8), the ERC Advanced Grant (# 882727; Monostar), and the ERC Synergy Grant (# 101118880; Synergy-Plague).
Three consecutive years of drought contributed to the ‘Barbarian Conspiracy’, a pivotal moment in the history of Roman Britain, a new Cambridge-led study reveals. Researchers argue that Picts, Scotti and Saxons took advantage of famine and societal breakdown caused by an extreme period of drought to inflict crushing blows on weakened Roman defences in 367 CE. While Rome eventually restored order, some historians argue that the province never fully recovered.
Our findings provide an explanation for the catalyst of this major event.Charles NormanAdam CuerdenMilecastle 39 on Hadrian's Wall
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.
Growing wildflowers on disused urban land can damage bee health
The metals have previously been shown to damage the health of pollinators, which ingest them in nectar as they feed, leading to reduced population sizes and death. Even low nectar metal levels can have long-term effects, by affecting bees’ learning and memory - which impacts their foraging ability.
Researchers have found that common plants including white clover and bindweed, which are vital forage for pollinators in cities, can accumulate arsenic, cadmium, chromium and lead from contaminated soils.
Metal contamination is an issue in the soils of cities worldwide, with the level of contamination usually increasing with the age of a city. The metals come from a huge range of sources including cement dust and mining.
The researchers say soils in cities should be tested for metals before sowing wildflowers and if necessary, polluted areas should be cleaned up before new wildflower habitats are established.
The study highlights the importance of growing the right species of wildflowers to suit the soil conditions.
Reducing the risk of metal exposure is critical for the success of urban pollinator conservation schemes. The researchers say it is important to manage wildflower species that self-seed on contaminated urban land, for example by frequent mowing to limit flowering - which reduces the transfer of metals from the soil to the bees.
The results are published today in the journal Ecology and Evolution.
Dr Sarah Scott in the University of Cambridge’s Department of Zoology and first author of the report, said: “It’s really important to have wildflowers as a food source for the bees, and our results should not discourage people from planting wildflowers in towns and cities.
“We hope this study will raise awareness that soil health is also important for bee health. Before planting wildflowers in urban areas to attract bees and other pollinators, it’s important to consider the history of the land and what might be in the soil – and if necessary find out whether there’s a local soil testing and cleanup service available first.”
The study was carried out in the post-industrial US city of Cleveland, Ohio, which has over 33,700 vacant lots left as people have moved away from the area. In the past, iron and steel production, oil refining and car manufacturing went on there. But any land that was previously the site of human activity may be contaminated with traces of metals.
To get their results, the researchers extracted nectar from a range of self-seeded flowering plants that commonly attract pollinating insects, found growing on disused land across the city. They tested this for the presence of arsenic, cadmium, chromium and lead. Lead was consistently found at the highest concentrations, reflecting the state of the soils in the city.
The researchers found that different species of plant accumulate different amounts, and types, of the metals. Overall, the bright blue-flowered chicory plant (Cichorium intybus) accumulated the largest total metal concentration, followed by white clover (Trifolium repens), wild carrot (Daucus carota) and bindweed (Convolvulus arvensis). These plants are all vital forage for pollinators in cities - including cities in the UK - providing a consistent supply of nectar across locations and seasons.
There is growing evidence that wild pollinator populations have dropped by over 50% in the last 50 years, caused primarily by changes in land use and management across the globe. Climate change and pesticide use also play a role; overall the primary cause of decline is the loss of flower-rich habitat.
Pollinators play a vital role in food production: many plants, including apple and tomato, require pollination in order to develop fruit. Natural ‘pollination services’ are estimated to add billions of dollars to global crop productivity.
Scott said: “Climate change feels so overwhelming, but simply planting flowers in certain areas can help towards conserving pollinators, which is a realistic way for people to make a positive impact on the environment.”
The research was funded primarily by the USDA National Institute of Food and Agriculture.
Reference
Scott, S.B.& Gardiner, M.M.: ‘Trace metals in nectar of important urban pollinator forage plants: A direct exposure risk to pollinators and nectar-feeding animals in cities.’ Ecology and Evolution, April 2025. DOI: 10.1002/ece3.71238
Wildflowers growing on land previously used for buildings and factories can accumulate lead, arsenic and other metal contaminants from the soil, which are consumed by pollinators as they feed, a new study has found.
Our results should not discourage people from planting wildflowers in towns and cities. But.. it’s important to consider the history of the land and what might be in the soil."Sarah ScottSarah ScottChicory growing in a vacant lot
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.
Growing wildflowers on disused urban land can damage bee health
The metals have previously been shown to damage the health of pollinators, which ingest them in nectar as they feed, leading to reduced population sizes and death. Even low nectar metal levels can have long-term effects, by affecting bees’ learning and memory - which impacts their foraging ability.
Researchers have found that common plants including white clover and bindweed, which are vital forage for pollinators in cities, can accumulate arsenic, cadmium, chromium and lead from contaminated soils.
Metal contamination is an issue in the soils of cities worldwide, with the level of contamination usually increasing with the age of a city. The metals come from a huge range of sources including cement dust and mining.
The researchers say soils in cities should be tested for metals before sowing wildflowers and if necessary, polluted areas should be cleaned up before new wildflower habitats are established.
The study highlights the importance of growing the right species of wildflowers to suit the soil conditions.
Reducing the risk of metal exposure is critical for the success of urban pollinator conservation schemes. The researchers say it is important to manage wildflower species that self-seed on contaminated urban land, for example by frequent mowing to limit flowering - which reduces the transfer of metals from the soil to the bees.
The results are published today in the journal Ecology and Evolution.
Dr Sarah Scott in the University of Cambridge’s Department of Zoology and first author of the report, said: “It’s really important to have wildflowers as a food source for the bees, and our results should not discourage people from planting wildflowers in towns and cities.
“We hope this study will raise awareness that soil health is also important for bee health. Before planting wildflowers in urban areas to attract bees and other pollinators, it’s important to consider the history of the land and what might be in the soil – and if necessary find out whether there’s a local soil testing and cleanup service available first.”
The study was carried out in the post-industrial US city of Cleveland, Ohio, which has over 33,700 vacant lots left as people have moved away from the area. In the past, iron and steel production, oil refining and car manufacturing went on there. But any land that was previously the site of human activity may be contaminated with traces of metals.
To get their results, the researchers extracted nectar from a range of self-seeded flowering plants that commonly attract pollinating insects, found growing on disused land across the city. They tested this for the presence of arsenic, cadmium, chromium and lead. Lead was consistently found at the highest concentrations, reflecting the state of the soils in the city.
The researchers found that different species of plant accumulate different amounts, and types, of the metals. Overall, the bright blue-flowered chicory plant (Cichorium intybus) accumulated the largest total metal concentration, followed by white clover (Trifolium repens), wild carrot (Daucus carota) and bindweed (Convolvulus arvensis). These plants are all vital forage for pollinators in cities - including cities in the UK - providing a consistent supply of nectar across locations and seasons.
There is growing evidence that wild pollinator populations have dropped by over 50% in the last 50 years, caused primarily by changes in land use and management across the globe. Climate change and pesticide use also play a role; overall the primary cause of decline is the loss of flower-rich habitat.
Pollinators play a vital role in food production: many plants, including apple and tomato, require pollination in order to develop fruit. Natural ‘pollination services’ are estimated to add billions of dollars to global crop productivity.
Scott said: “Climate change feels so overwhelming, but simply planting flowers in certain areas can help towards conserving pollinators, which is a realistic way for people to make a positive impact on the environment.”
The research was funded primarily by the USDA National Institute of Food and Agriculture.
Reference
Scott, S.B.& Gardiner, M.M.: ‘Trace metals in nectar of important urban pollinator forage plants: A direct exposure risk to pollinators and nectar-feeding animals in cities.’ Ecology and Evolution, April 2025. DOI: 10.1002/ece3.71238
Wildflowers growing on land previously used for buildings and factories can accumulate lead, arsenic and other metal contaminants from the soil, which are consumed by pollinators as they feed, a new study has found.
Our results should not discourage people from planting wildflowers in towns and cities. But.. it’s important to consider the history of the land and what might be in the soil."Sarah ScottSarah ScottChicory growing in a vacant lot
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.
Complete clean sweep for Cambridge at The Boat Race 2025
Thousands of spectators lined the banks of the River Thames today to witness a dramatic afternoon of action, with millions more following live on the BBC.
Cambridge Women secured their eighth consecutive win in the 79th Women’s Boat Race, extending their overall record to 49 victories to Oxford’s 30. The Men’s crew, too, were victorious in defending their title in the 170th edition of the event, notching up their 88th win, with Oxford sitting on 81.
Goldie, the Cambridge Men’s Reserve Crew, won the Men’s Reserve Race, while Blondie, the Cambridge Women’s Reserve Crew, won the Women’s Reserve Race. And the day before, the 2025 Lightweight Boat Race also saw two wins for Cambridge.
Cambridge’s Claire Collins said it was an incredible feeling to win the race.
“This is so cool, it’s really an incredible honour to share this with the whole club,” she said.
The Women’s Race was stopped initially after an oar clash, but Umpire Sir Matthew Pinsent allowed the race to resume. Claire said that the crew had prepared for eventualities such as a restart and so were able to lean on their training when it happened.
“I had total confidence in the crew to regroup. Our focus was to get back on pace and get going as soon as possible and that’s what we did.”
For Cambridge Men’s President Luca Ferraro, it was his final Boat Roat campaign, having raced in the Blue Boat for the last three years, winning the last two.
He said: “It was a great race. The guys really stepped up. That’s something that our Coach Rob Baker said to us before we went out there, that each of us had to step up individually and come together and play our part in what we were about to do. I couldn’t be prouder of the guys, they really delivered today.”
Professor Deborah Prentice, Vice-Chancellor of the University of Cambridge, congratulated all the crews following the wins.
“I am in awe of these students and what they have achieved, and what Cambridge University Boat Club has been able to create,” she said.
“These students are out in the early hours of the morning training and then trying to make it to 9am lectures. It’s so inspiring. And a complete clean sweep – this was an incredibly impressive showing by Cambridge, I am so proud of them.”
The Cambridge Blue Boats featured student athletes drawn from Christ’s College, Downing College, Emmanuel College, Gonville & Caius, Hughes Hall, Jesus College, Pembroke College, Peterhouse, St Edmund’s, and St John’s.
Cambridge are celebrating a complete clean sweep at The Boat Race 2025, with victories in all four openweight races and also both lightweight races.
Row360
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.
Complete clean sweep for Cambridge at The Boat Race 2025
Thousands of spectators lined the banks of the River Thames today to witness a dramatic afternoon of action, with millions more following live on the BBC.
Cambridge Women secured their eighth consecutive win in the 79th Women’s Boat Race, extending their overall record to 49 victories to Oxford’s 30. The Men’s crew, too, were victorious in defending their title in the 170th edition of the event, notching up their 88th win, with Oxford sitting on 81.
Goldie, the Cambridge Men’s Reserve Crew, won the Men’s Reserve Race, while Blondie, the Cambridge Women’s Reserve Crew, won the Women’s Reserve Race. And the day before, the 2025 Lightweight Boat Race also saw two wins for Cambridge.
Cambridge’s Claire Collins said it was an incredible feeling to win the race.
“This is so cool, it’s really an incredible honour to share this with the whole club,” she said.
The Women’s Race was stopped initially after an oar clash, but Umpire Sir Matthew Pinsent allowed the race to resume. Claire said that the crew had prepared for eventualities such as a restart and so were able to lean on their training when it happened.
“I had total confidence in the crew to regroup. Our focus was to get back on pace and get going as soon as possible and that’s what we did.”
For Cambridge Men’s President Luca Ferraro, it was his final Boat Roat campaign, having raced in the Blue Boat for the last three years, winning the last two.
He said: “It was a great race. The guys really stepped up. That’s something that our Coach Rob Baker said to us before we went out there, that each of us had to step up individually and come together and play our part in what we were about to do. I couldn’t be prouder of the guys, they really delivered today.”
Professor Deborah Prentice, Vice-Chancellor of the University of Cambridge, congratulated all the crews following the wins.
“I am in awe of these students and what they have achieved, and what Cambridge University Boat Club has been able to create,” she said.
“These students are out in the early hours of the morning training and then trying to make it to 9am lectures. It’s so inspiring. And a complete clean sweep – this was an incredibly impressive showing by Cambridge, I am so proud of them.”
The Cambridge Blue Boats featured student athletes drawn from Christ’s College, Downing College, Emmanuel College, Gonville & Caius, Hughes Hall, Jesus College, Pembroke College, Peterhouse, St Edmund’s, and St John’s.
Cambridge are celebrating a complete clean sweep at The Boat Race 2025, with victories in all four openweight races and also both lightweight races.
Row360
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.