A recent job posting by the UK's National Health Service (NHS) for a neonatal nurse specializing in "close-relative marriage" and the FBI's release of images related to the disappearance of news anchor Savannah Guthrie's mother are among the top news stories. Additionally, US lawmakers are accusing the Justice Department of improperly redacting files related to Jeffrey Epstein, while a trial in California is examining the mental health effects of social media, and a new study highlights the risks of using AI chatbots for medical advice.
The NHS advertised a full-time position titled "Neonatal Nurse - Close Relative Marriage" to support families involved in the practice, which often involves first cousins and carries higher genetic risks, according to health officials (Source 1). The role, which has since closed, aimed to assist families with "informed reproductive decision-making."
Meanwhile, the FBI released images of a masked person in connection with the disappearance of 84-year-old Nancy Guthrie, Savannah Guthrie's mother (Source 2). Authorities believe she was taken against her will from her Tucson, Arizona, home on January 31. Savannah Guthrie stated her family believes their mother is still alive and has appealed for information.
In the US, lawmakers are scrutinizing the Department of Justice's redaction of files related to convicted sex offender Jeffrey Epstein (Source 3). Members of Congress were allowed to review unredacted versions of the files released under the Epstein Files Transparency Act (EFTA). Democratic congressman Ro Khanna stated, "The core issue is that they're not complying with... my law, because these were scrubbed back in March by Donald Trump's FBI."
A landmark trial in California is underway, examining the mental health effects of Instagram and YouTube (Source 4). Lawyers for the plaintiff, identified as "K.G.M.," argued that social media companies created "addiction machines" designed to addict children. Mark Lanier, the plaintiff's lawyer, stated, "These companies built machines designed to addict the brains of children, and they did it on purpose." Lawyers for Meta and YouTube countered that K.G.M.'s addiction stemmed from other issues.
Finally, a study from the University of Oxford found that AI chatbots provide inaccurate and inconsistent medical advice, potentially posing risks to users (Source 5). Researchers gave 1,300 people a scenario, such as having a headache, and found the advice varied. Dr. Rebecca Payne, lead medical practitioner on the study, said it could be "dangerous" for people to ask chatbots about their symptoms. A November 2025 poll by Mental Health UK found that more than one in three UK residents now use AI to support their mental health or wellbeing.
Discussion
AI Experts & Community
Be the first to comment