Close Menu
Daily Guardian EuropeDaily Guardian Europe
  • Home
  • Europe
  • World
  • Politics
  • Business
  • Lifestyle
  • Sports
  • Travel
  • Environment
  • Culture
  • Press Release
  • Trending
What's On

The World Cup is driving a new kind of tourism in Mexico: Working remotely and living football for weeks at a time

April 28, 2026

Trial to begin for Austrian man accused of plotting Taylor Swift concert attack

April 28, 2026

Iran beats Trump (at the World Snooker Championship) – POLITICO

April 28, 2026

Watch: Champions League semi-finals: When football meets global capital

April 28, 2026

Taylor Swift files to trademark voice and image over AI concerns

April 28, 2026
Facebook X (Twitter) Instagram
Web Stories
Facebook X (Twitter) Instagram
Daily Guardian Europe
Newsletter
  • Home
  • Europe
  • World
  • Politics
  • Business
  • Lifestyle
  • Sports
  • Travel
  • Environment
  • Culture
  • Press Release
  • Trending
Daily Guardian EuropeDaily Guardian Europe
Home»Lifestyle
Lifestyle

This AI tool can detect domestic abuse risk years before victims seek help

By staffMarch 15, 20263 Mins Read
This AI tool can detect domestic abuse risk years before victims seek help
Share
Facebook Twitter LinkedIn Pinterest Email

Published on
15/03/2026 – 8:00 GMT+1

Scientists have developed an artificial intelligence (AI) tool designed to help doctors identify patients who may be at risk of intimate partner violence (IPV), potentially years before victims seek help.

Researchers in the United States trained a machine learning model using data collected during regular hospital visits. The study was published in the journal Nature.

Intimate partner violence refers to abuse from current or former partners and can lead to severe injuries, chronic pain, and mental health disorders.

According to a report from the European Commission, 18 percent of women who had ever had a partner said they had experienced physical or sexual violence by their partner in 2021.

Current screening for domestic abuse in hospitals usually relies on doctors asking patients questions about their safety at home. But many victims do not disclose abuse because of fear, stigma or safety concerns, leading to cases often going undetected.

The research team used several years of records from nearly 850 women who had experienced intimate partner violence and more than 5,200 patients of similar age in a control group to build three different AI systems to test how well the technology could identify people at risk.

The first system analysed structured hospital data such as age, medical history, and other standard patient information. The second one looked at written medical notes, including doctors’ observations and radiology reports. The third system combined both types of information.

All three models performed strongly, but the combined system proved the most accurate. It correctly identified risk in 88 percent of cases.

The tool was also able to flag potential abuse more than three years before many patients later entered hospital-based domestic abuse intervention programmes.

By analysing large amounts of hospital data at once, the new AI system can detect patterns of physical trauma linked to abuse and flag patients whose records resemble those seen in confirmed abuse cases, allowing healthcare professionals to intervene earlier.

“This clinical decision support tool could make a significant impact on prediction and prevention of intimate partner violence,” said Qi Duan, the program director of the division of health informatics technologies at the US National Institutes of Health’s National Institute of Biomedical Imaging and Bioengineering (NIBIB).

“Given the prevalence of cases, the tool could be a game-changing asset to public health,” Duan added.

Researchers say that the technology is designed to support clinicians rather than replace their judgment. It does not diagnose abuse or force patients to disclose information. Instead, it provides a signal that may help doctors approach the topic carefully and offer support if needed.

“Our work represents a shift towards recognising risk earlier using information already present in healthcare data,” said Bharti Khurana, an emergency radiologist at Mass General Brigham and associate professor at Harvard Medical School.

Researchers say they are planning to integrate the technology into electronic medical record systems so hospitals can receive real-time assessments during routine care.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Keep Reading

Inside Woven City: Japan’s real-life sci-fi town where robots share the streets with humans

China blocks Meta from buying AI startup Manus

OpenAI just changed its principals. Here’s what’s changing

Which country in Europe has the most data centres driving the AI boom?

Can Europe keep its industrial champions in the AI era?

China’s DeepSeek releases new AI model V4. Here’s everything to know as the AI race speeds up

Explained: What is the UK digital services tax and why has it angered Trump?

What is OpenAI’s GPT-5.5, its newest ‘smartest and most intuitive’ model?

Elon Musk’s xAI discussed partnership with Mistral to try and rival OpenAI and Anthropic, report

Editors Picks

Trial to begin for Austrian man accused of plotting Taylor Swift concert attack

April 28, 2026

Iran beats Trump (at the World Snooker Championship) – POLITICO

April 28, 2026

Watch: Champions League semi-finals: When football meets global capital

April 28, 2026

Taylor Swift files to trademark voice and image over AI concerns

April 28, 2026

Subscribe to News

Get the latest Europe and world news and updates directly to your inbox.

Latest News

EU energy mix: Crisis pushes balance on nuclear, renewables and security

April 28, 2026

Video. Rare spring snowstorm hits Moscow

April 28, 2026

Capitals cool on Brussels age-check app  – POLITICO

April 28, 2026
Facebook X (Twitter) Pinterest TikTok Instagram
© 2026 Daily Guardian Europe. All Rights Reserved.
  • Privacy Policy
  • Terms
  • Advertise
  • Contact

Type above and press Enter to search. Press Esc to cancel.