Close Menu

    Subscribe to Updates

    Get the latest creative news from Healthradar about News,Health and Gadgets.

    Bitte aktiviere JavaScript in deinem Browser, um dieses Formular fertigzustellen.
    Wird geladen
    What's Hot

    Ritten Raises $35M to Scale AI-Powered Behavioral Health Platform –

    16. Dezember 2025

    Vektor Medical Receives FDA 510(k) Clearance Adding Atrial Flutter Mapping, Enhanced Connectivity and New Integration Capabilities to Next-Generation vMap

    16. Dezember 2025

    New AI Tool Doubles Clinical Trial Enrollment Rates –

    16. Dezember 2025
    Facebook X (Twitter) Instagram
    Facebook X (Twitter) Instagram Pinterest Vimeo
    healthradar.nethealthradar.net
    • Home
    • Ai
    • Gadgets
    • Health
    • News
    • Contact Us
    Contact
    healthradar.nethealthradar.net
    Home»News»FDA gets mixed feedback on performance monitoring for AI
    News

    FDA gets mixed feedback on performance monitoring for AI

    HealthradarBy Healthradar16. Dezember 2025Keine Kommentare5 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    FDA gets mixed feedback on performance monitoring for AI
    Share
    Facebook Twitter LinkedIn Pinterest Email


    The Food and Drug Administration received more than 100 comments after seeking responses on how to monitor the real-world performance of artificial intelligence in medical devices. 

    The feedback diverged, with patients calling for stronger postmarket reporting and medical groups saying reporting should be the responsibility of manufacturers. Device companies, meanwhile, called for the FDA to use its existing regulatory frameworks instead of introducing new requirements.

    The FDA’s emphasis on real-world performance comes as the agency considers how to regulate increasingly complex technologies, such as generative AI, and how to ensure the performance of AI models doesn’t degrade over time. 

    Industry groups oppose universal postmarket requirements

    Medtech lobbying groups and individual companies called for the FDA to use existing quality metrics and a risk-based approach rather than implementing universal postmarket monitoring requirements.

    AdvaMed, a medical device industry group, recommended that the FDA use existing regulatory requirements, such as those outlined in the Quality Management System Regulations, adding that they provide “robust mechanisms” for design validation and postmarket surveillance.

    “Duplicative or prescriptive new requirements for performance monitoring of AI-enabled devices risks undermining both patient safety and innovation,” the trade group wrote in comments.

    AdvaMed instead called for a risk-based approach built on QMS and international consensus standards, adding “there is no one-size-fits all approach to performance monitoring for AI-enabled devices.” 

    The Medical Device Manufacturers Association also called for a risk-based approach, adding that special monitoring requirements should be used only in special circumstances. The lobbying group said that locked AI models, which don’t change autonomously over time, may carry lower risk and not require postmarket monitoring. 

    “In contrast, continual machine learning models that update autonomously based on new data may introduce additional complexities and risks, which could call for specific monitoring mechanisms beyond standard controls,” wrote MDMA CEO Mark Leahey. 

    Olympus Corporation of the Americas also called for the use of existing quality management structures, and Masimo supported a risk-based approach. 

    Healthcare providers say monitoring should be manufacturers‘ job

    Hospitals and medical groups see a need for postmarket monitoring of AI devices, but they said that work should be manufacturers’ responsibility. Comments emphasized the growing number of AI tools, but also noted that many hospitals don’t have the resources to evaluate or monitor these technologies.

    The American Hospital Association wrote in comments that hospitals are expanding their use of AI applications. Although the technology is mostly used for administrative tools, facilities are also deploying AI-enabled medical devices.

    “The potential for bias, hallucinations and model drift demonstrates the need for measurement and evaluation after deployment,” wrote Ashley Thompson, the AHA’s senior vice president of public policy analysis and development. 

    Thompson said the FDA should update adverse event reporting metrics to include AI-specific risks. The AHA also recommended that the FDA add monitoring requirements for manufacturers, from periodic revalidation to ongoing surveillance, depending on a device’s risk. The hospital lobbying group suggested that the FDA focus its efforts on higher-risk areas related to the diagnosis of conditions or the treatment or mitigation of disease, and not clinical decision support or administrative tools. 

    “The ‘black box’ nature of many AI systems can make it more challenging for hospitals and health systems to identify flaws in models that may affect the accuracy and validity of an AI tool’s analyses and recommendations,” Thompson wrote. “As such, post-market measurement and evaluation standards should be developed for vendors.”

    Thompson added that some hospitals — particularly rural and critical access facilities — may not have the staff or resources to support AI governance and ongoing monitoring. 

    The American College of Surgeons offered similar comments, supporting postmarket monitoring but delegating that responsibility to vendors instead of surgeons or other clinicians.

    Patients call for transparent monitoring

    Patients wrote to the FDA calling for transparent monitoring and better performance metrics that reflect people’s lived experiences. 

    “When an AI-enabled device misfires, patients experience it as extra visits, extra tests, confusion about why their care plan changed, or mental distress when an automated output contradicts what they know about their own bodies,” wrote Andrea Downing, co-founder and president of the Light Collective, a nonprofit advocating for patient rights in health tech. 


    “From the patient perspective, approval is not the finish line — it is the starting line. Patients need confidence that these devices will continue to perform safely and fairly after deployment.”

    Dan Noyes

    Healthcare AI strategist


    These types of burden rarely appear in traditional reporting metrics, Downing wrote, adding that device evaluations should include considerations such as additional appointments, delays in care, confusion about the AI’s role, and emotional or mental distress.

    Dan Noyes, a healthcare AI strategist, mentioned his personal experience living with a chronic health condition in comments to the FDA. He called for disclosures to patients about how AI tools are involved in their care decisions, as well as transparency for when models are updated, and testing across diverse populations to ensure equitable performance. 

    “From the patient perspective, approval is not the finish line — it is the starting line,” Noyes wrote. “Patients need confidence that these devices will continue to perform safely and fairly after deployment.”



    Source link

    FDA feedback Mixed Monitoring Performance
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleNavy Modernization Moves Forward as T6 Health Systems‘ OpMed CDP Launches Aboard USS Carney
    Next Article New AI Tool Doubles Clinical Trial Enrollment Rates –
    ekass777x
    Healthradar
    • Website

    Related Posts

    News

    Ritten Raises $35M to Scale AI-Powered Behavioral Health Platform –

    16. Dezember 2025
    Gadgets

    Vektor Medical Receives FDA 510(k) Clearance Adding Atrial Flutter Mapping, Enhanced Connectivity and New Integration Capabilities to Next-Generation vMap

    16. Dezember 2025
    News

    Lunit and Daiichi Sankyo Partner to Accelerate Cancer Biomarker Discovery with AI

    16. Dezember 2025
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    Garmin Venu 4: Everything we know so far about the premium smartwatch

    7. August 202577 Views

    The Top 3 Tax Mistakes High-Earning Physicians Make

    7. August 202530 Views

    Nanoleaf LED face mask review: fantastic value for money, but only by cutting some corners

    16. Oktober 202527 Views

    Linea Expands AI-Powered Heart Failure Care Solution

    6. August 202519 Views
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    Latest Reviews

    Subscribe to Updates

    Bitte aktiviere JavaScript in deinem Browser, um dieses Formular fertigzustellen.
    Wird geladen
    About Us

    Welcome to HealthRadar.net — your trusted destination for discovering the latest innovations in digital health. We are dedicated to connecting individuals, healthcare professionals, and organizations with cutting-edge tools, applications

    Most Popular

    Garmin Venu 4: Everything we know so far about the premium smartwatch

    7. August 202577 Views

    The Top 3 Tax Mistakes High-Earning Physicians Make

    7. August 202530 Views
    USEFULL LINK
    • About Us
    • Contact Us
    • Disclaimer
    • Privacy Policy
    QUICK LINKS
    • Ai
    • Gadgets
    • Health
    • News
    • About Us
    • Contact Us
    • Disclaimer
    • Privacy Policy
    Copyright© 2025 Healthradar All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.