Artificial Intelligence (AI) — what does this really mean to you in your day-to-day patient care?

We hear the acronym constantly, and we know we are told its use is growing. However, many skilled rehab professionals don’t fully understand how AI is being used in the healthcare space.

Better yet, let’s start the discussion with what guidance is in place for payers and beneficiaries on its use.

The American Medical Association (AMA) recently published Principles for Augmented Intelligence Development, Deployment, and Use, which was approved by AMA Board of Trustees on Nov. 14, 2023.

I would like to take the opportunity with this year’s first Rehab Realities to break down this guidance into three key areas: Regulatory oversight, definitions and use. 

Let’s begin, as we often do, with the rules. 

What are the rules? Who creates the rules? Are there any regulations my therapy team should be adhering to?

To begin, the Food and Drug Administration (FDA) regulates AI-enabled medical devices — many types of AI-enabled technologies fall outside the scope of FDA oversight, including AI that may have clinical applications, such as some clinical decision support functions.

There is currently no national policy or governance structure in place to guide the development and adoption of non-device AI. Therefore, clinical experts are best positioned to determine whether AI applications are high quality, appropriate, and whether the AI tools are valid from a clinical perspective. Clinical experts can best validate the clinical knowledge, clinical pathways and standards of care used in the design of AI-enabled tools and can monitor the technology for clinical validity as it evolves over time.

With the state of evolving regulatory the AMA guidance stresses the importance of using transparency and disclosure as essential.

Ok, understood … so what are we talking about here? How and what is AI? Can you direct me to some definitions?

Yes, there are many forms, used in various ways depending on the task at hand.

Take, for example, generative AI. This is a type of AI that can recognize, summarize, translate, predict and generate text and other content based on knowledge gained from large datasets. 

Generative AI tools are finding an increasing number of uses in healthcare, including assistance with administrative functions, such as generating office notes, responding to documentation requests, and generating patient messages. 

Additionally, there has been increasing discussion about clinical applications of generative AI, including use as clinical decision support to provide differential diagnoses, early detection and intervention, and to assist in treatment planning.

We also have the AI that is defined for actual use during care provision.

These are also defined by the AMA and were accepted in September 2021 by the CPT® Editorial Panel with the addition of a new Appendix S.

For this area there are currently three categories for AI applications. 

The classification of AI medical services and procedures as assistive, augmentative or autonomous is based on the clinical procedure or service provided to the patient and the work performed by the machine on behalf of the physician or other qualified healthcare professional (QHP).

Assistive classification

The work performed by the machine for the physician or other QHP is assistive when the machine detects clinically relevant data without analysis or generated conclusions. Requires physician or other QHP interpretation and report.

Augmentative classification

The work performed by the machine for the physician or other QHP is augmentative when the machine analyzes and/or quantifies data to yield clinically meaningful output. Requires physician or other QHP interpretation and report.


The work performed by the machine for the physician or other QHP is autonomous when the machine automatically interprets data and independently generates clinically meaningful conclusions without concurrent physician or other QHP involvement. Autonomous medical services and procedures include interrogating and analyzing data. The work of the algorithm may or may not include acquisition, preparation and/or transmission of data. The clinically meaningful conclusion may be a characterization of data (e.g., likelihood of pathophysiology) to be used to establish a diagnosis or to implement a therapeutic intervention. There are three levels of autonomous AI medical services and procedures with varying physician or other QHP professional involvement:

Level I — The autonomous AI draws conclusions and offers diagnosis and/or management options, which are contestable and require physician or other QHP action to implement.

Level II — The autonomous AI draws conclusions and initiates diagnosis and/or management options with alert/opportunity for override, which may require physician or other QHP action to implement.

Level III — The autonomous AI draws conclusions and initiates management, which require physician or other QHP initiative to contest.

Finally, where most therapists have likely seen AI being used in daily practice occurs in payor use of augmented intelligence and automated decision-making systems.

AKA pre-authorization 

Per recent AMA guidance, payors and health plans are increasingly using AI and algorithm-based decision-making in an automated fashion to determine coverage limits, make claim determinations, and engage in benefit design. 

Payors should leverage automated decision-making systems that improve or enhance efficiencies in coverage and payment automation, facilitate administrative simplification and reduce workflow burdens. 

In conclusion, the integration of AI into healthcare is an intricate journey that demands a nuanced understanding from rehabilitation professionals. By staying abreast of the AMA’s principles and classifications, therapists can navigate the evolving landscape of AI with proficiency, ensuring its responsible and effective integration into patient care.

Renee Kinder, MS, CCC-SLP, RAC-CT, serves as the Executive Vice President of Clinical Services for Broad River Rehab. Additionally, she contributes her expertise as a member of the American Speech Language Hearing Association’s (ASHA) Healthcare and Economics Committee, the University of Kentucky College of Medicine community faculty, and an advisor to the American Medical Association’s (AMA) Current Procedural Terminology CPT® Editorial Panel, and a member of the AMA Digital Medicine Payment Advisory Group. For further inquiries, she can be contacted at [email protected]

The opinions expressed in McKnight’s Long-Term Care News guest submissions are the author’s and are not necessarily those of McKnight’s Long-Term Care News or its editors.

Have a column idea? See our submission guidelines here.