Patient access departments are increasing registration accuracy with quality assurance tools, which results in more “clean claims” and reduced accounts receivable days.
- Outcomes may worsen initially, due to increased number of data elements.
- Registrars are coached right after the error is made.
- Training time is reduced, due to built-in rules and edits that catch errors.
Patient access leaders at Marion (IN) General Hospital wrote rules, tested, and trained more than 70 registrars before going live with a new electronic quality assurance (QA) tool in March 2014. The tool (AhiQA, manufactured by Alpharetta, GA-based Relay Health) allows registrars to see errors right after the registration is complete, so they can correct them immediately,
Prior to implementing the tool, “QA was tracked manually as best we could,” says patient access manager Teresa Adams, CHAM.
Some registration areas reported errors through their own clinical departments instead of patient access. “Some non-reporting departments were negatively impacting the overall QA results, with as low as 79% quality outcomes,” says Adams.
Staff members find it challenging to complete registration QA checks, especially if registration is a small percentage of their overall duties, adds Adams. The time commitment needed for training on completing accurate registrations is another challenge. “Clinical areas have a small quantity of cross-trained staff to cover while core registration personnel are attending training,” Adams explains.
Overall QA at 98.94%
Although the hospital’s overall QA score was about 95% — patient access wanted to improve it further. They set a target of between 97% and 100%.
“We realized there would be a transition while staff were on a learning curve,” says Adams. During the transition, managers still were tweaking the QA tool, by evaluating the rules that were installed and creating new ones.
Staff sometimes misinterpret a rule as “invalid/no action required.” “However, if they do not dispute the error, we will not be able to review and accept their dispute,” resulting in an unfixed error, says Adams. This process results in a lower QA score for the individual employee and the department.
Disputes allow patient access leaders to clear the error from being counted against an individual registrar. “Disputes also help us to rewrite or tweak rules, because of trends identified,” says Adams.
The hospital’s March 2016 overall QA score was 98.94%. “Although that score seems pretty high, there were still 301 accounts or claims tracked through the QA tool that had errors and were not fixed within the 72-hour final bill cycle,” says Adams. These claims had to be stopped and fixed before electronic billing could be completed. “Although that’s a very small percentage of the total claims, the efficiency of the patient accounts team is negatively impacted,” says Adams. Education on the front end is the department’s primary focus for 2016.
“Depending on each provider’s average claim size, those claims can then be translated into actual dollars or A/R [accounts receivable] days,” says Adams. Patient access management teams are coaching front-end users as close to “live time,” when the error was made, as possible, to prevent future errors.
“It is difficult to achieve 100% user attendance at monthly department meetings,” says Adams. “The QA tool allows coaching of all users — in our case, approximately 100 users — live-time and daily.”
Adams says of the QA tool, “There is no question that it improves accountability.” Front-end users receive a daily worklist telling them what errors are counting against them that they need to fix.
“They are trained to fix errors live time when possible and, at a minimum, prior to end of shift,” says Adams. (See related story later in this issue on how the tool’s implementation affected billing processes.) Users receive report cards that tell them their personal QA results, with a letter grade and percentage. The cards tell them how their scores compare to all hospital users. [A sample report card used by the department is included with the online issue. For assistance accessing your online subscription, contact customer service at Customer.Service@AHCMedia.com or (800) 688-2421.]
“They must review the report card with their Team Lead,” says Adams. “We expect them to get a perfect 100% score.”
This expectation of 100% is because the QA tool tells staff members exactly what errors to fix. “If they fix it, the error doesn’t count against them. If they disagree with the error, they dispute it,” says Adams.
If the reviewer agrees with the employee’s reasoning, the error doesn’t count against the employee. “If the reviewer disagrees, they reject the dispute, yet tell the user why,” says Adams.
At Cottage Hospital in Woodsville, NH, registrars have 72 hours to correct errors identified by the department’s QA tool (also AhiQA) before it affects their registration accuracy. Jennifer A. White, director of patient access, says, “It has increased the amount of clean claims and decreased finger pointing.”
Registrars receive monthly report cards that they review, sign, and turn in to their director for their employee files. “The ability to create your own edits is beneficial, as payer requirements are ever-changing,” says White.
The QA tool decreases the amount of time needed for training, because the edits and rules built into the tool decrease errors. “By the end of the first week, new hires are reviewing edits and making changes to their process,” says White.
By the end of the second week, new hires are working independently. “They are confident that if they miss something, the tool will remind them,” says White.