I'd like to talk to you today about Targeted Provider Feedback as an approach for quality improvement. In the area of Health Informatics and computers, we should really be harnessing this power, the power of analytics, to change people's behavior. So, we as a group thought, we should bring these data, performance data, back to the individual clinical providers and back to the units in our hospital to see if the given back data can actually improve care. We also had a secondary question, can a little bit of healthy competition drive improvements of care? So, in my clinical practice, I'm a trauma surgeon, so I started with me and my trauma surgery colleagues on our trauma surgery service. And we first began that we were going to start with the trauma attendings, and to see if we could give them feedback on how the attendings were doing in ordering VTE prophylaxis for admitted trauma patients. So, we as the attendings are these dark bars, and as you can see, all of those bars are in the pretty similar range. There's no statistically significant difference between the dark bars for the attendings. The action is really on the residents. So, each resident is a light-colored bar. And as you can see, there are some residents who are in the zero range, that over the course of an entire year, never followed the best practice care driven by a clinical decision support tool, so it should tell them exactly what to do about what to order. They never followed the recommendations of the decision support tool, so they were zero, seven of those residents. There are, however, a bunch of other residents, probably about half of them, that were 100 percent giving the appropriate prophylaxis. So, every single time they admitted a trauma patient, they used the decision support tool and followed the recommendation of what they should be doing. This is where the variation is. And as you would expect, there's a statistically significantly wide variation in practice for the residents and there is none for the attendings. So, why is that? The attendings, it turns out, we don't write the orders in the computer. We never are the ones who are doing the clicking of ordering the best practice prophylaxis. That's up to the residents. And as it turns out, there's variation and that's what we have decided to target. So, this data helped us figure out who the right target should be. So, what did we do for the residents? We gave them a scorecard. This actually started with data for September and literally, on October first, sent out an email to every single surgery resident with their data for the previous month. So, each score card for each month has the residents, their ID identified. You would know who you are. You're A123, you're A345, you're A91011. So, each person has a unique identifier number so they can tell where they are and they can tell where they are compared to their peers. Am I the best? Am I middle of the pack? Am I the worst? So, then what we did was on each subsequent month, we gave them more data. We gave them the previous month's data. And as you can see, the residents very rapidly improved in their ordering prophylaxis. So, September was before any intervention. October was just one month change where they got their one score card from September. As you can see, there's a lot less red and there's a lot more green. A huge difference in October, and then by November, it's almost auguring and we had moved from the red zone, less than 90, into the yellow zone, the 90 to 96 percent zone, to above into the green zone, above 96 percent, just over the course of giving back a few month's worth of data. Then what did we do? We kept that going for a whole entire year. So, these points, each one is one month's worth of data. The big black line up the middle is the before versus after. And as you can see in the before period, we were in the high 80s, we were kind of doing okay middle of the road. But then, when we made a difference, you've already seen we went up into then above 96, 96, 97, 98 percent. We're doing much, much better. Now, there is that July effect that creeps in and there's that one month that goes down. And what was that? That's the first month when new interns show up or we had residents that were out in lab time that came back in and we're back into the clinical realm. They didn't know the process. We must not have done a very good job onboarding and getting them up to speed, but very quickly, they got back up to where they're supposed to be. And as you can see, this is not just a one or two-month change. This change was ongoing over over a year period. Makes a big difference. So then, what can you do with that? This type of quality improvement can actually lead to fundable research. This project that I just showed you was the preliminary data. It was co-authored by myself and a resident team of quality improvement project team members for their required residency project on quality improvement along with my research team. So, what did that lead to? That led to the preliminary data we used to apply for and receive a five-year RO1 research grant from HRQ, the agency for Healthcare Research and Quality. And our grant that we're working on is individualized performance feedback on venous thromboembolism prevention practice. It's basically taking what I've just shown you on a small set of surgical residents and scaling that up, and our plan that we're actively working on is scaling it up to the rest of Johns Hopkins Hospital for all different types of other residents as well as a plan to spread it across the health system to attending physicians at hospitals that don't have residents and to other practitioners, nurse practitioners, physicians' assistants, who are also providing VTE prevention, giving them a scorecard and seeing if we can make ongoing improvements. You can also imagine that something like this might be beneficial for other types of quality improvement interventions. We've done it for VTE. I would challenge you to do it for other initiatives that you have. Provide the feedback, give the directed information, give them the data, and improve practice.