One of the reasons experts and government policy-makers pushed for and incented greater use of health information technology in health care was a belief (largely unsupported even by the evidence existing at that time) that it would result in better quality, lower spending and greater efficiency for providers. One class of health information technology is clinical decision support software, which is supposed to help clinicians make better decisions by providing quick access to guidelines and evidence and by giving alerts when a provider might be missing something or making a less than optimal treatment choice. Much clinical decision support software is used in the prescribing and management of medication therapies. Research in the British Medical Journal Quality & Safety examines what happens in response to alerts from clinical decision support systems in regard to medication therapies. (BMJ Article) The authors conducted a prospective observational study in six intensive care units in a large Massachusetts health system from July 2016 to April 2017. They examined provider over-rides of clinical decision support alerts for dose, drug allergy, drug-drug interaction, age-related issues and renal issues. The review was done by two independent experts, with the outcomes of interest being the appropriateness of the over-ride, incidence of adverse events after an over-ride and the risk of adverse events given the appropriateness of the over-ride. There were a total of 2448 over-rides. The reviewers judged about 82% of the over-rides to have been appropriate. However, there were 16.5 adverse drug events per 100 inappropriate over-rides versus only 2.7 for appropriate over-rides. While it is hard to assess what is really happening, it is alarming (no pun intended) that some many of they system-generated alerts are clearly unnecessary or wrong, and were appropriately over-riden. Thank goodness for human intervention. And it may be that clinicians are so used to over-riding, that they do it by rote and over-ride without careful thought, which leads to inappropriate disregarding of the alert. Although not contained in the study, it would be very useful to know what happened when an alert wasn’t over-riden; how many times did that lead to adverse events or other problems. What is clear is that a lot of improvement is needed in clinical decision support software.