ystems Options or by LAS's own Work Based Trainers (WBTs). …This training was not always comprehensive and was often inconsistent. The problems were exacerbated by the constant changes being made to the system."Facts are taken from http://catless.ncl.ac.uk/Risks, http://www.scit.wlv.ac.uk and the report of the Inquiry into the London Ambulance Service, February 1993.2.5 Poor user-interfaceThe last case was a good example of how a poor user-interface can lead to mayhem. Another similar case was reported to the Providence newspaper. The Providence (part of New York) police chief, Walter Clark, was grilled over why his officers were taking so long to respond to calls. In one case it took two hours to respond to a burglary in progress. He explained that all the calls are entered into a computer and are shown on a monitor. However the monitor can only show twenty reports at a time as the programmer did not design a scroll function for the screen. The programmer had some serious misconceptions about the crime rate in New York. Facts taken from: http://catless.ncl.ac.uk/Risks.2.6 Over reliance on the software systemThe Exxon Valdez oil disaster was simultaneously blamed on the drunken captain, the severely fatigued third mate, the helmsman and the "system". The system refers to the auto-pilot of the ship and the lack of care the crew had on its operation. According to Neumann the crew were so tired that they did not realise that the auto-pilot was left on and so the ship was ignoring their rudder adjustments. This example shows that even though everything was working properly, all the safety measures had a minimal effect when they were trying to override the auto-pilot. This is a very small mistake and could easily have been prevented.The Therac-25 case, a system designed to give the right amount of radiation to the patient in chemotherapy treatment also fell into a case "foolproofedness". The operators did not imagine the "…softwa...