Another very interesting article by Tony Collins, the editor of Computer Weekly. He reviews forty years of government spending on failed supercomputers.
- the public sector is currently spending £15bn pa on IT, or about £600 for every British household
- some major suppliers have historically done very well out of it: when EDS won the Inland Revenue IT outsourcing contract in 1994, £1bn of contracted work turned into £2bn once "extras" were added; Capgemini and Accenture have also scored heavily with "extras"
- Whitehall has always had a prediliction for the "dark side" of computing: the mega projects that are wildly ambitious, are generally imposed on users from above, and which "trudge on" for years despite diminishing hopes of success- cf the NHS supercomputer
- benefits in the form of cost savings are rarely properly quantified: twenty years ago the massive Opstrat project to automate welfare payments took a decade, ended up costing four times its original £700m budget, and according to the NAO, produced supposed cost savings that were so poorly measured as to be useless- cf the NHS Supercomputer
- there is a systematic failure to recognise failure: officials and politicians are for ever talking up their projects despite often widespread evidence they are in trouble- cf...er, the NHS Supercomputer
As Collins concludes, private sector firms who fall victim to mad IT mega-projects tend to "think small" next time. Now why doesn't that apply in the public sector?
PS Collins also reminds us of the grandiose names people like to stick on their big IT projects. I can recall the 80s City penchant for names like "Grand Plan", "Blitzkreig", and for a new mega-accounting system in a Jewish (!) merchant bank, the most extraordinarily tasteless "Final Solution".