Category Archives: CSC

What do Ben Bradshaw, Caroline Flint and Andy Burnham have in common?

By Tony Collins

Ben Bradshaw, Caroline Flint and Andy Burnham have in common in their political past something they probably wouldn’t care to draw attention to as they battle for roles in the Labour leadership.

Few people will remember that Bradshaw, Flint and Burnham were advocates – indeed staunch defenders – of what’s arguably the biggest IT-related failure of all time – the £10bn National Programme for IT [NPfIT.

Perhaps it’s unfair to mention their support for such a massive failure at the time of the leadership election.

A counter argument is that politicians should be held to account at some point for public statements they have made in Parliament in defence of a major project – in this case the largest non-military IT-related programme in the world – that many inside and outside the NHS recognised was fundamentally flawed from its outset in 2003.

Bradshaw, Flint and Burnham did concede in their NPfIT-related statements to the House of Commons that the national programme for IT had its flaws, but still they gave it their strong support and continued to attack the programme’s critics.

The following are examples of statements made by Bradshaw, Flint and Burnham in the House of Commons in support of the NPfIT, which was later abandoned.

Bradshaw, then health minister in charge of the NPfIT,  told the House of Commons in February 2008:

“We accept that there have been delays, not only in the roll-out of summary care records, but in the whole NHS IT programme.

“It is important to put on record that those delays were not because of problems with supply, delivery or systems, but pretty much entirely because we took extra time to consult on and try to address record safety and patient confidentiality, and we were absolutely right to do so…

“The health service is moving from being an organisation with fragmented or incomplete information systems to a position where national systems are integrated, record keeping is digital, patients have unprecedented access to their personal health records and health professionals will have the right information at the right time about the right patient.

“As the Health Committee has recognised in its report, the roll-out of new IT systems will save time and money for the NHS and staff, save lives and improve patient care.”

[Even today, 12 years after the launch of the National Programme for IT, the NHS does not have integrated digital records.]

Caroline Flint, then health minister in charge of the NPfIT,  told the House of Commons on 6 June 2007:

“… it is lamentable that a programme that is focused on the delivery of safer and more efficient health care in the NHS in England has been politicised and attacked for short-term partisan gain when, in fact, it is to the benefit of everyone using the NHS in England that the programme is provided with the necessary resources and support to achieve the aims that Conservative Members have acknowledged that they agree with…

“Owing to delays in some areas of the programme, far from it being overspent, there is an underspend, which is perhaps unique for a large IT programme.

“The contracts that were ably put in place in 2003 mean that committed payments are not made to suppliers until delivery has been accepted 45 days after “go live” by end-users.

“We have made advance payments to a number of suppliers to provide efficient financing mechanisms for their work in progress. However, it should be noted that the financing risk has remained with the suppliers and that guarantees for any advance payments have been made by the suppliers to the Government…

“The national programme for IT in the NHS has successfully transferred the financing and completion risk to its suppliers…”

Andy Burnham, then Health Secretary, told the House of Commons on 7 December 2009:

“He [Andrew Lansley] seems to reject the benefits of a national system across the NHS, but we do not. We believe that there are significant benefits from a national health service having a programme of IT that can link up clinicians across the system. We further believe that it is safer for patients if their records can be accessed across the system…” [which hasn’t happened].

Abandoned NHS IT plan has cost £10bn so far

Advertisements

Are passport officials hiding IT problems?

By Tony Collins

Are Passport Office systems crashing regularly – for up to half a day – without anyone outside knowing?

Last month a Home Office spokesman told Government Computing that IT was “not to blame” for delays in issuing passports.

But yesterday a Passport Office insider gave the opposite impression to Eddie Mair’s BBC R4 “PM” programme. She said that passport systems are sometimes out of action for up to half a day.

It also emerges that the Passport Office’s contract with one of its contractors, Steria, takes into account, in peak periods, the sorts of numbers of applications that offices are receiving – about 150,000 a week.

Home Office minister James Brokenshire told the House of Commons yesterday (7 July 2014) that there have been about 4 million applications so far this year, implying that the number is unexpectedly high.

But 4 million applications is not out of line with Steria’s contractual expectations of up to 150,000 applications a week.

This raises the question of whether the delays are due to a combination of IT problems and high numbers of applications – rather than high numbers alone.

On the BBC “PM” programme the comments by the Passport Office insider were spoken by an actor.  She said the backlog of passport applications has increased since the government announced emergency measures last month.

” The numbers have increased significantly and they are just the ones in the system.  What you need to take into account – and I don’t think people have realised – is that we have another huge backlog of applications that have not been scanned onto the system and are not in process.

“The backlog of applications – what they call “work in progress” – is not a true figure because but you still have another backlog that has not been scanned …”

BBC reporter: So there is another set of passports that don’t appear in official figures?

“That’s correct.”

She said that increasing backlogs are indicated on figures on boards around offices that give dates of which offices are working on what on certain dates.

“Her Majesty’s Passport Office is in total crisis. It’s a total mess …  it is chaos.”

BBC reporter:  A member of my family phoned up to try and make an appointment to sort out a passport last week and while she was on the phone she was told the computer kept crashing. Is that a common problem?

“Yes.  Once the system crashes you cannot issue anything or do anything. You just have to sit there until the technicians or the IT experts reboot the system or put a patch in to get it up and running again.”

BBC: How long does that last? How long are the computers and therefore the people handling the passports out of action for?

“Sometimes I would say a minimum of half an hour up to half a day.”

She said that passports are being stored in meeting and conference rooms which are locked. The windows have been “blacked out or covered with paper so no photos can appear, for instance, in the Guardian“.

She does not believe the Home Secretary Theresa May or her ministers have a grip of the situation. “I don’t think they understand. I am not too sure whether it is because they haven’t been fed the correct information or whether they are just putting their heads in the sand.”

The Home Office said a minister was not available to speak to the BBC.  It provided a statement instead – which gave no response to the insider’s claims of computer problems.

The statement said:

“These allegations are false. We receive thousands of application every week with their numbers constantly changing. We aim to log applications within 48 hours of receipt at which point they become active work in progress…”

Update 18.00 8 July 2014

Paul Pugh, Chief Executive of the Passport Office, appeared before MPs on the Home Affairs Committee this afternoon and was not asked about IT problems and made no mention of any.

He declined to say how many applications have been received by the Passport Office which have not been scanned into the systems. He said the number varied daily. He conveyed a quiet self-confidence which wasn’t in any way dented by committee members who in general did not put him under pressure.

They thanked him repeatedly for dealing with complaints from their constituents.  Committee chairman Keith Vaz handed Pugh 180 emails from people who are facing delays and need a passport urgently.

He denied the Passport Office was in chaos and said the “vast majority” of people working there would disagree with the comments of the anonymous contributor to the BBC “PM” programme.

Comment

The insider’s comments may be relevant given that the Passport Office had serious IT processing difficulties when it has changed its main processing systems in 1989 and 1999.   Has it had a third IT-related calamity as a result of an upgrade of passport systems in 2013?

US-based supplier CSC helped with the upgrade last year but no information has been released on the change-over.  The new system was installed as part of a $570m services contract the Passport Office awarded CSC in 2009.

Steria manages the front-end of passport application processing. It receives applications from the public, scans them digitally, verifies the contents, checksthe scanned documents for accuracy and makes corrections where necessary, and banks payments received. It then passes applications to Her Majesty’s Passport Office to complete the examination.

Steria expected up to 150,000 passport applications a week at peak times and it appears that actual applications have been around this number for much of this year. So why are ministers and officials blaming delays on a record number of applications?

There may be IT problems that nobody in officialdom is mentioning.

The most worrying thing is that they do not have to mention them. Perhaps the cause or causes are complex and they are unsure who bears most of the responsibility.

In politics nobody seems to expect the truth to be told. So it’s likely that ministers and officials will continue to blame the delays in issuing passports on record numbers of applications, and not mention anything to do with IT, except to deny it has anything to do with the backlog.

BBC PM programme (approx 47 minutes into the programme).

How well is new passport IT coping with high demand?

By T0ny Collins

In 1989 when the Passport Agency introduced new systems avoidable chaos ensued. A decade later, in 1999, officials introduced a new passport system and avoidable chaos ensued. Jack Straw, the then Home Secretary, apologised to the House of Commons.

Last year HM Passport Office introduced, after delays,  a replacement passport system, the Application Management System. It was built with the help of the Passport Office’s main IT supplier CSC under a 10-year £385m contract awarded in 2009.

The Passport Office said at the time the new system was designed “to be easier to use and enable cases to be examined more efficiently”. So how well is the system coping with unusually high demand, given that an objective was to help passport staff deal with applications more efficiently?

The answer is that we don’t know: open government has yet to reach HM Passport Office. It publishes no regular updates on how well it is performing, how many passports it is processing each month or how long it is taking on average to process them. It has published no information on the performance of the Application Management System or how much it has cost.

All we know is that the system was due to be rolled out in 2012 but concerns about how well it would perform after go-live led to the roll-out being delayed a year. In the past 18 months it has been fully rolled out.

Comment

Has there been a repeat of the IT problems that seriously delayed the processing of applications in 1989 and 1999? In both years, passport officials had inadequate contingency arrangements to cope with a surge in demand, according to National Audit Office reports.

Clearly the same thing has happened for a third time: there have been inadequate contingency arrangements to cope with an unexpectedly high surge in demand.

How is it the passport office can repeatedly build up excessive backlogs without telling anyone? One answer is that there is a structural secrecy about internal performance.

Despite attempts by Francis Maude and the Cabinet Office to make departments and agencies more open about their performance, the Passport Office is more secretive than ever.

It appears that even the Home Secretary Theresa May was kept in the dark about the latest backlogs.  She gave reassuring statistics to the House of Commons about passport applications being processed on time – and only days later conceded there were backlogs.

It’s a familiar story: administrative problems in a government agency are denied until the truth can be hidden no longer because of the number of constituents who are contacting their MPs.

David Cameron said this week that up to 30,000 passport applications may be delayed.

One man who contacted the BBC said he had applied for a passport 7 weeks before he was due to travel. The passport office website said he should get a new passport in 3 weeks. When it had not arrived after 6 weeks he called the passport office and was told he’d be called back within 48 hours. He wasn’t, so he called again and was told the same thing. In the end he lost his holiday.

In 1989 the IT-related disaster was avoidable because managers continued a roll-out even though tests at the Glasgow office had shown it was taking longer to process passport applications on computers than clerically. Backlogs built up and deteriorating relations with staff culminated in industrial action

In 1999 electronic scanning of passport applications and added security checks imposed by the new systems caused delays and lowered productivity.  Even so a national roll-out continued. Contingency plans were inadequate, said the National Audit Office.

Does the “new” Application Management System show down processing of applications? We don’t know. The Passport Office is keeping its 2014 statistics to itself.

Decades of observing failures in government administration have taught me that chaos always seems to take officialdom by surprise.

If departments and agencies had to account publicly for their performance on a monthly and not just an annual basis, the public, MPs, ministers and officials themselves, would know when chaos is looming. But openness won’t happen unless the culture of the Passport Office changes.

For the time being its preoccupation seems to be finding whoever published photos of masses of files of passport applications seemingly awaiting processing.

The taking and publication of the photos seems to be regarded as a greater crime than the backlogs themselves.  To discourage such leaks the Passport Office has sent a threatening letter to staff.

But innocuous leaks are an essential part of the democratic process. They help ministers find out what’s going on in their departments and agencies.  Has government administration really come to this?

 

Top 5 posts on this site in last 12 months

Below are the top 5 most viewed posts of 2013.  Of other posts the most viewed includes “What exactly is HMRC paying Capgemini billions for?” and “Somerset County Council settles IBM dispute – who wins?“.

1) Big IT suppliers and their Whitehall “hostages

Mark Thompson is a senior lecturer in information systems at Cambridge Judge Business School, ICT futures advisor to the Cabinet Office and strategy director at consultancy Methods.

Last month he said in a Guardian comment that central government departments are “increasingly being held hostage by a handful of huge, often overseas, suppliers of customised all-or-nothing IT systems”.

Some senior officials are happy to be held captive.

“Unfortunately, hostage and hostage taker have become closely aligned in Stockholm-syndrome fashion.

“Many people in the public sector now design, procure, manage and evaluate these IT systems and ignore the exploitative nature of the relationship,” said Thompson.

The Stockholm syndrome is a psychological phenomenon in which hostages bond with their captors, sometimes to the point of defending them.

This month the Foreign and Commonwealth Office issued  a pre-tender notice for Oracle ERP systems. Worth between £250m and £750m, the framework will be open to all central government departments, arms length bodies and agencies and will replace the current “Prism” contract with Capgemini.

It’s an old-style centralised framework that, says Chris Chant, former Executive Director at the Cabinet Office who was its head of G-Cloud, will have Oracle popping champagne corks.

2) Natwest/RBS – what went wrong?

Outsourcing to India and losing IBM mainframe skills in the process? The failure of CA-7 batch scheduling software which had a knock-on effect on multiple feeder systems?

As RBS continues to try and clear the backlog from last week’s crash during a software upgrade, many in the IT industry are asking how it could have happened.

3) Another Universal Credit leader stands down

Universal Credit’s Programme Director, Hilary Reynolds, has stood down after only four months in post. The Department for Work and Pensions says she has been replaced by the interim head of Universal Credit David Pitchford.

Last month the DWP said Pitchford was temporarily leading Universal Credit following the death of Philip Langsdale at Christmas. In November 2012 the DWP confirmed that the then Programme Director for UC, Malcolm Whitehouse, was stepping down – to be replaced by Hilary Reynolds. Steve Dover,  the DWP’s Corporate Director, Universal Credit Programme Business, has also been replaced.

4) The “best implementation of Cerner Millennium yet”?

Edward Donald, the chief executive of Reading-based Royal Berkshire NHS Foundation Trust, is reported in the trust’s latest published board papers as saying that a Cerner go-live has been relatively successful.

“The Chief Executive emphasised that, despite these challenges, the ‘go-live’ at the Trust had been more successful than in other Cerner Millennium sites.”

A similar, stronger message appeared was in a separate board paper which was released under FOI.  Royal Berkshire’s EPR [electronic patient record] Executive Governance Committee minutes said:

“… the Committee noted that the Trust’s launch had been considered to be the best implementation of Cerner Millennium yet and that despite staff misgivings, the project was progressing well. This positive message should also be disseminated…”

Royal Berkshire went live in June 2012 with an implementation of Cerner outside the NPfIT.  In mid-2009, the trust signed with University of Pittsburgh Medical Centre to deliver Millennium.

Not everything has gone well – which raises questions, if this was the best Cerner implementation yet,  of what others were like.

5) Universal Credit – the ace up Duncan Smith’s sleeve?

Some people, including those in the know, suspect  Universal Credit will be a failed IT-based project, among them Francis Maude. As Cabinet Office minister Maude is ultimately responsible for the Major Projects Authority which has the job, among other things, of averting major project failures.

But Iain Duncan Smith, the DWP secretary of state, has an ace up his sleeve: the initial go-live of Universal Credit is so limited in scope that claims could be managed by hand, at least in part.

The DWP’s FAQs suggest that Universal Credit will handle, in its first phase due to start in October 2013, only new claims  – and only those from the unemployed.  Under such a light load the system is unlikely to fail, as any particularly complicated claims could managed clerically.

 

MP calls for candour after Cerner NPfIT go-live at Croydon

By Tony Collins

Richard Bacon, a long-standing member of the House of Commons’ Public Accounts Committee, has called on Croydon Health Services NHS Trust to be more open about problems it faces after deploying a Cerner Millennium patient records system at the end of September.

The installation was carried out by BT under the London Programme for IT – a branch of the NPfIT.  The Health and Social Care Information Centre, which has taken on BT and CSC contracts under the NPfIT, was the trust’s partner for the Cerner deployment.

Bacon has closely followed the NPfIT and written a chapter on it in his book, “Conundrum: Why every government gets things wrong and what we can do about it” which he co-wrote with Christopher Hope, the Telegraph’s senior political correspondent.

According to fragments of information in Croydon Health Services’ latest board papers, dated 25 November 2013, the trust has faced a series of problems after the NPfIT Cerner go-live.

They included:

–  N3 Network downtime and waiting time breaches.

 Excessive waits for patients in A&E

 Going over budget.

– Significant loss of income.

 A bid to recover Cerner costs.

– A need for HSCIC support for delays. 

-A need for extra investment in Cerner to “stabilise the operational position”

The trust has not published any specific report on the implementation’s problemsNow Bacon says it is “unacceptable for any trust not to disclose the problems it faces – and possibly patients face – after a major IT implementation such as Cerner”.

He adds:

“If these implementations go wrong they can affect the safety of patients.  We know this from some NPfIT deployments at other  trusts. For Croydon to say that board members have been kept informed of the potential risks of the Cerner implementation through the “Corporate Risk and Board Assurance Framework”  is not reassuring.

“This is putting a matter of importance in the small print. Indeed, for officials to brief board members on the potential risks, rather than actual events, is also of concern.

“Patients need to know that Croydon takes a duty of candour seriously. If the Trust cannot be open about its IT-related problems, how can we be sure it will be open about anything else to do with patient safety?”

Patient records go-live “success” – or a new NPfIT failure 

Patient records go-live “success” – or a new NPfIT failure?

By Tony Collins

John Goulston says the go-live of a new patient records system at his trust is a “success”.

He should know. He’s Chief Executive of Croydon Health Services NHS Trust. He’s also chair of the trust’s Informatics Programme Board which has taken charge of bringing Cerner Millennium to Croydon’s community health services and the local University Hospital, formerly the Mayday.

He was formerly Programme Director of the London Programme for IT at NHS London – a branch of the NPfIT.

In a report two weeks ago Goulston said the trust deployed the “largest number of clinical applications in a single implementation in the NHS”. Croydon went live with Cerner Millennium on 30 September and 1 October 2013.

Said Goulston in his report:

“Administrative functions do not engage clinicians; providing them with a suite of clinical functionality has been justified as each weekday approx. 1,000 staff are logged on and using the system. CHS [Croydon Health Services] has in Phase 1 deployed, in addition to patient administration, the largest number of clinical applications in a single implementation in the NHS England.”

BT helped install Millennium at Croydon under the National Programme for IT.  The trust’s spokesman says the Department of Health provided central funding, and the trust paid for implementation “overheads”.  The Health and Social Care Information Centre was the trust’s partner for the go-live.

The Centre is the successor for Connecting for Health. It has taken on CfH’s officials who continue to help run the NPfIT contracts with BT and  CSC.

Goulston said that Cerner and BT have paid tribute to the trust which installed Millennium in A&E, outpatients, secretarial support and cancer services, and elsewhere.

“Our partners Cerner, BT and Ideal have commented that the Trust has undertaken one of the most efficient roll-outs of the system they have worked on, with more users adopting the system more quickly and efficiently than other trusts … the success we have achieved to date is the result of the efforts of every single system user and all staff members,” said Goulston.

Best Cerner implementation yet?

Optimistic remarks about their launch of Cerner Millennium were also made in 2012 by executives at the Royal Berkshire NHS Foundation Trust.  Their optimism proved ill-judged.

Of the Millennium go-live at Royal Berkshire, trust executives said that it “had been considered to be the best implementation of Cerner Millennium yet and that despite staff misgivings, the project was progressing well”.   This positive message should be disseminated, they said.

Months later they told the Reading Chronicle of patient safety issues and a financial crisis arising from the Millennium implementation.

A Royal Berkshire governors Rebecca Corre was quoted as saying: “There is a patient safety issue when staff write down observations and then there is an hour before they can get it onto the computer. If it is an experienced nurse, they may pick up a problem, but others may not.”

Ed Donald, Chief Executive of Royal Berkshire was quoted as saying:

“Unfortunately, implementing the EPR [electronic patient record] system has at times been a difficult process and we acknowledge that we did not fully appreciate the challenges and resources required in a number of areas.”

Are executives and managers at Croydon Health Services NHS Trust  now similarly afflicted with an unjustified optimism about the success of their Cerner go-live?  

Past consequences of NPfIT go-lives hidden?

The Department of Health has claimed benefits for the NPfIT of £3.7bn to March 2012 but there have been trust-wide failures: thousands of patients have had their appointments, care or treatment delayed by difficulties arising from past implementations of patient record systems under the NPfIT.  For thousands of patients waiting time standards have been exceeded or “breached” because of disruption arising from troubled go-lives.

In nearly every case trusts made it difficult for the facts to come out publicly. Vague or unexplained fragments of information about the consequences of the NPfIT implementation appeared  in different board papers over several months. The facts only emerged after a journalistic investigation that required scrutiny of many board papers and follow-up questions to the trust’s press office.

So Campaign4Change investigated Croydon Health’s implementation of Cerner Millennium to see if the Francis report’s call for a “duty of candour” over mistakes and problems in the NHS have made any difference to the traditional fragmentation of facts after NPfIT go-lives of patient record systems.

The Francis report called for “openness, transparency and candour“.  Trusts were told not to hide sub-standard practices under the carpet. The health secretary Jeremy Hunt said it can be “disastrous” when bad news does not emerge quickly and the public are kept in the dark about poor care.

To my questions about the Cerner Millennium implementation Croydon trust’s spokesman always responded promptly and tried to be helpful. But it appears that trust executives have given him limited information about consequences of the go-live, and have preferred to indulge the “good news” NHS culture that Jeremy Hunt warned about.

On being asked what problems the trust has faced since the go-live the spokesman gave various answers that made no mention of the problems.

“All of our staff received training on the system, and we are continuing to offer our teams support as it is embedded.”

What of the problems arising from the implementation, and has the board been fully informed?

“Millennium has featured regularly on the Corporate Risk Register presented to each Part 1 Board meeting.   In addition, implementation has received detailed confidential consideration at Part 2 of Board meetings, (which is why you won’t find it in our public board papers).”

Given Francis’s call for duty of candour,  should the trust be more open about its problems?

“The initial roll out for CRS Millennium was introduced over three days at the Trust, with a phased approach.  We did this to ensure the system was working in each department, before introducing it in another area.

“We are monitoring waiting time performance and records management so we can identify any issues if they emerge. The system is still being introduced in some services and when this is completed we will be able to assess the overall programme,” said the spokesman.

Does Croydon’s unwillingness to give in its statements to me any details of problems indicate that the culture of a lack of transparency in the NHS will be hard to change, no matter how many times Jeremy Hunt talks about the need for candour when things go wrong?

The spokesman:

“I’d like to be clear about the Trust’s approach:

  • The Trust board has been cited on the roll out of CRS Millennium and any potential risks throughout the process.  As I previously noted, the board received an update in September.  The board meeting, which will take place on Monday of next week, will receive a further update from the Chief Executive.  The papers from this meeting will be published on our website and the meeting takes place in public;
  • A meeting chaired by the Chief Operating Officer has reviewed any operational matters arising on a daily basis.  This is an internal meeting for clinicians and managers which has informed the implementation process;
  • Patients and visitors to the hospital have been kept fully appraised of the introduction of the system and were made aware that they may experience some delays to the check-in process while staff became familiar with the new computer system;

“These actions would suggest that the Trust has been transparent in its approach.  You are welcome to review the board papers when they are published.”

Serious problems now emerge

Croydon did indeed publish its board papers on 25 November 2013 – which is to its credit because not all NHS trusts publish timely board papers.

But it’s mostly in the small print of various board papers that details emerge of Millennium-related problems. The shortcomings are mentioned as individual items rather than in a single, detailed Cerner Millennium deployment report.  This leaves one to question whether trust directors have an overview of the seriousness of the difficulties arising from its implementation of a new patient records system.

These are some excerpts from deep inside Croydon’s latest board papers:

Breaches in waiting time standards

– “CRS Millennium (Cerner) Deployment -Network downtime – Week 1.  In particular, the significant network downtime in week 1 (BT N3 problem) led to no electronic access to Pathology and Radiology which resulted in longer waits for patients in the Emergency Department (ED) leading to a large number of breaches. This was a BT N3 problem which has been rectified with BT providing CHS with the required scale of N3 access (>600 concurrent users and >1,600 users on any day – which is the largest network usage of any trust in England).”

– • “Hospital Based Pathways: The deployment of CRS Millennium was a particular challenge in the month across the multiple service areas within the Directorate of A&E, Surgery and Maternity.

• “Cancer & Core Functions: With the implementation of CRS Millennium, the open pathways part of RTT [referral to treatment – patient waiting times) may fail the standard – validation will be completed after the narrative for this report… “

Excessive waits in A&E

– “The main drivers adversely affecting the performance in the month [October 2013) for A&E were the deployment of CRS Millennium and the commencement of winter pressures due to the seasonality change.  A&E  4-Hour Total Time in Department Target: 95.00%. Actual: 91.57%.”

Over budget

“The Trust position as at October is an adverse variance of £4.1m. This is a significant deterioration on the Month 6 position. The movement is mainly due to a significant reduction in income mainly as a result of operating issues caused by the Cerner deployment (£0.9m)…  Actual £14.8 (£14.8)m; Budget £10.7m; Variance £4.1m.”

“Cerner Millennium: Plan YTD [year-to-date] £245,000; Actual YTD  £621,000;

Significant loss in income

“… A new patient administration system was deployed in the Trust on the 30th September and 1st October (Cerner Millennium). The deployment has resulted in significant loss in income in September and October £ 1.1m. Trust performance on Activity Planning Assumptions and Key Performance Indicators is substantially worse than plan …”

Extra costs

“Medical £412k and admin £148k agency levels continue to be high due to cover for vacancies, annual leave, sickness and release of staff for Cerner training. The Trust has also incurred additional costs associated with the Cerner deployment (£600k) including overtime payments to administration staff and training costs.”

Bid to recover Cerner costs?

“… The Trust is currently forecasting a deficit position of £17.8m, which is £3.3m off the plan submitted to the NHS Trust Development Authority. This is a £3m movement from the month 6 forecast and is as a result of operational issues caused by the Cerner deployment. The current projected impact is an additional costs £1.7m and a loss in activity £1.1m . An application is to be made to recover the additional cost/losses relating to the Cerner deployment [of £2.9m] …”

HSCIC support for delays

“Cerner Millennium – Revised implementation date to Sept 2013 (achieved) ,with resultant additional costs including additional PC requirements of £146k, specialist support services £300k, procurement costs £91k, data cleansing costs £200k.

“Health& Social Care Information Centre (HSCIC) has confirmed support for the delayed implementation will be provided, accounting treatment of support to be confirmed with Department of Health.”

More money to stabilise operational position?

“As a result of operational issues caused by the Cerner deployment , Income is significantly reduced in October. The forecast assumes that the Trust will resume normal operating levels from November and that an element of the income lost will be will be recovered in the latter part of the year. A business case is being submitted to the Trust Board for additional investment in Cerner to stabilise the operational position.

“If there are further operational issues due to the Cerner deployment then this will significantly impact on the year end forecast…”

Over-optimism?

Principal risk -reporting output from Cerner is not accurate or timely. Officer in charge: CEO. Before go-live risk scores: June 2013 – 16; July – 16; Aug  – 10; Sept – 10. After go-live risk score (for Oct): 20 [high risk of likelihood and consequences]

Principal risk – operational readiness following the implementation of Cerner. Officer in charge: COO.  Before go-live risk score 15. Post go-live: 20. Risk rating before go-live – Green. After go-live – Red.

Red risks

Corporate Risk Assurance Framework

Nine risks are reported as Red [two of which relate directly to Millennium]:

“… Reporting output from Cerner is not accurate or timely. Data migration was successful. However reliance on external provider as internal knowledge has not yet been fully gained. A data quality dashboard with exception reporting is in place.

“… Operational readiness following the implementation of Cerner CRS Millennium impact conveyed to Trust Development Authority e.g. ED [Emergency Department] reporting and cost overruns

Risk scores

– Failure of CRS millennium to deliver anticipated benefits – 12. Officer in charge: CEO

– Reporting output from Cerner is not accurate or timely – 20. Officer in charge: CEO

– Operational readiness following the implementation of Cerner – 20. Officer in charge: COO

Croydon’s trust’s response to problems

Said John Goulston, Croydon’s CEO, in his latest [November 2013] report to the board of directors:

“The issues being encountered now with CRS Millennium are not due to any lack of integration testing with legacy applications or testing of workflow. They can be attributed to changing from a 25 year old Patient Administration System (Patient Centre) which did not require working in real time, was simple and intuitive to use, easily configurable and flexible to our needs.

“CRS Millennium’s patient administration functions are almost the complete opposite and the language used is new for our staff i.e. conversations, encounters etc. For our staff it has been a big ask for them to step into and up to such a complex application.”

He added: “The benefits of the new system are that each patient will have a single accurate electronic record that can be viewed and kept up-to-date by hospital and community clinical staff. This will eventually mean less time searching for patient notes, missing documentation and duplicating patient information…

“As with any massive change, there are still some challenges to tackle in making the system work effectively for every single user, in a diverse and complex organisation.

“However the success we have achieved to date is the result of the efforts of every single system user and all staff members. I would like to thank all our staff for their hard work in getting the Trust to this important stage.”

The trust spokesman gave me this statement on the problems:

“The Trust board has been given regular reports on the roll out of CRS Millennium and any potential risks throughout the process, not least through its regular reviews of the Corporate Risk and Board Assurance Frameworks.  As I previously noted, the board received a specific update in September.

“As you already know, November’s board meeting received a further update from the Chief Executive.  The papers from this meeting were published and the meeting takes place in public;  Those attending are invited to put forward questions.

“A meeting chaired by the Chief Operating Officer continues to review operational matters.  This is an internal meeting for clinicians and managers which has informed the implementation process;

“Patients and visitors to the hospital have been kept fully appraised of the introduction of the system and were made aware that they may experience some delays to the check-in process while staff became familiar with the new computer system;

“As you highlight from the board report, Cerner & BT noted that ‘the Trust has undertaken one of the most efficient roll-outs of the system they have worked on’   The papers also note some operational challenges as the system was rolled out.  These have been addressed as part of the daily meetings I reference above – these are mainly concerned with users familiarising themselves with the system and have been addressed through the support and training staff received.

“In terms of the costs, the introduction of CRS Millennium has been supported by central funding from the Department of Health with the Trust paying the implementation overheads.   These costs are a matter of public record and the Trust publishes annual Accounts as part of its Annual Report.”

Comment

When you go into hospital it’s reassuring to know the directors will be well informed and open about problems that could affect you.

The approach of Croydon Health Services NHS Trust to openness about its problems is not reassuring. It is no better or worse than other trusts that have implemented Cerner’s Millennium. In fact the timely publication of its board papers means it is more open than some.

But it should not require a time-consuming journalistic investigation to establish the consequences for patients of an NPfIT go-live. It has required just such an investigation after the go-live of Millennium at Croydon.

Board directors will not have the time to dig for, and piece together, information about internal problems that could delay patient appointments, treatment and care. They need the unpalatable facts in one place. Croydon Health Services has failed to make it easy for patients or board directors to see what has gone wrong.

NPfIT deployments at other trusts have led, cumulatively, to thousands of patients having appointments that were disrupted, or who had to wait longer to be seen than necessary, or whose records were not available, or who were seen with another patient’s records.

In shying away from telling the whole truth trusts take their cue from the top: the Department of Health has always made it hard to establish facts about anything to do with the NPfIT.  Said the Public Accounts Committee in its report The National Programme for IT in the NHS: an update on the delivery of detailed care records systems in July 2011:

 “It is unacceptable that the Department [of Health] has neglected its duty to provide timely and reliable information to make possible Parliament’s scrutiny of this project.

“Basic information provided by the Department to the National Audit Office was late, inconsistent and contradictory.”

Unanswered questions

Croydon has questions to answer, such as how many breaches of waiting time standards it has had, and may still be having, due to problems arising from the go-live. Other unanswered questions:

– What does a “a large number of breaches” in the Emergency Department mean? Have each the patients affected been told?

– Why are the risks related to the implementation much higher after go-live than before, given that the trust has had years to prepare for the go-live, and the many lessons it could have learned from other trusts?

– Exactly what problems are still affecting patients?

In a post-Francis NHS, Jeremy Hunt has demanded openness about mistakes and problems. There is an agreed need for change – but how can Hunt change an NHS culture – indeed a public sector culture – in which senior executives, in troubled IT implementations, will always emphasise the good news over the bad, perhaps hoping the bad will always remain hidden?

Trust spends £16.6m on consultants for Cerner EPR

By Tony Collins

Reading-based Royal Berkshire NHS Foundation Trust says in an FOI response that its spending on “computer consultants since the inception of the EPR system is £16.6m”.

The Trust’s total spend on the Cerner Millennium system was said to have been £30m by October 2012.

NHS IT suppliers have told me that the typical cost of a Trust-wide EPR [electronic patient record] system, including support for five years, is about £6m-£8m, which suggests that the Royal Berkshire has spent £22m more than necessary on new patient record IT.

Jonathan Isaby, Taxpayers’ Alliance political director, said: “This is an astonishing amount of taxpayers’ money to have squandered on a system which is evidently failing to deliver results.

“Every pound lost to this project is a pound less available for frontline medical care. Those who were responsible for the failure must be held to account for their actions as this kind of waste cannot go unchecked.”

 The £16.6m consultancy figure was uncovered this week through a Freedom of Information request made by The Reading Chronicle. It had asked for the spend on consultants working on the Cerner Millennium EPR [which went live later than expected in June 2012].

The Trust replied: “Further to your request for information the costs spent on computer consultants since the inception of the EPR system is £16.6m.”

The Chronicle says that the system is “meant to retrieve patient details in seconds, linking them to the availability of surgeons, beds or therapies, but has forced staff to spend up to 15 minutes navigating through multiple screens to book one routine appointment, leading to backlogs on wards and outpatient clinics”.

Royal Berkshire’s chief executive Edward Donald had said the Cerner Millennium go live was successful.  A trust board paper said:

 “The Chief Executive emphasised that, despite these challenges, the ‘go-live’ at the Trust had been more successful than in other Cerner Millennium sites.”

A similar, stronger message had appeared was in a separate board paper which was released under FOI.  Royal Berkshire’s EPR [electronic patient record] Executive Governance Committee minutes said:

“… the Committee noted that the Trust’s launch had been considered to be the best implementation of Cerner Millennium yet and that despite staff misgivings, the project was progressing well. This positive message should also be disseminated…”

Comment

Royal Berkshire went outside the NPfIT. But its costs are even higher than the breathtakingly high costs to the taxpayer of NPfIT Cerner and Lorenzo implementations.

As senior officials at the Department of Health have been so careless with public funds over NHS IT – and have spent millions on the same sets of consultants – they are in no position to admonish Royal Berkshire.

So who can criticise Royal Berkshire and should its chief executive be held accountable?

When it’s official policy to spend tens of millions on EPRs that may or may not make things better for hospitals and patients – and could make things much worse – how can accountability play any part in the purchase of the systems and consultants?

The enormously costly Cerner and Lorenzo EPR implementations go on – in an NHS IT world that is largely without credible supervision, control, accountability or regulation.

Cash squandered on IT help

Trust loses £18m on IT system

The best implementation of Cerner Millennium yet?

Firecontrol disaster and NPfIT – two of a kind?

By Tony Collins

Today’s report of the Public Account Committee on the Firecontrol project could, in many ways, be a report on the consequences of the failure of the National Programme for IT in the NHS in a few years time.

The Firecontrol project was built along similar lines to the NPfIT but on a smaller scale.

With Firecontrol, Whitehall officials wanted to persuade England’s semi-autonomous 46 local fire authorities to take a centrally-bought  IT system while simplifying and unifying their local working practices to adapt to the new technology.

NPfIT followed the same principle on a bigger scale: Whitehall officials wanted to persuade thousands of semi-autonomous NHS organisations to adopt centrally-bought technologies. But persuasion didn’t work, in either the fire services or the NHS.

More similarities

The Department for Communities and Local Government told
the PAC that the Firecontrol control was “over-specified” – that it was unnecessary to have back-up to an incident from a fire authority from the other side of the country.

Many in the NHS said that NPfIT was over-specified. The gold-plated trimmings, and elaborate attempts at standardisation,  made the patient record systems unnecessarily complicated and costly – and too difficult to deliver in practice.

As with the NPfIT, the Firecontrol system was delayed and local staff  had little or no confidence it would ever work, just as the NHS had little or no faith that NPfIT systems would ever work.

Both projects failed. Firecontrol wasted at least £482m. The Department of Communities and Local Government cancelled it in 2010. The Department of Health announced in 2011 that the NPfIT was being dismantled but the contracts with CSC and BT could not be cancelled and the programme is dragging on.

Now the NHS is buying its own local systems that may or may not be interoperable. [Particularly for the long-term sick, especially those who have to go to different specialist centres, it’s important that full and up-to-date medical records go wherever the patients are treated and don’t at the moment, which increases the risks of mistakes.]

Today’s Firecontrol report expresses concern about a new – local – approach to fire services IT. Will the local fire authorities now end up with a multitude of risky local systems, some of which don’t work properly, and are all incompatible, in other words don’t talk to each other?

This may be exactly the concern of a post-2015 government about NHS IT. With the NPfIT slowly dying NHS trusts are buying their own systems. The coalition wants them to interoperate, but will they?  

Could a post-2015 government introduce a new (and probably disastrous) national NHS IT project – son of NPfIT – and justify it by drawing attention to how very different it is to the original NPfIT eg that this time the programme has the buy-in of clinicians?

The warning signs are there, in the PAC’s report on Firecontrol. The report says there are delays on some local IT projects being implemented in fire authorities, and the systems may not be interoperable. The PAC has 

” serious concerns that there are insufficient skills across all fire authorities to ensure that 22 separate local projects can be procured and delivered efficiently in so far as they involve new IT systems”.

National to local – but one extreme to the other?

The PAC report continues

“There are risks to value for money from multiple local projects. Each of the 22 local projects is now procuring the services and systems they need separately.

“Local teams need to have the right skills to get good deals from suppliers and to monitor contracts effectively. We were sceptical that all the teams had the appropriate procurement and IT skills to secure good value for money.

“National support and coordination can help ensure systems are compatible and fire and rescue authorities learn from each other, but the Department has largely devolved these roles to the individual fire and rescue authorities.

“There is a risk that the Department has swung from an overly prescriptive national approach to one that provides insufficient national oversight and coordination and fails to meet national needs or achieve economies of scale. 

Comment

PAC reports are meant to be critical but perhaps the report on Firecontrol could have been a little more positive about the new local approach that has the overwhelming support of the individual fire and rescue authorities.  

Indeed the PAC quotes fire service officials as saying that the local approach is “producing more capability than was expected from the original FiReControl project”. And at a fraction of the cost of Firecontrol.

But the PAC’s Firecontrol Update Report expresses concern that

– projected savings from the local approach are now less than originally predicted

– seven of the 22 projects are running late and two of these projects have slipped by 12 months

– “We have repeatedly seen failures in project management and are concerned that the skills needed for IT procurement may not be present within the individual fire and rescue authorities, some of which have small management teams,” says the PAC.

On the other hand …

The shortfall in projected savings is small – £124m against £126m and all the local programmes are expected to be delivered by March 2015, only three months later than originally planned.

And, as the PAC says, the Department for Communities and Local Government has told MPs that a central peer review team is in place to help share good practice – mainly made up of members of fire and rescue authorities themselves.

In addition, part of the £82m of grant funding to local fire services has been used by some authorities to buy in procurement expertise.

Whether it is absolutely necessary – and worth the expense – for IT in fire services to link up is open to question, perhaps only necessary in a national emergency.

In the NHS it is absolutely necessary for the medical records of the chronically sick to link up – but that does not justify a son-of-NPfIT programme. Linking can be done cheaply by using existing records and having, say, regional servers pull together records from individual hospitals and other sites.

Perhaps the key lesson from the Firecontrol and the NPfIT projects is that large private companies can force their staff to use unified IT systems whereas Whitehall cannot force semi-autonomous public sector organisations to use whatever IT is bought centrally.

It’s right that the fire services are buying local IT and it’s right that the NHS is now too. If the will is there to do it cheaply, linking up the IT in the NHS can be done without huge central administrative edifices.

Lessons from FireControl (and NPfIT?) 

The National Audit Office identifies these main lessons from the failure of Firecontrol:

– Imposing a single national approach on locally accountable fire and rescue authorities that were reluctant to change how they operated

–  Launching the programme too quickly without applying basic project approval checks and balances

– Over optimism on the deliverability of the IT solution.

– Issues with project management including consultants who made up half of the management team and were not effectively managed

MP Margaret Hodge, chair of the Public Accounts Committee, today sums up the state of Firecontrol

“The original FiReControl project was one of the worst cases of project failure we have seen and wasted at least £482 million of taxpayers’ money.

“Three years after the project was cancelled, the DCLG still hasn’t decided what it is going to do with many of the specially designed, high-specification facilities and buildings which had been built. Four of the nine regional control centres are still empty and look likely to remain so.

“The Department has now provided fire and rescue authorities with an additional £82 million to implement a new approach based on 22 separate and locally-led projects.

“The new programme has already slipped by three months and projected savings are now less than originally predicted. Seven of the 22 projects are reportedly running late and two have been delayed by 12 months. We are therefore sceptical that projected savings, benefits and timescales will be achieved.

“Relying on multiple local projects risks value for money. We are not confident that local teams have the right IT and procurement skills to get good deals from suppliers and to monitor contracts effectively.

“There is a risk that the DCLG has swung from an overly prescriptive national approach to one that does not provide enough national oversight and coordination and fails to meet national needs or achieve economies of scale.

 “We want the Department to explain to us how individual fire and rescue authorities with varied degrees of local engagement and collaboration can provide the needed level of interoperability and resilience.

“Devolving decision-making and delivery to local bodies does not remove the duty on the Department to account for value for money. It needs to ensure that national objectives, such as the collaboration needed between fire authorities to deal with national disasters and challenges, are achieved.”

Why weren’t NPfIT projects cancelled?

 NPfIT contracts included commitments that the Department of Health and the NHS allegedly did not keep, which weakened their legal position; and some DH officials did not really want to cancel the NPfIT contracts (indeed senior officials at NHS England seem to be trying to keep NPfIT projects alive through the Health and Social Care Information Centre which is responsible for the local service provider contracts with BT and CSC).

PAC report on Firecontrol

What Firecontrol and the NPfIT have in common (2011)

Did officials exaggerate death of the NPfIT?

By T0ny Collins

In 2011 the Department of Health made a major announcement that implied the NHS IT programme, the NPfIT, was dead when it wasn’t.

The DH’s press release announced an “acceleration of the dismantling of the National Programme for IT, following the conclusions of a new review by the Cabinet Office’s Major Projects Authority”.

It said the Authority had concluded that the NPfIT was “not fit to provide the modern IT services that the NHS needs…” The National media took the press release to mean that the NPfIT was dead.

What the announcement didn’t mention was that at least £1.1bn had still to be spent, largely with CSC, provided that the company successfully completed all the work set out in its revised contracts, and that the projected end-of-life of some centrally-chosen NHS IT systems was 2024.

Some will say: who cares if the DH issues a press release that is misleading. Others may say that in a democracy one should be able to trust institutions of state. If the DH issues an official notice that has the effect of manipulating public perceptions – gives a false impression – can citizens trust the Department’s other official notices?

The press release in question did not say the NPfIT was closing but gave that impression. The announcement distanced the government and the Department of Health from an IT scheme, perhaps the world’s largest non-military IT programme, that was failing. This was the press release:

The government today announced an acceleration of the dismantling of the National Programme for IT.

“The government today announced an acceleration of the dismantling of the National Programme for IT, following the conclusions of a new review by the Cabinet Office’s Major Projects Authority (MPA). The programme was created in 2002 under the last government and the MPA has concluded that it is not fit to provide the modern IT services that the NHS needs…”

The press release was given added weight by those quoted in it. They included the Department of Health, Francis Maude, Minister for the Cabinet Office and Sir David Nicholson, Chief Executive of the NHS.

But the truth about the press release emerged this week at a hearing of the Public Accounts Committee.

Margaret Hodge, chair of the Public Accounts Committee, began a hearing on the NPfIT on Wednesday by asking Sir David Nicholson, the NHS chief, a canny question.

Hodge:  “There was a big announcement back in 2011 that you were closing the NPfIT programme.”

“Yes,” replied Sir David.

“That’s not true,” said Hodge. “It was a PR exercise to say you closed it.”

Nicholson: “It certainly was not a PR exercise.”

Hodge: “What changed?”

Nicholson: “The governance arrangements changed.  So there are separate senior responsible officers for each of the individual programmes [within the NPfIT].”

Hodge: “With the greatest respect, changing governance arrangements is not closing the programme.. .I think the impression you were trying to give was that you were closing the programme. All you were doing was shifting the deckchairs on the Titanic. You were shifting the way you were running it but you were keeping all that expenditure running… The impression given to the public was that you were going to get out of some of these contracts.”

On the basis of the press release the Daily Mail published a front page lead story with this headline:

£12bn NHS computer system is scrapped… and it’s all YOUR money that Labour poured down the drain

On the day of the press release the Daily Telegraph reported that the £11.4bn NHS IT programme was “to be abandoned”.  Similar reports appeared in the trade press.

But this week’s Public Accounts Committee heard that the NPfIT is very much alive:

– the estimated worth of CSC’s contracts under the NPfIT has risen from £3.1bn to £3.8bn at today’s prices.

–  officials expected to pay CSC a further £1.1bn on top of the £1.1bn it has already received, and this payment may include up to £600m for Lorenzo deployments at only 22 trusts. Hodge said: “You are going to spend another half a billion with this rotten company providing a hopeless system” – to which the DH argues that CSC has delivered thousands of (non-Lorenzo) working systems to the NHS which trusts and community health services rely on.

– About £500m of the £1.1bn still set aside for CSC will go on GP systems supplied by CSC’s subcontractor TPP Systmone.

– Further spending on the NPfIT may come as a result of Fujitsu’s legal action against the DH after it left the NPfIT in 2008, which leaves the taxpayer with a potential pay-out of £700m or more. The outcome of a formal arbitration is expected in about six months. The closing arguments are due at the end of this month.

– £31.5m has so far been spent on the DH’s legal costs in the Fujitsu case, mostly with the .law firm DLA Piper.

– DH has agreed a compensation payment to CSC of £100m. In return CSC has released the Department of Health from a contractual commitment for 160 NHS trusts to take the Lorenzo system. The DH has made a further payment to CSC of £10m in recognition of changes to its software which had been requested by the NHS but not formally agreed with CSC.

Comment

It appears there has been no deliberate deception and no deliberate manipulation of public perceptions of the NPfIT. But the fact remains that the DH made a major announcement in 2011 which gave the impression the NPfIT was dead when this was not true.

When a BBC Radio 4 journalist called me this week and we spoke briefly about the NPfIT he said: “I thought it was dead”.

Perhaps the mindset of officials was that the NPfIT was dead because everyone except the suppliers wanted it to be. But because local service provider contracts had to stay in place – the suppliers being much better equipped than the DH to handle any disputes over early termination – large payments to CSC and BT had to continue.

It’s a little like the political row over weapons of mass destruction in Iraq. It’s unlikely Blair lied over the existence of WMD. He probably convinced himself they existed. In a similar act of self-delusion officials appear to have convinced themselves the NPfIT was dead although it wasn’t.

But if we cannot believe a major DH announcement one starts to ask whether any of the department’s major announcements can be believed.

Uncoloured information on the NPfIT has always been hard to come by. So credit is due to the Public Accounts Committee and particularly its MP Richard Bacon for finding out so much about the NPfIT.  All credit to Margaret Hodge for picking up on Bacon’s concerns. Were it not for the committee, with indispensable support from the National Audit Office, the DH would have been a sieve allowing only bits of information it wanted to release to pass through.

The fall-out from the NPfIT will continue for years. We still don’t know, for example, what all the trusts with BT and CSC systems will do when the NPfIT contracts expire in the next three years. The hope is for transparency – and not of the sort characterised by the DH’s announcement in 2011 of the NPfIT’s dismantling.

This post also appears on ComputerworldUK

How to cost-justify the NPfIT disaster – forecast benefits a decade away

By Tony Collins

To Jeremy Hunt, the Health Secretary, the NPfIT was a failure. In an interview with the FT, reported on 2 June 2013, Hunt said of the NPfIT

“It was a huge disaster . . . It was a project that was so huge in its conception but it got more and more specified and over-specified and in the end became impossible to deliver … But we musn’t let that blind us to the opportunities of technology and I think one of my jobs as health secretary is to say, look, we must learn from that and move on but we must not be scared of technology as a result.”

Now Hunt has a different approach.  “I’m not signing any big contracts from behind [my] desk; I am encouraging hospitals and clinical commissioning groups and GP practices to make their own investments in technology at the grassroots level.”

Hunt’s indictment of the NPfIT has never been accepted by some senior officials at the DH, particularly the outgoing chief executive of the NHS Sir David Nicholson. Indeed the DH is now making strenuous attempts to cost justify the NPfIT, in part by forecasting benefits for aspects of the programme to 2024.

The DH has not published its statement which attempts to cost justify the NPfIT. But the National Audit Office yesterday published its analysis of the unpublished DH statement. The NAO’s analysis “Review of the final benefits statement for programmes previously managed under the National Programme for IT in the NHS” is written for the Public Accounts Committee which meets next week to question officials on the NPfIT. 

A 22 year programme?

When Tony Blair gave the NPfIT a provisional go-ahead at a meeting in Downing Street in 2002, the programme was due to last less than three years. It was due to finish by the time of the general election of 2005. Now the NPfIT  turns out to be a programme lasting up to 22 years.

Yesterday’s NAO report says the end-of-life of the North, Midlands and East of England part of the NPfIT is 2024. Says the NAO

“There is, however, very considerable uncertainty around whether the forecast benefits will be realised, not least because the end-of-life dates for the various systems extend many years into the future, to 2024 in the case of the North, Midlands and East Programme for IT.”

The DH puts the benefits of the NPfIT at £3.7bn to March 2012 – against costs of £7.3bn to March 2012.

Never mind: the DH has estimated the forecast benefits to the end-of-life of the systems at £10.7bn. This is against forecast costs of £9.8bn to the end-of-life of the systems.

The forecast end-of-life dates are between 2016 and 2024. The estimated costs of the NPfIT do not include any settlement with Fujitsu over its £700m claim against NHS Connecting for Health. The forecast costs (and potential benefits) also exclude the patient administration system Lorenzo because of uncertainties over the CSC contract.

The NAO’s auditors raise their eyebrows at forecasting of benefits so far into the future. Says the NAO report

“It is clear there is very considerable uncertainty around the benefits figures reported in the benefits statement. This arises largely because most of the benefits relate to future periods and have not yet been realised. Overall £7bn (65 per cent) of the total estimated benefits are forecast to arise after March 2012, and the proportion varies considerably across the individual programmes depending on their maturity.

“For three programmes, nearly all (98 per cent) of the total estimated benefits were still to be realised at March 2012, and for a fourth programme 86 per cent of benefits remained to be realised.

There are considerable potential risks to the realisation of future benefits, for example systems may not be deployed as planned, meaning that benefits may be realised later than expected or may not be realised at all…”

NPfIT is not dead

The report also reveals that the DH considers the NPfIT to be far from dead. Says the NAO

“From April 2013, the Department [of Health] appointed a full-time senior responsible owner accountable for the delivery of the [the NPfIT] local service provider contracts for care records systems in London, the South and the North, Midlands and East, and for planning and managing the major change programme that will result from these contracts ending.

“The senior responsible owner is supported by a local service provider programme director in the Health and Social Care Information Centre.

“In addition, from April 2013, chief executives of NHS trusts and NHS foundation trusts became responsible for the realisation and reporting of benefits on the ground. They will also be responsible for developing local business cases for the procurement of replacement systems ready for when the local service provider contracts end.”

The NAO has allowed the DH to include as a benefit of the NPfIT parts of the programme that were not included in the original programme such as PACS x-ray systems.

Officials have also assumed as a benefit quicker diagnosis from the Summary Care Record and text reminders using NHSmail which the DH says reduces the number of people who did not attend their appointment by between 30 and 50 per cent.

Comment

One of the most remarkable things about the NPfIT is the way benefits have always been – and still are – referred to in the future tense. Since the NPfIT was announced in 2002, numerous ministerial statements, DH press releases and conference announcements have all referred to what will happen with the NPfIT.

Back in June 2002, the document that launched the NPfIT, Delivering 21st Century IT for the NHS, said:

“We will quickly develop the infrastructure …”

“In 2002/03 we will seek to accelerate the pace of development …

“Phase 1 – April 2003 to December 2005 …Full National Health Record Service implemented, and accessible nationally for out of hours reference.”

In terms of the language used little has changed. Yesterday’s NAO report is evidence that the DH is still saying that the bulk of the benefits will come in future.

Next week (12 June) NHS chief Sir David Nicholson is due to appear before the Public Accounts Committee to answer questions on the NPfIT. One thing is not in doubt: he will not concede that the programme has been a failure.

Neither will he concede that a fraction of the £7.3bn spent on the programme up to March 2012 would have been needed to join up existing health records for the untold benefit of patients, especially those with complex and long-term conditions.

Isn’t it time MPs called the DH to account for living in cloud cuckoo land? Perhaps those at the DH who are still predicting the benefits of the NPfIT into the distant future should be named.

They might just as well have predicted, with no less credibility, that in 2022 the bulk of the NPfIT’s benefits would be delivered by the Flower Fairies.

It is a nonsense that the DH is permitted to waste time on this latest cost justification of the NPfIT. Indeed it is a continued waste of money for chief executives of NHS trusts and NHS foundation trusts to have been made responsible, as of April 2013, for reporting the benefits of the NPfIT.

Jeremy Hunt sums up the NPfIT when he says it has been a huge disaster. It is the UK’s biggest-ever IT disaster. Why does officialdom not accept this?

Instead of wasting more money on delving into the haystack for benefits of the NPfIT, it would be more sensible to allocate money and people to spreading the word within Whitehall and to the wider public sector on the losses of the NPfIT and the lessons that must be learnt to discourage any future administrations from embarking on a multi-billion pound folly.