Category Archives: CIO

Companies nervous over HMRC customs IT deadline?

By Tony Collins

This Computer Weekly article in 1994 was about the much-delayed customs system CHIEF. Will its CDS replacement that’s being built for the post-Brexit customs regime also be delayed by years?

The Financial Times  reported this week that UK companies are nervous over a deadline next year for the introduction of a new customs system three months before Brexit.

HMRC’s existing customs system CHIEF (Customs Handling of Import Export Freight) copes well with about 100 million transactions a year. It’s expected a £157m replacement system using software from IBM and European Dynamics will have to handle about 255 million transactions and with many more complexities and interdependencies than the existing system.

If the new system fails post-Brexit and CHIEF cannot be adapted to cope, it could be disastrous for companies that import and export freight. A post-Brexit failure could also have a serious impact on the UK economy and the collection of billions of pounds in VAT, according to the National Audit Office.

The FT quoted me on Monday as calling for an independent review of the new customs system by an outside body.

I told the FT of my concern that officials will, at times, tell ministers what they want to hear. Only a fully independent review of the new customs system (as opposed to a comfortable internal review conducted by the Infrastructure and Projects Authority) would stand a chance of revealing whether the new customs system was likely to work on time and whether smaller and medium-sized companies handling freight had been adequately consulted and would be able to integrate the new system into their own technology.

The National Audit Office reported last year that HMRC has a well-established forum for engaging with some stakeholders but has

“significant gaps in its knowledge of important groups. In particular it needs to know more about the number and needs of the smaller and less established traders who might be affected by the customs changes for the first time”.

The National Audit Office said that the new system will need to cope with 180,000 new traders who will use the system for the first time after Brexit, in addition to the 141,000 traders who currently make customs declarations for trade outside the EU.

The introduction in 1994 of CHIEF was labelled a disaster at the time by some traders,  in part because it was designed and developed without their close involvement. CHIEF  was eventually accepted and is now much liked – though it’s 24 years old.

Involve end-users – or risk failure

Lack of involvement of prospective end-users is a common factor in government IT disasters. It happened on the Universal Credit IT programme, which turned out to be a failure in its early years, and on the £10bn National Programme for IT which was dismantled in 2010. Billions of pounds were wasted.

The FT quoted me as saying that the chances of the new customs system CDS [Customs Declaration Service) doing all the things that traders need it to do from day one are almost nil.

The FT quotes one trader as saying,

“HMRC is introducing a massive new programme at what is already a critical time. It would be a complex undertaking at the best of times but proceeding with it at this very moment feels like a high stakes gamble.”

HMRC has been preparing to replace CHIEF with CDS since 2013. Its civil servants say that the use of the SAFe agile methodology when combined with the skills and capabilities of its staff mean that programme risks and issues will be effectively managed.

But, like other government departments, HMRC does not publish its reports on the state of major IT-related projects and programmes. One risk, then,  is that ministers may not know the full truth until a disaster is imminent.

In the meantime ministerial confidence is likely to remain high.

Learning from past mistakes?

HMRC has a mixed record on learning from past failures of big government IT-based projects.  Taking some of the lessons from “Crash”, these are the best  things about the new customs project:

  • It’s designed to be simple to use – a rarity for a government IT system. Last year HMRC reduced the number of system features it plans to implement from 968 to 519. It considered that there were many duplicated and redundant features listed in its programme backlog.
  • The SAFe agile methodology HMRC is using is supposed to help organisations implement large-scale, business-critical systems in the shortest possible time.
  • HMRC is directly managing the technical development and is carrying out this work using its own resources, independent contractors and the resources of its government technology company, RCDTS. Last year it had about 200 people working on the IT programme.

These are the potentially bad things:

  • It’s not HMRC’s fault but it doesn’t know how much work is going to be involved because talks over the post-Brexit customs regime are ongoing.
  • It’s accepted in IT project management that a big bang go-live is not a good idea. The new Customs Declaration Service is due to go live in January 2019, three months before Britain is due to leave the EU. CHIEF system was commissioned from BT in 1989 and its scheduled go-live was delayed by two years. Could CDS be delayed by two years as well? In pre-live trials CHIEF rejected hundreds of test customs declarations for no obvious reason.
  • The new service will use, at its core,  commercially available software (from IBM) to manage customs declarations and software (from European Dynamics) to calculate tariffs. The use of software packages is a good idea – but not if they need large-scale modification.  Tampering with proven packages is a much riskier strategy than developing software from scratch.  The new system will need to integrate with other HMRC systems and a range of third-party systems. It will need to provide information to 85 systems across 26 other government bodies.
  • If a software package works well in another country it almost certainly won’t work when deployed by the UK government. Core software in the new system uses a customs declaration management component that works well in the Netherlands but is not integrated with other systems, as it would be required to do in HMRC, and handles only 14 million declarations each year.
  • The IBM component has been tested in laboratory conditions to cope with 180 million declarations, but the UK may need to process 255 million declarations each year.
  • Testing software in laboratory conditions will give you little idea of whether it will work in the field. This was one of the costly lessons from the NHS IT programme NPfIT.
  • The National Audit Office said in a report last year that HMRC’s contingency plans were under-developed and that there were “significant gaps in staff resources”.

Comment

HMRC has an impressive new CIO Jackie Wright but whether she will have the freedom to work within Whitehall’s restrictive practices is uncertain. It seems that the more talented the CIO the more they’re made to feel like outsiders by senior civil servants who haven’t worked in the private sector.  It’s a pity that some of the best CIOs don’t usually last long in Whitehall.

Meanwhile HMRC’s top civil servants and IT specialists seem to be confident that CDS, the new customs system, will work on time.  Their confidence is not reassuring.  Ministers and civil servants publicly and repeatedly expressed confidence that Universal Credit would be fully rolled by the end of 2017. Now it’s running five years late.  The NHS IT programme NPfIT was to have been rolled out by 2015.  By 2010 it was dismantled as hopeless.

With some important exceptions, Whitehall’s track record on IT-related projects is poor – and that’s when what is needed is known. Brexit is still being negotiated. How can anyone build a new bridge when you’re not sure how long it’ll need to be and what the many and varied external stresses will be?

If the new or existing systems cannot cope with customs declarations after Brexit it may not be the fault of HMRC. But that’ll be little comfort for the hundreds of thousands of traders whose businesses rely, in part, on a speedy and efficient customs service.

FT article – UK companies nervous over deadline for new Customs system

Advertisements

NHS “Wachter” digital review is delayed – but does it matter?

By Tony Collins

The Wachter review of NHS technology was due to be published in June but has been delayed. Would it matter if it were delayed indefinitely?

A “Yes Minister” programme about a new hospital in North London said it all, perhaps. An enthusiastic NHS official shows the minister round a hospital staffed with 500 administrators. It has the latest technology on the wards.

“It’s one of the best run hospitals in the country,” the NHS official tells the minister, adding that it’s up for the Florence Nightingale award for the standards of hygiene.

“But it has no patients,” says the minister.

Another health official tells the minister,

“First of all, you have to sort out the smooth running of the hospital. Having patients around would be no help at all.” They would just be in the way, adds Sir Humphrey.

In the Wachter’s review’s terms of reference (“Making IT work: harnessing the power of health IT to improve care in England“)  there is a final bullet point that refers, obliquely, to a need to consider patients. Could the Wachter terms of reference have been written by a satirist who wanted to show how it was possible to have a review of NHS IT for the benefit of suppliers, clinical administrators and officialdom but not patients?

The Wachter team will, according to the government,

• Review and articulate the factors impacting the successful adoption of health information systems in secondary and tertiary care in England, drawing relevant comparisons with the US experience;

• Provide a set of recommendations drawing on the key challenges, priorities and opportunities for the health and social care system in England. These recommendations will cover both the high levels features of implementations and the best ways in which to engage clinicians in the adoption and use of such systems.

In making recommendations, the board will consider the following points:

• The experiences of clinicians and Trust leadership teams in the planning, implementation and adoption of digital systems and standards;

• The current capacity and capability of Trusts in understanding and commissioning of health IT systems and workflow/process changes.

• The current experiences of a number of Trusts using different systems and at different points in the adoption lifecycle;

• The impact and potential of digital systems on clinical workflows and on the relationship between patients and their clinicians and carers.

Yes, there’s the mention of “patients” in the final bullet point.

Existing systems?

nhsSome major IT companies have, for decades, lobbied – often successfully – for much more public investment in NHS technology. Arguably that is not the priority, which is to get existing systems to talk to each other – which would be for the direct benefit of patients whose records do not follow them wherever they are looked at or treated within the NHS.

Unless care and treatment is at a single hospital, the chances of medical records following a patient around different sites, even within the same locality, are slim.

Should a joining up of existing systems be the main single objective for NHS IT? One hospital consultant told me several years ago – and his comment is as relevant today –

“My daughter was under treatment from several consultants and I could never get a joined-up picture. I had to maintain a paper record myself just to get a joined-up picture of what was going on with her treatment.”

Typically one patient will have multiple sets of paper records. Within one hospital, different specialities will keep their own notes. Fall over and break your leg and you have a set of orthopaedic notes; have a baby and you will have a totally different set of notes. Those two sets are rarely joined up.

One clinician told me, “I have never heard a coroner say that a patient died because too much information was shared.”

And a technology specialist who has multiple health problems told me,

“I have different doctors in different places not knowing what each other is doing to me.”

As part of wider research into medical records, I asked a hospital consultant in a large city with three major hospitals whether records were shared at least locally.

“You must be joking. We have three acute hospitals. Three community intermediate teams are in the community. Their records are not joined. There is one private hospital provider. If you get admitted to [one] hospital and then get admitted to [another] the next week your electronic records cannot be seen by the first hospital.  Then if you get admitted to the third hospital the week after, again not under any circumstances will your record be able to be viewed.”

Blood tests have to be repeated, as are x-rays; but despite these sorts of stories of a disjointed NHS, senior health officials, in the countless NHS IT reviews there have been over 30 years, will, it seems, still put the simplest ideas last.

It would not cost much – some estimate less than £100m – to provide secure access to existing medical records from wherever they need to be accessed.

No need for a massive investment in new technology. No need for a central patient database, or a central health record. Information can stay at its present location.  Just bring local information together on local servers and provide secure access.

A locum GP said on the Pulse website recently,

“If you are a member of the Armed Forces, your MO can get access to your (EMIS-based) medical record from anywhere in the world. There is no technical reason why the NHS cannot do this. If need be, the patient could be given a password to permit a GP to see another Surgery’s record.”

New appointments

To avoid having patients clog up super-efficient hospitals, Sir Humphrey would have the Wachter review respond to concerns about a lack of joined up care in the NHS by announcing a set of committees and suggesting the Department of Health and NHS England appoint a new set of senior technologists.

Which is just what has happened.

Last week NHS England announced  “key appointments to help transform how the NHS uses technology and information”. [One of the NHS appointments is that of a Director of Digital Experience, which is not a fictional title, incidentally. Ironically it seems to be the most patient-facing of the new jobs.]

Said the announcement,

“The creation of these roles reflects recommendations in the forthcoming review on the future of NHS information systems by Dr Bob Wachter.

“Rather than appoint a single chief information and technology officer, consistent with the Wachter review the NHS is appointing a senior medical leader as NHS Chief Clinical Information Officer supported by an experienced health IT professional as NHS Chief Information Officer.

“The first NHS Chief Clinical Information Officer will be Professor Keith McNeil, a former transplant specialist who has also held many senior roles in healthcare management around the world, including Chief Executive Officer at Addenbrooke’s Hospital, Cambridge University Hospitals NHS Foundation Trust and Chief Executive Officer at the Royal Brisbane and Women’s Hospital in Australia.

“The new NHS Chief Information Officer will be Will Smart, currently Chief Information Officer at the Royal Free London NHS Foundation Trust. Mr Smart has had an extensive career in IT across the NHS and in the private sector.

“The NHS CCIO and NHS CIO post-holders will act on behalf of the whole NHS to provide strategic leadership, also chairing the National Information Board, and acting as commissioning ‘client’ for the relevant programmes being delivered by NHS Digital (previously known as the Health and Social Care Information Centre).

“The roles will be based at NHS England and will report to Matthew Swindells, National Director: Operations and Information, but the post-holders will also be accountable to NHS Improvement, with responsibility for its technology work with NHS providers.

“In addition, Juliet Bauer has been appointed as Director of Digital Experience at NHS England. She will oversee the transformation of the NHS Choices website and the development and adoption of digital technology for patient ‘supported self-management’, including for people living with long term conditions such as diabetes or asthma. Ms Bauer has led delivery of similar technology programmes in many sectors, including leading the move to take Times Newspapers online…”

Surely a first step, instead of arranging new appointments and committees, and finding ways of spending money on new technology, would be to put in place data sharing agreements between hospitals?

A former trust chief executive told me,

“In primary care, GPs will say the record is theirs. Hospital teams will say it is our information and patient representative groups will say it is about patients and it is their nformation. In maternity services there are patient-held records because it is deemed good practice that mums-to-be should be fully knowledgeable and fully participating in what is happening to them.

“Then you get into complications of Data Protection Act. Some people get very sensitive about sharing information across boundaries: social workers and local authority workers. If you are into long-term continuous care you need primary care, hospital care and social care. Without those being connected you may do half a job or even less than that potentially. There are risks you run if you don’t know the full information.”

He added that the Summary Care Record – a central database of every patient’s allergies, medication and any adverse reactions to drugs, was a “waste of time”.

“You need someone selecting information to go into it [the Summary Care Record]so it is liable to omissions and errors. You need an electronic patient record that has everything available but is searchable. You get quickly to what you want to know. That is important for that particular clinical decision.”

Is it the job of civil servants to make the simple sound complicated?

Years ago, a health minister invited me for an informal meeting at the House of Commons to show me, in confidence, a one-page civil service briefing paper on why it was not possible to use the internet for making patient information accessible anywhere.

The minister was incredulous and wanted my view. The civil service paper said that nobody owned the internet so it couldn’t be used for the transfer of patient records.  If something went wrong, nobody could be blamed.

That banks around the world use the internet to provide secure access to individual bank accounts was not mentioned in the paper, nor the existence of the CHAPS network which, by July 2011, had processed one quadrillion (£1,000,000,000,000,000) pounds.

Did the briefing paper show that the civil service was frightened by the apparent simplicity of sharing patient information on a secure internet connection? If nothing else, the paper showed how health service officials will tend, instinctively, to shun the cheapest solutions. Which may help to explain how the (failed) £10n National Programe for IT came into being in 2002.

Jargon

Radiation_warning_symbolNobody will be surprised if the Wachter review team’s report is laden with  jargon about “delays between technology being introduced and a corresponding rise in output”. It may talk of how new technology could reduce the length of stay by 0.1528 of a bed day per patient, saving a typical hospital £1.8m annually or 7,648 bed days.

It may refer to visions, envisioning fundamental change, establishing best practice as the norm, and a need for adaptive change.

Would it not be better if the review team spoke plainly of the need for a patient with a fractured leg not having to carry a CD of his x-ray images to different NHS sites in a carrier bag?

Some may await the Wachter report with a weary apprehension that its delay – even indefinitely – will make not a jot of difference. Perhaps Professor Wachter will surprise them. We live in hope.

Wachter review terms of reference.

Review of IT in the NHS

https://ukcampaign4change.com/2016/02/09/another-npfit-it-scandal-in-the-making/

Hunt announces Wachter review

What can we learn from the US “hospitalist” model?

Some of the strengths and weaknesses in GovIT – Phil Pavitt

By Tony Collins

Phil Pavitt was CIO at HM Revenue & Customs. He left two years ago and arrived at Specsavers via Aviva where he was global director of IT transformation.

At HMRC he was a main board member, responsible for all technology across the estate, delivering the change agenda, and managing a total annual IT budget of more than £1bn.

Now he has given an interview to Government Computing in which he talks about his role at Specsavers but also some of the challenges faced by those who are responsible for IT in central government.

He:

–  applauds the Government Digital Service’s (GDS) role in increasing digital traction, but believes the putting down of CIOs has been unnecessary and counter-productive.

– laments a lack of attention to legacy systems. “Name me the departments that have revolutionised themselves and their legacy engines. There’s not many to name. But the front end looks really really good. But who is going to change that legacy because one day that disconnect will be huge? They [GDS] are playing into the hands of the big SIs [systems integrators] who will turn out and say, ‘You’ll have to swap it out, and only we can do it.

“So I think there’s an interesting fundamental dichotomy that will eventually appear where the front of government will look really good and rightly so, and the back of government increasingly becomes expensive, archaic and out of date. And that’s going to be a problem.”

Pavitt also talks about the challenges faced by SMEs when trying to do business with departments, and the role of big suppliers, the so-called systems integrators.

Phil Pavitt’s interview in Government Computing.

DWP appoints 180k DG Technology

By Tony Collins

Mayank Prakash is the new Director General for Technology at the Department for Work and Pensions. He’ll be responsible for the DWP’s IT services and ensuring that its technology supports current and new digital services. He is the permanent replacement for DWP Chief Information Officer Andy Nelson who was in the role less than a year.

Prakash was previously Managing Director of Wealth and Asset Management Technology delivery centres at Morgan Stanley. Before that he led IT, security and digital business transformation at Sage UK.

He said: “This is a once in a lifetime opportunity to simplify welfare by transforming one of the UK’s largest IT estates to deliver easy to use digital services. It is a professional privilege to improve how the government delivers services to 22 million citizens.”

He’ll start in November.

His early career was in leading technology and eBusiness teams for Lucent and then Avaya where he was the International IT Director. He has an MBA from Manchester Business School.

He will be partly responsible for Universal Credit.

DWP’s advert for a £180k IT head – what it doesn’t say

Whitehall has taken on 100 technology experts over past year

By Tony Collins

The Cabinet Office says that government departments have taken on more than  100 IT experts over the past year.

The Government Digital Service (GDS) led the recruitment as part of a plan to raise technology-related skills in the civil service.

One appointment is of former Credit Suisse CIO Magnus Falk as the Government’s new Deputy Chief Technology Officer, reporting to Government CTO Liam Maxwell. Other recent technology recruits include:

  • MOJ Chief Technology Officer Ian Sayer, who was Global Chief Information Officer at Electrolux; and
  • Government Chief Technical Architect Kevin Humphries, former Chief Technical Architect at Qatarlyst.

Chief Digital Officer appointments include:

  • HMRC Chief Digital and Information Officer Mark Dearnley, formerly CIO of Vodafone;
  • MOJ CDO Paul Shelter, who previously co-founded two start-ups and was CTO for banking at Oracle;
  • ONS’s Laura Dewis, Deputy Director Digital Publishing, who was Head of Online Commissioning at The Open University;
  • Jacqueline Steed, former Managing Director and CIO for BT Wholesale, who starts as CDO at the Student Loan Company next week; and
  • DWP CDO Kevin Cunnington, who was previously Global Head of Online at Vodafone.

Comment

It’s encouraging that the Cabinet Office, through the GDS, is overseeing the recruitment of IT leaders in government departments. It means the recruits will see their roles as cross-governmental. In the past the civil service culture has required that CIOs show an almost filial respect for their departmental seniors.

It’s a good idea that GDS tries to change age-old behaviours from within by recruiting technology experts with a wide range of experience from the private sector. But how long will they last?

Their challenge will be converting the words “transformation”, “innovation” and “fundamental change” from board papers, press releases, strategy documents, and conference speeches, into actions.

New deputy CTO role in central government – Government Computing

 

 

CEO and CIO resign after troubled EHR go-live

By Tony Collins

At the foot of the Blue Ridge Mountains, Georgia, in America’s deep south, about 70 miles from Atlanta, is Athens .

It was named at the turn of the 19th century to associate its university with Aristotle and Plato’s academy in Greece. It is home to the Athens Regional Medical Centre, one of the USA’s top hospitals.

There on 4 May 2014 the Centre went live with what it described as the most meaningful and largest scale information technology system in its 95-year history – a Cerner EHR implementation.

Now the Centre’s CEO James Thaw and CIO Gretchen Tegethoff have resigned. The Centre’s implementation of the electronic health record system seems to have been no more or less successful than at UK hospitals.

The main difference is that more than a dozen doctors complained in a letter to Thaw and Tegethoff.  A doctor leaked their letter to the local paper.

“Medication errors”

The letter said the timescales to install the Cerner EHR system were too “aggressive” and there was a “lack of readiness” among the intended users. They called the system cumbersome.

The letter referred to “medication errors … orders being lost or overlooked … (emergency department) and patients leaving after long waits”. An inpatient wasn’t seen by a physician for five days.

“The Cerner implementation has driven some physicians to drop their active staff privileges at ARMC [Athens Regional Medical Centre],” said the letter. “This has placed an additional burden on the hospitalists, who are already overwhelmed. Other physicians are directing their patients to St. Mary’s (an entirely separate local hospital) for outpatient studies, (emergency room) care, admissions and surgical procedures. … Efforts to rebuild the relationships with patients and physicians (needs) to begin immediately.”

The boldness of the letter has won praise in parts of the wider American health IT community.

It was signed by the centre’s most senior medical representatives: Carolann Eisenhart, president of the medical staff; Joseph T. Johnson, vice president of the medical staff; David M. Sailers, surgery department chair; and, Robert D. Sinyard, medicine department chair.

A doctor who provided the letter to the Athens Banner-Herald refused a request to openly discuss the issues with the computer system and asked to remain anonymous at the urging of his colleagues.

Swift action

One report said that at a meeting of medical staff 200 doctors were “solid in their vote of no confidence in the present hospital administration.”

Last week Thaw wrote in an email to staff: “From the moment our physician leadership expressed concern about the Cerner I.T. conversion process on May 15, we took swift action and significant progress has been made toward resolving the issues raised … Providing outstanding patient care is first and foremost in our minds at Athens Regional, and we have dedicated staff throughout the hospital to make sure the system is functioning as smoothly as possible through this transition.”

UK implications?

The problems at the Athens centre raise questions about whether problematic Cerner installations in the NHS should have consequences for CEOs.  Health IT specialists say that, done well, EHR implementations can improve the chances of a successful recovery. Done badly an EHR implementation can harm patients and contribute to death.

The most recent installations of Cerner in the NHS, at Imperial College Healthcare NHS Trust and Croydon Health Services NHS Trust, follow the pattern of other Cerner EHR go-lives in the NHS where there have been hints of problems but the trusts are refusing to publish a picture of how patients are being affected.

What has gone wrong at Athens Regional?

IT staff, replying to the Banner-Herald’s article, have given informed views on what has gone wrong. It appears that the Athens Regional laid off about a third of the IT staff in February 2014, about three months before go-live.

Past project disasters have shown that organisations often need more, not fewer, IT staff, advisers and helpers, at the time of a major go-live.

A further problem is that there appears to have been little understanding or support among doctors for the changes they would need to make in their business practices to accommodate the new system.  Had the organisation done enough to persuade doctors and nurses of the benefits to them of changing their ways of working?

If clinicians do not support the need for change, they may focus unduly on what is wrong with the new system. An organisation that is inherently secretive and resentful of constructive criticism will further alienate doctors and nurses.

Doctors who fully support an EHR implementation may find ways around problems, without complaining.

One comment on the Banner-Herald website says:

“While I have moved on from Athens Regional, I still have many friends and colleagues that are trying to work through this mess. Here is some information that has been reported to me…

“Medications, labs and diagnostic exams are not getting done in a timely manner or even missed all together. Some of this could be training issues and some system.

“Already over worked clinical staff are having to work many extra hours to get all the information in the system. This obviously takes away from patient care.

“Senior leadership tried to implement the system in half the amount of time that is usually required to do such things, with half the staff needed to do it. Why?

“Despite an environment of fear and intimidation the clinical staff involved with the project warned senior administration that the system was not ready to implement and posed a safety risk.

“I have ex-colleagues that know staff and directors that are involved with the project. They have made a valiant effort to make things right. Apparently an 80 to even a 100 hour work week has been the norm of late.

“Some questions that I have: where does the community hospital board stand with all this? Were they asking the questions that need to be asked? Why would the software company agree to do such a tight timeline? Shouldn’t they have to answer some questions as well?”

“Hopefully, this newspaper will continue to investigate what has happened here and not cave to an institution that spends a lot of money on frequent giant full page ads.

“Please remember there are still good people (staff, managers and administrators) that work at ARMC and I am sure they care about the community they serve and will make sure they provide great patient care.”

“The last three weeks have been very challenging for our physicians, nurses, and staff,” said Athens Regional Foundation Vice President Tammy Gilland. “Parts of the system are working well while others are not. The medical staff leadership has been active in relaying their concerns to the administration and the administration has taken these concerns very seriously. Maintaining the highest quality of patient care has always been the guiding principle of Athens Regional Health System.”

Keeping quiet

NHS trusts go quiet about the effect on patients of EHR implementations despite calls by Robert Francis QC and health secretary Jeremy Hunt for openness when things go wrong.

Imperial College Healthcare NHS Trust, which comprises St Mary’s Paddington, Hammersmith Hospital, Charing Cross Hospital, Queen Charlotte’s and Chelsea Hospital, and Western Eye hospital in Marylebone Road, went live with Cerner– but its managers and CEO are refusing to say what effect the system is having on patients.

An FOI request by eHealth Insider elicited the fact that Imperial College Healthcare had 55 different consultants working on the Cerner Millennium project and 45 Trust staff. The internal budget for electronic patient record deployment was £14m.

Croydon Health Services NHS Trust, which comprises Croydon University Hospital (formerly Mayday) and the Purley War Memorial Hospital, went live with Cerner last year, also under BT’s direction.

The trust has been a little more forthcoming than Imperial about the administrative disruption, unforeseen extra  costs and effects on patients, but Croydon’s officials, like Imperial College Healthcare’s spokespeople,  refuse to give any specific answers to Campaign4Change’s questions on the Cerner implementation.

Comment

It was probably unfair of doctors at Athens Regional to judge the Cerner system so soon after go-live but their fierce reaction is a reminder that doctors exist to help patients, not spend time getting to grips with common-good IT systems.

Would an NHS CEO resign after a rebellion by UK doctors over a problematic EHR implementation? It’s highly unlikely – especially if trusts can stop news leaking out of the effects on patients. In the NHS that’s easy to do.

Athens Regional CEO resigns

A tragic outcome for Cerner Millennium implementation?

Athens Regional is addressing computer problems encountered by doctors

Athens Regional is addressing computer problems after patients put at risk

CEO forced out?

 

More IT-based megaprojects derail amid claims all is well

By Tony Collins

If one thing unites all failing IT-based megaprojects in the public sector it is the defensive shield of denial that suppliers and their clients hold up when confronted by bad news.

It has happened in the US and UK this week. On the Universal Credit  project, the minister in charge of the scheme, Lord Freud, accepted none of the criticisms in a National Audit Office report “Universal Credit: early progress”.   In a debate in the House of Lords Lord Freud quoted from two tiny parts of the NAO report that could be interpreted as positive comments.

“Spending so far is a small proportion of the total budget … and it is still entirely feasible that [universal credit] goes on to achieve considerable benefits for society,” said Lord Freud, quoting the NAO report.

But he mentioned none of the criticisms in the 55-page NAO report which concluded:

“At this early stage of the Universal Credit programme the Department has not achieved value for money. The Department has delayed rolling out Universal Credit to claimants, has had weak control of the programme, and has been unable to assess the value of the systems it spent over £300 million to develop.

“These problems represent a significant setback to Universal Credit and raise wider concerns about the Department’s ability to deal with weak programme management, over-optimistic timescales, and a lack of openness about progress.”

And a shield of denial went up in the US this week where newspapers on the east and west coast published stories on failing public sector IT-based megaprojects.  The LA [Los Angeles] Times said:

As many as 300,000 jobless affected by state software snags

“California lawmakers want to know why Deloitte’s unemployment benefits system arrived with major bugs and at almost double the cost estimate. The firm says the system is working.”

The LA Times continued:

“Problems are growing worse for the state’s Employment Development Department after a new computer system backfired, leaving some Californians without much-needed benefit cheques for weeks.”

The Department said the problems affected 80,000 claims but the LA Times obtained internal emails that showed the software glitches stopped payment to as many as 300,000 claimants.

Now lawmakers are setting up a hearing to determine what went wrong with a system that cost taxpayers $110m, almost double the original estimate.

Some blame the Department’s slow response to the problems. Others point the finger at a Deloitte Consulting.

The LA Times says that Deloitte has a “history of delivering projects over budget and with problematic results”. Deloitte also has been blamed, in part, for similar troubles with upgrades to unemployment software in Massachusetts, Pennsylvania and Florida, says the paper.

“We keep hiring the same company, and they keep having the same issues,” said Senator Anthony Cannella.  “At some point, it’s on us for hiring the same company. It’s faulty logic, and we’ve got to get better.”

In 2003 California planned to spend $58m upgrading its 30-year-old unemployment benefits system. By the time the state awarded Deloitte the contract in 2010  the cost estimate had grown by more than $30m.

The Department handed out $6.6bn to about 1 million unemployed Californians in 2012. The software was expected to ease the agency’s ability to verify who was eligible to receive benefits.

Problems began when the Department transferred old unemployment data to the new system. The software flagged claims for review — requiring state workers to manually process them.

The LA Times says that officials thought initially the workload would be manageable, but internal emails showed the agency was quickly overwhelmed. Phone lines were jammed. For weeks, the Department’s employees have been working overtime to clear the backlog.

A poor contract?

In a contract amendment signed two months ago California agreed to pay Deloitte $3.5m for five months of maintenance and operations costs. Those costs should have been anticipated in the contract said Michael Krigsman, a software consultant who is an expert on why big IT-based contracts go awry. He told the LA Times:

“It’s a striking oversight that maintenance was not anticipated at the beginning of the contract when the state was at a much stronger negotiation position.”

By the time the middle of a project is reached, the state has no choice but to stick with Deloitte to work out bugs that arise when the system goes live, he said.

System works

Loree Levy, a spokeswoman for the Department, said the system is working, processing 80% of claims on time. As for the troubles, she said, “There is a period of transition or adjustment with any large infrastructure upgrade like this one.”

Deloitte spokeswoman Courtney Flaherty said the new California system is working and that problems are not the result of a “breakdown or flaw in the software Deloitte developed”.

System not working?

While there seems to be no project disaster in the eyes of the Department and Deloitte Consulting, some of the unemployed see things differently. One wrote:

“I am a contract worker who had to fight for my unemployment benefits. I won my case and yet they still cannot pay me… It’s been more than 3 weeks since I won my appeal and as of this moment, I am owed 13 weeks of back payments. To add insult to injury, they cannot send me current weeks to certify and they refuse to even try to help me to get back into the online system.

“I blame Deloitte, but it is California that carries the heaviest burden of fault… We’re nearing November and they still haven’t fixed an issue that began over Labor Day? Nonsense!

“This is untenable for everyone affected …We are owed reparations as well as our money at this point. It’s a funny word, affected. That means families and individuals are going hungry but can’t get food stamps or welfare. It means evictions and repossessed cars. It means destroyed credit, late fees, years of turmoil and shame for people already dealing with unemployment. Shame on you California.”

Another wrote:

“ … Not communicating is NOT an answer. Unemployed individuals caught up in the nightmare were told to be patient.  Rents and other expenses were still accumulating.  But [when you] add on additional fees: late fees, restoral fees, interest fees, etc…….you get the picture.

“Dear Governor Brown,

“Please reimburse me for all additional fees I’ve had to absorb to survive this fiasco.  You are going to make me payback any overpayments, but ignore the cost to the unemployed taxpayer.  This is  appears to unfair.  Perhaps Deloitte should pay us back from their contracted funds before they receive their final payment.  I am saving all of my receipts to deduct from my 2013 tax return.

“BTW Gov Brown – I am still waiting on additional payments as of today and DMV registration for my vehicle was due on 10/20/13.  Are you going to waive the penalty for late payment? Am I the only one with this question?”

Scrutiny

California’s state Assembly has set a date of 6 November 2013 for a hearing into the Department’s system upgrade.

“We’re going to look at EDD, the contractors and others to see how the system broke down so we can avoid this in the future,” said Henry Perea, chair of the Assembly’s Insurance Committee, which has oversight over the jobless benefits program.

On its website Deloitte says:

“Deloitte continues to help EDD [Employment Development Department] transform the level of service it provides to unemployed workers and improve the quality of information collected by EDD. The next time unemployment spikes, California should be ready to meet the increased demand for services.”

Massachutsetts IT disaster?

On the opposite coast the Boston Globe reported on an entirely separate debacle (which also involved Deloitte):

          None admit fault on troubled jobless benefits system

“… even with the possibility that unemployed workers could face months more of difficulties and delays in getting benefits, officials from the Labor Department and contractor, Deloitte Consulting of New York, testified before the Senate Committee on Post Audit that the rollout of the computer system was largely a success.

“‘I am happy with the launch,’ said Joanne F. Goldstein, secretary of Labour and Workforce Development, noting that she would have liked some aspects to have gone better.

“Mark Price, a Deloitte principal in charge of the firm’s Massachusetts business, acknowledged that software has faced challenges during the rollout, but insisted, ‘We have a successful working system today. ‘’’

NPfIT shield

A shield of denial was up for years at the Department of Health whose CIOs and other spokespeople repeatedly claimed that the NPfIT was a success.

Comment

If you didn’t know that Universal Credit IT wasn’t working, or that thousands of people on the east and west coasts of the US hadn’t been paid unemployment benefits because of IT-related problems, and you had to rely on only the public comments of the IT suppliers and government spokespeople, you would have every reason to believe that Universal Credit and the jobless systems in Massachusetts and California were working well.

Why is it that after every failed IT-based megaproject those in charge can simply blow the truth gently away like soap bubbles?

When confronted by bad news, suppliers and their customers tend to join hands behind their defensive shields. On the other side are politicians, members of the public affected by the megaprojects and the press who have all, according to suppliers and officials, got it wrong.

Is this why lessons from public sector IT-based project disasters are not always learned? Because, in the eyes of suppliers and their clients, the disasters don’t really exist?

None admit fault on troubled jobless benefit system

State fired Deloitte

Complaints continue despite claims system is under control

As many as 300,000 affected by California’s software problems

California’s predictable fiasco?

Has 2 decades of outsourcing cut costs at HMRC?

By Tony Collins

If HMRC’s experience is anything to go by, outsourcing can, in the long-term, at least triple an organisation’s IT costs.

When Inland Revenue contracted out its 2,000-strong IT department to EDS, now HP, in 1994 it was the first major outsourcing deal in central government.

Costing a projected £1.03bn over 10 years the outsourcing was a success, according to the National Audit Office in a report in March 2000. The deal  enabled Inland Revenue to bring about changes in tax policy to a tight timetable, said the NAO’s Inland Revenue/EDS Strategic Partnership – Award of New Work.

But costs soared for vague reasons. Something called “post-contract verification” added £203m to the £1.03bn projected cost over 10 years. A further increase of £533m was because of “workload increases including new work”. Another increase of £248m was put down to inflation.

By now the deal with HP had risen from £1.03bn to about £2bn.

When the contract expired in 2004, HM Revenue and Customs and HP successfully transferred the IT staff to Capgemini. The new 10-year contract from 2004 to 2014 (which was later extended 2017) had a winning bid price of £2.83bn over 10 years.

So by 2004 the costs of outsourcing had risen from £1.03bn to £2.83bn.

The new contract in 2004 was called ASPIRE – Acquiring Strategic Partners for Inland Revenue. HMRC then added £900m to the ASPIRE contract for Fujitsu’s running of Customs & Excise systems. By now there were about 3,800 staff working on the contract.

The NAO said in its report in July 2006  – ASPIRE, the re-competition of outsourced IT services – that Gateway reviews had identified the need for a range of improvements in the management of the contract and projects.

Now costing £7.7bn over 10 years

The latest outsourcing costs have been obtained by Computing. It found that annual fees paid to Capgemini under ASPIRE were:

  • 2008/09:  £777.1m
  • 2009/10:  £728.9m
  • 2010/11:  £757.8m
  • 2011/12:  £735.5m
  • 2012/13:  £773.5m

So IT outsourcing costs have soared again. The original 10-year costs of outsourcing in 1994 were put at £1.03bn. Then the figure became about £2bn, then £2.83bn, then £3.7bn when Fujitsu’s contract was added to ASPIRE. Now annual IT outsourcing costs are running at about £770m a year – £7.7bn over 10 years.

So the original IT running costs of Inland Revenue and Customs & Excise have, under outsourcing contracts, more than tripled in about two decades.

Comment:

What happened to the prevailing notion that IT costs fall over the long-term, and that outsourcing brings down costs even further?

Shouldn’t HMRC’s IT costs be falling anyway because of reduced reliance on costly Fujitsu VME mainframes, reductions in data centres, modernisation of PAYE, and the clearance of time-consuming unreconciled items on more than 10 million tax files?

HMRC knows how much profit Capgemini makes under “open book” accounting. It’s a margin of about 10-15% says the NAO. Lower margins are for value-added service lines and higher margins for riskier projects. If the overall target profit margin of 12.3% is exceeded, HMRC can obtain an equal share of the extra profits.

There were 10 failures costing £3.25m in the first 15 months. Capgemini refunded £2.67m in service credits in the first year of the contract.

It’s also worth mentioning that Capgemini doesn’t get all the ASPIRE fees. It is the lead supplier in which there are around 300 subcontractors – including Fujitsu and BT.  Capgemini pays 65% of its fees to its subcontractors.

The outsourcing has helped to enable HMRC to bring in self-assessment online and other changes in tax policy. But HMRC’s quality of service generally (and not exclusively IT) is mixed, to put it politely.

The adjudicator for HMRC who intervenes in particularly difficult complaints identifies as particular problems the giving out of inaccurate information and recording information incorrectly.

She says in her 2013 annual report:

“I am disappointed at the number of complaints HMRC customers feel they need to refer to me in order to get resolution. My role should be to consider the difficult exceptions, not handle routine matters that are well within the capability of departmental staff to resolve successfully. At a time of austerity it is also important to note that the cost of dealing with customer dissatisfaction increases exponentially with every additional level of handling.”

RTI

There are complaints among payroll companies and specialists that real-time information  is not working as well as HMRC has claimed. There seems to be growing irritation with, for example, HMRC’s saying that companies owe much more than they do actually owe. And HMRC has been sending out thousands of tax codes that are wrong or change frequently – or both.

HMRC says it has made improvements but the helpline is appalling. It’s not unusual for callers to wait 30 minutes or more for an answer – or to hang on through multifarious automated messages only to be cut off.

That said there are signs HMRC is, in general, improving slowly. Chief executive of HMRC since 2012 Lin Homer is more down-to-earth and slightly more willing to own up to HMRC’s mistakes than her predecessors, and the fact that RTI and the modernisation of PAYE has got as far as it has is creditable.

But is HMRC a shining example of outsourcing at its best, of outsourcing that cuts costs in the long term? No. A decade of HP and a decade of Capgemini has shown that with outsourcing HMRC can cope, just about, with major changes in tax policy to demanding timetables. But the costs of the outsourcing contracts in the two decades since 1994 have more than tripled.

What about G-Cloud? We look forward to a change in direction from the incoming head of IT Mark Dearnley (if he has much say).

**

A Deloitte survey “The trend of bringing IT back in-house” dated February 2013, said that 48% of respondents in its Global Outsourcing and Insourcing survey 2012 reported that they had terminated an outsourcing agreement early, or for cause, or convenience. Those that took IT services back in-house mentioned cost reduction as a factor. Deloitte said factors included:

– the need for additional internal quality control due to poor quality from the outsourcer

– an increase in the price of service delivery through scope creep and excessive change orders.

Ex Govt CIO speaks – having left the public sector

By Tony Collins

In November last year we asked “Where is the Government CIO?”

We said that the then Government CIO Joe Harley – amiable, straight-talking and influential – could be the civil service’s ambassador for change.

Like his predecessor John Suffolk he could have used conferences and public events to talk inspirationally about the dystopian costs of government IT and what to do about them.  Why hasn’t he, we asked.

“If the Government CIO has much to say, it is not for the public ear. While there has been talk in recent weeks of how five corporations control GovIT, and how it can cost up to £50,000 to change a line of code, Harley has been silent.

“Where does the Government CIO stand on the need for major reform of the machinery of government, on the sensible risks that could save billions? Is the top man in Government IT inspiring his colleagues and officials in other departments to do things differently?”

Now it’s good to hear Joe Harley has speaking publicly about government IT, and what needs to be done. He has left the public sector though.

He suggested to Computing that there needs to be less strategising and more action.

“The whole emphasis now needs to be on implementation and delivery. There has been enough strategising and there really needs to be execution… [The government must] deliver on the implementation plan that we created and grow the talent with capability for the future.

“When it starts to deliver, we’ll start to see government ICT getting a [better] reputation,” he said.

Comment

Who will do less strategising and focus more on delivery?

As Harley now says, there needs to be individual accountability for decisions rather than a generalised blaming of committees.

“I think we need to be more light-footed and make people more accountable for their decisions and actions rather than [blaming] committees and programme boards,” he said.

No individual in government is going to make the changes that Harley recommends. Any real changes will be effected by committees and programme boards. Which is probably why material change in government administration and IT will happen in geologic periods. Unless an individual with charisma and leadership abilities – and who doesn’t mind talking in public while still in the public sector – is prepared to make the difference.

Ex Government CIO Joe Harley rejoins private sector

By Tony Collins

Former Government CIO Joe Harley has taken a position as non-executive adviser to Amor Group, an IT and business technology provider to the transport, energy and public sectors.

Amor says is taking the place of large systems integrators whose “monopolies are ending”.

It is Harley’s first official role since retiring from the civil service earlier this year.

Amor Group says it has “succeeded in recruiting the man credited with reforming the UK Government’s information communication and technology strategy to act as a strategic adviser”.

Harley was UK Government CIO between 2011 and 2012 and CIO at the Department of Work and Pensions from 2004 to 2012.

Amor has a turnover of about £45m and nearly 600 staff at offices in Aberdeen, Glasgow, Manchester, Coventry, London, Dubai and Houston.

Harley said,

“Amor Group is a new breed of companies that is helping organisations to improve their business performance and to manage their ICT budgets to deliver maximum value in the current economic climate and I am delighted to be helping a company which has grown year on year in a tough market, and that has such great ambitions for growth.

“Businesses are looking to more agile, flexible firms who can act quickly and save costs whilst not lowering service levels. I am looking forward to helping Amor continue that trend.”

John Innes, CEO at Amor, said,  “The days of the large systems integrators and monopolies are ending and we are taking their place. We signed a £18.5m contract last year with the Scottish Government to run its eProcurement service and we’ve seen real traction in International markets with our passenger tracking technology being installed at Dubai Airport and a number of wins for our Energy team in the US.

“What sets us apart is our culture as a company. We understand that technology only has a value when it delivers benefits to an organisation and we focus on delivering those benefits rather than selling heavyweight solutions.”

Harley’s background:

1993 – 1996: BP Alaska, IT director
1996 – 1998: BP Exploration and Downstream Europe, CIO
1998 – 2000: BP, global IT vice president
2000 – 2004: ICI Paints, CIO
2004 – 2012 Director General of Corporate IT and CIO, Department for Work and Pensions. Government CIO from 2011-2012.
Harley led the Universal Credit IT scheme which is due to go live from next October.