An NAO report today suggests that some officials are fiddling projected savings figures from a shared services deal involving seven research councils.
It all began so well. A Fujitsu press release in 2008 said:
“UK Research Councils to implement shared services with Fujitsu. £40 million project will generate cost and efficiency savings across the organisations.”
An executive who representedFujitsu Services’ was quoted in the press release as saying at the time:
“Fujitsu is consistently proving that it can deliver effective shared services infrastructures and is playing a vital role in driving forward the transformational government agenda through shared services.
“Organisations that adopt a shared services approach can experience genuine economies of scale and reduction in costs which can be essential in their drive for continuous improvement.
Twenty-one months later Fujitsu and Research Councils UK parted company. The 10-year shared services contract began in August 2007. It was terminated by mutual consent in November 2009.
A revealing report, which is published today by the National Audit Office, shows how, despite the best intentions by the Cabinet Office to improve the management of IT-related projects and programmes, and decades of mistakes to learn from, some officials in departments are still making it up as they go along.
The worrying thing in the NAO report is not only what happened in the past – few will be surprised that the NAO report characterises the shared services deal as lacking professionalism. What’s worrying is officialdom’s more recent disregard for the truth when claiming savings for its shared services arrangements.
The NAO’s report”Shared Services in the Research Councils” suggests that officials manipulated – some could say fiddled – projected savings figures.
The NAO also found that officials awarded a £46m shared services contract to Fujitsu which came second in the bid evaluation. Exactly how the contract came to be awarded will be investigated soon by MPs on the Public Accounts Committee.
Origins of shared services contract
In 2004 a review led by the Government adviser Peter Gershon suggested that the public sector should save money by sharing support services such as IT, HR and finance. In 2006 officials at the Department of Trade and Industry (now the Department for Business, Innovation and Skills) encouraged their colleagues at seven research councils to set up a shared service centre, which they did.
The UK Research Councils is an important organisation. In 2009/10 it spent £3.7bn, mostly on giving research grants to universities, the European Space Agency and other organisations. Its biggest recipient of grants is the Medical Research Council.
Public servants appointed Fujitsu in August 2007 to put in place the ICT systems to underpin the shared service centre in a ten-year contract worth £46m. Fujitsu came second in the initial bid evaluations.
The NAO said that the bidding process produced a shortlist of three companies including Fujitsu. Said the NAO:
“The initial weightings applied by the [bid] panel had placed Fujitsu second: although the bid had scored well on quality, it was 19 per cent more expensive than the cheapest bid.”
An independent review commissioned by the project board backed the evaluations which put Fujitsu second. But the bid panel and the project board had concerns about the evaluation. The supplier chosen in the evaluation – which the NAO refuses to name – did not score well on quality requirements.
It appears that the bid panel and the project board preferred Fujitsu.
Then officials happened to spot a mathematical error in the bid scoring. The corrected scoring left Fujitsu on top, as the new preferred bidder.
Said the NAO:
“… a mathematical error was identified by a member of the project team that changed the order of the preferred suppliers, leaving Fujitsu as the front runner
“The [bid] panel reconvened to discuss this but, rather than re-performing in full the quantitative and qualitative analysis and submitting this to independent review, it decided to appoint Fujitsu on the basis of a vote.
“In September 2007 the gateway review team concluded that the incident had weakened the value of the overall process and had left the project at risk of challenge.”
User requirements unclear
Full delivery was due in September 2008 but the project team and Fujitsu “quickly encountered difficulties, resulting in contract termination by mutual consent in November 2009”.
The NAO said there was “miscommunication between the parties about expectations and deliverables, primarily because design requirements had not been sufficiently defined before the contract started”.
Fujitsu consequently missed agreed milestones. “Fujitsu and the Centre told us that the fixed-rate contract awarded by the project proved to be unsuitable when the customers’ requirements were still unclear.”
Officials paid Fujitsu a total of £31.9 million, of which £546,000 related to termination costs. Despite the payments to Fujitsu, parts of the system were withdrawn and rebuilt in-house.
Overspend on Fujitsu contract
The NAO found there were “significant overspends on design and build activities and the contract with Fujitsu.”
At least £13m wasted on Fujitsu deal
Said the NAO:
“Had the Fujitsu contract worked as planned, we estimate that the additional £13.2m design and build costs … would not have been needed. In addition the project management overspend of £9.1m would have been lower, as, after termination of the Fujitsu contract, a significant overhead in managing contractors was incurred by the project.”
Fujitsu out – Oracle in
The breakdown in relations with Fujitsu led to the appointment of Oracle as supplier of the grants element of the project. “The contract with Oracle suggested that lessons had been learnt by the project following its experience with Fujitsu, with greater effort given to specifying the design upfront,” said the NAO.
Did officials know what they were doing?
In deciding how to share services the research councils came up with six options including setting up a centre run jointly by the councils or joining with another public sector agency such as one supplying the NHS.
But two of the options including the NHS one were dropped without proper analysis, said the NAO. The remaining four options were each given a score of one to three, against seven criteria. “The scores appear to be purely judgemental with no quantified analysis,” said the NAO.
Even if the six options had been properly appraised, the evaluation would have failed because it did not include a “do-minimum” option as recommended by HM Treasury.
“Overall, the quality of options appraisal was poor,” said the NAO.
Fiddling the figures?
The NAO found that:
– Initial estimates were of zero projected procurement savings from shared services. But by the time the first draft of the business case had been written the projected savings had soared to £693.9m.
– When this project board queried this figure the research councils’ internal audit service scaled down the figure to £403.7m – but this included £159.3m of savings that internal audit had concluded were not soundly based.
– Since the shared services centre began officials have recorded procurement savings of £35.2m against the business case and while of these are valid savings some are not. The NAO investigated 19 high-value savings that represented 40% of savings recorded to the end of 2010 and found that 35% “should not be claimed against the project investment”.
– The research councils have been “unable to provide paperwork to substantiate the claimed saving”.
– Savings claimed were indistinguishable from normal business practice such as disputing costs claimed by a supplier.
– Clear evidence exists that the budget holder had no intention or need to pay the higher price against which the saving was calculated
– Last month the research councils claimed that savings were £28m higher than they had reported previously owing to errors in the original numbers. But the NAO found that the councils were unable to reconcile fully the two sets of numbers; had not used a single method for calculating benefits or tracked these effectively; and had not included £7m of spending incurred by the councils. “Overall, this review has highlighted that Councils have not put in place proper processes to track benefits and forecast future operational savings,” said the NAO.
– Further, investments needed to deliver projected savings have not been included in calculations.
– Double counting. A revised target for projected procurement savings procurement “includes elements of double counting …”
Other NAO findings:
– Four Gateway review reports of progress on setting up the shared services centre, including a review which put the project at “red – immediate action needed”, were not fully followed up.
– There was no evidence of intervention by the Department for Business, Innovation and Skills when it became clear the shared services project was likely to overspend.
– The shared services centre has begun to match the pre-shared services payment performance of the research councils but a high number of invoices was on hold at the end of July 2011 because of problems with the end-to-end processes. About 5,900 invoices were on hold, awaiting payment, in July 2011, which was 21 per cent of all invoices due to be paid in that month. The reason for the delay was being investigated.
– Despite the shared services arrangements, some research council staff were at times running parallel systems, or managing their businesses without adequate data.
– In July 2011 the shared services centre had 53 key performance targets to meet but was only able to measure activity against 37 of them and of these met only 13..
– Five of the seven research councils did not file annual accounts on time in 2011 in part because functions in the finance ICT system were not delivered by the project.
Some good news
Said the NAO:
“The grants function and its associated ICT system developed by the project has allowed the Councils to replace older systems that were increasingly at risk of failing. This is of critical importance, given that the processing of research grant applications lies at the heart of what the Councils do. The single grants system has the potential to make it easier for the Councils to collectively modify their processes in the future…”
The commendably thorough NAO investigation has shown once again how badly departments and their satellites are in need of independent Cabinet Office oversight when it comes to major IT-related projects. In that respect thank goodness for the Cabinet Office’s Major Projects Authority. But how much influence can it really have? How much influence is it having?
This NAO report suggests that some officials are fiddling the figures without a care for professional accounting practices. Double counting, not including full costs in projected savings calculations, not having paperwork to support figures and other such administrative misdemeanours indicates that some officials are making up savings figures as they go along.
What is to be done when some departments and their agencies are not to be trusted in managing major projects?