G-Cloud – it’s starting to happen

By Tony Collins

Anti-cloud CIOs should “move on” says Cabinet Office official, “before they have caused too much harm to their business”.

For years Chris Chant, who’s programme director for G Cloud at the Cabinet Office, has campaigned earnestly for lower costs of government IT. Now his work is beginning to pay off.

In a blog post he says that nearly 300 suppliers have submitted offers for about 2,000 separate services, and he is “amazed” at the prices. Departments with conventionally-good rates from suppliers pay about £700-£1,000 a month per server in the IL3 environment, a standard which operates at the “restricted” security level. Average costs to departments are about £1,500-a-month per server, says Chant.

“Cloud prices are coming in 25-50% of that price depending on the capabilities needed.”  He adds:

“IT need no longer be delivered under huge contracts dominated by massive, often foreign-owned, suppliers.  Sure, some of what government does is huge, complicated and unique to government.  But much is available elsewhere, already deployed, already used by thousands of companies and that ought to be the new normal.

“Rather than wait six weeks for a server to be commissioned and ready for use, departments will wait maybe a day – and that’s if they haven’t bought from that supplier before (if they have it will be minutes).  When they’re done using the server, they’ll be done – that’s it.  No more spend, no asset write down, no cost of decommissioning.”

Chant says that some CIOs in post have yet to accept that things need to change; and “even fewer suppliers have got their heads around the magnitude of the change that is starting to unfold”.

“In the first 5 years of this century, we had a massive shift to web-enabled computing; in the next 5 the level of change will be even greater.  CIOs in government need to recognise that, plan for it and make it happen.

“Or move on before they have caused too much harm to their business.”

He adds: “Not long from now, I expect at least one CIO to adopt an entirely cloud-based model.  I expect almost all CIOs to at least try out a cloud service in part of their portfolio.

“Some CIOs across government are already tackling the cloud and figuring out how to harness it to deliver real saves – along with real IT.  Some are yet to start.

“Those that have started need to double their efforts; those that haven’t need to get out of the way.”

Cloud will cut government IT costs by 75% says Chris Chant

Chris Chant’s blog post

Clegg speech renews Coalition mutuals and employee-ownership focus

By David Bicknell

Deputy Prime Minister Nick Clegg has advocated greater employee-ownership.

In a speech yesterday to the Corporation of London, he described employee share ownership as “a touchstone of liberal economic thought for a century and a half and a hugely under-used tool in unlocking growth.”

As this report explains,  he suggested that employee-owned firms could end the ‘standing feud between capital and labour’.

“We don’t believe our problem is too much capitalism: we think it’s that too few people have capital. We need more individuals to have a real stake in their firms.”

It could be the latest kick-start the mutuals and employee-ownership initiative needs. (And John Lewis’s marketing department must be wallowing again in the free publicity)

Not all coverage of Clegg’s speech has been positive, however, with Nils Pratley in the Guardian calling the employee share-ownership  ideas ‘half-baked’.  Pratley says the speech raised more questions than answers.  But in fairness, I don’t think Clegg’s intention was to lay out a complete White Paper for action. It was merely to continue to put employee ownership on the agenda for discussion.

Text of Nick Clegg’s speech

Transition Institute’s weekly round-up of mutuals and spin-out stories

By David Bicknell

Here is a link to the Transition Institute’s weekly round-up of mutuals and public sector spin-out stories.

NB The link  on the Transition Institute site to the Public Service article on procurement change and SMEs on 13th January doesn’t work. The link below does, however.

Pace of procurement change frustrates innovative SMEs

‘Penny wise and pound foolish’ to postpone IT project

By David Bicknell

Sometimes you make decisions over the future of IT systems in the public sector with the best intentions – but still you can’t win. Someone, somewhere, will be unhappy.

Yesterday, I mentioned that a $92m overhaul of a Department of Revenue system in Oregon had been postponed to save money. Now, it seems,  the postponement is a bad idea that will hamper legislators’ ability to make well-informed decisions.  

“I think it is penny wise and pound foolish, if I could use an old saw,” said Vicki Berger, co-chair of the committee that oversees state taxing and revenue policy, according to the Statesman Journal. “We have to bite the bullet. We have to get a better system. We have to know better, more viable information on what impacts our revenue stream.”

Richard Devlin, co-chair of the legislature’s Joint Legislative Audits, Information Management and Technology Committee, has reportedly characterised the announcement as a “nine-month delay” rather than a cancellation of the project.

“I don’t see that as an end to the project, because the need is very real. They need to upgrade their systems, and they will continue to work to that end,” said Devlin. “I can understand the counter-argument, that you do have antiquated systems in the Department of Revenue, but I think citizens in Oregon would want when we invest in this fully that we do it right,” he continued. “I would not want to spend $92 million and then have a project that doesn’t really work.”

Comment

It’s a sign of the times that you can get such polarised views over the future of an IT project, but it’s perhaps not surprising when the project is going to cost $92m. I think the current climate is likely to see cost/benefits for IT projects become an issue for many organisations, both in the public and private sectors, but especially in the public sector.

It doesn’t necessarily mean that IT projects are at risk, simply that those making decisions on new systems/upgrades are going to need hard evidence of the real change benefits to justify any decision they make to proceed.

US state government and defence IT projects face uncertain future

By David Bicknell

Local newspapers in the US are offering some insight into the cloudy future of two significant IT projects.

In Salem, Oregon, a planned $92 million upgrade of the state’s Department of Revenue computer system is reportedly on hold because the state can’t afford $13 million in start-up costs.

The Register-Guard website says local officials chose to put the  project on hold rather than ask legislators to make a choice between paying for the computer system and paying for public safety and human services.

The computer system is said to be responsible for processing $7 billion a year and 94 percent of Oregon’s general fund revenue, but officials are apparently concerned about its future effectiveness.

The agency’s ability to collect taxes rests on a “myriad of disparate, aging software applications and databases,” according to a 96-page business analysis the Department of Revenue produced in 2010.

Meanwhile,  in Beavercreek, Ohio, a US Air Force computer modernisation project which has already cost $1 billion, is said to be at risk of Washington defence cuts.

US Air Force officials have acknowledged that the Expeditionary Combat Support System project, on which at least $986.5 million has been spent, won’t be completed in 2016 as had been hoped. Work began in 2007, but the local Springfield News-Sun newspaper reports that the completion date has been repeatedly postponed because of delays.

US report suggests huge FISCal IT project may be ‘running into some hurdles’

By David Bicknell

A report from the US has cast doubt on the progress of a major financial system for the state of California.

CivSource suggests that the IT project,  a business transformation project for the state government in the areas of budgeting, accounting, and procurement, is “running into some hurdles. The Financial Information System for California (FISCal), was supposed to streamline IT costs and staffing but seems to be hitting snags for exactly those reasons.”

So far, CivSource says, “…the project has cost over $60 million with final costs stretching into over a billion over the next 12 years. Supporters of the system say that the state needs to spend this money in order to upgrade legacy systems and modernise processes.

“However, long term cost-estimates of the project are still up in the air. As are claims that systems will be modernised if the proposed build out lasts over a decade. Future funding is also uncertain as the state faces unprecedented rolling budget crises.”

An LA Times article previously suggested that the system was at one point $300 million over budget and three years behind schedule.  It argues that despite its Silicon Valley technology expertise, California has a poor track record of delivering successful IT projects.

In our book on IT projects, ‘Crash’, Tony Collins and I reported on the problems with the Department of Motor Vehicles project  which was cancelled in 1994 at a cost of around $50m.  $50m would be a snip compared to the financial muscle which may be needed to finally deliver FISCal.

FISCal project site

Temenos confirms end of T24 IT project with Queensland Treasury Corporation

By David Bicknell

The Swiss banking software supplier Temenos has responded to overnight coverage of the end of an IT project with the Queensland Treasury Corporation in Australia.

Queensland Treasury Corporation said it had changed tack on the T24 project after experiencing problems developing a core banking system.

Temenos said today, “Temenos can confirm that the T24 project at Queensland Treasury Corporation (QTC) has been terminated. This decision was taken by QTC based on a change of requirements, which had become more specific; as a result it wanted an approach that was more in line with these unique requirements.

“It has therefore decided to run its IT upgrade in house. Temenos is pleased to have had the opportunity to assist QTC with their IT renewal project, and has enjoyed a constructive and mutually amicable relationship with the organisation.”

QTC, which manages public sector debt and investments, told the Brisbane Times it was now developing its own system and insisted it would be able to do so within the existing $27 million project budget. The organisation defended its handling of the IT contract, saying it has now acquired additional in-house capabilities.

The story had been extensively covered in the Australian IT and banking press earlier today.

QLD Treasury terminates failed IT overhaul

QLD Treasury Corp abandons core IT project

Queensland Treasury Corporation tears up Temenos T24 contract

Are officials pressing GPs to switch IT supplier to SystmOne?

By Tony Collins

There’s concern in the NHS that Primary Care Trusts, which are due to be abolished next year, are putting GP practices under pressure to switch their IT systems to TPP SystmOne, a patient record system that is supplied by CSC under the National Programme for IT.

The conversions are being subsidised by taxpayers under unpublished NPfIT local service provider contracts. The concern of at least one aspiring Clinical Commissioning Group – which is one of the CCGs being formed under Andrew Lansley’s health reforms –  is that GP system conversions to TPP SystemOne under local service provider NPfIT contracts could leave CCGs a legacy of financial commitments that are as yet unknown.

One CCG contacted Campaign4Change to express concern that it may have uncertain financial commitments when it begins to take on SystmOne commitments next year. On 1 April 2013 PCTs and strategic health authorities are due to be abolished and their responsibilities passed to authorised CCGs.

Aspiring CCGs are now taking a close interest in PCT financial commitments because the Groups are due to inherit any of their local PCT deficits incurred from 1 April 2011 to 31 March 2013.

At present, GP practices receive PCT funding whether they take replacement SystmOne patient record technology from CSC  under the NPfIT or acquire new IT under a scheme known as GP Systems of Choice.

But the Group’s spokeswoman said that PCTs are putting pressure on GP practices to replace their systems with SystmOne. She said it’s because it can cost PCTs less – or nothing – for a GP switch to SystmOne under NPfIT-funded local service provider contracts. In comparison PCTs may have to pay costs such as hardware maintenance when GPs acquire systems under GPSoC.

Incentives for GPs to switch IT supplier

Our inquiries show that at least one PCT has received what it called “incentives” from its strategic health authority for GP practices to change computer systems, according to the PCT’s response to an FOI inquiry. The FOI response said: “The PCT can confirm that the incentives passed to [GP] practices to change computer systems as follows”.

It went on to say that its strategic health authority gave the PCT a £10,000 implementation fee [for each GP practice that changed its systems]. The PCT passed £3,000 of the £10,000 to the GP practice to part fund its implementation costs.

The PCT’s preferred GP system supplier was SystmOne, as supplied by CSC.

What happens when CSC’s NPfIT contract expires in 2015?

At that time Clinical Commissioning Groups may have to pay whatever costs are levied because GP practices with SystmOne could be reluctant to switch systems again, said the CCG spokesperson.

The Department of Health’s Informatics Directorate, which has subsumed NHS Connecting for Health, has confirmed that the prices it pays CSC for TPP installations are confidential.

Said a DH spokesperson “While prices within the LSP [Local Service Provider] contracts are commercially confidential we are in partnership with Intellect, the Technology Trade Association, to develop an open and transparent approach to costs and quality, as part of working to create a vibrant marketplace.”

A spokesperson for CSC said  “Because we are in active negotiations with the government, we are not able to comment in depth on the programme until those negotiations have concluded.”

The spokesperson said the comments applied to TPP as it is “a supplier to us working on the National Programme”.

Department of Health response

When asked if GP practices are taking on non-transparent NPfIT commitments for TPP systems, the DH spokesperson said “If a GP practice chooses to take a system under an LSP contract they are made fully aware of the product they are taking and the length of the contract.

“We are committed to ensuring transparent and trusting working relationships between suppliers and their NHS customers.”

Asked whether GP practices that choose GPSoC systems cost the PCT more than TPP acquired through the LSP contracts, the DH spokesperson said “ It is up to the GP practice as to whether they choose a system through GPSoC or through the LSP contracts.

“The GPSoC PCT/ Practice agreement provides a mechanism for GPs to raise and resolve any concerns they may have.”

Comment

Centrally-funded incentives to PCTs to encourage GPs to switch to SystmOne as supplied by CSC under the NPfIT keep alive one of the original objectives of the national programme, which was to have health IT dominated by a few suppliers that would be under firm central control.

But that strategy creates an imbalance in the health IT market, inhibits open competition and leaves the NHS with unquantifiable future costs given that SystmOne is being supplied under NPfIT contracts that are secret.

Favouring central control, Labour created the NPfIT. In contrast the coalition favours decentralisation so it makes sense for GPs to have a genuine choice of suppliers, with the funding PCTs remaining neutral on the decision.

TPP SystmOne is good enough to compete freely in the open market. It does not need a leg up from the PCT or the Department of Health – just for the sake of keeping a part of the original NPfIT alive.

 

Lifting the lid on Agile development within a public sector IT project

By David Bicknell

It’s not often that you get an insight into the workings of Agile development within a public sector  IT project.

So the Inspector General’s report into the Sentinel IT project at the FBI that I mentioned a couple of days ago offers a rare and unique picture into how the sprints, story points etc are progressing. This will not be new to Agile exponents – but the detail below may be of interest to those unfamiliar with Agile’s processes.

Transition to an Agile Development Approach

The report’s discussion of Agile within the Sentinel project says this:

“Agile software development is not a set of tools or a single methodology, but an approach that leverages close collaboration between representatives of system users, system developers, and testers to deliver functionality in a compressed timeframe and on a continuous basis. The delivery of working software is the primary measure of progress, and satisfying customers through the delivery of valuable software is treated as the highest priority during development.

“While an Agile methodology can be implemented in a variety of ways, the FBI is implementing a variation called Scrum, an iterative methodology which breaks the development effort into increments called sprints, each of which the FBI decided would last 2 weeks.

“At the conclusion of each sprint, User Stories – functions that a system user would typically perform – along with Architecture Stories – qualities that define the system software architecture and configuration – are planned and completed, and it is the successful completion of these stories that is measured as progress for the project.”

Development Progress

“As of August 26, 2011, the FBI had completed 22 of 24 planned sprints. Under the Scrum approach, a project’s progress and amount of work remaining is measured using a burndown chart, which depicts how factors such as the rate at which a development team completes work (a team’s velocity) and changes in a project’s scope affect its likelihood of staying on schedule and within budget over time.

“This information can be used by project management and project stakeholders to estimate the duration of the project or the amount of work that can be completed within an identified amount of time.

“During the first 22 sprints (Sprint 0 through Sprint 21), the FBI had completed 1,545 of the 3,093 story points (1,548 remaining) that it identified at the beginning of the project, or about 50 percent.  As of December 2, 2011, the FBI reported that it had completed 28 of 33 planned sprints ….It had also completed 2,345 story points  – 748 remained to be completed.”

Velocity

The Report says this of the Agile team’s velocity:

“According to FBI officials, after five sprints have been completed, the velocity, or rate at which an Agile team completes story points, can be used to project the completion rate of future work. During Sprints 5 through 21, the Sentinel team’s average velocity was 80 story points per sprint.

“During our review, we estimated that if the team’s velocity remained at 80 story points per sprint, the FBI would complete about 55 percent of the intended functionality by the end of the project’s originally planned 24 sprints on September 23, 2011. At that rate of development we estimated that Sentinel will be completed in June 2012.

“On September 6, 2011, the FBI CIO stated that the FBI had added six development sprints to Sentinel’s development schedule and that the FBI then planned to end development on December 16, 2011, after 30 sprints. After development ended, the FBI planned to test Sentinel for about 6 weeks and then deploy the system to all users in January 2012. During the additional development sprints, the FBI planned to finish the functionality work that it previously planned to complete by September 23, 2011.

“Based on the average velocity of 80 story points per sprint, and the number of remaining story points to be completed (1,548) we estimated that the FBI would complete about 71 percent of the intended functionality by the end of the project’s 30 development sprints on December 16, 2011.

“On December 1, 2011, the FBI again extended the schedule for the completion of Sentinel. The CTO stated that the FBI had added four development sprints to Sentinel’s development schedule and that the FBI now plans to end development in February 2012, after 34 sprints. After development, the FBI plans to test Sentinel for about 12 weeks and then deploy the system to all users in May 2012. During this testing period, the FBI plans to test Sentinel’s hardware and execute a test of all major Sentinel functionality that will involve personnel from across the FBI.

“Also in December 2011, after the FBI received a copy of our draft report, the FBI reported to us that during Sprints 5 through 28 it had completed 2,167 story points, an average of 90 story points per sprint – 10 more story points than its average rate as of September 2011.

“Based on this average velocity and the number of remaining story points to be completed (748) during the final 5 sprints under this plan, the Sentinel team must increase its average velocity to approximately 150 story points per sprint.

“However, the six sprints between the end of development and deployment – during which the FBI will test Sentinel – could also have story points assigned to them that the FBI is not accounting for at this time, and as a result the total number of story points to complete the project could increase. Without including such an increase, the FBI would need to average about 68 story points per sprint over the total 11 sprints remaining before the planned May 2012 deployment.”

Sentinel Agile Development Approach

The report’s Appendix says this about the FBI’s approach to its Agile development for Sentinel:

“In October 2010 the FBI identified a total of 670 stories for the Sentinel Product Backlog, or the compilation of all of the project’s stories. The FBI has mapped the Product Backlog to each of the requirements in Sentinel’s Systems Requirements Specification (SRS), which serves as an important control to ensure that the backlog, and the stories it contains, cover all of Sentinel’s requirements. The FBI also assigned weighted amounts, or “story points,” to each story in the Product Backlog based on the difficulty of the work associated with each story. The FBI assigned a total of 3,093 story points to its 670 stories in the Sentinel Product Backlog.”

The Report’s Conclusion

Although it appears that the FBI has made good progress with its Agile development, adopting Agile may not be enough to get the project exactly on track, with some testing issues and hardware problems discussed in the report.

“It is too early to judge whether the FBI’s Agile development of Sentinel will meet its newly revised budget and completion goals and the needs of FBI agents and analysts.

“While the Sentinel Advisory Group responded positively to the version of Sentinel it tested, results from wider testing were not as positive. Also, none of the Agile-developed Sentinel has been deployed to all users to give them the ability to enter actual case data and assist FBI agents and analysts in more efficiently performing their jobs.

“Despite the FBI’s self-reported progress in developing Sentinel, we are concerned that the FBI is not documenting that the functionality developed during each sprint has met the FBI’s acceptance criteria. Our concerns about the lack of transparency of Sentinel’s progress are magnified by the apparent lack of comprehensive and timely system testing.

“Our concerns about the lack of transparency also extend to Sentinel’s cooperation with internal and external oversight entities, to which Sentinel did not provide the necessary system documentation for them to perform their critical oversight and reporting functions.

“We believe that this issue could be resolved, at least in part, with a revision to the FBI’s Life Cycle Management Directive to include standards for Agile development methodologies.

“….Sentinel experienced significant performance problems during the Sentinel Functional Exercise. The FBI attributed these performance problems to either the system architecture or the computer hardware.

“According to the FBI, subsequent operational testing confirmed the inadequacy of the legacy hardware and the requirement to significantly expand the infrastructure before the system could be deployed to all users. In November 2011, the FBI requested that Lockheed Martin provide a cost proposal for this additional hardware.”

The FBI’s Response

In its response to the report, the FBI says:

 “….we are mindful of the short delay we have recently encountered under our new” Agile” approach. The Sentinel development schedule has recently been extended by two months (from December 2011 to February 2012), and the FBI-wide deployment is now scheduled for May 2012, as described in this Report.

“This modest extension is due primarily to the need to implement a standard  five-year “refresh” of computer hardware, so the Sentinel software will provide the required functionality as intended. Indeed, you have determined that, given the pace at which the program has proceeded under the Agile approach over the time period you reviewed, your estimate for completion is essentially the same – June 2012.

“We have one concern with the current draft of the Report. We request that you note that the hardware we are acquiring for the refresh, which is being purchased using fiscal year 2012 operations and maintenance funds, is separate from the development activities being carried out by the Agile team under the development budget.

“The refresh is part of the normal and expected operations and maintenance activities of the FBI, and such a refresh is a common maintenance activity where hardware has reached its expected replacement threshold. We do not agree that the FBI is using operations and maintenance funds for the development of Sentinel…and we ask that you make this revision.”

NPfIT Cerner go-live at Bristol has “more problems than anticipated”

By Tony Collins

The BBC reports that there are “more problems than anticipated” with a patient-booking system at two Bristol hospitals run by North Bristol NHS Trust.

The trust describes the problems as “teething”.  Consultants say the problems are “potentially dangerous”.

Last month North Bristol went live with the Cerner Millennium system under an NPfIT contract with BT. The Trust says problems are due to software being used incorrectly. They have led to some patients missing their operations and the wrong patients being booked for operations, says the BBC.

Emails from executives at Frenchay and Southmead hospitals, seen by the BBC, said staff should be “vigilant” to check lists were “completely accurate”.

BBC Points West’s health correspondent Matthew Hill said emails sent by consultants to hospital bosses claimed operation lists printed by the system were “complete fiction” and “potentially dangerous”.

One consultant told the BBC he had been put down to operate on patients from a completely different speciality.

The trust said there had been “teething problems” and that there had been “more problems than anticipated”.

In an email to staff the trust said the change of system had been “a very big change” so there was “no surprise” there had been difficulties.

A trust spokesman said there were a series of problems around outpatients and the associated clinics and some of the data moved from old systems had not migrated as planned.

“We need to ensure that we rebuild and recreate the clinics to match what people expect them to be on the ground,” he said.

“In theatres we have had some issues but have absolutely ensured from the outset that clinical safety has been at the top and have ensured any risks and issues have been mitigated.”

Conservative MP Richard Bacon, a member of the Public Accounts Committee, has established through a Parliamentary question that the cost of the North Bristol Cerner implementation is much higher than for a non-NPfIT installation in the same city.

Health Minister Simon Burns told Bacon that the costs of a Cerner Millennium deployment at the North Bristol NHS Trust were £15.2m for deployment and an annual service charge of £2m.

This brought the total cost of the Cerner system over seven years to about £29m, which was more than three times the £8.2m price of a similar deployment outside of the NPfIT at University Hospitals Bristol Foundation Trust.

Comment

Several Cerner implementations under the NPfIT have gone awry but the problems have eventually been resolved. The question is whether patient care and treatment is affected in the meantime. The lack of openness over problems with patient care in the NHS mean that the answer will probably never be known, which underlines the need for better regulation of hospital IT implementations.

Does hospital IT need airline-style safety certification?