Category Archives: data sharing

NHS “Wachter” digital review is delayed – but does it matter?

By Tony Collins

The Wachter review of NHS technology was due to be published in June but has been delayed. Would it matter if it were delayed indefinitely?

A “Yes Minister” programme about a new hospital in North London said it all, perhaps. An enthusiastic NHS official shows the minister round a hospital staffed with 500 administrators. It has the latest technology on the wards.

“It’s one of the best run hospitals in the country,” the NHS official tells the minister, adding that it’s up for the Florence Nightingale award for the standards of hygiene.

“But it has no patients,” says the minister.

Another health official tells the minister,

“First of all, you have to sort out the smooth running of the hospital. Having patients around would be no help at all.” They would just be in the way, adds Sir Humphrey.

In the Wachter’s review’s terms of reference (“Making IT work: harnessing the power of health IT to improve care in England“)  there is a final bullet point that refers, obliquely, to a need to consider patients. Could the Wachter terms of reference have been written by a satirist who wanted to show how it was possible to have a review of NHS IT for the benefit of suppliers, clinical administrators and officialdom but not patients?

The Wachter team will, according to the government,

• Review and articulate the factors impacting the successful adoption of health information systems in secondary and tertiary care in England, drawing relevant comparisons with the US experience;

• Provide a set of recommendations drawing on the key challenges, priorities and opportunities for the health and social care system in England. These recommendations will cover both the high levels features of implementations and the best ways in which to engage clinicians in the adoption and use of such systems.

In making recommendations, the board will consider the following points:

• The experiences of clinicians and Trust leadership teams in the planning, implementation and adoption of digital systems and standards;

• The current capacity and capability of Trusts in understanding and commissioning of health IT systems and workflow/process changes.

• The current experiences of a number of Trusts using different systems and at different points in the adoption lifecycle;

• The impact and potential of digital systems on clinical workflows and on the relationship between patients and their clinicians and carers.

Yes, there’s the mention of “patients” in the final bullet point.

Existing systems?

nhsSome major IT companies have, for decades, lobbied – often successfully – for much more public investment in NHS technology. Arguably that is not the priority, which is to get existing systems to talk to each other – which would be for the direct benefit of patients whose records do not follow them wherever they are looked at or treated within the NHS.

Unless care and treatment is at a single hospital, the chances of medical records following a patient around different sites, even within the same locality, are slim.

Should a joining up of existing systems be the main single objective for NHS IT? One hospital consultant told me several years ago – and his comment is as relevant today –

“My daughter was under treatment from several consultants and I could never get a joined-up picture. I had to maintain a paper record myself just to get a joined-up picture of what was going on with her treatment.”

Typically one patient will have multiple sets of paper records. Within one hospital, different specialities will keep their own notes. Fall over and break your leg and you have a set of orthopaedic notes; have a baby and you will have a totally different set of notes. Those two sets are rarely joined up.

One clinician told me, “I have never heard a coroner say that a patient died because too much information was shared.”

And a technology specialist who has multiple health problems told me,

“I have different doctors in different places not knowing what each other is doing to me.”

As part of wider research into medical records, I asked a hospital consultant in a large city with three major hospitals whether records were shared at least locally.

“You must be joking. We have three acute hospitals. Three community intermediate teams are in the community. Their records are not joined. There is one private hospital provider. If you get admitted to [one] hospital and then get admitted to [another] the next week your electronic records cannot be seen by the first hospital.  Then if you get admitted to the third hospital the week after, again not under any circumstances will your record be able to be viewed.”

Blood tests have to be repeated, as are x-rays; but despite these sorts of stories of a disjointed NHS, senior health officials, in the countless NHS IT reviews there have been over 30 years, will, it seems, still put the simplest ideas last.

It would not cost much – some estimate less than £100m – to provide secure access to existing medical records from wherever they need to be accessed.

No need for a massive investment in new technology. No need for a central patient database, or a central health record. Information can stay at its present location.  Just bring local information together on local servers and provide secure access.

A locum GP said on the Pulse website recently,

“If you are a member of the Armed Forces, your MO can get access to your (EMIS-based) medical record from anywhere in the world. There is no technical reason why the NHS cannot do this. If need be, the patient could be given a password to permit a GP to see another Surgery’s record.”

New appointments

To avoid having patients clog up super-efficient hospitals, Sir Humphrey would have the Wachter review respond to concerns about a lack of joined up care in the NHS by announcing a set of committees and suggesting the Department of Health and NHS England appoint a new set of senior technologists.

Which is just what has happened.

Last week NHS England announced  “key appointments to help transform how the NHS uses technology and information”. [One of the NHS appointments is that of a Director of Digital Experience, which is not a fictional title, incidentally. Ironically it seems to be the most patient-facing of the new jobs.]

Said the announcement,

“The creation of these roles reflects recommendations in the forthcoming review on the future of NHS information systems by Dr Bob Wachter.

“Rather than appoint a single chief information and technology officer, consistent with the Wachter review the NHS is appointing a senior medical leader as NHS Chief Clinical Information Officer supported by an experienced health IT professional as NHS Chief Information Officer.

“The first NHS Chief Clinical Information Officer will be Professor Keith McNeil, a former transplant specialist who has also held many senior roles in healthcare management around the world, including Chief Executive Officer at Addenbrooke’s Hospital, Cambridge University Hospitals NHS Foundation Trust and Chief Executive Officer at the Royal Brisbane and Women’s Hospital in Australia.

“The new NHS Chief Information Officer will be Will Smart, currently Chief Information Officer at the Royal Free London NHS Foundation Trust. Mr Smart has had an extensive career in IT across the NHS and in the private sector.

“The NHS CCIO and NHS CIO post-holders will act on behalf of the whole NHS to provide strategic leadership, also chairing the National Information Board, and acting as commissioning ‘client’ for the relevant programmes being delivered by NHS Digital (previously known as the Health and Social Care Information Centre).

“The roles will be based at NHS England and will report to Matthew Swindells, National Director: Operations and Information, but the post-holders will also be accountable to NHS Improvement, with responsibility for its technology work with NHS providers.

“In addition, Juliet Bauer has been appointed as Director of Digital Experience at NHS England. She will oversee the transformation of the NHS Choices website and the development and adoption of digital technology for patient ‘supported self-management’, including for people living with long term conditions such as diabetes or asthma. Ms Bauer has led delivery of similar technology programmes in many sectors, including leading the move to take Times Newspapers online…”

Surely a first step, instead of arranging new appointments and committees, and finding ways of spending money on new technology, would be to put in place data sharing agreements between hospitals?

A former trust chief executive told me,

“In primary care, GPs will say the record is theirs. Hospital teams will say it is our information and patient representative groups will say it is about patients and it is their nformation. In maternity services there are patient-held records because it is deemed good practice that mums-to-be should be fully knowledgeable and fully participating in what is happening to them.

“Then you get into complications of Data Protection Act. Some people get very sensitive about sharing information across boundaries: social workers and local authority workers. If you are into long-term continuous care you need primary care, hospital care and social care. Without those being connected you may do half a job or even less than that potentially. There are risks you run if you don’t know the full information.”

He added that the Summary Care Record – a central database of every patient’s allergies, medication and any adverse reactions to drugs, was a “waste of time”.

“You need someone selecting information to go into it [the Summary Care Record]so it is liable to omissions and errors. You need an electronic patient record that has everything available but is searchable. You get quickly to what you want to know. That is important for that particular clinical decision.”

Is it the job of civil servants to make the simple sound complicated?

Years ago, a health minister invited me for an informal meeting at the House of Commons to show me, in confidence, a one-page civil service briefing paper on why it was not possible to use the internet for making patient information accessible anywhere.

The minister was incredulous and wanted my view. The civil service paper said that nobody owned the internet so it couldn’t be used for the transfer of patient records.  If something went wrong, nobody could be blamed.

That banks around the world use the internet to provide secure access to individual bank accounts was not mentioned in the paper, nor the existence of the CHAPS network which, by July 2011, had processed one quadrillion (£1,000,000,000,000,000) pounds.

Did the briefing paper show that the civil service was frightened by the apparent simplicity of sharing patient information on a secure internet connection? If nothing else, the paper showed how health service officials will tend, instinctively, to shun the cheapest solutions. Which may help to explain how the (failed) £10n National Programe for IT came into being in 2002.

Jargon

Radiation_warning_symbolNobody will be surprised if the Wachter review team’s report is laden with  jargon about “delays between technology being introduced and a corresponding rise in output”. It may talk of how new technology could reduce the length of stay by 0.1528 of a bed day per patient, saving a typical hospital £1.8m annually or 7,648 bed days.

It may refer to visions, envisioning fundamental change, establishing best practice as the norm, and a need for adaptive change.

Would it not be better if the review team spoke plainly of the need for a patient with a fractured leg not having to carry a CD of his x-ray images to different NHS sites in a carrier bag?

Some may await the Wachter report with a weary apprehension that its delay – even indefinitely – will make not a jot of difference. Perhaps Professor Wachter will surprise them. We live in hope.

Wachter review terms of reference.

Review of IT in the NHS

https://ukcampaign4change.com/2016/02/09/another-npfit-it-scandal-in-the-making/

Hunt announces Wachter review

What can we learn from the US “hospitalist” model?

After billions spent on NHS IT, a carrier bag to transfer x-ray images

By Tony Collins

After fracturing my angle (slipping on a slope while mowing the lawn) I’ve been surprised how well parts of the NHS work – but not when it comes to the electronic transfer of records and PACS x-ray images from one trust area to another.

The minor injuries unit at one trust wasn’t able to send its PACS images to another trust’s orthopaedic department because it used a different PACS.  [The NHS has spent more than £700m on PACS ]

“Can’t we email the images?” said a senior nurse at the minor injuries unit. In reply the clinician looking at my x-rays gave a look that suggested emailing x-rays was impossible,  perhaps for security and cost reasons. [PACS images are sometimes tens of MBs.]

In the end the minor injuries unit (which within its own sections shared data electronically) had to download my x-rays onto CD for me to take the other trust’s orthopaedic department.  The CD went into a carrier bag.

The next day, at a hospital with an orthopaedic department, after 4-5 hours of waiting in a very busy A&E, I gave a doctor the CD. “I don’t think we can read that,” he said. “We don’t have any computers which take CDs.”

After a long search around a large general hospital the tired doctor eventually found a PC with a CD player. Fortunately the minor injuries unit had downloaded onto the disc a self-executing program to load the x-rays. Success. He gave his view of the fracture.

Even then he didn’t have my notes from the minor injuries unit.

Comment

My care was superb. What was surprising was seeing how things work – or don’t – after the NHS has spent more than £20bn on IT over the past 20 years.

The media is bombarded with press releases about IT innovations in the NHS. From these it’s easy to believe the NHS has the most up-to-date IT in the world. Some trusts do have impressive IT – within that trust.

It’s when records and x-rays need to be transferred outside the trust’s area that the NHS comes unstuck.

As a nurse at my GP’s practice said, “Parts of the NHS are third-world.”

Since 2004 billions has been spent on systems to create shareable electronic patient records.  But it’s not happening.

Within those billions spent on IT in the NHS, couldn’t a little bit of money be set aside for transferring x-rays and patient notes by secure email? That’s the real innovation the NHS needs, at least for the sake of patients.

In the meantime the safest way for x-rays and notes to be transferred from one trust to another is within the patient’s carrier bag.

NHS database: is it a top IT priority?

By Tony Collins

It’s called the NHS database but the new “giant” medical records system is to be run by the Health and Social Care Information Centre, largely for the benefit of researchers.

Though it may help patients in the longer term, say by helping to identify what treatments work and don’t, it is arguably not the NHS’s most immediate IT priority.

I said on BBC R4’s Today programme this morning that a top NHS IT priority is providing secure links to health records so that patients with acute and chronic illnesses can be treated in one part of the NHS one week and another part of the health service the following week – perhaps in a different county – and have their updated records accessible wherever they go.

At present patients with multiple problems can end up being treated in different NHS or non-NHS centres without each organisation knowing what the other is doing.  This is dangerous for patients and gives the impression the NHS is technologically backwards.

Links can be made to existing medical records – there are millions of electronic records already in the NHS – without creating a big central database. The records can reside where they are at the moment, inside and outside the NHS, and be linked to securely by clinicians and nurses, subject to the patient’s specific consent.

Indeed patients should be able to look at their record online and correct any mistakes.

Research database

My comment on BBC R4 Today that a research database is a good idea has brought a mixed response – understandably, because are risks. We need some facts from the Health and Social Care Information Centre on who is going to run the database, and how information will be made genuinely anonymous.

The HSCIC concedes in its information material that some patient information on the database will be potentially identifiable, but it implies this is acceptable if the organisations using the data can be trusted.

Why must information be potentially identifiable? And to what extent can the HSCIC be trusted to run the database? It is, after all, managing contracts under the National Programme for IT, a scheme which Jeremy Hunt called a “huge disaster”.

How much extra will be paid to BT which runs the SUS database under the “dismantled” NPfIT? It is likely that BT’s Spine and SUS-related work will link into the new “NHS database”. Have any new contracts gone to open competitive tender?

Firecontrol disaster and NPfIT – two of a kind?

By Tony Collins

Today’s report of the Public Account Committee on the Firecontrol project could, in many ways, be a report on the consequences of the failure of the National Programme for IT in the NHS in a few years time.

The Firecontrol project was built along similar lines to the NPfIT but on a smaller scale.

With Firecontrol, Whitehall officials wanted to persuade England’s semi-autonomous 46 local fire authorities to take a centrally-bought  IT system while simplifying and unifying their local working practices to adapt to the new technology.

NPfIT followed the same principle on a bigger scale: Whitehall officials wanted to persuade thousands of semi-autonomous NHS organisations to adopt centrally-bought technologies. But persuasion didn’t work, in either the fire services or the NHS.

More similarities

The Department for Communities and Local Government told
the PAC that the Firecontrol control was “over-specified” – that it was unnecessary to have back-up to an incident from a fire authority from the other side of the country.

Many in the NHS said that NPfIT was over-specified. The gold-plated trimmings, and elaborate attempts at standardisation,  made the patient record systems unnecessarily complicated and costly – and too difficult to deliver in practice.

As with the NPfIT, the Firecontrol system was delayed and local staff  had little or no confidence it would ever work, just as the NHS had little or no faith that NPfIT systems would ever work.

Both projects failed. Firecontrol wasted at least £482m. The Department of Communities and Local Government cancelled it in 2010. The Department of Health announced in 2011 that the NPfIT was being dismantled but the contracts with CSC and BT could not be cancelled and the programme is dragging on.

Now the NHS is buying its own local systems that may or may not be interoperable. [Particularly for the long-term sick, especially those who have to go to different specialist centres, it’s important that full and up-to-date medical records go wherever the patients are treated and don’t at the moment, which increases the risks of mistakes.]

Today’s Firecontrol report expresses concern about a new – local – approach to fire services IT. Will the local fire authorities now end up with a multitude of risky local systems, some of which don’t work properly, and are all incompatible, in other words don’t talk to each other?

This may be exactly the concern of a post-2015 government about NHS IT. With the NPfIT slowly dying NHS trusts are buying their own systems. The coalition wants them to interoperate, but will they?  

Could a post-2015 government introduce a new (and probably disastrous) national NHS IT project – son of NPfIT – and justify it by drawing attention to how very different it is to the original NPfIT eg that this time the programme has the buy-in of clinicians?

The warning signs are there, in the PAC’s report on Firecontrol. The report says there are delays on some local IT projects being implemented in fire authorities, and the systems may not be interoperable. The PAC has 

” serious concerns that there are insufficient skills across all fire authorities to ensure that 22 separate local projects can be procured and delivered efficiently in so far as they involve new IT systems”.

National to local – but one extreme to the other?

The PAC report continues

“There are risks to value for money from multiple local projects. Each of the 22 local projects is now procuring the services and systems they need separately.

“Local teams need to have the right skills to get good deals from suppliers and to monitor contracts effectively. We were sceptical that all the teams had the appropriate procurement and IT skills to secure good value for money.

“National support and coordination can help ensure systems are compatible and fire and rescue authorities learn from each other, but the Department has largely devolved these roles to the individual fire and rescue authorities.

“There is a risk that the Department has swung from an overly prescriptive national approach to one that provides insufficient national oversight and coordination and fails to meet national needs or achieve economies of scale. 

Comment

PAC reports are meant to be critical but perhaps the report on Firecontrol could have been a little more positive about the new local approach that has the overwhelming support of the individual fire and rescue authorities.  

Indeed the PAC quotes fire service officials as saying that the local approach is “producing more capability than was expected from the original FiReControl project”. And at a fraction of the cost of Firecontrol.

But the PAC’s Firecontrol Update Report expresses concern that

– projected savings from the local approach are now less than originally predicted

– seven of the 22 projects are running late and two of these projects have slipped by 12 months

– “We have repeatedly seen failures in project management and are concerned that the skills needed for IT procurement may not be present within the individual fire and rescue authorities, some of which have small management teams,” says the PAC.

On the other hand …

The shortfall in projected savings is small – £124m against £126m and all the local programmes are expected to be delivered by March 2015, only three months later than originally planned.

And, as the PAC says, the Department for Communities and Local Government has told MPs that a central peer review team is in place to help share good practice – mainly made up of members of fire and rescue authorities themselves.

In addition, part of the £82m of grant funding to local fire services has been used by some authorities to buy in procurement expertise.

Whether it is absolutely necessary – and worth the expense – for IT in fire services to link up is open to question, perhaps only necessary in a national emergency.

In the NHS it is absolutely necessary for the medical records of the chronically sick to link up – but that does not justify a son-of-NPfIT programme. Linking can be done cheaply by using existing records and having, say, regional servers pull together records from individual hospitals and other sites.

Perhaps the key lesson from the Firecontrol and the NPfIT projects is that large private companies can force their staff to use unified IT systems whereas Whitehall cannot force semi-autonomous public sector organisations to use whatever IT is bought centrally.

It’s right that the fire services are buying local IT and it’s right that the NHS is now too. If the will is there to do it cheaply, linking up the IT in the NHS can be done without huge central administrative edifices.

Lessons from FireControl (and NPfIT?) 

The National Audit Office identifies these main lessons from the failure of Firecontrol:

– Imposing a single national approach on locally accountable fire and rescue authorities that were reluctant to change how they operated

–  Launching the programme too quickly without applying basic project approval checks and balances

– Over optimism on the deliverability of the IT solution.

– Issues with project management including consultants who made up half of the management team and were not effectively managed

MP Margaret Hodge, chair of the Public Accounts Committee, today sums up the state of Firecontrol

“The original FiReControl project was one of the worst cases of project failure we have seen and wasted at least £482 million of taxpayers’ money.

“Three years after the project was cancelled, the DCLG still hasn’t decided what it is going to do with many of the specially designed, high-specification facilities and buildings which had been built. Four of the nine regional control centres are still empty and look likely to remain so.

“The Department has now provided fire and rescue authorities with an additional £82 million to implement a new approach based on 22 separate and locally-led projects.

“The new programme has already slipped by three months and projected savings are now less than originally predicted. Seven of the 22 projects are reportedly running late and two have been delayed by 12 months. We are therefore sceptical that projected savings, benefits and timescales will be achieved.

“Relying on multiple local projects risks value for money. We are not confident that local teams have the right IT and procurement skills to get good deals from suppliers and to monitor contracts effectively.

“There is a risk that the DCLG has swung from an overly prescriptive national approach to one that does not provide enough national oversight and coordination and fails to meet national needs or achieve economies of scale.

 “We want the Department to explain to us how individual fire and rescue authorities with varied degrees of local engagement and collaboration can provide the needed level of interoperability and resilience.

“Devolving decision-making and delivery to local bodies does not remove the duty on the Department to account for value for money. It needs to ensure that national objectives, such as the collaboration needed between fire authorities to deal with national disasters and challenges, are achieved.”

Why weren’t NPfIT projects cancelled?

 NPfIT contracts included commitments that the Department of Health and the NHS allegedly did not keep, which weakened their legal position; and some DH officials did not really want to cancel the NPfIT contracts (indeed senior officials at NHS England seem to be trying to keep NPfIT projects alive through the Health and Social Care Information Centre which is responsible for the local service provider contracts with BT and CSC).

PAC report on Firecontrol

What Firecontrol and the NPfIT have in common (2011)

Summary Care Record “unreliable”

By Tony Collins

The  central Summary Care Record database (which is run by BT under its NPfIT Spine contract) is proving unreliable, Pulse reports today.

The SCR is supposed to give clinicians , particularly those working in A&E and for out-of-hours services, a view of the patient’s most recent medicines, allergies and bad reactions to drugs.

But one criticism of the scheme has always been the lack of any guarantee that the data in the SCR could be accurate or complete.

Researchers at University College, London, led by Trisha Greenhalgh, found in a confidential draft report that doctors were unable to trust the SCR database as a single source of truth. They found in some cases that  some information on the database was wrong, and what should have been included in the patient’s record was omitted for unknown reasons.

Now Pulse reports that some GP-derived information is going on the patient’s SCR, and some isn’t. One problem is that GPs must use smartcards to update the SCR database and some don’t use them.

The General Practitioners Committee of the British Medical Association has raised the matter with the Department of Health.

Dr Paul Roblin, chief executive of Oxfordshire, Buckinghamshire and Berkshire local medical committee told Pulse that  smartcards were not often used in Buckinghamshire, because they slowed down the practice IT system for normal use, with one practice reporting that it had interfered with allergy data.

Dr Roblin said that this made the record ‘unreliable’ and said that although most GPs would prefer to take their own history rather than relying on the SCR, and would double check all details with the patient, other health professionals may not realise the record is incomplete, and may not check the data.

He said “Drugs lists might not be complete and recent allergies may not be uploaded. The Summary Care Record is unreliable. Don’t rely on it. It’s an expensive initiative without a lot of benefit.”

Dr Chaand Nagpaul, GPC lead negotiator on IT, said the current arrangements  undermine the benefit and usefulness of summary care records.

“The GPC have suggested workaround systems for practices who do not use smartcards, such as a ‘mop-up’ session where all new data is uploaded on to the national spine once a day. However, the DH decided against this option.”

There may be professionals who believe the SCR database  represents an up to date record said Nagpaul.

A DH spokesperson said that most practices which have created Summary Care Records use smartcards.

[Whether justified or not the SCR  scheme is believed to have cost about £250m so far.]

In 2010 Professor Ross Anderson at Cambridge University argued that the SCR could do more harm than good.

Richard Veryard also wrote on the unreliability of the SCR in 2010.

The Devil’s in the Detail – UCL report on the Summary Care Record.

Summary Care Record – where does the truth lie?

Medical dictionary should help prevent medication mistakes

By Tony Collins

The Department of Health says that a medicines dictionary, which is approved today, will make medical errors less likely by ensuring all staff who work in the NHS and healthcare use the same terminology when referring to medicines.

The Information Standards Board for Health and Social Care has approved the NHS dictionary of medicines and devices – called “dm+d” –  as a standard which, says the Department of Health,  “must be used by all staff”.

The DH says that “all doctors, nurses and pharmacists should move towards using the common medicines dictionary so that information exchanged electronically is accurate and safe”.

Using a single drug terminology will “enable information about patients’ medicines to transfer more effectively between different healthcare settings, reducing the risk of medication mistakes caused by human error”.

The NHS dictionary of medicines and devices is already used in the UK for the exchange of clinical information, including the Electronic Prescription Service and for patients’ Summary Care Records.

Dr Charles Gutteridge, National Clinical Director for Informatics at the Department of Health and Medical Director, Barts and the London NHS Trust said

“The adoption of dm+d is an important milestone. It will mean clearer and consistent communication throughout the NHS ensuring health professionals in all care settings …. I encourage all clinicians to accelerate their use of this common medical dictionary for the benefit of the patients we care for.”

Heidi Wright, from the Royal Pharmaceutical Society (RPS) said “The Royal Pharmaceutical Society supports the need for a single terminology to facilitate interoperability and to enable such initiatives as the Electronic prescription Service (EPS). We believe that the opportunities created for using dm+d are substantial in terms of interoperability, opportunities for comparison and reducing variation, enhancing patient safety i.e. reducing risks associated with system interfaces and providing links to clinical systems such as the British National Formulary .”

The dictionary contains unique identifiers and associated textual descriptions for medicines and medical devices.  It was developed and delivered through a partnership between the Department of Health Informatics Directorate  and the NHS Business Services Authority.

The DH Information Strategy says that  reducing the number of inconsistent or incompatible terminologies will allow better integration between systems and across health and social care, and better information to support care and improvement of care.

Maude responds to fears over data-sharing between government agencies

By David Bicknell

Cabinet Office minister Francis Maude has responded to a Guardian story which reported that ministers are planning to shake up the law on using confidential personal data to make it easier for public-sector organisations to share confidential information supplied by the public.

The article had argued that “the proposals are similar to ‘database state’ legislation abandoned by the last Labour government in 2009 in the face of fierce opposition. That legislation was intended to reverse the basic data protection principle that sensitive personal information provided to one government agency should not normally be provided to another agency for a different purpose without explicit consent.”

Maude has responded to the Guardian piece, saying, “One of the guiding principles of this Government is the restoration of civil liberties and rolling back the intrusive state; that is why one of our first legislative acts was to scrap ID cards. So it is wrong to say our proposals are similar to the previous government’s abandoned “database state” legislation.

“We want people to be able to interact with government online, for example, in applying for benefits or a disabled parking permit, in a way that is quick, easy and secure. To do this we need to give them a way of proving their identity online, but only if they choose to. This would be done without a national, central scheme.

“This is all about putting the citizen in charge, not the state. But we are still taking great care to carefully consult on our plans. Throughout all our work in this area we have proactively engaged with privacy and consumer groups including NO2ID, Privacy International, Which?, London School of Economics, Oxford Internet Institute and Big Brother Watch.

“In June the Cabinet Office will publish, in a white paper, plans for improving data-linking across government. What will not be published in this white paper are any “fast-track” proposals that would require changes to the existing legislative landscape. Any such proposals will need careful consideration with the involvement of the public and interest groups with whom we will continue to engage.

“This is not a question of increasing the volume of data-sharing that takes place across government, but ensuring an appropriate framework is in place so that government can deliver more effective, joined-up and personalised public services, through effective data-linking.”