A 3rd birthday celebration audit of a debatable affected person data-sharing association between a London NHS Trust and Google DeepMind seems to have skirted over the core problems that generated the talk within the first position.
The audit (complete file right here) — performed via regulation company Linklaters — of the Royal Free NHS Foundation Trust’s acute kidney damage detection app gadget, Streams, which used to be co-developed with Google-DeepMind (the usage of an present NHS set of rules for early detection of the situation), does now not read about the problematic 2015 information-sharing settlement inked between the pair which allowed records to begin flowing.
“This Report incorporates an evaluate of the knowledge coverage and confidentiality problems related with the knowledge coverage preparations between the Royal Free and DeepMind . It is proscribed to the present use of Streams, and any more construction, practical trying out or medical trying out, this is both deliberate or in growth. It isn’t a ancient evaluate,” writes Linklaters, including that: “It includes consideration as to whether the transparency, fair processing, proportionality and information sharing concerns outlined in the Undertakings are being met.”
Yet it used to be the unique 2015 contract that induced the talk, after it used to be received and printed via New Scientist, with the wide-ranging report raising questions over the large scope of the knowledge switch; the criminal bases for sufferers news to be shared; and resulting in questions over whether or not regulatory processes supposed to safeguard sufferers and affected person records were sidelined via the 2 primary events concerned within the project.
In November 2016 the pair scrapped and changed the preliminary five-year contract with a unique one — which installed position more information governance steps.
They additionally went directly to roll out the Streams app to be used on sufferers in more than one NHS hospitals — regardless of the United Kingdom’s records coverage regulator, the ICO, having instigated an investigation into the unique data-sharing association.
And simply over a 12 months in the past the ICO concluded that the Royal Free NHS Foundation Trust had didn’t comply with Data Protection Law in its dealings with Google’s DeepMind.
The audit of the Streams project used to be a demand of the ICO.
Though, particularly, the regulator has now not counseled Linklaters file. On the opposite, it warns that it’s looking for criminal recommendation and may just take additional motion.
In a observation on its site, the ICO’s deputy commissioner for coverage, Steve Wood, writes: “We cannot endorse a report from a third party audit but we have provided feedback to the Royal Free. We also reserve our position in relation to their position on medical confidentiality and the equitable duty of confidence. We are seeking legal advice on this issue and may require further action.”
In a bit of the file record exclusions, Linklaters confirms the audit does now not imagine: “The data protection and confidentiality issues associated with the processing of personal data about the clinicians at the Royal Free using the Streams App.”
So necessarily the core controversy, associated with the criminal foundation for the Royal Free to move in my view identifiable news on 1.6M sufferers to DeepMind when the app used to be being advanced, and with out other folks’s wisdom or consent, goes unaddressed right here.
And Wood’s observation pointedly reiterates that the ICO’s investigation “found a number of shortcomings in the way patient records were shared for this trial”.
“[P]art of the undertaking committed Royal Free to commission a third party audit. They have now done this and shared the results with the ICO. What’s important now is that they use the findings to address the compliance issues addressed in the audit swiftly and robustly. We’ll be continuing to liaise with them in the coming months to ensure this is happening,” he provides.
“It’s important that other NHS Trusts considering using similar new technologies pay regard to the recommendations we gave to Royal Free, and ensure data protection risks are fully addressed using a Data Protection Impact Assessment before deployment.”
While the file is one thing of a frustration, given the obvious ancient omissions, it does lift some issues of hobby — together with suggesting that the Royal Free must more than likely scrap a Memorandum of Understanding it additionally inked with DeepMind, by which the pair set out their ambition to use AI to NHS records.
This is really helpful since the pair have it seems that deserted their AI analysis plans.
On this Linklaters writes: “DeepMind has informed us that they have abandoned their potential research project into the use of AI to develop better algorithms, and their processing is limited to execution of the NHS AKI algorithm… In addition, the majority of the provisions in the Memorandum of Understanding are non-binding. The limited provisions that are binding are superseded by the Services Agreement and the Information Processing Agreement discussed above, hence we think the Memorandum of Understanding has very limited relevance to Streams. We recommend that the Royal Free considers if the Memorandum of Understanding continues to be relevant to its relationship with DeepMind and, if it is not relevant, terminates that agreement.”
In every other phase, discussing the NHS set of rules that underpins the Streams app, the regulation company additionally issues out that DeepMind’s position within the project is little more than serving to supply a glorified app wrapper (at the app design entrance the project additionally applied UK app studio, ustwo, so DeepMind can’t declare app design credit score both).
“Without intending any disrespect to DeepMind, we do not think the concepts underpinning Streams are particularly ground-breaking. It does not, by any measure, involve artificial intelligence or machine learning or other advanced technology. The benefits of the Streams App instead come from a very well-designed and user-friendly interface, backed up by solid infrastructure and data management that provides AKI alerts and contextual clinical information in a reliable, timely and secure manner,” Linklaters writes.
What DeepMind did convey to the project, and to its different NHS collaborations, is cash and assets — offering its construction assets unfastened for the NHS on the level of use, and pointing out (when requested about its trade style) that it would decide how a lot to fee the NHS for those app ‘innovations’ later.
Yet the economic services and products the tech massive is offering to what are public sector organizations don’t seem to have been put out to open soft.
Also particularly excluded within the Linklaters’ audit: Any scrutiny of the project vis-a-vis festival regulation, public procurement regulation compliance with procurement laws, and any considerations in terms of conceivable anticompetitive habits.
The file does spotlight one doubtlessly problematic records retention factor for the present deployment of Streams, announcing there’s “currently no retention period for patient information on Streams” — that means there is not any procedure for deleting a affected person’s scientific historical past as soon as it reaches a undeniable age.
“This means the information on Streams currently dates back eight years,” it notes, suggesting the Royal Free must more than likely set an higher age prohibit at the age of news contained within the gadget.
While Linklaters in large part glosses over the chequered origins of the Streams project, the regulation company does make some extent of agreeing with the ICO that the unique privateness affect evaluate for the project “should have been completed in a more timely manner”.
It additionally describes it as “relatively thin given the scale of the project”.
Giving its reaction to the audit, well being records privateness advocacy staff MedConfidential — an early critic of the DeepMind data-sharing association — is roundly unimpressed, writing: “The biggest question raised by the Information Commissioner and the National Data Guardian appears to be missing — instead, the report excludes a “historical review of issues arising prior to the date of our appointment”.
“The report claims the ‘vital interests’ (i.e. remaining alive) of patients is justification to protect against an “event [that] might only occur in the future or not occur at all”… The most effective ‘vital interest’ safe this is Google’s, and its want to hoard scientific data it used to be informed have been unlawfully accrued. The important pursuits of a hypothetical affected person don’t seem to be important pursuits of a real records matter (and the GDPR assessments are demonstrably unmet).
“The ICO and NDG asked the Royal Free to justify the collection of 1.6 million patient records, and this legal opinion explicitly provides no answer to that question.”