Old tech, new threats or new tech, old threats?
This podcast explores how new technology may introduce new cyber security risks.
The original health and social care system was not built with cyber security in mind. As we continue to innovate and use technology and data to improve patient care, are we introducing cyber security risk?
In this episode, Mike Fell, National Director of Cyber Operations, talks to Charlie Sammut, former Deputy Director of the Cyber, Data, Analytics and Surveillance Group, UK Health Security Agency, about the advantages and disadvantages of having a diverse technology estate, and how we speak to clinicians and scientists about cyber.
We have to see doing cyber well as a public health outcome.
Old tech, new threats or new tech, old threats? podcast transcript
Mike Fell, NHS England:
Hello and welcome to today's edition of The Cyber Sessions podcasts. I'm your host, Mike Fell, the director of national Cyber Operations at NHS England and really pleased that Charlie Sammut of the UK Health Security Agency, is joining me. Today we're going to be talking about old tech, new threats or new tech, old threats.
The original health and social care system really was not built with cyber security in mind. Indeed, cyber security hadn't even really been conceived of, let alone the Internet. As we continue to innovate and use technology and data to improve patient care, today's questions already hang off whether we are introducing cyber security risk, that’s probably a fairly short question to answer, of course we are, so we'll probably go a little bit deeper than just whether we're introducing that risk, and talk a little bit about how we're managing those risks.
So Charlie huge thanks for joining us today. Can I just ask you to briefly introduce yourself and tell us a bit about the amazing role that you have at the UK Health Security Agency?
Charlie Sammut:
Hello everyone. It's a really great pleasure to be here and I hope this podcast proves enjoyable and potentially informative for you. So yeah, I'm Charlie Sammut, Deputy Director for Cyber in the Data Analytics and Surveillance group in UKHSA. But effectively what I am responsible for is the cyber security of the entirety of UKHSA’s estate. So UKHSA is the UK Health and Security Agency, as many of you may know, it was born during the pandemic.
At three different organisations, which was Public Health England, Test and Trace, and then the joint Bio Security Centre and the mission is, I think for all of us working in healthcare, it's a really meaningful and enlivening one, which is that effectively we are there to prevent, prepare for and respond to infectious diseases and environmental hazards and that's really to keep all of us safe. So we have a huge amount of roles within that, the one that has the most pertinence I think to all of us is preparing for the next pandemic and responding to any current or future pandemics, and really my role within that is to make sure that everything we do within that and we are a very data-driven, very technology heavy organisation is done in a cyber secure way as possible. Me personally, I am a career civil servant. So I'm coming off of 15 years in civil service. I think probably like you, Mike, I bounced around numerous different disciplines actually and I wouldn't ever describe myself, I don't think, as a cyber expert, but I have now been doing cyber on and off for about the last 10 years.
Really working across numerous organisations, including working overseas in the Middle East with the Foreign Office and helping develop national cyber security capabilities there and then coming back and doing work on international cyber governance in the Foreign Office and then doing work in central government on incident management and leading some incident management preparations as well. All of which I think prepped me quite well for this role, which is a big role, big team, big budget and huge amounts of risk and important things that we're doing.
Mike Fell, NHS England:
Great. Thank you. And I think you know it's a common theme amongst this that I don't think anything can prepare you for the reality of kind of delivering cyber security risk mitigations, but that is one of the few kind of bits of diversity that is enjoy to work around people like yourself that do bring those kind of different experiences and insight. I think it's a really important thing, not least because as you well know, healthcare is a hugely large complex sector and a really incredibly diverse estate, for the technology that it does and the age of that technology in many cases as well, I know that we talk to probably about 150,000 organisations contributing to what we recognise as that sector and doing everything from, you know individuals delivering residential care to all the way up to some of the most technologically advanced large data sets and most sensitive intellectual property in the world. So I guess kind of one thing that'll be interesting to talk about is what the specific cyber security challenges that you within the Health Security Agency face and how, how do you go about approaching them?
Charlie Sammut:
Yeah, it's a great question and I think you've touched on I think the key thing, the key challenge that I found coming into this job and to be honest, what I found most fascinating about it is just the diversity of the estate and the user base that we're trying to protect. You know that we have everything from some incredible cloud environments where there's really, really cutting edge data science going on, huge amounts of sensitive data through to our health security laboratories, which were absolutely instrumental in the pandemic, are absolutely instrumental in surveillance of future pandemics.
And we, you know, have incredibly expert staff and you know, machinery that is unique effectively or, you know, is only used in those kind of environments and you know, touching on one of the points you made right at the start, much of it is is old, it's still incredible machinery, but it wasn't built with cyber security in mind. And so there's that diversity where we have to try and protect everything effectively, everything is at risk. Everything is a vulnerability. And everything is actually instrumental to the way that the organisation does its business.
The other thing that really comes through in that challenges everything we'd also do is about enabling the organisation and is about supporting patient care and supporting ultimately better health outcomes for the country as a whole. So there's that balance as well, constantly between risk and delivery and always keeping that kind of public health outcome in mind which feels right at the heart of what the UK does in a way that sometimes missions can feel slightly abstract. And I think particularly when you're a cyber security team, the sense of mission about what the actual wider organisation does can feel a bit abstract.
Because you're so intent on delivering cyber security, whereas here we are constantly reminded about balance between, you know, need to be as cyber security as we can need to enable the business needing to really ensure we get the right public health outcomes and constantly negotiating. And sometimes that does require a bit of balance and sometimes across the diversity of that estate that throws up some really bizarre challenges like how do you secure machinery from 30 years ago equally you know how to ensure that we can really look after whole and national data sets and make sure that they are as secure as possible.
So for me that's been the challenge that's felt unique, particularly in comparative to experience elsewhere is the diversity in the scale of the technology and the user base here.
Mike Fell, NHS England:
Yeah. And it's interesting a kind of emerging theme within our sector more generally is the kind of almost the unification of things and the consolidation that we're recognising there. And I think you know what you've described there without in any way throwing shade at the size and complexity of it. it's almost a microcosm of the wider sector.
I know certainly I talk at length about this not being a cyber issue. It's a patient safety issue.
There's probably different language kind of in your space a little bit with it, but ultimately for me I think that complexity and taking the business centric approach, whatever that business is, whether it's a patient, whether it is a society, whether it's an entire civilization being protected from a pandemic is there.
One of the other things that it makes me think about here is I like a bit of a kind of analogy and Matthew Syed’s book ‘Black box thinking’ is a really kind of important one for me in this space that looks about how the airline industry went about moving from a world in the 1960s where planes genuinely fell out of the sky because people had filled them up in litres of fuel, thinking they were filling them up in gallons of fuel, to the place now where I think we can generally have more confidence that the plane isn't going to fall out of the sky and one of the things that strikes me is you fly with any big airline now.
Mike Fell, NHS England:
And they typically bar a few have a mixed fleet they'll have half of one of the big airline manufacturers and half with another and I think it's a comparison here I don't know what the answer is I ask myself the question but is there advantages for having a diverse estate or is there a kind of panacea where just having that single vendor solution is actually the better one for it, is that diversity helpful for you or is it just a fig leaf a veneer of it in thinking that's because something's really old and won't be targeted but it makes us safer.
Charlie Sammut:
Yeah. It's a really good question, isn't it? Because there's so many trade-offs in each of those decisions, you know, you can see absolutely the benefit of a single vendor.
And the, you know, benefits of unification equally, you know. Resilience. The ability to see vulnerabilities and exploits out in the wild and they're targeting, we routinely have this from your team in particular, this vendor has been targeted by this vulnerability has been found is this point the fact that we have some resilience. We have different types of providing technology provider on site actually gives us resilience and redundancy makes it introduces more complexity. But it does give us resilience. But I think the overall point to me is, I don't think we will ever get away from that kind of diversity because of a diversity of public health issues that we face.
And you just look at that, I think in our laboratories, for instance, that if we were just dealing with one type of problem, you know one type of pandemic vector, you would be able to scale one type of solution. But we're not. We're actually trying to look in UKHSA, I think this is true for a lot of the NHS as well. You're trying to care and respond to all sorts of different things. A laboratory is probably a great example of them, you know, a microcosm of what we face, you need lots of different technology, which comes from lots of different vendors to be able to respond to those kind of issues.
And that brings with it both the opportunity to be able to respond and to be flexible and to have expert people delivering expert outcomes on expert pieces of technology. It does also bring in lots of different risk as well. Because we've got lots of different technology we're trying to weld that into trying to assure it trying to understand it, trying to bring it onto a network, really trying to secure, all of that does actually introduce lots of risks. So I'm of the opinion where I can see particularly in terms of how we handled large data sets, yes, there's got to be absolute benefit to unification.
One of the big pieces of work we have on is about creating an enterprise data analytic platform where actually we can really merge big data. We can really get kind of outcomes out of it, but in terms of being able to respond to it, the reason we have a diverse estate is because we're responding to usually a diverse set of issues and we need to maintain that. And so we, I don't see a point where we would ever move away from that diversity without significantly undermining our ability to flex and respond to issues. I think even if we were able to have a kind of shift to a single or several providers, you get the dis-economies of scale and you get the issues of monopoly.
Mike Fell, NHS England:
Yeah, I think that's a really kind of insightful response to what was a slightly cartooned kind of question in an attempt to drive you down a real kind of binary answer. But I mean, I can't agree more. This is ultimately risk management and I think you're right in kind of closing off with that point about innovation, the way that we get the most effective, innovative solutions cannot be, monopoly kind of situations where we need that innovative bit for the UK economy but also for the kind of health outcomes that we're seeking as well with it. A slightly different angle on that one is we've talked a bit about all the technology. We've recently had the spring budget announcement of £3.4 billion of additional funding for the NHS to increase productivity and clearly technology is one of the one of the mainstays of making a more productive workforce. Asking clinicians, asking doctors, asking nurses to just work a little bit harder isn't the name of the game here. Technology has to be part of the answer so how do you find that balance? When we're talking about the trade off between securing that older technology versus implementing the new technology, what's the kind of considerations that go through your mind?
Charlie Sammut:
When you look at UKHSA, I presume when you look at the wider NHS estate, there's years and years of accretion of technology, right and you know again the lab's a good example, there's incredible cutting edge equipment there. There's some stuff that's been there for years because it's so essential to how the lab does its business and, overhauling all of that is just there's not a cost effective way to do it. So I think where I'm at the minute they are two quite separate but interlinked elements. The first is on securing new technology.
It's a complete no brainer. We have to do it. It's the one area where I really don't want to take any risk and I think where the Cabinet Office is coming from and where the NHS coming from, secure by design is absolutely vital. So as a programme develops, security is enmeshed in it right in the heart and that we've not always done that well. I don't think any organisation has frankly.
But that's the only way it's doing secure by design. Is the only way that we're ever going to scale security at the outset of any programme and why I want that to be, is it if we get that proper embedded in the business and part of just how the business works, we actually open up the opportunity to start to really get to grips in legacy tech and legacy tech presents different issues. In some ways it's a risk I can live with a bit more because it's been there for years and years, right.
But it does present much different risks. Firstly, in terms of just understanding what the risk actually is, I think again this is similar for organisations, but understanding exactly what we have right and getting really good about making sure all registers are up to scratch. What's your understanding, you know and that is the scale of a challenge in and of itself, right? Because like I said, this is in some cases up to 30 years of investment and 30 years of, you know accretive technology put on a top of accretive technology.
Mike Fell, NHS England:
Yeah.
Charlie Sammut:
So first is understanding it which needs a lot of time and space and care. The second is then being able to actually prioritise what the risk is from it, the fact that there are, no doubt, organisations running completely out of date operating systems which are unpatchable, but actually to what extent are they really vital to a medical process or not? And therefore what sort of risk can we take and that's where that dynamic between patient safety sort of public health outcomes and cyber security and technology, I think really comes into play.
And then thirdly. It's really been able to prioritise what the mitigations are and how can we do that at scale without costing a huge amount of money for the organisation. We have to do both but I am really, really focused on getting secure by design in place so that we can actually fix forward and hopefully buy us the time then to do more of a fixing backwards. But it's going to be a long, hard process to properly understand the risk and then properly prioritise the mitigation.
Mike Fell, NHS England:
Yeah, I think you're absolutely right there and you know secure by design is a great principle. Recently we've had 68 organisations in America sign up to the principles of that, expanding it beyond the public sector. And I think given the nature of our work and the partnerships that we rely on with technology organisations, it's absolutely there and I think you're right that it's almost, one of my colleagues uses the analogy of turning the tap off. Once you turn the tap off, you can go up and mop the stuff up as well as kind of see what else you need to do with it. So OK.
Charlie Sammut:
That's absolutely spot on, because that's what we're trying to do effectively is turn the tap off and mop up what's leftover.
Mike Fell, NHS England:
Yeah, yeah, yeah. So it's been two different tasks that you can actually get on top of.
I'm going to change the direction a little bit here now. So an alien landing in and listening into this might accuse us of having the easiest jobs in the world because the Hippocratic Oath with the clinicians that we work with is very clear about kind of doing no harm and actually that's kind of what we're aiming to do in, in cyber resilience and in cyber security.
Mike Fell, NHS England:
The protection from the fear of harm is at the heart of these things.
But I'm always very conscious that we work in a world of log for shells of bits and bytes of 0 days and of our own little language. And then clinicians operate in a deeply complex world of medical language. And we've already touched upon this kind of need to be business centric in our approach to understand the risks and choose the right one so. I guess two questions out of that are how do you go about talking to the clinicians and scientists about cyber risk? And is it working?
Charlie Sammut:
I think if an alien came down, he'd also reflect that sometimes with two different tribes speaking two different languages, but on the same topic. So risk wise.
I think we're both exceptionally good at viewing risk, but we just come at it from different angles often.
And you know my clinician and scientist colleagues are, like genuinely credible at public health risk and can understand it in a way that I would never, ever hope to.
And equally, I would hope they would accept, but I'm reasonably good at cyber technology risk and the constant challenge of what I've been reflecting on is OK, how do we bridge those two worlds to make them mutually comprehensible.
I think for a long time we've suffered in UKHSA from you know, I've got a really excellent team, but the scale of a challenge has meant that actually sometimes we're quite internally focused and we haven't done enough across the business, particularly in terms of explaining why is cyber important? Why does it affect people? Why is cyber actually a public health issue if we get it wrong?
And one of the ways I'm increasingly considering that is, you know, we always go back to your CIA triad, right. And we really care about things like confidentiality and availability. But actually from a kind of scientific perspective.
What's really, really critical alongside those two is integrity and actually having faith in the datasets that you're operating off. If your data set has been accessed by an outside actor, whether it's a criminal who's got in and just trashed a few data sets, well, you know, trashed the server on the way out. It's compromised integrity of that, and it's compromised the heart of the scientific method. And that's one of the interesting conversations I've been having with my scientific and clinician colleagues is that public health relies on the integrity of data, relies on the integrity of the insight that the clinicians and scientists have.
And actually, if we don't do our cyber security well, that is something that is compromised and really compromisable, we're going to do more and more of this because I think bridging that gap, bridging the way that we see risk and bringing those worlds together is absolutely crucial. I don't think we've always got it right by any means. One of the regular frustrations that we, I know our scientist colleagues have with us is that we will frequently, talking about labs, we need software particularly that's running that lab machinery to be assured, to check where it's going, where it's beaconing out to, you know, has it got malware, etcetera.
But it slows down bringing on replacement equipment and of course, to the scientist that’s saying you're actually costing a public health outcome here. I can tell you how many tests I've had to run by hand or how many issues this has generated further down the public health chain. So it is unfortunately a lot of time a balance I think. And the only way around it as I said we've got to make those two worlds mutually comprehensible.
We have to see doing cyber well as a public health outcome, and fundamentally a lot of it really comes down to the people and building the right relationships and being able to explain things properly and listening. A lot of the time to the other side and being able to make sensible risk decisions off the back of that. Sometimes that does mean that actually we take a substantial cyber risk to enable a public health outcome.
And sometimes it's vice versa.
Mike Fell, NHS England:
I think you're absolutely right and I kind of reflect on, you know, two recognised professions, one which has been a profession for a very long time in the kind of clinical space and on many parts of that versus what is quite an embryonic one in security and I look at things like the manner in which you know, doctors, consultants will speak to patients to ensure informed consent and the manner in which that's done. And then I kind of play that off against sometimes how we will talk about a critical vulnerability in a way which is very different from that in which a clinician might speak to somebody with a bedside manner about it. And I do think there's definitely things to take away there about the way in which we do talk about, you know, critical and use the war language and the technical language for it, when actually we're talking about, as you say, it's either, some data is not going to be available that they rely on, or some data's going to be changed or
variants of that. So interesting, right?
Charlie Sammut:
There's a whole problem anyways in there that nobody wants to hear from security. You know, security brings with it a whole bunch of resonance and ramifications for people. And, you know, I think one of the things that my organisation struggles with is moving from being even just the sort of discursive nature of moving from being Public Health England to the UK Health and Security Agency. Right. And the fact that security has been injected into a title.
The ramifications shifted from being a public health organisation to a health security organisation. Now, in essence, it's exactly the same thing, but it introduces different dimensions and so we always have to be careful. I think in terms of just understanding coming from security just brings with it a whole load of baggage anyway, right? And this is where I actually think the people to people connections and being enabling and being able to take sensible risks.
And building that trust is actually the most essential part of our delivery in some ways, because if cyber was a public health outcome, it's a people to people issue, really.
Mike Fell, NHS England:
I often think back, to the very old days of some of my academic training and the definition of security, one of which is the freedom from fear and actually, if you take that definition of it, which is in a more traditional rather than cyber security
kind of context, if you accept that definition about security being freeing people from fear,
Charlie Sammut:
Yeah.
Mike Fell, NHS England:
Actually, that then leads you directly back to public trust. Trust is built upon not being scared, not being fearful of it and so I think there's quite a compelling case that you make there around that and I guess with trust we've touched on innovation a little bit there.
Trust, typically you know you will trust things that you are used to, or me personally, I think it's kind of human nature to trust the familiar. I think it's in some ways, hard wired in us as people.
And yet, we've also said we need some innovations. We've said that, you know, the technology is going to be the key to increase productivity and that we need diversity of thought to help with that innovation.
In the NHS we've got virtual wards, we've got people receiving care at home, using technology. We're increasingly using video calls, apps, wearables, other medical devices. We saw through the pandemic, really rapid innovation about the use of the NHS app and the COVID app going from kind of the NHS App side of it, going from 1,000,000 users to over 22 million through that.
And I know that you will probably have a similar scale of kind of increasing digitalisation and innovation in your space with it. So can you tell me a bit about recent developments that in the innovative side that you've got particularly around like the pandemic preparation and where the thinking is going about how technology is, you know, part of that preparedness to ensure that public do have greater trust with future responses?
Charlie Sammut:
Yeah, I think so much of this is, you know, we are a very data centric organisation and I think so much of this is about making that data accessible and impactful and you know relevant and useful to the public. So that's you know where I see some really cool stuff happening is that is around that. I think very visibly was the work around the COVID dashboard, right. And the fact that people were able to access information in a really granular way and almost make decisions on my day-to-day life about how they would respond to COVID based on the kind of availability and the impactfulness of that information. And that came from, you know, not cyber. But my area of data analytics and surveillance. And that was incredible to see. And I think behind that, then there'll be much more of that. I think in the future. And I think the way that you can say we’ll continue to make or will want to make data available and data understandable and impactful for the average citizen and to be able to drive narratives around that as well, actually meaningful outcomes for the country.
Underneath that is that a couple of things.The first is the way that we analyse that data internally, so I think we have got some incredible work going on around that, the development of incident management systems, you know data analytic platforms, the ability basically to merge big and some really small data sets actually as well and properly understand what is happening in any given health outcome. And I think there’s incredibly innovative work going on around that which is challenging to assure and secure. I mean I can tell you like you know, especially when you start looking at big platforms and merging a lot of holder nation data sets etcetera, you know that's an issue.
How do you create that secure data environment and that's really crucial to us. So cyber involved with all of this, but underpinning all of that and I know I've spoken about it a lot, but you know our health security laboratories at places like Colindale that they are pretty incredible actually there was really a massive experience for me to go see it and just see the kind of technology that they're using there. The surveillance data that's generating and then have that map through those data analytic platforms and through into the dashboard.
And through all of it that public health outcome is about, you know, this innovation in terms of how we create data, how we analyse data and how we utilise and present data. I think they probably called it the data value chain or something, don't they? But cyber security is instrumental to each part of that as well. And it's not easy to secure and assure any part of that.
Particularly, I mean innovation in and of itself poses challenges to cyber security, right? Because it's not just that you trust what you know. It's much easier to assure and secure what you know as well. And so to go beyond that when things have been done in new, different ways, it requires a different mindset as well. A different set of skills often, and I think for me.
When I look at the people on my team who I think are fantastic, but that's part of their skill and that's, I think an extremely important skill for cyber in the future is being able to respond to innovations, being able to enable innovation, being able to find creative security solutions effectively and find creative ways to understand and to respond to risk as well.
Mike Fell, NHS England:
Yeah. I did feel that we were going to go a bit Donald Rumsfeld there with the unknown unknowns and that, but I think there is definitely something there about the pace of innovation and that kind of ever increasing hockey curve of the speed and agility of innovation. And as security folk our role in being an enabler by rapidly helping those that choose to deploy it, understand the risks and manage them.
Mike Fell, NHS England:
The cynic might say, well, that's us being sloppy shouldered and saying, you know, it's your decision to make. But the hopefully compelling case that we can make is that actually the people who are best placed to identify the right innovation and also understand whether those risks are tolerable, like with the out of support legacy technologies, those that are on the hope for doing them so.
Charlie Sammut:
Yeah.
Mike Fell, NHS England:
I guess that opens us up. We talk about innovation, a lot of of innovation is about risk management. A lot of it is about connecting things that were previously unconnected to give a bigger picture.
Charlie Sammut:
OK. Yeah.
Mike Fell, NHS England:
With that just kind of thinking about some of those risks.
To pick off a few of those ones that are relevant to innovation, but also to incidents that we see at the moment, what are you doing about the remote access aspects of risk that's embedded in a lot of innovative products?
Charlie Sammut:
It's a real challenge.
And I'll be, I mean I think frankly I'll be addressing your thoughts on this as well.
I think so much of it is about having good relationships with vendors. First of all, properly understanding products properly, developing assurance processes that enable us to understand the vendor and the product and that is difficult because it requires a lot of access and it requires a lot of engagement from both the vendor but also people on our side really understanding the products and what remote access really means.
That underpins it all. I think if we don't have knowledge of what the remote access actually is, we haven't done the investment, in that risk management does rely on information and actually you know we're never going to have perfect information. But trying to create as much good information as we can. I think it just follows through to, OK, if we are going to assure it, how on earth do we monitor it and assure it through life as well.
And I think that's a real challenge for us because we do have a lot of this kind of, you know, these developments are coming on and that it will continue to be, I think a theme for us as for every organisation throughout the future and understanding it and then being able to not just understand at a point in time, but try and understand it through life, I think is a real challenge. We're developing ways around it at the minute, but I think that's going to be a challenge for the next 5 to 10 years obviously.
Mike Fell, NHS England:
Yeah. And it's, I mean, it's a frustration. It's certainly my perspective from the incidents that we see is the frustrating, repetitive nature of the foundational bits that that actually, no matter how foundational, there are some security basics that you know, some refer to as eating your vegetables and getting a message about, you know, eating your greens.
It's for remote access. It's well multifactor authenticating on it. And yes, you can wrap that up in all sorts of language like zero trust. And continuous assurance, but I think. Yeah, it's those eating your greens bit and as you say, the whole life management of that involves going back and saying, well, actually yeah.
Charlie Sammut:
There's a lot of just really basic things remote access is happening. Where is that person physically as well because that also brings in all sorts of data protection issues that often aren't thought through. And so, you know, one of the drives for innovation is cost cutting, right. And that frequently results in offshoring.
Mike Fell, NHS England:
Yeah.
Charlie Sammut:
And really just being able to understand if there is remote access happening.
Where is that remote to? Where's it actually coming from? Where is that person based and is that something we can actually tolerate as a risk and stomach and enable.
Mike Fell, NHS England:
Yeah. And again, I mean from my perspective, innovation is part of the problem, but part of the solution here on in is that we are seeing loads of conditional access type automated controls that can spot the geographically impossible travel and block it by default that can spot the more risky connections and the anomalous by the nature of the time and and not sort of things. So I think innovation is part of the challenge. But part of the solution as well.
Charlie Sammut:
I completely agree and like I said, I mean some of it is just part of eating the vegetables. I think like I said, is just having a relationship with the vendor with the products that are being used and actually trying your very best to understand what they actually are and what's happening with them and having an open and trusted relationship with vendors and you know with suppliers such that actually yes, you've got, really sort of funky conditional access stuff running, but you've also just got the heart of it, quite a trusted relationship where people understand the conditionality of what they are and aren't supposed to be doing.
Mike Fell, NHS England:
Yeah. And that supply chain pieces is really interesting for me. You know, it's not one that's unique for us and definitely again you know the supply chain are for the record part of the solution, not the problem they you know any organisation in this space is facing the same challenges that that we face as an organisation and as other duties that we discharge with it. I guess my reflection I've been coming into the health sector relatively recently is, is that in some areas for various drivers it can be seen to be kind of behind the curve in the Medtech space either due to the life cycle of some of the technology that has, you know life cycles of 20 years, 30 years by design. And also I think to be frank security through obscurity in some ways I think that because things haven't been hit it's allowed.
Charlie Sammut:
Yep.
Mike Fell, NHS England:
You know different approach to risk in some of the solutions that we'd see in other sectors that are more directly like financially targeted. If you look at the gaming sector, you look at the kind of banking and finance sector.
Charlie Sammut:
Yep.
Mike Fell, NHS England:
It has different control sets and I don't think it's due to the regulatory environment of those. I think it's the kind of sensitivity or the immediately financial nature of some of those relationships with it.
Charlie Sammut:
Sometimes the operational technologies of that, I really do worry about because when I spoke about monopoly, I do worry sometimes that certain suppliers almost do exert a monopoly over OT because they are the only ones who could provide that kind of technology and therefore a lot of our security posture is dependent on how seriously they take it and how seriously they take their supply chain security as well. And I absolutely agree with that security through obscurity point. The other thing I think we see quite a bit of is that we have some incredibly special intelligent people working in the UKHSA and part of that is a lot of them have a huge amount of drive to deliver a public health outcome, and often that means that, you know, they will end up inadvertently creating a piece of shadow IT for instance, because it's the best way to live for that public. It's not a malevolent thing. They've not tried to break any governance. They've just know there's a piece of technology out there, they need to bring it in and they need to get it fixed and they need to start doing this. And I think that's particularly true in, in the past when, antecedents of our organisation were really looking at how do we invest properly in tech and I don't think they always did, and so therefore you know that does drive kind of shadow IT. So we and OT as well. So we've had AI, you know I think a lot of that.
And we've been, you know, going back to that point around really understanding what we've got on the estate and really understanding how to mitigate it, really understanding the supply chains that come off that as well because that piece of technology isn't just an endpoint in time, there's a history to it. And there's an ongoing relationship and there's an ongoing maintenance and ongoing networking connections, understanding all of that is absolutely critical and it's a challenge.
Mike Fell, NHS England:
It is and it reminds me of a conversation I had with an academic recently, an academic who also had responsibility as CISO for a university, and I sometimes think that my job's kind of challenging with the scale and nature of it. But geez, like the challenges that those that are protecting academic institutes with a sense of the information with the incredibly bright people, young people and others who are going to get around the controls in it. And we had a conversation about actually is it our approach to shadow IT and viewing it as that there's part of the problem and do we actually need a different governance model more similar to that which we're seeing with things like virtual wards that just recognises actually that you know, home routers are going to be part of the healthcare ecosystem and is shadow, is seeing that as inside the camp or outside the camp actually the right way of protecting it, you know zero trust show me an organisation that has zero trust. It doesn't exist, it's a philosophy. It's a way of approaching things.
But I do think that that kind of zero trust approach to IT governance might be one of the ways out to help us with this and not viewing it in that more traditional kind of the IT is either 100% approved, govern known or not because you know increasingly we are going to be seeing that and actually there's benefits to not directly governing it all in the same way.
Charlie Sammut:
So I think you’re absolutely spot on when I was talking about the mitigation, what I do want to do is know roughly what we've got or what's been done where, right. So we can actually start to take sensible risks on it because there is going to be some elements of any shadow IT estate which are doing some potentially really sensitive things right. And all access really data sensitive data sets do need that certain level of protection is wrapped around there and do need to be known and monitored. But I think you're absolutely right in any kind of estate there's going to be a whole bunch of stuff that isn't.
And that actually we can afford to take risk on and actually if we do afford to take risk, it produces a new paradigm about how we can deliver cyber security and also effectively IT at scale as well.
Charlie Sammut:
And I think one of the things I'm interested in is, we're talking about mitigating risk at scale, but I think often that means if we, you can think about reducing some risk substantially or lowering the risk of small amount, but across a very wide surface area.
And if you're making a very wide surface area, it's slightly more resilient because you know about it. You've put in place sensible measures. I think back to your point in some ways is that as good from a government's perspective as making a substantial difference, but on a smaller part of the estate.
Mike Fell, NHS England:
Yeah, exactly. And not by design. But that gives you the perfect back to the cyber strategy for health and care, which talks about focusing on the greatest threats and harms.
Yes, we talk about defending as one, not defending once, and that can lead you down a kind of everything must be controlled bit. But actually focusing on the greatest threats and harms maybe does facilitate a more a permissive environment in that.
Look, Charlie, you're getting away with this far too easily. This is like far too collaborative. We seem to be finding consensus on this. So I'm going to flip things around a little bit now.
Charlie Sammut:
Yeah, yeah.
Mike Fell, NHS England:
And take us back to to some incident type stuff with it. So bit of almost kind of quick fire on things. So the just culture you know back to the airline analogy where they have crew resource management, the concept that it doesn't matter how junior you are in the cockpit if you see somebody else about to you know put the wheels down when they were meant to go up or, hang on, we've gone beyond my level of knowledge of piloting now, but if they're about to do something that's dangerous, anybody can call it out without any judgement because it's for the better interest of that. And it'd be great. We talk about a just culture for cyber and in the health sector.
Charlie Sammut:
Yep.
Mike Fell, NHS England:
You know, I'll tell you the opportunity of commending organisations that have done the right thing here, the British Library, for one, the Irish HSE, for a second one, the British Library recently published an incident review of their pretty catastrophic ransomware incident, sharing the recent lessons. Obviously we've done a review against those and they are again the frustrating foundational failings that have taken place there, and the eating the greens bit. But I mean there's some bits of their estate that they've just said they're never going to get back.
Well, how does that make you feel? Because I think we can all kind of be candid enough here by recognising those issues and the reality that organisations just sometimes can't get stuff back after that. What do you take away from those kinds of things.
Charlie Sammut:
Well, I mean, I would again firstly commend them like you because that report is an incredible piece of work, actually. The openness in it and the ability to say we have tests, we have systems that we thought were pretty good and that we stress tested and then actually it turned out with hindsight we missed a whole load of stuff and there's a bunch of uncomfortable reading for me there probably for you as well, the sort of things that really jumped out to me were the fact that their cloud estate is just rolling up no problems and on Prem is basically destroyed.
For them is uncomfortable reading with that when we start talking about legacy tech. Whereas a lot of legacy tech - on Prem - and it's vital to the way that we operate. So that was one thing. So just some of that I thought some of the lessons learned around sort of you know the criticality of network segmentation and again how many of us have networks that we've just developed over years and years and years, are they actually when you talk about defending against the greatest threats, is our network designed to protect against the greatest threats?
That was uncomfortable reading. I think the other thing that was uncomfortable reading for me was just the way that threat actor had operated, that they come in and they deliberately destroyed. I mean, what I’d see when I was working previously on incident management was that there are particular criminal actors were shifted towards smash and grab. But basically, you know, get in, get some data, get off, but had moved away from that kind of what would destroy stuff or lock stuff down. And actually, that felt really uncomfortable to me because the idea of losing parts of our estate and parts of data that never, ever get it back, going back to that integrity conversation.
And that's really, really fundamental to the way that an organisation like us, but any kind of healthcare organisation operates. The idea that we might never get bits of our estate back if we are compromised like that. I mean it's deeply uncomfortable reading. I think it's an amazing report.
If anyone here, if anyone listening, if you know, if you haven't read it, go and read it. I think it is an incredible piece of work. I do really commend them for being that open. It's really uncomfortable reading. So those are the major things that dropped out, it's the same things that come up over and over again, and it's a simple vulnerabilities are exploited.
It's if you then don't have the right kind of network design, that's when you know it starts to get really bad and you compromised at scale. And I think many of us have those kind of those foundational characteristics in our systems. So yeah, brilliant report, really helpful and it's sharpened our mind on the things that we're doing that we need to do.
Mike Fell, NHS England:
Yeah, responsibility on this kind of thing is often something going through people's minds. Who's responsibility is it?
Charlie Sammut:
Ha ha.
There are layers of responsibility, aren't there in any organisation. I mean, ultimately this all sits on our senior leaders to properly understand cyber risk and to put the right emphasis on the mitigations in place. And I think you know, we're lucky that our CEO does talk about cyber being a public health issue. I think it's then largely my responsibility to try and deliver the changes that we need and to influence the organisation to, you know, develop a kind of cyber security we need but I think ultimately, and it turns a lot of people off because it is naff, but genuinely security is everyone's responsibility.
Phishing emails are a great example of this. It's our organisation leadership to understand that phishing emails pose a substantial threat. Right. And then for me to put in place is the controls that we can to try and weed out as many as we can. But ultimately, there's also still a responsibility on each user to if they think something looks slightly off in that e-mail to not open it or to report it or if they have opened it.
And they think that something bad has happened to report it straight away. And you know that their responsibility in the sense is lesser because it's around one single thing. But it's so important if it's not exercised. And in that sense my job's easy. It's clear. I've got to deliver some security for the organisation. It's when you get down to the individual's responsibilities to take security seriously, that it gets really tricky because it's one responsibility among many. They've also got responsibility to do their job, to get to work, to look after their families.
You know, to I'm trying to buy a house at the minute to do all the admin around that right and security is one responsibility among many. So going back to one of the points we've made really one of the things we are doing much more. We will need to continue to do is but why is it important and it's not about scaring people. It's about making a positive responsibility.
Mike Fell, NHS England:
Yeah.
And you've laid the ground really well there actually for a bit of analogy that if you think about a kind of contaminated environment in a hospital setting, there's obviously kind of cleaners who are on the hook for, for going and actually cleaning up that bit and potentially an additional you know more professional teams that might go in a in a high risk situation and kind of isolate it and that. But underpinning all of that is that disease transmission is the responsibility of everybody to wash their hands before seeing a patient, between patients. You know, it's everybody's, everybody's responsibility to call that out and again it's to that point about using the clinical language effectively and ironically, it's always kind of strikes me that kind of one of the first words in cyber security is virus. It's a direct, lift and shift / nick, from a health setting for a virus and yet we don't always necessarily make that bit about, but it's just that it's the cleaning, it's the washing your hands, isn't it? It's everybody's responsibility to do it. But there's other people that have got different aspects to it.
Charlie Sammut:
Yep. And weirdly, when I was in the Middle East, the Arabic for hackers is alqursan, which means pirates, which I thought was the idea of pirates roaming around the network was much better. But no, it is. I mean there's I think the medical analogies run really, really strongly in this because effectively, you can see it from the way that COVID-19 operated, right. And the way that is.
Mike Fell, NHS England:
Yes.
Charlie Sammut:
When I look at what an attacker could do to some of, you know, our networks and sort of healthcare networks and the ability to move because of increased connectivity, because of the ability to move through and compromise large sections of our national principal national infrastructure really is exactly like a pandemic in that response. And it requires the kind of level of individual responsibility that people show day in, day out in response to COVID is also the foundational element of how we actually protect ourselves against those those threats. It's just it's not easy. It's really hard and it does again, go back to the point, scaring people is useless. I think we really need to develop the why and it's the positive enhancing responsibility to look after security because, for instance, like we were talking about earlier on, it maintains the integrity of data, maintaining cyber security as a public health issue in and of itself, you can't deliver public health on compromised, untrusted systems.
Mike Fell, NHS England:
Could not agree more, so great. Charlie, I'm going to start to wrap things up here now and I'm going to ask a genie in a bottle type question.
Charlie Sammut:
Yes.
Mike Fell, NHS England:
So if you've got that kind of metaphorical genie in front of you and you've rubbed the bottle and they've come out and given you that one wish.
Charlie Sammut:
Yep.
Mike Fell, NHS England:
What would that wish be for cyber security in our space, in our sector in 5 to 10 years time and you're not allowed that we have funkier names for our adversaries like pirates.
Charlie Sammut:
Yeah. And what about a massive wage for me? Is that off the list? What a fascinating question. What would I?
You know, what I really want is that we've actually bridged that gap in the mutual comprehensibility of risks, so that people who work in cyber from our side are actually they are kind of public health professionals as well. They understand the organisation that they're in really well. They understand the public health outcomes, what's happening across the organisation, how their work plays into that. And on the flip side, that the, you know, our public health experts really understand how cyber security and maintaining security and integrity data is absolutely vital to delivering those public health outcomes, which is quite a simple thing really. But there isn't a lot, you know. We do work very close together. There is a lot of compensability. But what I think when it falls down is because those two worlds remain incomprehensible sometimes and bridging that gap and doing it in a really sustainable and I think probably quite a caring way is really important. I think one of the things that actually I've loved most about this job, I think many people in my team do is that they do feel a genuine sense of mission. They feel a genuine attachment to UKHSA. A lot of that came out of the pandemic.
Where you know, I think a lot of people working in technology and working in cyber, as I said previously, can feel quite removed from a mission, can feel like you know, it's what for instance, if you do work in a financial sector or whatever you sometimes you might feel your mission is about profit and it's about generating outcomes to shareholders. So being really attached to a public health mission where you're actually doing things that matter at scale.
For the population, you are delivering better, healthier people who are happy in their lives and their work is incredibly important. But actually I think sometimes we've got to go further and bridge a gap and make that really meaningful and actually not an abstract thing, something where they're directly engaged on programmes and projects from the start, going back to secure by design, they really understand how security is driving that programme. But equally, the people, the public health professionals who own that, understand why cyber is so important as well and deeply care about it alongside all the other things that go on so I think that's a good start on that, but I do think if in five or ten years time we've got that balance really, really good, I think that I'd be very happy with that genie.
Mike Fell, NHS England:
Yeah, I love that bridging kind of point that you make there and when you look at how society is more healthy now than it was 50 years, 100 years ago. It is because of things like lead being taken out of paint, lead being taken out of petrol that has had a huge kind of public health benefit. And I think what we're talking in secure by design is taking the lead out of the products and making them, you know, healthy by design in it. So I think there's so much territory on there for ammunition to bridge that gap. I think we just need to work out how to do it.
Charlie Sammut:
Well, I agree. And just building on that point, there was a fascinating point made at a meeting I was at yesterday that we're quite good at quantifying how public ill health at scale impacts GDP, and I don't think we've the speaker's saying. I'm not sure we've done the same for how public, good health increasing health outcomes has a positive effect on GDP again, in fact, GDP could also give me the ability to say well, delivering cyber security is actually enhances public health outcomes, and it had led to this. And this is the kind of the knock on effect it's had.
You know that would be fantastic as well because often one of the problems we deal with, struggle with in cyber security is that you’re protecting against something that might not come.
And it's hard to track through what the effects are.
Mike Fell, NHS England:
This genie is a good one, but I'm sure maybe that is even a step too far to really, genuinely quantify the risk and put it in a way that's what we need.
Charlie Sammut:
Oh, you need a better genie, Mike. Honestly.
Mike Fell, NHS England:
We need the genie that can help us quantify the cyber risk in a way that people recognise. Look, Charlie. Thanks so much for that. It's been a great conversation. Really enjoyed it. I could keep on going for a long time, but there are many, many, many more things. But huge thank you for your time, for the amazing work that you are doing in a really vital part of our sector.
I'm going to wrap things up now just by saying to the audience a huge thank you for joining us for this edition of The Cyber Sessions. I hope you've enjoyed it and please do look out for the next drop of the next edition coming to you.
Last edited: 26 September 2024 10:38 am