2021 saw Prof. Peter Ryan, head of the Applied Security and Information Assurance Group (APSIA) at SnT, chair the European Joint Conferences on Theory and Practical Software (ETAPS). Having been at SnT since 2009, Prof. Ryan’s research interests of cryptography and security are peripheral to the core themes of ETAPS – the theory and practice of software. And, just as security forms a vital part of any good software system, the tools being developed by the ETAPS community are highly relevant to information security. In this interview, we explored Prof. Ryan’s research interests in greater depth, and what drew him to the research field of security.
“Actually, doing computer-security research can be a lot like playing chess. You need to take into account that your opponent is also planning a few steps ahead and that they also have a strategy.”
Peter Ryan, SnT Tweet
The biggest, most important difference is the presence of an adversary. In computer security, you need to consider what a malicious, motivated human might try to do to get around your security measures. In general software engineering, the end-user may well be unskilled – but they aren’t malicious and errors typically occur at random. That’s a really important distinction. While both disciplines are human centred, security is more human centred. Actually, doing computer-security research can be a lot like playing chess. You need to take into account that your opponent is also planning a few steps ahead and that they also have a strategy. I think that’s actually what drew me to security research in the first place. I was an avid chess player as a child and to this day I really enjoy dynamic, adversarial puzzles like chess and Go. Computer security research is just another arena for this special type of mind sport.
“Security, if anything, gets in people’s way, and it gets in the way of the tasks they want to do. We need to convince them that security measures are worth it”
Peter Ryan, SnT Tweet
Yes, and that trend is only increasing. Nowadays, I am thinking not only about the adversaries but also about the honest human end users of the systems I develop. In the past, when cryptography was first used, only very special information needed to be protected, and everyone who used it was very highly motivated. End users accepted working with complicated systems as long as they were effective. But with the internet and digital technologies, the amount and types of information that is now vulnerable to attack has massively expanded. We’re asking regular consumers to be good stewards of their own digital information, but there’s a limit to the mental space people have available to them; as the amount they need to protect grows, many people struggle to maintain the discipline needed to stay secure. Security, if anything, gets in people’s way, and it gets in the way of the tasks they want to do. We need to convince them that security measures – for which the upside isn’t obvious because the best-case scenario with security is that nothing bad happens – are worth it. So when I design a secure system now, I think not only about the adversary but also about the end-user. The challenge is to beat that adversary, without alienating the person I’m ultimately trying to protect.
Secure voting is a really good example of this. I got into secure voting because the human element is really crucial – more so than in most other areas. When designing a secure voting system, we need to prepare for an adversary by ensuring the outcome of the election is not only correct, but demonstrably correct. We have to deal with an especially powerful adversary who may interact with the users (voters), giving them instructions and demanding that they reveal secrets such as credentials, for example. Beyond that, we also need to ensure voters can easily use our system. We needed to design something that embodies the democracy it enables. A voting system that is so secure it becomes byzantine, to the point of being unusable, isn’t democratic anymore. Neither is a system that isn’t private, which opens up the possibility of vote buying or coercion. I’ve developed a number of secure voting systems over the years, and it is exactly this complex human element that keeps drawing me back to them. It is the ultimate game of chess.