Daily coverage of Apple’s WWDC 2019 conference, by John Sundell.

A Swift by Sundell spin-off.

Developer interview: Anastasiia Voitova on WWDC expectations from a security and privacy perspective

Welcome to the forth WWDC by Sundell developer interview, a mini-podcast and article series in which we hear from some of my friends from around the Apple developer community — about their thoughts, hopes and dreams for WWDC.

Today I’m talking to Anastasiia Voitova, who is a security product engineer at Cossack Labs, and a prominent security expert within the iOS developer community. Let’s find out what her WWDC expectations are from a security and privacy perspective.

Listen to the interview

You can also listen in your favorite podcast app, by adding the following RSS:

https://wwdcbysundell.com/podcast

Read the interview

John: Security and privacy have been really big topics during the last few years, and Apple has been taking a really strong stance on this in general — even going so far as to making whole marketing campaigns around those topics. Do you see this trend continuing at this year’s WWDC — do you think that there will be a strong emphasis on security and privacy?

Anastasiia: Yeah, I’m sure about that. Apple have been pushing these topics since the early beginning, saying things like ”We don’t collect all of your data“, and so on. So privacy is very important to them. Apple have also had several security-related sessions at each WWDC as well, so I’m sure that the trend will continue — and I believe that we’ll see even more this year — because the latest trends from the large service providers, like Google for example, is to start demanding that companies that use their services go through a security assessment.

So if you’re a company that uses Google’s APIs, and gets data from Google’s users which you store or process on your own servers, then Google will push you to go through a security assessment with a third party company. It’s not done by Google — you need to hire security assessors, pay them, and Google expects that the price for that will be around $15,000 - $75,000.

John: Wow!

‌Anastasiia: Yeah, and you as a company will need to pay that in order to get a report that you can then show to Google — otherwise, they probably won’t allow you to use their services. So this is a new thing, it started appearing at the beginning of this year, and the first companies have already started to receive emails with this demand, to go through such an assessment this year — during 2019 — and it’s already May.

John: So do you think that Apple will follow Google in doing something like that? Like demand that companies go through some sort of security assessment in order to be on the App Store, use iCloud, or something like that?

Anastasiia: I think so, but not in the exact same way. Because Apple has different policies, and Apple doesn’t provide services to use their users’ data. You can’t just grab any data from an Apple user.

John: No, but you could do things like ask for permission to access a user’s contacts — and then upload them to your own server. So there is a possibility that user data will end up on third party servers, so Apple could be in a similar position as Google.

Anastasiia: Yeah, especially when it comes to large applications — those apps that are known to do bad things in terms of privacy.

John: So a few years ago Apple started enforcing that all network calls from third party apps are made using HTTPS, when they introduced App Transport Security. But there are still a lot of other aspects of a more secure networking model that developers often miss — like SSL pinning, properly encoding keys, etc. Do you think that Apple could introduce something similar to App Transport Security to make these things easier and more common?

‌Anastasiia: I don’t think so, because with App Transport Security, there’s already a nice mechanism to use — HTTPS. But other things, like SSL pinning, are more sophisticated — you can’t just enable it if you don’t have support for it on your backend, because you need to actually pin the SSL certificate, which also means that you need to have access to it.

Many third party services don’t give you access to the certificate, or you can’t update the certificate, so you can’t just enable SSL pinning by default. Apple do try to provide you as a developer with better tools, so that you will make better apps — but at the same time, they try not to be too restrictive or “pushy”. They’ll give you tools, and tutorials — oh, and by the way — I think that the amount of tutorials about these things could be better. I mostly see community-made tutorials from people who are working in the security field — about things like SSL pinning or key encryption — but I don’t see many up-to-date tutorials from Apple.

John: Yeah, because these things can be a bit daunting — it sometimes feels like security is a world of its own, and in order to get a good grasp of the techniques we mentioned, you have to learn so many different things.

Anastasiia: Yeah, that’s right — and what makes it even more complex is that each application is unique. If you got some tips and tricks for — for example — a note taking app, things most likely won’t work the same if your application is instead a secure messaging app, and vice versa. So you need to learn a lot of tricks, and you need to understand which tricks that suit which use cases. That’s what makes security so complicated.

John: So speaking of different techniques, one thing that we hear more and more about these days — even in mainstream media — is end-to-end encryption. There’s a lot of services that promote that they are end-to-end encrypted, and how that’s much better for users, etc. But I think that there’s still a lot of confusion around how to actually implement end-to-end encryption in an app — so what do you think, when should an iOS developer really start thinking about this?

Anastasiia: First of all — end-to-end encryption, as a security pattern, is not for every app or use case. Because end-to-end encryption relies on two parties that exchange something, for example messages, and that each party can only decrypt messages that are meant for him or her. It won’t work if you have many different parties — if you have an exchange between many people or users, you can’t do it end-to-end. OK, technically you can, but it’s very sophisticated. One new idea is “end-to-end encrypted data collaboration”, for when there are a lot of users that collaborate on the same piece of data — and it’s a really sophisticated model from a cryptography point of view, so it won’t suit most iOS applications.

So when it comes to end-to-end encryption, you need to first understand if your app has a good enough use case for it — do you have these two parties that exchange some secret data? And when I’m talking about “two parties”, it can be the same user, but on multiple devices — for example if your iOS app shares some data with your macOS app — that sort of data sharing can be end-to-end encrypted. For example, if you use Apple’s CloudKit and encrypt messages before syncing them, then you’ll get a simple end-to-end encryption scheme.

John: Yeah, I think it’s a really good tip to look into CloudKit — especially if you’re only in the Apple ecosystem, because it’ll give you so many security features for free. Especially if you use CloudKit’s private database, which is associated with a single user’s iCloud account.

‌Anastasiia: Yeah.

John: So, security and encryption APIs tend to be really low-level — they really require you to dive in and learn a lot of different things, like we mentioned. Just look at the built-in Keychain API that Apple provides — it’s very low-level, based on pointers, and things like that. I know that you’ve been working on a library called Themis, to make encryption more accessible to people — do you think that this is something that Apple should do as well, to make it easier to use the built-in security features — and do you think that this is something that they might introduce at this year’s WWDC?

‌Anastasiia: There’s actually a huge problem here, because all of these cryptographic tools — like the Keychain API, CommonCrypto, and the Security framework — they’re tools for doing a lot of things. So they’re like toolboxes, and since they’re toolboxes, they’re very low-level. You can have a high-level toolbox if you pre-define some of the steps and use cases — for example, in Themis, we have full crypto systems. So we kind of make decisions on behalf of developers, about what kind of cipher to use, what kind of key to generate, and so on. So it’s secure by default — we built in these security decisions into the framework.

That makes our API more high-level, but at the same time, our API is limited. You have these full crypto systems, and you can’t do anything except use them. You can combine them, but you can’t build super sophisticated use cases. So it’s always a problem if you want to cover sophisticated use cases, you need to provide something really low-level — and that’s what Apple does. If you want to provide more higher level use cases, you need to make a not of decisions on behalf of developers, and I think Apple can’t do that.

John: Right, so what you’re saying is that Apple wants to cover many different use cases, and enable people to build their own security abstractions on top of what they provide — so that people like you can build libraries that are more opinionated, and more limited, but easier to use?

‌Anastasiia: Yeah, that’s the problem of security usability — if you as a developer know nothing about cryptography, then tools like the Security framework are really complicated to use. Another thing is that Apple still doesn’t provide public APIs for some ciphers, for example AES-GCM, which is kind of an industry standard for authentication symmetric encryption. Most applications use these ciphers, but if you stick with Apple’s native tools, there’s no way to use them — since those APIs are private. I hope that during this year, or maybe next year, Apple will update their security APIs — and remove old ciphers and hash functions, like MD5 for example. If you don’t know why using MD5 is bad, then you’ll use it, right?

John: Exactly.

‌Anastasiia: So there are still old APIs, and some new ciphers are still missing from the public API, so I hope that Apple will push this forward. But at the same time, certification and assessment is a problem — Apple needs to be sure that the things that they add to the public API work fine, have been audited, and so on.

John: Yeah, that’s always a problem when they’re moving something from being only internal — which lets them make more assumptions about it — to making it a proper, public API.

Big thanks to Anastasiia for her many insights into the world of security and cryptography, and what Apple might be up to next within this field. I really recommend following her and her security work on Twitter @vixentael.

Make sure to check back tomorrow for the last developer interview before WWDC starts — and, like always, if you have any feedback — just find me on Twitter @johnsundell.

Thanks for reading/listening! 🚀