Taking Security Seriously
By Wojciech Cichon, SETL Software Developer
7/30/2019
While I was looking for a solution to a problem on StackOverflow a few days ago, the idea for this article came to me. This particular solution I found was accepted and received a large number of upvotes. However, I discovered that it contained a security vulnerability that was over 20 years old. As I dug deeper, I found several other examples of approved solutions that contained known security flaws. How is it possible that so many solutions were approved with such serious security flaws? It occured to me that when searching for solutions, security isn’t always a primary concern. Even though security is a critical part of development, it often gets overlooked in training, design, and implementation. Writing code with security as a primary goal leads to a more resilient product rather than addressing it in production.
There’s plenty of research investigating this issue, including the fascinating publication, “How Reliable is the Crowdsourced Knowledge of Security Implementation?” where the authors analysed over 1400 answers on one of the biggest developer Q&A sites I mentioned earlier, Stack Overflow. They found that the ratio of secure and insecure solutions is almost equal – 55% secure and 45% insecure. At such an alarming ratio, it’s no wonder that unsecure suggestions are still such a pervasive issue among the community.
Another interesting study reports on the experimentation of developer expectations and security. In the article, “If you want, I can store the encrypted password. A Password-Storage Field Study with Freelance Developers,” researchers from the University of Bonn hired freelance developers to build a user registration component for a social media site. The results show that developers who were asked to provide a secure solution did in fact produce a much more secure code than those who weren’t asked about security at all. It seems that being prompted about security is more rewarding than using an experienced developer – as shown in the following figure.
Developers without a security mindset provide less secure code, which can have serious consequences for their organisation. Vulnerabilities in an application can lead to a data breach, which could result in a fine of 20 million euros or 4% of annual global turnover under GDPR, not to mention loss of reputation and costs of fixing the error. Given that the costs of fixing a system in production are much higher than in earlier stages of a lifecycle, it just makes sense to spot them and fix them as soon as possible – or hopefully, avoid making mistakes in the first place.
The concept of Shift-left security, which applies security to whole project lifecycle, is growing in popularity. In “3 Need-to-Know Security Terms for 2017: DevOps, ‘Shifting Left’ and Ransomware,” Arden Rubens discusses tactics on how organisations secure code. Security, according to Rubens, is no longer an additional layer on top of an application, or a problem which has to be dealt with when a breach occurs. Now, security is designed into a system from day one. Designing and developing secure systems requires not only security awareness, but also an understanding of the core concepts of security. Developers need to be more proactive, but to do that they must be better trained in security.
Traditionally, developers are very goal-oriented people. It’s relatively easy to test that code is functionally correct. There are various metrics for code quality, yet there are no metrics for security.
At SETL, we understand these issues and take security extremely seriously. The preservation of confidentiality, integrity, and availability of our systems is paramount. All members of our project teams are trained in Information Security and we ensure that our security is designed and built into our projects from the outset. By consciously evaluating security at SETL, we reduce the risk of a data breach for ourselves or any of our customers – and thereby provide solutions that can be relied on to protect our, and our customers’ reputation.
References
Alena Naiakshina, A., Danilova, A., Eva Gerlitz, E., Von Zezschwitz, E., & Smith, M. (2019, May 9). “If you want, I can store the encrypted password.” A Password-Storage Field Study with Freelance Developers. Retrieved July 27, 2019, from https://net.cs.uni- bonn.de/fileadmin/user_upload/naiakshi/Naiakshina_Password_Study.pdf
Arden Rubens. (2017, Feb. 15). 3 Need-to-Know Security Terms for 2017: DevOps, “Shifting Left” and Ransomware. Retrieved July 27, 2019, from https://www.checkmarx.com/2017/02/15/3-need-know-security- terms-2017- devops-shifting-left-ransomware/
Chen, M., Fischer, F., Meng, N., Wang, X., & Grossklags, J. (2019, January 4). How Reliable is the Crowdsourced Knowledge of Security Implementation? Retrieved July 27, 2019, from https://arxiv.org/pdf/1901.01327.pdf