In my years of working as an application security (appsec) penetration tester I’ve come to the conclusion that there are so much more value to be added than pure technical vulnerabilities. To deliver the most value you have to be willing and able to walk the extra mile. Before getting into what can be done to increase the value, let’s dig into the two most common types of vulnerabilities.
Technical Vulnerabilities
The technical vulnerabilities are the most common vulnerabilities we see. This is where the application is abused to do something it shouldn’t, for example by injecting code or abusing weak cryptography. Even though the vulnerability is technical, it is important for the reporter to describe how it will impact the business. Otherwise the receiving organisation might not have enough of an understanding to prioritise the issues, and handle them accordingly. Even though a code injection can be used to pivot to other machines, the main impact for the business can often be linked to the confidentiality, integrity and availability of the application. As a tester it can be hard to accept, but a dom based XSS might be an accepted risk if the only impact is defacing the sight by pasting code into the searchbox.
Logical Vulnerabilities
Logical vulnerabilities are closer connected to the business risks. Instead of abusing the technical capacities of the application, the attacker here abuses the logic of the application. The most common example would be to order -1 books from a webshop. Will that remove cost from the total? Since these logical vulnerabilities are coupled to the business logic, it is often easier to explain them to the business and therefore get them fixed.
There are exceptions however. I would like to split the Logical vulnerabilities into to categories:
- Abusing unintended behaviour
- Abusing intended behaviour
- Risks introduced by behaviour
The previously described example for webshops is an example of an unintended behaviour. Abusing intended behaviours however is harder to pinpoint, and even harder to explain to the business. This is when a feature is used as intended, but has an unintended consequence. This would for example include a forgot password function sending the password to the users email. The feature works as intended, but it’s still a security problem, or even multiple problems.
- Sending the password to the email is a low risk vulnerability, since email is an unsafe way to send information [1]
- The application sending the password to the user means that the password is stored either in clear text, or with reversible cryptography. This increases the risk that if the application gets hacked the passwords will be leaked, and due to password reuse this might mean that users are affected on other sites as well.
The third category, risks introduced by behaviours are even more tricky. It could be functionality that is added, but introduces the risk. This can be the possibility to send a download link via text to a validated phonenumber. Spamming one self might not be a huge risk, but if each text message costs 1 cent for the sending company sending enough texts will have a financial impact. I would also argue that privacy of the user falls into this category, since it might impact the public relations of the company as well as adding value to the customers.
the Extra Mile
So with this knowledge, how can we as tester go the extra step to increase the value for our customers? I would argue that in addition to the usual findings that clearly can be exploited, it is our duty to inform the customers about their more subtle risks. By understanding their business as well as their application it is possible to find and report risks introduced by their behaviours. We should think outside the CIA (Confidentiality, Integrity, Availability) triad and think of other risks. With our expertise we have a good possibility to think about privacy and other business impact. It requires a bit more work, but in my experience it is often worth it. The testers experience with security, privacy, and thinking outside the box will often lead to some findings that give aha moments to the client, even if they are not traditional security risks.