I've written before about how important responsible disclosure is for Security Researchers. That responsibility falls on both sides of the discussion. Of course it falls on the side of the security researcher. When they find a security vulnerability they should work with the company to disclose it properly and to make sure it's fixed properly. They should do this for free and without extortion. I think most professional security researchers are on the same page, and while we may debate whether it's prudent to ever publicly disclose an issue, most of us will try to use responsible disclosure first.
The other side of this coin is you, dear software vendor. Creating a stress-free mechanism to disclose vulnerabilities to you is critical to finding yourself on bugtraq less frequently. These security researchers are giving you their time, efforts, and expertise for free. Their time, efforts, and expertise that you would otherwise pay thousands of dollars for. Sure, they may not bundle an issue up in a nice, perfectly formatted Problem Report, but it is absolutely worth your effort to listen and remediate the issue as quickly as possible.
One of the great things that we're seeing lately are bug bounties and disclosure programs. A disclosure program is a way for a security researcher to disclose to you a security issue that they have found. Good disclosure programs have: Respect, Optional Anonymity, Legal Impunity, Security, Responsiveness, and Openness.
Respect is very important. Again, in many cases these are professional researchers, who have found your product or service interesting or critical enough to want to look at your security. They may use tools that are costly to build or purchase. They may spend their free time, weekends, or professional development time and budget on your product.
When a researcher comes to you with a security vulnerability, you should give them your full attention.
###Anonymity (at the request of the researcher)
It may be important or desirable for some researchers to disclose their vulnerability anonymously. They may have stumbled across a security issue in a slightly less than legal way, but that makes the vulnerability no less important to you.
At the other end of this spectrum, researchers may want their name to appear in disclosure notes, bug fixes, or other messaging. These researchers may be independent contractors and this may be a great opportunity for marketing for them.
The absolute worst response to somebody giving you free work and vulnerabilities is to attempt to sue them. This is a surefire way to lose respect within the security community and ensure no other researcher tells you about a vulnerability again. (note: they won't stop looking or finding issues, they'll just stop telling you about them)
Security is important because of the sensitivity of the data that is being transferring to the vendor. It's important that the security issue that is discovered isn't intercepted by a malicious third party and used against end-users or customers.
Security Researchers want to know that you have received the issue, you understand the risk, and have taken or will take steps to mitigate the risk. This is often the “payment” they're looking for. It is important to improve the security of the software, so knowing how the issue is being handled is important.
Sunlight is the best disinfectant.
No Software is perfectly secure; we balance the risk with software's utility. It's impossible to understand the risk and to make an informed decision about software without this security information. Many users, and security professionals are probably the most like this, automatically assume the worst, especially in today's climate of weekly massive data breaches. It's important, therefore, to meet these concerns head on, help your customers understand the vulnerability, how it happened, what you learned from it, and how you'll make sure it never happens again.
One of my deep interests are incentives and motivation. As a manager of many security engineers, and a security engineer myself, I love to think about what drives people to excel, build their skills, and do research. I've found that while money isn't a primary driver, it can help show that the incentives of the company are aligned with the things that each person is excited about.
For example, at Security Innovation we have a research program that allows each engineer to take up to 10% of their time and a hefty research budget to research anything (security related) they'd like. This gets met by “I get paid to hack on Google Glass or Connected hardware locks? Awesome!” While the engineer would likely have been doing this research on their own, getting paid for it shows that the company values the research and work.
Bug bounties are similar to this. Some Security Researchers rely on bug bounties to make a living, but many see it as a great bonus to research they would already be doing. They also realize that if a company is progressive enough to create a Bug Bounty program they are also likely to follow the outlines of a high quality disclosure program like the one outlined above. This means that this company takes security seriously and welcomes the feedback to improve the security of their product.
If you are a software vendor I hope you'll start a Security Disclosure program at your company. It's a great way to get security feedback on your product and to know that people care enough to provide you feedback. Creating a Bug Bounty program shows that you take this seriously and have a process for responding to security researchers.
Posted By: Joe Basirico