DRAFT UNDER GPC REVIEW UNTIL 7/8/2011

APPROVAL OF MINUTES TO FOLLOW THEREAFTER

GPC June 2011 Working Session Minutes


OWASP Global Projects Committee

Day 1 - Monday, 6-Jun-2011

Present: Jason Li, Chris Schmidt, Justin Searle, and Sarah Baso (Keith Turpin arrived on Sunday but was not feeling well, so he did not attend on Monday).

Since the Global Projects Committee had discussed the working session agenda[1],  including objectives and deliverables, prior to arriving in Dublin, the group was able to hit the ground running on the first day.  We started with (1) coming up with a complete list of projects, (2) inventorying the projects, (3) discussing and drafting the phases of the new lifecycle, (4) categorizing the projects, (5) reevaluating the lifecycle model, and (6) adjusting the project categorization as needed.

1) Jason came up with a number of different project “lists” on the OWASP wiki:

2) We crossed checked the list and came up with one master list. Then using a “divide and conquer” approach, took inventory of each project noting the following details: project name, type, OWASP mailman mailing list (if exists), wiki page (if exists), leader(s) (if exists), description (if available), and miscellaneous notes.

3) As a starting point for the discussion, we brainstormed and defined 6 general “buckets” to sort the existing OWASP projects into. These were our initial thoughts (note this chart does not reflect the final decisions/definitions of the new lifecycle):

Incubator

Project idea, just out of the gate, no release, but  must have a defined deliverable  and single point of contact to serve as a project leader

Labs

A functional project that has produced a stable release as defined by the project leader.

Flagship*

OWASP sponsored

If single point of contact ceases to exist, do we have enough knowledge about the project for someone to take the reins?

Would you recommend/use?

Maintained/stable/fully functional

Enterprise Edition

Just one release version of a project that breaks away from the project lifecycle (project can go back to development for a new version)

Inactive**

Inactive Incubator Projects - no activity in last year and no stable release

Inactive**

Inactive Labs Projects - at least one stable release, but no activity in last year

*This was initially called “Mainstream” projects but was changed once we started talking about how accurately or inaccurately it described the projects or purpose of the projects placed in the category (more on that below).

** During our initial categorization phase, in order to keep the projects in the appropriate bucket - we put inactive incubator projects in the “graveyard” category and inactive labs projects in the “archive” category.

Additional notes:

Possible projects for OWASP Flagship:

AntiSammy                                        Mod Security Core Rule Set

ASVS                                                OpenSAMM

Code Review Guide                                Secure Coding Practices Quick Reference Guide

Codes of Conduct                                Testing Guide

Common Numbering Scheme                Top 10

CSRF Guard                                        WebGoat

Development Enterprise Security API        ZAP

Live CD

Part of the Enterprise Edition thought exercise was to identify the absolute pinnacle ideal qualities for an “Enterprise” project. A subset of these qualities were subsequently used as criteria for Flagship/Labs projects.

4) Having defined the 6 different buckets/categories for projects and looking at the basic structure we had put together for the new project lifecycle, we started sorting the projects. This sorting process also provided a means by which for us to find out where possible ambiguity or problems existed in our new structure.

Day 2 - Tuesday, 7-Jun-2011

Present: Jason Li, Chris Schmidt, Keith Turpin, Sarah Baso, and Justin Searle (after completing delivery of his training course at AppSec EU)

4) The fourth step of our process in developing a new project lifecycle (sorting the projects into our new phases/categories) continued through the first part of the day.

5) The first thing we decided to discuss further after the initial categorization was the “graduation” criteria between phases.  What are the requirements to move from Incubator to Labs? What are the requirements to move from Labs to Flagship?  We realized that as soon as we rolled out the new lifecycle and categorization, project leaders/teams would want to know how to get to the next level; so that process needed to be well defined.

After looking and discussing how the likely distribution of projects would look on a bell curve, we determined that as the criteria currently existed (Image A), the threshold for a project to move from incubator to labs was quite low (one stable release).  This low threshold would effectively mean that most projects would be classified as “labs” and water down the quality that an “OWASP Labs Project” would have in the industry.

Image A

We determined there were two possible solutions that would divide up the projects into buckets that were closer (in size) to what we had in mind.  First, raise the bar (requirements) for graduating from incubator to labs (Image B).  This would then leave a majority of the projects in the incubator category. Without firm requirement, that bar had the possibility of sliding gradually to the left as the desire/pressure for projects to move from incubator to labs grew.

Image B

In order to break up the incubator category, we discussed the idea of adding another category to the beginning of the project lifecycle (Image C). Projects would enter the lifecycle as slightly more than an idea (they would still have to meet the basic entry criteria defined above), and then to graduate into the incubator phase.

Image C

The remainder of the discussion on the second day involved coming up with a list of all the possible means or metrics for assessing projects that we could think of. Then we tried to categorize them into “automated” or “manual” review. Our idea was to be able to provide as many “automated” review metrics as we could capture and make available through a project “dashboard” to project leaders, reviewers, or even people from outside the OWASP community considering the use of one of our tools.

At the end of the day Tuesday, we also had an GPC-all conference call, which included all parties participating in the working session in Dublin (Jason, Chris, Keith, Justin, and Sarah) as well as Brad and Paulo who dialed in.  This provided an opportunity to fill Brad and Paulo in on all that had been discussed and preliminarily decided up to this point (over the course of the first two days), as well as get their input. Both were encouraged by the progress and positive about the changes.  Paulo, specifically had questions regarding how the new lifecycle mapped up to the current model for Alpha Status, Beta Status, and Stable Quality.  

Day 3 - Wednesday, 8-Jun-2011

Present: Jason Li, Chris Schmidt, Keith Turpin, Sarah Baso, and Justin Searle (after completing delivery of his training course at AppSec EU)

The final day of our working session involved trying to take all the different assessment metrics and ideas that we generated the previous afternoon and putting them into a tangible and cohesive review system that worked with the lifecycle framework.

In an effort to avoid an unduly burdensome process with too many stages of manual project review, the graduation from New to Incubator would  require a stable (as defined by the project leader) release  and involve filling out a simple form :

Continuing with the  idea of 4 phase lifecycle (Image C), we set out to define what we would require of project leaders aspiring to graduate from Incubator to Labs. This review (in contrast to the review to move from new to incubator) would need to be more automated and check for “a quality” project. After discussing general criteria of what we thought made up a “quality” project, we tried to distill the criteria into a set of 5 simple questions that could be used by project reviewers to determine if the project was ready for the labs phase:

        

1. Is the project actively maintained?

        2. Doe the project meet quality expectations?

        3. Does the project use the OWASP project infrastructure/standards?

        4. Does the project further the OWASP mission?

        5. Do you see value in this project?

We decided that these questions were general enough to apply to any type of project (documentation, tool, code library, or media) and at the same time provide guidance to the reviewer.  Additionally, while the reviewer would only need to answer “yes” to these 5 questions for the project to move into labs, we could provide sub-questions under each main question to provide feedback to the project leader (and possibly provide more guidance in answering the general question if the reviewer so decides).  So these questions seemed simple enough for a reviewer to complete the review process in a relatively short amount of time provided that they had the information necessary to answer the questions.  

As noted with previous OWASP project reviews, much of the time spent in the review process can simply be in reading materials or gathering the information about the project itself. This information must be obtained before a reviewer can even consider completing the actual assessment.  Our goal was to make the process as simple and straightforward as possible for reviewer(s). We all agreed that the best way to facilitate the review or “cater” to the reviewer was to gather that information as much as possible before the review process. So, the reviewer(s) would not only have access to the “automated” review metrics on the project dashboard (discussed previously) but also, we would require that project leaders, when they determined they were ready for evaluation, submit a “promotion proposal” outlining their project, the process they had gone through, and why they thought they met each of the 5 criteria/questions stated above.  Then the project leader could present his or her project to the reviewers (either via recoded medium or skype/webex) the reviewer could follow up with questions if necessary.  We also discussed the possibility of listing projects that are “up for promotion” on the main project webpage to show that it is not something to be taken lightly and also congratulate project leaders for getting to that point.

While we hoped that the process of preparing for review and submitting a promotion proposal would weed out those project leaders who were not actually ready for review or not serious about the review, we felt like there should be some “stopper” in the system to prevent a project leader from submitting for re-review immediately after failing. That is, we wanted to build in the notion that when a project fails review, it is not because of a “quick fix” that the project leader/team missed. So, we decided that there needed to be a buffer or restriction not only in how frequently a project leader can apply for his/her project to be re-reviewed, but also a maximum amount of reviews that can be granted for that project within a certain time period.

Finally, we needed to discuss who would be performing the review between incubator and labs. We discussed whether just anyone could do the review (as we had determined was ok for the earlier review from new to incubator) or did the reviewers need to be OWASP contributors or leaders?  What about reviewers from industry?  Chris explained to us the system used by the Java Community Process Program (JCP) and noted that we could do something similar.  

We decided that the best way to come up with reviewers that at least overcame some (low level) bar, was to have a review pool that anyone could opt into (apply for).  This review pool could be composed of individuals participating in OWASP as well as other people from the appsec community.  Ideally, we would tie participation in this review pool with our corporate members/supporters and when they contribute (directly or indirectly), we would invite them to be involved in our review pool. Regardless of the composition of reviewers (OWASP or not), the project leader would need to have 5 reviews that passed muster under our 5-question standard to move into the labs phase.  While the GPC would not need to be involved in every review, they would set up a periodic audit to provide a check on the system.

Another concern that came up concerning the review process, was to create a “failsafe” for reviewing projects that applied for graduation from incubator to labs but were not able to find reviewers. We decided to set up a system for a paid professional reviewer. There would also be an option for a paid project review open to any project leader, but it must be paid for out of the project budget except in cases of last resort where no volunteer review is found within a reasonable time.  In the latter case, the review would be paid by the GPC.

Having worked out some of the review/graduation kinks, we started to re-examine the model of having 3 vs. 4 phases in the new lifecycle (i.e. Image B vs. Image C). Although there was concern at the beginning of the conversation about the “bar” between incubator and labs and whether if was ok to have a large incubator category (hence adding a 4th stage), we realized an extra stage might add unnecessary confusion to the model. So, after some healthy debate about the merits of a 4 phase model vs. a 3 phase model, we decided to stay with the 3 phase model (Image B).

The graduation criteria that was defined for the new to incubator move, could still be “recommended” for projects as they progressed through the incubator phase and moved toward labs, but it would not be required.  Instead, it would just be one of the metrics available on the project dashboard and used in the consideration of the 5 question review process for graduation from incubator to labs.

So the final model would look like this

Incubator

Project idea, just out of the gate, no release, but  must have a defined deliverable  and single point of contact to serve as a project leader

Labs

A project that has gained community support as well as showing it:

1. Is actively maintained,

2. Meets quality expectations,

3. Uses the OWASP project infrastructure/standards,

4. Furthers the OWASP mission,

5. Provides value.

Flagship

OWASP projects selected for their maturity, established quality, and strategic value to OWASP and application security as a whole.  These projects have also demonstrated usefulness and set themselves above other similar OWASP tools in their security space.

Enterprise Edition

Just one release version of a project that breaks away from the project lifecycle (project can go back to development for a new version)

If no stable release - no inactive tag, just deleted off the face of the earth after no activity in 1 year.

If stable release - Tagged as inactive if no activity (including but not limited to mailing list replies, code commits, wiki changes) within past year

Tagged as inactive if no activity (including but not limited to mailing list replies, code commits, wiki changes) within past year

No inactive projects here - will drop back to labs (and “inactive” labs)

By definition, not inactive.

5) Instead of resorting the projects to align with our modified lifecycle (raising the bar for entry into labs), we decided that we could leave them provisionally where they were originally sorted.  If a project was placed in the labs phased based on there being at least one functional and stable release (as well as some activity being demonstrated on the mailing list, code commits, etc. within the last year), then we would give them a certain amount of time to apply and initiate the new 5 question review process in order to stay in labs - thus the “provisional” categorization.  Otherwise, if the project decides not to go through the new review process or just does not take initiative in that direction, the project will drop down to the incubator phase.

Additionally, we considered that there will need to be a rebuttal period for project leaders that do not agree with their projects initial classification.  However, based on the criteria and that we used for sorting as well as the plan of action going forward, there will likely not be much room for debate in the placement.  

The only room for debate may be in the projects selected strategically by the GPC for the Flagship category, but this is likely something that can be appealed to the GPC on a case by case basis.

At the end of this part of the working session, we noted the following action items for the GPC:

PROJECT HOSTING

Though we were running a couple of hours over in our working session, we discussed the responses received by the GPC in their project hosting RFP process. Three proposals were received:         

1. OWASP internal hosting (self-hosted) - Larry Casey and Chris Schmidt would work as paid contractors to set up and maintain a project hosting system on OWASP infrastructure: setup cost $9,920 and $19,318.80 annually

2. Collabnet - they only addressed about one-fourth of the requirements stated in the RFP and provided limited language support (which was a concern due to the focus on our org growing internationally). The setup cost was $18,000 and $22,700 annually

3. Sourceforge - lots of language support, localized site, and responded individually to each of the requirements listed in the RFP.  This was also the cheapest proposal received: set up cost of $9,000 and $15,000 annually

After a short discussion and acknowledgment by all present that they had sufficient time to review all the proposals, there was a committee vote and Chris, Justin, Keith, and Jason all voted in favor of going forward with Sourceforge. (Although Brad and Larry were not present, there were enough committee members to constitute quorum and majority vote without them. Also the Brad and Larry received all three of the RFPs and no specific comments or objections were raised.)

DRAFT UNDER GPC REVIEW UNTIL 7/8/2011

APPROVAL OF MINUTES TO FOLLOW THEREAFTER


[1] GPC Meeting Minutes (May 13, 2011)