SLCs are responsible for conducting an annual survey of all students within their ERC regarding center Strengths, Weaknesses, Opportunities, and Threats. The information gathered in this SWOT analysis is intended to help SLCs identify center strengths to be capitalized upon, identify needed areas of improvement, and plan for future opportunities and threats. The SWOT analysis also aids the NSF in assessing student life and the students' views regarding the center.
The SWOT analysis is carried out once per year, and the results and planned follow-up are discussed with the NSF in a closed meeting between the NSF, SLC and, in some centers, the entire ERC student body.
8.9.1 Generating Questions
Most SLCs generate questions and issues to include on the SWOT survey through brainstorming sessions and discussions. This can be achieved at meetings of either the SLC leadership or the general student body, over email, or through social media and online forums such as reddit. Some centers review the results from the previous year, or even simply revise the old survey, but this might not effectively address all new issues affecting students in the center. The recommended best practice is to gather student input ahead of the SWOT survey in order to get recommendations for new questions or issues that arise. To aid in maximum survey participation and completion, survey questions should not be too numerous. Depending on the section, a maximum of 15-20 questions is appropriate, and similar questions should be merged.
Since the goal for the SWOT survey is to get a clear picture of the condition of the student body, questions on student demographics should be gathered. Demographics include information such as a student's university, core or associated status, thrust/section/testbed affiliation (the term varies by ERC), and years in their graduate/undergraduate career. This information is necessary for analysis of the results, which will be discussed in Section 8.9.3 below.
Regarding the questions, responses should follow a Likert Scale (1 to 5, with 1 being strong disagreement and 5 being strong agreement), and may also include a response about the student's awareness of an issue (such as "not aware"). Questions should be phrased so that the meaning of agreement or disagreement is very clear, and it is best if one or two members of the SLC who had a lesser role in developing the survey complete the survey first to get an impression of how the questions will be interpreted. Short answer responses may also be gathered, and if so, comments should be solicited at the end of each of the four sections of the SWOT survey so that a student does not forget his or her comment(s) about that section.
8.9.2 Conducting the Survey
Different centers conduct the SWOT survey through a wide variety of methods: SurveyMonkey, paper-and-pencil, email, and verbal responses have all been used. Some centers have an initial discussion and then use follow-up emails or surveys to generate responses from a greater percentage of the ERC student body. Response rates vary from 20% to almost 100%. There is little correlation between the response rate and the center's size, age, or strength. In fact, some of the largest centers are able to achieve the highest participation rates. Centers with a large number of partner institutions, however, seem to have lower response rates for SWOT surveys.
Successful methods for gaining a high response rate have included incentives (such as a drawing for prizes), timing the survey with a student event that has large student participation, or creating a paper-and-pencil survey that is physically given to all students. With all current ERCs consisting of two or more partner institutions, web-based surveys typically have very low student participation unless the SLC assigns a champion for each university, who contacts other ERC students on their campus.
8.9.3 Analyzing Results
Proper analysis of the SWOT survey results can be a powerful tool for getting a snapshot of the condition of students across an ERC. At least a week should be set aside for analysis of results, as responses should be measured not only in aggregate, but also by demographics. For example, a weakness with a 60% agreement rate may be composed of 50% from one demographic and 10% from another, pointing to issues such as a lack of engagement for new students or for students at a particular university. Since drilling down on questions can be time consuming, priority should be given to survey responses that merit action items.
SLCs from mature centers may come across repeating problems, which have not been resolved despite numerous efforts by the SLC. In these cases, studying the demographics of the response can often point to the source of the issue (such as new students being unaware of resources) but after this, it may also be helpful to append additional questions to the survey or gather more detailed student feedback. One center was able to identify, for example, that its inability to resolve a complaint about internships came mainly from newer students being unaware that the center provided internships.
Proper analysis of the SWOT survey can also help generate better questions in future years.
8.9.4 Forming an Action Plan
Surveying students for the sake of presenting results to administrators or NSF evaluators is not enough. The SLC should be responsible for following-up on the results of the SWOT analysis. Some SLCs assign specific individuals to areas of concern. Others make recommendations to center administrators regarding how problems might be resolved or student life improved. For instance, when computers, notebooks, and tablets were found to be inefficiently distributed at one center, action was taken in the subsequent year to allocate resources appropriately. In another case, when the judging procedure for site visit poster contests was not working well, it was later revised. The Site Visit Team looks not only for action plans for the current SWOT survey, but also for success with action plans for previous surveys.