Final Survey Design

Survey Design


The design of our survey asking experts what they saw as the most critical security-related misconceptions held by novices was one of the most important aspects of this project. One possible approach would be to collect misconceptions from the literature and ask experts to rank them, but we felt it was important to avoid influencing what experts thought. As a result, we had to be careful to not ask leading questions or include examples of misconceptions that might sway the results.

On the other hand, we felt that a single question, such as “What are common computer security mistakes you have observed?” would be too vague and might only generate a fraction of the responses from each participant. For example, an expert might think of a misconception they encounter daily, but might not have other misconceptions come to mind in other areas of security.

Another issue we faced was that it is often easy for people to identify mistakes that they see, but can be difficult for them to generalize from a particular mistake to a broader misconception or pattern. We didn’t want experts to be put off by that challenge, and since we knew we would have to aggregate feedback from experts anyway, we felt it would be valuable to encourage people to simply list common mistakes if that is what came to their mind, expecting that we would do our best to generalize from those mistakes into broader misconceptions.

After considering the aforementioned issues, we decided to ask about six fundamental areas in computer security as a way to “jog” the imagination of our experts. These areas are: ”Network Security”, ”Application Security”, ”Data Security and Encryption”, ”Physical Computer Security”, ”Information Privacy”, and ”Access Control”. We added an ”Other” area as a catch-all in case experts saw common mistakes that they felt didn’t fall under any of the categories. The initial six areas were taken from a Wikipedia list for categories of computer security; we asked several security experts if they felt any major categories were missing, but they did not identify any additional areas.

For each category, we ask two open-ended questions, expecting answers in long text form. The first question is:

Can you think of any security misconceptions or mistakes you have observed in $AREA?

… and the second is:

Why do you believe these misconceptions or mistakes occur?

Using open-ended questions was risky, because some people find it difficult to answer wide-open questions. However, we felt that it was critical to “prime” the experts as little as possible, so that the raw results we obtained reflected their opinions as directly as possible.

One way we tried to mitigate the difficulty of answering open-ended questions, as well as respecting our experts’ time (and potentially their privacy), is that we made every single question on the survey optional. If experts did not have a misconception in mind for a particular area, they could simply leave it blank. We’ll talk about this more when we discuss the results, but we did not plan to use the different “categories” of security to differentiate the misconceptions -- all results would be coded together. This underscored the categories purpose as “memory jogs” and allowed experts to contribute as much or as little as they wanted to, which we hoped would increase participation.

As a final note, we included a short consent form at the beginning of the survey that explained to participants how their data would be used, how long it would be retained, and so on.

In a follow-up post, we will describe our use of a social science coding process to analyze the results from the first question to identify the most commonly mentioned misconceptions. The second question on the survey (Why do you believe these misconceptions or mistakes occur?) exists so that we can compare our results to our experts’ intuition about root causes of the misconceptions, and to use them as we flesh out our misconception list, test, and interventions.

Finally, there are some optional demographic questions. We ask for education level, which we split into nine categories (less than high school, some high school, high school diploma or GED, some undergraduate education, undergraduate diploma (2 or 4 year), some graduate school, master’s degree (or equivalent), and PhD (or equivalent), as well as an “other” section. We also ask ”What degrees or certifications do you have (if any)?” and ”What was/were your major(s)/minor(s) (if any)?” The next question is ”In what sector do you work?” and has five options (Education, Industry, Government, Defense, and Other). Then, we ask ”What is your primary job responsibility?”, which gives five 5 options (educating others, security research, applied security, consulting, and other). We also ask for how many years of experience in the field there are ”How many years of information security experience do you have?” with six categories (1-5, 6-10, 11-15, 16-20, 21-25, 25+). We also ask for an email address, if the participant is willing to receive follow-up communication about the study (”Please enter your email address if you’re willing to receive follow-up communication about this survey”).

In our next post, we'll talk a bit about the count and demographics of our respondents.

Comments

Popular posts from this blog

October 2019 Update

Automate all the things!

Hello World