Week 9 Reflection

This week’s articles are both case studies on users to see how easily they click on malware warnings. Article one:  Alice in Warningland: A Large-Scale Field Study of Browser Security Warning Effectiveness by Devdatta Akhawe and Article two: Your Attention Please by Cristian Bravo-Lillo, el al.  Article one focused on the percentage of “click throughs” each web browser, i.e. Google Chrome or Mozilla Firefox, had because each browser has different methods of warning their users.  While article two focused on that as a starting point, with the pop-up warning, and designed experiments to see if different types of pop-ups would be more effective. 

Article one’s goal for their case study was see “a 0% clickthrough rate all SSL warnings: users should heed all valid warnings, and the browser should minimize the number of false positives” (Akhawe, 259).  However, as an outsider to the computer cyber security and a lay user when it comes to computers, I do not think you can every have a 0% rate of click throughs because each person using the internet has their own set of biases and they either ignore the pop-up warning or they abide by it.  People are not machines, we make our own decisions on the spot, computers will make the decision they are programmed to make. 

Article two’s approach for click throughs, I think, is more approachable, on the user’s end at least.  Their experiments were to give different prompts for the user to go through.  Because each web browser has their prompts set up differently, each user tends to see different prompts.  Should all browsers unify, to make it easier on the user? So they will not just pass over the prompt because they are irritated with all the different types of prompts?

Leave a comment