Microsoft had its security Waterloo 6-8 years ago, and they have dramatically improved and changed their development practices since then. I once read that they have one whole building of engineers dedicated to security (sorry cannot find the link anymore).
Reports involving security breaches of Apple products often show Apple as someone who feels wronged and would rather try to kill the messenger than fix the problem. To their credit, they appear to be looking for fixes that cause the least harm to their users. But to me they appear to be operating on very limited manpower/budget for this so they have to prioritize.
An example: Apple didn't fix a flaw on Windows iTunes for 3 years because it wasn't a priority, and intelligence software companies sold software based on this hole - imagine if Microsoft did this:
2002 was the start of Microsoft's major security rethink, prompted in part by exploits like the Code Red worm. For a short period it wasn't a building of engineers, but the whole engineering staff that stopped development and focused on security code reviews and bug fixes[1]. One of the outcomes was the Security Development Lifecycle[2].
Attempting to revert the PC revolution into centrally controlled display terminals is "pretty fucking behind". Microsoft attempted to do this a decade ago and was rightfully called out for it, but it seems most of the Apple community is happy to not own the devices they rely on. Apparently Macs really are not meant to be Personal Computers.
Although what I said could be applied there as well, given that I didn't even attempt to reference it, I'm pretty sure. Sorry, arbitrary dichotomies are only fun for ultimately inconsequential things (professional sports, for instance).
Mac OS hasn't reached the point where one needs to jump through hoops to install arbitrary programs, but given that the beginnings of such a system is present in Lion and it's the status quo on all of Apple's more recent devices (and one of their main selling points), one can only conclude that's the direction they will be taking consumer-oriented Mac OS in.
> You also didn't answer the part about trusted code.
Because it wasn't there when I replied.
Centrally signed code repositories on their own aren't the worst thing, I rely on them myself (apt-get). The problem arises when it's fixed to one possible repository. If the proverbial "average user" cannot install and administer a friend-approved but Apple-unapproved app with the nearly the same ease as an Apple-approved one, we end up with a situation where Apple directly controls what average users are capable of. It also causes users who wish to own their device to hope for new exploits to be publicly discovered, which is utterly backwards. I understand that progress for actual security (isolation, capabilities, proper deputies, etc) takes significant work, but coarse-grained whitelists aren't the answer.
I agree with your first point, but I disagree that it makes Mac OS into a 'display terminal'. Particularly since although hoops are being introduced for good reason, there is no evidence that Mac OS X will be closed in the way iOS is.
Your second point as I've indicated, is just conjecture. I don't see good evidence for it actually happening. Certainly nothing that Apple has publicly disclosed suggests that it will. It seems pretty unlikely to me.
But yes, if they did go that far, they would indeed control what average users were capable of.
If it ever gets to that point, I'd hope that by that time, there would be some obvious killer apps for another OS to demonstrate why it was a problem.
[edit: Coarse grained whitelists might not be the answer, but I highly doubt that Apple is going to stop there. Every OS release is a step along the way. It's worth noting that iOS has generally developed in the direction of providing more capabilities to programmers, rather than less over time.]
I've actually thought about this a bit, and it's not a "killer app" that will be missing from locked down systems, as the repository can always add anything that becomes popular elsewhere (after a little delay from porting/approval/etc).
The difference starts at the foundation, and manifests in a pervasive lack of respect for the user (who's ultimate control and understanding should be a prime usability concern).
For instance, that whole device-id brouhaha - iOS apps really get a unique device-id, which they are then supposed to partially-obscure according to Apple's guidelines? Why in the world is an app allowed to directly query a fixed identifier in the first place?! There should be a specific ID-Api of which the user controls via a system dialog the same way a user controls how long a browser stores cookies for. Sandboxing+auditing then make sure apps aren't using something like the ethernet addresses to get around the user's choice.
But unfortunately, most of the developers who actually know enough to analyze this are on the take of the ad companies and think that their stake on the user's device is equivalent or even overrides the owner's! So Apple kowtows to the advertisers and permits uncontrollable tracking while the end-users are stuck with their only choice being 'use or not use' an app based on how much they perceive it abusing them. Instead of being introduced to a world full of self determination and limitless possibilities (as early computer adopters were), modern day users are shown a standard no-free-lunch world where "they either get you coming or going". Developers are still able to seek out freedom, but the goal of empowering an end user to solve their own problems couldn't be farther from sight.
(And yes, Android has most of these same problems in addition to some of its own, which is why I said dichotomies aren't useful.)
Instead of being introduced to a world full of self determination and limitless possibilities (as early computer adopters were), modern day users are shown a standard no-free-lunch world where "they either get you coming or going". Developers are still able to seek out freedom, but the goal of empowering an end user to solve their own problems couldn't be farther from sight.
I couldn't agree more with this. However that dream seemed to die with the breakup of Alan Kay's original group. Nobody is even approaching this problem except perhaps Kay's own FONC group, and even that seems to be more academic than practical now.
That said, I think that as digital culture matures as more generations grow up with digital creation, programmability will become the primary constraint, and then we might see progress in this area. If Apple doesn't keep up (although I expect they will), this is the domain I expect the killer app to emerge from.
I'm not sure why you bring up the device-id thing. Apple corrected that issue without external pressure. Also, in the real world, I think that expecting end-users to manage a second cookie-like entity with subtly different semantics to cookies is unrealistic.
It's easy to point to Alan Kay's research and then wistfully say that it will hopefully bear fruit one day, but that actually does a disservice to anybody working on today's systems that treat the user as a mature, self-actualizing owner - whether they be creating new software or making existing software more user-friendly.
A system that's built on a philosophy of eliminating capabilities can never progress into a system that allows a user to gradually learn more and empower themselves, as there's nothing "further down" that unifies the whole thing. Software that starts off requiring significant effort to administer can progress into having a user-friendly interface and be incorporated into systems with sensible defaults.
One shouldn't require a user to have to configure everything out of the gate (say, cookie policy), but one shouldn't prevent them from doing things they know they want. Wasn't the Apple device-id thing "fixed" by only allowing tracking on a per-app basis? With cookies, I can have them deleted every time I close the page.
It's easy to point to Alan Kay's research and then wistfully say that it will hopefully bear fruit one day, but that actually does a disservice to anybody working on today's systems that treat the user as a mature, self-actualizing owner - whether they be creating new software or making existing software more user-friendly.
Setting the straw-men aside, which systems did you have in mind?
Every mainstream operating system has unique IDs readily available to applications - it's just that native applications don't traditionally include ads.
Especially the fact that Apple's OS security has been solid, all thru the past two decades, with very few exploits, while Microsofts has bee wide open during those period, and Microsooft deliberately chose not to close the holes because they were there for "marketing initiatives".
On Windows a massive industry of malware detection has sprung up and still there are millions upon millions of zombie PCs out there. Meanwhile, despite no such industry on OS X there are no reports of infections in the wild (from viruses the original commenters claim, this article is about a trojan, which also is a lot more prevalent on windows than on the Mac.)
But don't let these facts get in the way-- remember, the point of this thread specifically, and a big amount of the draw of Hacker News, is that you can bash Apple and get up voted by other Apple haters. Facts are not relevant.
the easier question to ask is what HASN'T Microsoft done?
quick example: they paid millions possibly tens of millions for the best people from industry and academia to build automatic bug searching tools. these tools currently define state of the art for finding bugs in applications.
they also have a community outreach program where they will work with companies and individuals that report flaws. that same outreach program will work internally with the relevant product groups to get flaws fixed.
oh and also they invest constantly in improvements to their toolchain and operating system runtime to make exploitable bugs harder (safeSEH, ASLR, DEP, GS cookies, EMET, encoded pointers, safe-unlinked heaps both in user and kernel mode, etc).
and yet, you realize, there are still exploitable bugs. in my opinion you should regard this as the fundamental instability of system software written in C. if microsoft can't get that right (in terms of security and stability), after all that they've invested, who can?
2. I'd agree that the outreach program is something that Apple clearly hasn't done.
3. Apple clearly is doing this too.
HP's work on secret agents in the 90's shows that you can't prove code to be trustworthy. You can only assign trust to the intentions of the originator.
Therefore, the most significant thing you can do to improve security is to verify the provenance of trusted code and the isolation of untrusted code.
Windows clearly has decent technical code-signing infrastructure, but Apple seems far ahead in terms of effectively deploying this model into the field.
They seem more concerned about not letting users jailbreak their devices than anything else.
"Windows clearly has decent technical code-signing infrastructure, but Apple seems far ahead in terms of effectively deploying this model into the field."
"Far ahead"? How do you justify that claim? Most charitably, it seems that both companies work on security.
Of the two, Microsoft seems to have better documentation, better openness, and better tools.
That jailbreaking works by taking advantage of OS exploits to gain root access, and, in fact, represents a glaringly obvious, popular, and easy to use vulnerabilit has nothing to do with Apple fixing it quickly?
By #3 I was referring to the parent's 3rd comment about the focus on improving the toolchain and runtime. I doubt that you're serious about the jailbreak comment.
By far ahead, I was referring to the deployment of code signing technology.
I think it's pretty clear that although Microsoft has solid code signing technology, they are much further behind in promoting effective use of it in the field.
The compound word 'PreOrder' exists as a discrete term. ... case it as 'Preorder' or strip the first token entirely if it represents any sort of Hungarian notation.
Aw shit, Apple is gonna get fucked by all those remotely exploitable Hungarian notation bugs.
> They seem more concerned about not letting users jailbreak their devices than anything else.
Ironic that you would use a feature that excludes malware from running but pisses off android fans, as an attempt to claim that Apple is not working on security!
these tools currently define state of the art for finding bugs in applications.
The stuff in Visual Studio is so far behind the state of the art I don't even know where to begin. But let's ignore that and recalibrate by asking this. What do you think the not quite state of the art second best automatic bug finding tool is?
Releasing security fixes in a timely manner for one.
This Java exploit was fixed for a long time before Apple so graciously bestowed a fix upon us. Then there's that SSL certificate SNAFU where I had to fix Safari myself, after almost all companies had already issued updates. Too bad that was not possible on iOS, where you just had to sit around and twiddle your thumbs.