The way I see it, there are two separate takeaways from this ArsTechnica piece: Vendor inaction leads researcher to disclose Safari, IE flaw.
The first is that convenience doesn't come for free. One of my post-college drinking buddies kept himself solvent refurbishing cars in the days before title branding, and if he taught me one lesson (apart from calling him for a diagnosis before taking my car to the mechanic so I could at least talk a good game), it's that features exist to break. Or, like automatic windows, freeze up in the -30 lows of Eau Claire, Wisconsin.
In the same sense, any software that makes it easy to sling sensitive data around the internet is also breakable. And, as a matter of fact, is extra-vulnerable because a certain percentage of the software development community will be hammering at it harder than a more innocuous feature. That's the reason I never, ever allow browsers to remember form data, much less passwords, for me. It's the difference between hobbling along under your own power and being pushed about in a Barcalounger on wheels: Even hobbling, you know who's actually in the driver's seat.
The second point is, IMO, the unspoken point of the ArsTechnica piece, namely the difference between open source (FOSS) and proprietary software. Even when the proprietary product is more polished, more extensible (or what have you) than its FOSS counterpart, there's a crucial difference in the way that vulnerabilities are viewed.
In the case of FOSS, any functional imperfection is an unexpected outcome, and thus more or less a comment upon the person or team who released it. An actual security hole is exponentially more so. That can be a good thing (i.e. a strong motivation to fix things Right This Very Nanosecond) or a bad thing (a certain lack of big-picture perspective, which is crucial for allocating limited resources).
Then there's proprietary software. Particularly the web browser, which--sexy HTML5 notwithstanding--is IMTLHO still hugely overworked and underappreciated because of its dot-com baggage. Integrated pest management isn't the least appropriate metaphor for handling proprietary software bugs, implicit pun included. How critical is any particular vulnerability? It depends on who will have the loudest freak-out. Which is precisely the kind of thing guaranteed to get up a hacker's nose--meritocracy carrying a lot more water in that sphere and all.
Even with some grasp of the philosophical differences, I nevertheless fail--meaning, "fail" as shorthand for #epicfail--to grok the attitude of (some) brand-name corporations when it comes to security vulnerabilities or other deal-breaking flaws in their products. Why?
A.) Someone just handed them free testing time, which at a minimum merits the uncommon decency of a "thank you."
A-point-one.) That same someone isn't keeping mum about the flaw to profit from it.
B.) Someone cared enough about a product to dig that deeply into it--someone who could easily be turned in a cheerleader for the company by dint of a pleasant, appreciative response.
C.) Given the afore-mentioned meritocracy, that someone is primarily looking for kudos. If the messenger is ignored--not even shot, just ignored--the I/T community will more than amply fill the kudos-vaccuum if they are not supplied by the rightful source.
D.) By circular-filing or black-holing such warnings, the software company in question is broadcasting a signal to its wiser employees and would-be employees: "We don't have the cajones to face our problems anymore. Sure, we can snooker college grads into working for us b/c they can tell their parents/GFs/BFs/SOs that they work for us. They haven't learned the difference between Lysol and gangrene. And when they do, there are non-compete agreements...and more graduates."
With all the AntennaGate joking about reality distortion fields, what's lost is the fact that it's not an isolated phenomenon. There lies the proverbial lion's share of blame, naturally. But that leaves the hyenna's share...and even the vulture's share. I don't care what the UI books preach: At some level, the user has to take responsibility for thinking. If only because outsourcing thought in the online world is every bit as dangerous as outsourcing it in the real one.