Categories
Apple Security

Path’s Privacy Folly Proves Shift In Privacy Views

Path uploaded address book data from its users in order to provide “social” functionality. After this became public they deleted all address data and apologized.

Everyone is ignoring the worst part of this. While very bad, it’s not that Path actually uploaded their address book (I’d venture most store it in “the cloud” already, so true privacy is out the window). The worst part is that Path didn’t even think this would be a problem until it became news. Even 2 years ago I don’t think there was anyone other than malware developers who would think uploading an entire address book of contacts without an explicit approval would be an OK practice. That is a huge cultural shift.

If Path were a desktop app in 2010, they would be competing with AntiVirus and Spyware blockers who would be racing to provide protection to their users.

In just a short time, a practice that would be reserved for illegal and dubious software was adopted by what seems like a mainstream startup. It’s electronic moral decay.

Apple doesn’t get a free pass either. Why in iOS 5 a sandboxed app can access an address book without alerting the user is beyond me. Addresses, calendar data, geolocation, and the ability to make a call are sacred API’s and should have obvious UI and/or warnings. Geolocation does have an interstitial alert. Phone calls have an obvious UI. Address and calendar data need to have an alert before the app is granted access.

Categories
In The News Internet Security

Facebook’s IPO Filing

There’s not really much to say about Facebook’s IPO filing other than we all knew the day would eventually come. People love to look at the number of users as the measure of the company, but the truth is users come and go, all empires eventually fall. They have since the dawn of man. It’s a terrible measure of what a company has done, and is capable of doing. Eventually market saturation will be reached. It’s unsustainable to grow quicker than the world’s population. Everyone but Wall Street and some tech blogs know that.

What really matters is a company’s DNA. For Facebook that’s the willingness to be agile, the willingness to push things, and the willingness to change. That may occasionally backfire, however it’s proven to generally work out quite well. Especially when Facebook is willing to back down and revise as it has in the past. Mark Zuckerberg’s goal is pretty lofty, especially given the world and it’s people are struggling to figure out privacy in a connected world.

To quote him in 2010: “we’ve made great progress over the last year towards making the world more open and connected”. Balancing this mission and not crossing the line will be the challenge Facebook will face for years to come. I’ve criticized them several times in the past for either not doing enough, or not giving enough priority to the right to control privacy. Lately I’ve got less to complain about. I think that’s good for everyone.

Categories
Google Security

Google Open Sesame

Google quietly put up a new login method via QR code. Essentially the way it works is you view the QR code viewed on a computer or tablet. Then use your smartphone to open the QR code and login via your browser. That process remotely validates the session and that computer can then access your account until you logout. Essentially eliminating the need to enter a password on that computer.

Presumably the idea is to work around keyloggers that may record passwords. However, if you don’t trust a computer enough to use a password, do you really trust that it’s not watching everything you are doing? If the computer hardware or software is compromised not even SSL will save you. This might be better, but I’d think it’s only marginally so. I personally just make a rule of not using computers I don’t trust. Given I have a smartphone in my pocket, this is pretty easy to live by these days. Given computers are getting smaller and cheaper, I question if encouraging the use of shady terminals is worthwhile.

Regardless, pretty innovative and clever.

Categories
Apple Security

Smartphone Guest Mode

A very good idea by Greg Kumparak on TechCrunch:

Here’s the dream: one lock-screen, two PINs. One for me, one for anyone else who might use my phone but doesn’t necessarily need to see everything.

Not only is it a good idea for there to be a guest mode, the implementation is quite nice and simple. Maps, Phone, Clock, Calculator, Safari. Perhaps the ability to granularity add/remove from that default set. Everything is stateless and rest when guest mode ends.

This could potentially even lower the divorce rate in the US.

Categories
Security

GPRS Cracked

I mentioned the work of Karsten Nohl to expose how insecure cell phones really are back in 2009. It’s great work since many people assume cell phones are secure, while they likely aren’t nearly as secure as one would think or hope. He’s done a lot more since then as The Register reports:

“The interception software to be released tomorrow puts GPRS operators with no encryption at an immediate risk,” he told The Register on Tuesday evening. “All other GPRS networks are affected by the cryptanalysis that will be presented but not released at tomorrow’s conference. Those operators will hopefully implement stronger encryption in the time it takes others to re-implement our attacks.”

As the article goes on to say, most use none or weak encryption.

In 2010, he bundled many of the various tools he helped develop into a comprehensive piece of software that gave amateurs the means to carry out many of the attacks. That same year, other cryptographers cracked the encryption scheme protecting 3G phone calls before the so-called Kasumi cipher had even gone into commercial use.

So your best bet to make a secure call right now is to use Skype on a smart phone. So far it doesn’t seem anyone has cracked Skype’s security. If Skype has a backdoor or known vulnerabilities is questionable. If they were considered a phone company (they insist they aren’t) they would be subject to CALEA.

Bottom line: Don’t assume a cell phone call is secure.

Categories
Apple Security

On Apple’s Location Tracking

The controversy over Apple’s “Location Tracking” is quite interesting. It’s worth making clear that the nodes stored in the database are approximations of cell phone towers and WiFi hotspots you’re likely to encounter rather than your location(s) at any given point in time. It’s a way to “prime the well” when doing a GPS lookup to improve performance.

Apple notably failed in a few key ways which should serve as a lesson to others:

  1. Always disclose what you’re doing. – Never just assume what you’re doing with someone’s information is cool. Apple could have mitigated a lot of this had they disclosed what the phone was actually doing from day 1. Never transmit anonymous or personal information without letting the user know first.
  2. Never store more than you need – I can’t believe how many companies mess this up. Storing user information is a liability. A good business limits it’s liabilities to only what’s necessary to conduct business. Storing so much data, and not expunging was a very bad move and amplified the situation. On top of not letting users know what was going on, there was no way to purge information. This just made things much worse. Apple went as far as backing up what should be an expendable cache.
  3. Always be paranoid with information – Apple states “The local cache is protected with iOS security features, but it is not encrypted. Beginning with the next major release of iOS, the operating system will encrypt any local cache of the hotspot and cell tower location information.” in the response to Edward J. Markey. This should have been encrypted since day 1. Various tools existed for a few years that could read this data in the surveillance community. Apple undoubtedly knew people were using this data sometimes for illicit purposes. No company has gotten in trouble for being to secure with customer information with anyone other than the NSA or FBI.

It’s worth noting that their software update in response to this controversy is actually pretty good and pretty thorough. I’m surprised they couldn’t quickly shim some encryption around it. The iOS is loaded with enough DRM and crypto.

On another note, I fully expect some court cases to be reopened now that “cell phone records” are not quite as accurate as they were falsely billed to be. Also companies who marketed software are capable of showing a users location history may be liable as this wasn’t accurately vetted. If they did good testing they would have seen the extent of it’s “tracking”. It seems inevitable.

Lastly, I wonder how much battery life, and how much bandwidth this was utilizing. Some customers are on metered WiFi (especially some hotspots). To geo-tag one must turn on GPS, meaning battery life was being drained behind the scenes.

Apple’s full response can be found on Congressman Ed Markey’s website (copied here for perpetuity).

Categories
Security Software

Quicken Security Theater

Quicken Password Confirmation

I don’t understand this one. The reason many (most) sites require you to confirm your password is to ensure you typed it correctly when creating your password, otherwise a typo would prevent you from logging back in correctly later. We’ve all “fat fingered” a password before. That simple confirmation step prevents it on creation. How does entering my password twice when logging in provide any additional security? If the password is compromised, the extra field does nothing.

I presume the reason is to make Quicken look/feel more secure than it really is.

I should note that I like Quicken. I like it enough that even though the native Mac version is so disappointing on paper that I never purchased it, I did I purchased the Windows version and continue to use it there. I think that demonstrates my not hating Quicken. It does however have its quirks that just make me wonder what they were thinking.

Categories
Mozilla Security Web Development

Wanted: Native JS Encryption

I’d like to challenge all browser vendors to put together a comprehensive JS API for encryption. I’ll use this blog post to prove why it’s necessary and would be a great move to do so.

The Ultimate Security Model

I consider Mozilla Sync (formerly known as “Weave”) to have the ultimate security model. As a brief background, Mozilla Sync is a service that synchronizes your bookmarks, browsing history, etc. between computers using “the cloud”. Obviously this has privacy implications. The solution basically works as follows:

  1. Your data is created on your computer (obviously).
  2. Your data is encrypted on your computer.
  3. Your data is transmitted securely to servers in an encrypted state.
  4. Your data is retrieved and decrypted on your computer.

The only one who can ever decrypt your data is you. It’s the ultimate security model. The data on the server is encrypted and the server has no way to decrypt it. A typical web service works like this:

  1. Your data is created on your computer.
  2. Your data is transmitted securely to servers.
  3. Your data is transmitted securely back to you.

The whole time it’s on the remote servers, it could in theory be retrieved by criminals, nosy sysadmins, governments, etc. There are times when you want a server to read your data to do something useful, but there are times where it shouldn’t.

The Rise Of Cloud Data And HTML5

It’s no secret that more people are moving more of their data in to what sales people call “the cloud” (Gmail, Dropbox, Remember The Milk, etc). More and more of people’s data is out there in this maze of computers. I don’t need to dwell too much about the issues raised by personal data being stored in places where 4th amendment rights aren’t exactly clear in the US and may not exist in other locales. It’s been written about enough in the industry.

Additionally newer features like Web Storage allow for 5-10 MB of storage on the client side for data, often used for “offline” versions of a site. This is really handy but makes any computer or cell phone used a potentially treasure trove of data if that’s not correctly purged or protected. I expect that 5-10 MB barrier to rise over time just like disk cache. Even my cell phone can likely afford more than 5-10 MB. My digital camera can hold 16 GB in a card a little larger than my fingernail. Local storage is already pretty cheap these days, and will likely only get cheaper.

Mobile phones are hardly immune from all this as they feature increasingly robust browsers capable of all sorts of HTML5 magic. The rise of mobile “apps” is powered largely by the offline abilities and storage functionality. Web Storage facilitates this in many ways but doesn’t provide any inherent security.

Again, I don’t need to dwell here, but people are leaving increasingly sensitive data on devices they use, and services they use. SSL protects them while data is moving over the wire, but does nothing for them once data gets to either end. The time spent over the wire is measured in milliseconds, the time spent at either end can be measured in years.

Enter JS Crypto

My proposal is that there’s a need for native JS Cryptography implementing several popular algorithms like AES, Serpent, Twofish, MD5 (I know it’s busted, but still could be handy for legacy reasons), SHA-256 and expanding as cryptography matures. By doing so, the front end logic can easily and quickly encrypt data before storing or sending.

For example to protect Web Storage before actually saving to globalStorage:

globalStorage[‘mybank.com’].lastBalance = "0.50";
globalStorage[‘mybank.com’].lastBalance = Crypto.AES.encrypt("0.50", password);

Using xmlHttpRequest or POST/GET one could send encrypted payloads directly to the server over HTTP or https rather than send raw data to the server. This greatly facilitates the Mozilla Sync model of data security.

This can also be an interesting way to transmit select data in a secure manner while serving the rest of a site over HTTP using xmlHttpRequest by just wrapping the data in crypto (that assumes a shared key).

I’m sure there are other uses that I haven’t even thought of.

Performance

JS libraries like Crypto-JS are pretty cool, but they aren’t ideal. We need something as fast and powerful as we can get. Like I said earlier, mobile is a big deal here and mobile has performance and power issues. Intel and AMD now have AES Native Instructions (AES NI) for their desktop chips. I suspect mobile chips who don’t have this will eventually do so. I don’t think any amount of JS optimization will get that far performance wise. We’re talking 5-10 MB of client side data today, and that will only grow. We’re not even talking about encrypting data before remote storage (which in theory can break the 10MB limit).

Furthermore, most browsers already have a Swiss Army knife of crypto support already, just not exposed via JS in a nice friendly API. I don’t think any are currently using AES NI when available, though that’s a pretty new feature and I’m sure in time someone will investigate that.

Providing a cryptography API would be a great way to encourage websites to up the security model in an HTML5 world.

Wait a second…

Shouldn’t browsers just encrypt Web Storage, or let OS vendors turn on Full Disk Encryption (FDE)?

Sure, both are great, but web apps should be in control of their own security model regardless of what the terminal is doing. Even if they are encrypted, that doesn’t provide a great security model if the browser has one security model in place for Web Storage and the site has its own authentication system.

Don’t JS Libraries already exist, and isn’t JS getting the point of almost being native?

True, libraries do exist, and JS is getting amazingly fast to the point of threatening native code. However crypto is now being hardware accelerated. It’s also something that can be grossly simplified by getting rid of libraries. I view JS crypto libraries the way I view ExplorerCanvas. Great, but I’d prefer a native implementation for its performance. These libraries do still have a place bridging support for browsers that don’t have native support in the form of a shim.

But if data is encrypted before sending to a server, the server can’t do anything with it

That’s the point! This isn’t ideal in all cases for example you can’t encrypt photos you intend to share on Facebook or Flickr, but a DropBox like service may be an ideal candidate for encryption.

What about export laws?

What about them? Browsers have been shipping cryptography for years. This is just exposing cryptography so web developers can better take advantage and secure user data. If anything JS crypto implementations likely create a bigger legal issue regarding “exporting” cryptography for web developers.

Your crazy!

Perhaps. To quote Apple’s Think Different Campaign

Here’s to the crazy ones. The misfits. The rebels. The troublemakers. The round pegs in the square holes.

The ones who see things differently. They’re not fond of rules. And they have no respect for the status quo. You can quote them, disagree with them, glorify or vilify them.

About the only thing you can’t do is ignore them. Because they change things. They invent. They imagine. They heal. They explore. They create. They inspire. They push the human race forward.

Maybe they have to be crazy.

How else can you stare at an empty canvas and see a work of art? Or sit in silence and hear a song that’s never been written? Or gaze at a red planet and see a laboratory on wheels?

While some see them as the crazy ones, we see genius. Because the people who are crazy enough to think they can change the world, are the ones who do.

Time to enable the crazy ones to do things in a more secure way.

Updated: Changed key to password to better reflect likely implementation in the psudocode.

Categories
Hardware Security

In Search Of Fireproof/Waterproof Backup

Every year or two, I like to audit how I backup and store my data. I’ve got a pretty good routine of backing up my hard drive for my primary and secondary computers. It’s part of my weekly routine. I also remotely backup some files just in case of a site compromising situation (fire, flood, theft). I’d like to continue that process to move my primary backups to something more secure for site compromising situations. Remote backups either need physical transportation, or adequate bandwidth, both of which are limiting and not exactly cost efficient. I’d like to bypass that.

I’m aware of but not really fond of iosafe’s line of fire/water proof hard drives because it’s a high investment in 1 drive. This doesn’t seem very practical to me in the long run as data storage needs change and drives get bigger/faster. I also don’t need that level of simplicity. I just want someplace safe to store backups.

What I’m really looking for is a lockbox style safe that meets the following requirements:

  • Just large enough to hold 1-2 3.5″ hard drive enclosures.
  • Fireproof and Waterproof
  • UL 125 rated for 1 hr or more.
  • Solid locking mechanism and hinges that can handle many cycles. Combo is preferred since keys either get lost, or leaving them in the lock results in them getting bent.

There doesn’t seem to be anything on the market that meets these seemingly simple requirements. Almost everything in this size range (which isn’t much) is UL 125 for 30 minutes at best. Reviews for everything in this class is very mixed regarding the quality of the hinges and locking mechanism. Truthfully I’d rather no lock and reliable opening/closing than a failed lock. Unless all computers are physically secured in a safe, it’s false security anyway. USB pass-through isn’t ideal either since who wants to keep something like this that close to their desk and not in a closet or someplace more convenient?

Oh yea, I’d also like to keep this somewhat economical. Truthfully a safe/lockbox of this size generally is, though they don’t meet the other two requirements. I’d be curious if anyone has found something that meets all my requirements. I can’t be the first to go down this path. Maybe I’m just the first who wants to do it right and doesn’t want a 300 lb walk-in safe.

Categories
Mozilla Security

Firesheep Is Just The Messenger

I must say that I’m glad to see there are no plans to pull Firesheep. Add-ons have a lot of power since they run in a privileged space. Anything your browser can access, your add-ons can access. The point to being able to kill add-ons was to protect the user in situations where an add-on was either bundling malware or sending information without the users consent. Firesheep does none of that. It behaves exactly as advertised. It also causes no harm to the user or their computer.

Firesheep doesn’t do anything that couldn’t be done with a packet sniffer, it just makes it trivial enough that the average person can do it. It just makes a flaw in many websites more visible. The more technical folks have known this for years. Firesheep is just the messenger. These insecure bits of traffic have traveled across the wire for a decade or more. All traffic across Ethernet is visible to all devices. This is how Ethernet works. The network is a shared medium. It’s just a matter of looking at it. WiFi is a slightly different ballgame but at the end of the day if a wireless signal is unencrypted, it’s just a matter of listening.

I am not a lawyer (nor do I play one on TV) but from a legal perspective I suspect Gregg Keizer is correct in suggesting that it’s likely legal under federal wiretapping statutes (ethics is another debate). However a company likely can still fire you for using it, and a school likely can still kick you out for using it on their network. Private networks have their own rules and policies.

That covers the detection of a session. If you were to actually session jack, that would likely be considered fraud, hacking, identity theft, etc. depending on what you do. Generally speaking, unauthorized access to a computer system is illegal. If you are using someone else’s credentials, that’s by definition unauthorized access.

Electronic communications law is hardly considered developed or mature but generally there isn’t an expectation of privacy when no encryption is used and transmission is done over a shared connection. It’s akin to speaking to someone on the street and being overheard. That said, if someone reads their credit card number while on a cell phone call and you use the credit card information you overheard, it’s still fraud regardless of the interception method.

Bottom line: It’s time to start securing connections.