The phone seems like the next ideal point in the evolution of transactions technologies. There are lots of advantages to digital wallets, such as ease of use, security, and contextual intelligence. A smartphone knows a lot of things about the user than any magnetic strip could ever hope to. it knows where the user is, whats on his calendar, who his friends are, his shopping habits, and finally, how much money he has and where. This knowledge base, usually known as passive intelligence or contextual awareness, is infinitely expandable. In the ideal limit, your phone could know you as well as you know yourself. This is a very impressive data set to have if you’re trying to help a user make purchases with as little friction as possible.
One key technological question is that of identification.
How does your phone know that it really you?
If its stolen, will the thief have easy unfettered access to your finances?
Technologically speaking, there are a few ways to ascertain the identity of the account holder. The simplest one is what we use today - username/password combos. This is also the least secure. The second one is a token system, whereby the user is handed some sort of token such as a digital certificate or a physical key fob that grants him access to certain accounts. Today we mostly use what’s called two-factor authentication, which is a combination of the above two methods. But the holy grail is of course is biometric ID. This uses the user’s biological features to ascertain his identity. Implementations could use the following:
Iris scans - straightforward scan of the iris. This is effective but awkward from a usability stand point, unless it gets so good as to become truly effortless (think iris scan system from minority report).
DNA scans - this would be the ultimate, in a way. DNA is absolutely unique to a person. However, DNA samples are easy to obtain, so I don’t know how foolproof this system would be. There is also a technological barrier as to how and how fast a system could scan a users DNA.
Phenotypic recognition - the phone would either use its camera to look at the user’s face, or use its microphone to listen to the users voice, or use a fingerprint scanner to scan the user’s fingerprints. This is the one that seems doable right now with a a reasonable level of accuracy. So i’ll discuss it in more detail here.
The current iPhone already has an HD front camera and 2 or 3 microphones with respectable signal processing hardware and software in the A6 CPU. In a generation or two, these things can get to a point where they’re good enough to be used to ID the user. Siri could come into play here to deliver that apple panache. Imagine saying this to Siri:
Siri, buy me two tickets to see Django Unchained when I get to the movie theater and prepay for two sodas and popcorn.
Siri would immediately set up a geofence around the movie theater and purchase those things for you when you get there, and put your tickets in passbook. Since she knows your voice ID, every command you give her is itself an authentication method that she can use to perform transactions. If you chose to open a banking app, the phone might ask to your face and make sure that it really is you. But the user can’t always use his voice (bad cold, quiet room,etc) or show his face to the phone (bad lighting, movie theater, etc). This is where the fingerprint tech comes in.
There are 3 user interaction models for fingerprints IDing:
Rollers - the user rolls his finger in a sensor. This is old school.
Swipe - the user swipes his finger past a strip of sensor. This is the most common type. It saves money and space, because it has the least area possible dedicated to sensing.
Area - this is based on a big flat surface with embedded sensor networks that the user can rest his fingers on, and the print would be read everywhere on the surface. This is expensive.
That last option, though expensive, is the most apple-like solution because it is the easiest and most elegant from a user standpoint. Area sensors are based on a grid of capacitive sensing pixels. Sound familiar? It should, because that’s how capacitive touch-screens on iPhones detect multi-touch inputs. For the iPhone 5, apple moved to in-cell sensors for touch, which means that each display pixel also acts as a touch sensor pixel. This means that the current generation iPhone has a capacitive touch sensor with a density of 326 pixels per inch (326 PPI). I had a chat the other day with professor Barry Johnson here at UVA (Bio sketch here), who is also the founder of privaris (a biometric security firm), and he told me that in order for area sensors to be accurate enough, they need to have a sensor with something like 500 PPI. And lo and behold, we recently hear news that apple is in talks with (and may even make investments in) Sharp. Sharp, of course, is the pioneer in IGZO technology, which allows for the shrinking of the backplane circuitry that powers LCD screens, allowing them to create LCD displays with… Wait for it… 498 PPI (source). Coincidence? Maybe. But maybe not.
If IGZO is the real deal, apple could find itself with LCDs that have in-cell sensors at a resolution of ~500 PPI in the next year or two. This will let the iPhone do finger print ID on the fly as the user is using the phone. A possible scenario is that every time the user taps the icon for a banking app, that patch of screen where the icon currently resides will automatically and seamlessly take a fingerprint snapshot. The ID is then performed and the app will only open IF the user is recognized. Another high security scenario is that the lock screen where you swipe to open will also sense your fingerprint while you’re swiping and on the fly authorize access. The user can configure security app by app too, in this case. He can also create a fortress of a phone by requiring that fingerprint, facial and voice recognition all happen in order to grant access to certain features. I believe this is also the reason that apple has bought fingerprint authentication company Authentec back in july for $356 million. I wouldn’t be surprised to see them add one or two more biometrics companies going forward if they feel the need to accelerate this aspect of their business.
This whole thing will only work if banks grant the iphone (and siri) unprecedented access to users’ accounts. This will be authorized by the user. This brings up the interesting possibility of apple doing something interesting in the banking area. They could simply sign a deal with the banks and transaction processors to make this all work. But from what we know about apple, they’re just as likely to set up their own finance division to process transactions, or even a bank. It’s anyone guess what that might entail, but the possibility is significantly greater than nil.
For governments and other critical security users, another level of security can be added via encrypted processors. There’s a lot of literature on that, so I won’t go into details here. But apple could possibly add encrypted chips to iPhones or even design the whole SOC to be an encrypted system (now that they’re designing their own cores). Read more here on encrypted chips - http://en.wikipedia.org/wiki/Secure_cryptoprocessor
This is apple’s revenue split for the last quarter. Mobile is the future and apple has its feet firmly planted in the mobile world. Even the mac sector of this pie chart is mostly made of macbook airs and macbook pros.
This is also why microsoft will be going the way of IBM pretty soon. Thats how they got started anyways - what goes around comes around.
So, I bought the iPhone/ipad version of keynote yesterday, expecting to be editing my keynote slides on my phone (Seeing as how I’m using keynote pretty much everyday these days). After downloading, i fully expected all my keynote files from my mac to automagically appear in the mobile app because I’ve had “Documents in the cloud” turned on in the iCloud preferences both on my mac and my iPhone. WOMP motherfucking WOMP.
when apple says, “it just works,” they don’t actually mean it works for everything. Because it doesn’t work for documents and that fucking sucks. Apparently, you have to use the web interface to iWork.com to get your docs from the mac to iCloud, and even then i don’t know how you’d work with them on your mobile device.
Apparently, we have to wait for the Mountain Lion update , which does have a separate dialogue box for saving things straight to iCloud. But that won’t be until this summer…
AT&T is getting spanked.
Simply amazing. iPhone apps are amazing.
most, if not all, of the apps in the iOS app store will become mac apps as well, once the mac app store opens. we’ll see soon enough…
This was so heart breaking. One of the saddest days of my life…
MATLAB mobile. I don’t know how I didn’t hear about this before… Its an iPhone app that lets you remote connect to a desktop installation and you get a fairly decent console. You can type commands, which means you can run scripts too. This is an interesting idea…