I’ve been at my new job doing mobile development full-time for just over three weeks now. Since I get to spend all day doing mobile stuff, I’ve learned a lot more about Android, iOS, and mobile in general.
But the most interesting thing I’ve learned is that for mobile devices manufactured after 2012 in the US, they must have “tongue detection” enabled. This isn’t for playing Angry Birds when both hands are full. Instead it’s one of those emergency things.
When you touch the screen on your smartphone, it uses the electrical properties of the human body to register touches. This is why you need to buy special gloves if you want to use your smartphone with gloves. Because the tongue is one giant muscle, it has slightly different properties than your finger.
So it was decided by someone somewhere this special property should be used. The use case scenario? A person with their hands inaccessible but their phone available. I’m guessing it’s not “both hands are full of sandwiches”, but more of “You’ve been tied up and fell over”. I’m wondering if it’s the same person that came up with the internal trunk release regulation.
Anyways, the regulation originally called for that if the phone detected the tongue moving over a large area of the screen (when it’s off) to dial 911 and put the phone into speaker mode. However, with the rise of hands-free digital assists like Siri, Google Now, and Cortana, manufacturers just open up those instead. This has the user could still choose to call 911, but they could also choose to send a text message, Google Hangout, or check the weather.
What’s the weirdest thing you’ve learned in your industry?