The Future of TouchScreens
In today’s society, touch screens are everywhere. They are on our phones, tablets, stores and many other devices and places. The problem is that they are somewhat limited. Yes, some allow you to use two or three fingers but that is it. It seems this may change.
Researchers at Carnegie Mellon University have come up with new technology that can tell exactly what has touched the devices screen. This technology, called TapSense, was developed by Chris Harrison and Julia Schwartz. They are both Phd students from the Human-Computer Interaction Institute inside Carnegie Mellon. TapSense uses a microphone attatched to the screen is used to determine what exactly has interacted with the screen. Instead of just taps, the system is able to distinguish between taps with the tip of a finger, the pad (the part of your finger that is the finger print), the fingernail and even the knuckles. Basically, instead of just detecting gestures, the software can “listen” to what part of the finger was used and act accordingly.
This adds a whole new dimension to touch screens. For example, let us say that this could be used with drawing applications or any application that has many menus. If you touch with your knuckle or your fingernail, maybe it could open different menus.
Right now the software can’t work on smartphones because it requires the extra microphone. The microphones on current smartphones are optimized for picking up voices, not sounds of a finger tap. It could still be implemented though, just with the addition of the extra mic.
Here is a video demoing TapSense:
To read more about from the article this blog post is based on, click HERE. Also, remember to take the BlackBoard quiz!
Entry filed under: Uncategorized.