Skin as Network

Photo via Carnegie Mellon

According to two developments using the human skin as an interface, we are moving closer to the reality that we will no longer need devices as our bodies network in the ubiquitous information world imagined by Philip K. Dick. Researchers at Carnegie Mellon University and Microsoft Research recently introduced a new skin-based interface called Skinput that allows for using hands and arms as touchscreens. Skinput works by detecting the various ultralow-frequency sounds produced when tapping different parts of the skin, allowing users to control audio devices, play games, make phone calls, and navigate hierarchical browsing systems.

Co-developed by Chris Harrison, Ph.D. student at Carnegie Melon and developer of the touchscreen tabletop and Microsoft researchers Dan Morris and Desney Tan, the software matches sound frequencies to specific skin locations, allowing the system to determine which “skin button” the user pressed. A keyboard, menu, or other graphics are beamed onto a user’s palm and forearm from a pico projector embedded in an armband. An acoustic detector in the armband then determines which part of the display is activated by the user’s touch. Variations in bone density, size, and mass, as well as filtering effects from soft tissues and joints, mean different skin locations are acoustically distinct.

RedTacton, from Nippon Telegraph and Telephone Corporation in Japan, has developed a new ‘Human Area Networking’ technology that uses the surface of the human body as a safe, high speed network transmission path. RedTacton employs a proprietary electric field/photonics method, which the company claims surpasses the other methods in terms of communication distance, transfer speed, and interactivity. RedTacton works via receivers interacting with the minute electric field emitted from the body, making communication possible using any body surfaces, such as the hands, fingers, arms, feet, face, legs or torso. RedTacton claims to work through shoes and clothing as well.

via Kurzweil AI and RedTacton

Leave a reply