Archive for the ‘B is for Biosense’ Category

Skin as Network

Thursday, March 11th, 2010

Photo via Carnegie Mellon

According to two developments using the human skin as an interface, we are moving closer to the reality that we will no longer need devices as our bodies network in the ubiquitous information world imagined by Philip K. Dick. Researchers at Carnegie Mellon University and Microsoft Research recently introduced a new skin-based interface called Skinput that allows for using hands and arms as touchscreens. Skinput works by detecting the various ultralow-frequency sounds produced when tapping different parts of the skin, allowing users to control audio devices, play games, make phone calls, and navigate hierarchical browsing systems.

Co-developed by Chris Harrison, Ph.D. student at Carnegie Melon and developer of the touchscreen tabletop and Microsoft researchers Dan Morris and Desney Tan, the software matches sound frequencies to specific skin locations, allowing the system to determine which “skin button” the user pressed. A keyboard, menu, or other graphics are beamed onto a user’s palm and forearm from a pico projector embedded in an armband. An acoustic detector in the armband then determines which part of the display is activated by the user’s touch. Variations in bone density, size, and mass, as well as filtering effects from soft tissues and joints, mean different skin locations are acoustically distinct.

RedTacton, from Nippon Telegraph and Telephone Corporation in Japan, has developed a new ‘Human Area Networking’ technology that uses the surface of the human body as a safe, high speed network transmission path. RedTacton employs a proprietary electric field/photonics method, which the company claims surpasses the other methods in terms of communication distance, transfer speed, and interactivity. RedTacton works via receivers interacting with the minute electric field emitted from the body, making communication possible using any body surfaces, such as the hands, fingers, arms, feet, face, legs or torso. RedTacton claims to work through shoes and clothing as well.

via Kurzweil AI and RedTacton

B is for Biosense

Saturday, May 9th, 2009

Biosense describes the fact that every organism is in permanent exchange with the environment. It represents a new model of ecological design that is aware and alive. For example, think of a plant’s stomata, located on the underside of its leaves. It is sensitive to environmental cues and reacts as a gateway to supply water and nutrients or, in conditions such as drought, it closes tightly to prevent dehydration. Moving forward, objects will biosense, they will be able to detect and monitor invisible substances and apparently nonsensible phenomenon to give people on-demand control of their health and environments. For instance, one of the new features on the iPhone 3.0 update is LifeScan, from Johnson & Johnson Co., which monitors glucose levels through a Bluetooth enabled blood testing device allowing users to make insulin-adjustments. Or imagine a mobile that uses technology developed by Gentag, Inc. that can detect pollen in the air, and since it knows you’re allergic, instruct an alternative route for safe-passage home. Or a car with a carbon meter that indicates the toxin level that’s spewing out from its tail pipe. And then there’s emotion recognition technology. Toyota, working with Stanford University and Affective Media, has designed a car that reads your feelings so when you’re stressed, it responds by cooling the inside climate and plays your favorite music so you can chill out. And to prevent accidents due to road rage, its headlights change colors to indicate the driver’s mood. While right now, sound is made visible in the form of captions or voiced descriptions for the hearing impaired, in the future there will be advanced transduction mechanisms that transform one sense into another. For instance, we all know we feel euphoric when exercising, which is a side-effect of certain endorphins and hormones, and although we can detect the feeling, imagine if there were detectors paying attention to these types of signaling and chemicals that are important for your health or monitoring your state of mind. These feedback loops, where people will become informed about their state of well-being, is a sense that we don’t have right now, but will be artificially created.