Applied Science and Eng
ineering Laboratories

The Laboratories


The Applied Science and Engineering Laboratories endeavors to help individuals with disabilities gain the full benefit of new technologies. This page describes many of the projects in our Laboratories right now.

Robotics Lab

Signal Processing Techniques for Tremor Suppres sion: A system that suppresses the effects of hand/arm tremor for persons with this type of disability. The system is based on adaptive equalizers designed to attenuate the frequency components of the tremor signal while keeping the intentional movement as intact as possible.

Enhanced Sensory Feedback C ontrol of an Assistive Robot: For individuals with upper-extremity motor disabilities, a head-stick is a simple and intuitive means of performing manipulations because it provides direct proprioceptive information to the user. An interface to a robot system which emulates the proprioceptive qualities of a traditional head-stick while also allowing for augmented end-effector ranges of force and motion is being developed and tested.

Powered Robotic Orthoses for People w ith Muscular Dystrophy: The powered orthosis is designed to allow people with muscular dystrophy, muscular atrophy or ALS (Lou Gehrig's disease) to move their arms. In this project, an anti-gravity device designed for use in an orthosis has been developed.

Vocational Robotic Workcell: The Vocational Robotics project focuses on evaluating the role that robotics can play in helping to expand employment opportunities for individuals with manipulation disabilities. The three components of the project are assessment and training, system integration, and job identification and analysis. The assessment and training component of the project is in conjunction with the Easter Seals Society of DelMar.

Multimodal User Supervised Interface and Intelligent Control (MUSIIC) : A robot arm is controlled by voice commands integrated with gestures (pointing with a laser) to pick up and move objects in the robot's work space. Such a system enables the user to give commands like ``Put that [points to object] there [points to destination]'' to move objects in the environment. Stereo vision is used to locate objects and places that are indicated by the spot of laser light. An intelligent planning system interprets the voice commands and directs the robot's actions.

Rapid Prototyping for Rehabilitation Devices: An application will be demonstrated which allows virtual prototyping of linkages in JACK by generating fully articulated geometric models of manipulators from user input DH-Parameters. The user is able to interactively alter the DH-Parameters and instantly see the results of their alteration in JACK. In addition to the automatic generation of DH models, the application allows both joint based and Cartesian based control of the models via a Spaceball interface device.

Consumer Innovation - Autovac, a robotic vacuum cleaner: The AutoVac project, as part of ASEL's consumer innovation laboratory, was inspired by and is directed by consumers who are full members of design teams. The goal of the laboratory is to involve consumers in the design of new products and devices with the expectation that items produced will be better suited to the needs and abilities of people who may have some physical limitations. The prototype vacuum cleaner unit is directly controlled by the user by wireless remote control; radio equipment is used to control direction, speed, and independent switching of the vacuum motor and rotating brush motor. The unit is battery operated, and can clean for approximately 1 hour before recharging is necessary. A "docking station" has been developed to provide accessible recharging. The unit is intended for home use, and its cleaning performance is comparable with conventional consumer- level units.

A Body Powered Rehabilitation Robot : A gravity balanced, mechanical arm is currently under development. Such an arm, whose end-point (gripper) is controlled and powered by a functional body part of the user offers an intuitive means for a person with no arm function to interact with the environment.

Haptic Visualization fo r Science Engineering and Math: The PHANToM Haptic Interface is a novel 3 degree of freedom force feedback device that allows the user to feel virtual environments with a fingertip. A method of feeling 2 and 3 dimensional data sets is being developed.

Back to Top

Speech Research Laboratory

Computerized speech assessment: assessing the speech of childrenngology, assists clinicians in clinicians evaluatioatient along with complete information about the Clinical applications of speech
gning advanced software to sensitive mean Markov Modeling (HMM) is being used to developchniqueed in an augmentative communication device which automatically ssment and Nemours Dysarthric Speech Database: To aid basic research into speech from dysarthric talkers. The database ieloped a database ofp in examples of all the speech materials recibility of the speech,ase are Text-to-speech synthesis: The goal of this work is to produce ndescribes how the text would be pronounced by an Enscript" whichwith is then synthesized from the phonetiche text. Natural sounding speech Speaker modeling through automatic extraction of diphones: Becarelatively easy to record and "capture" the voicct will make ite real process of loosing their voicerve the voice of someone who is in theor due to illness.

Back to Top

Gesture Recognition Lab

Neural Network Recognition of Gesture: CyberGloves and Ascension Bird sensors are used to input the biomechanical data. CyberGloves transmit joint angle data; they represent handshapes as a set of eighteen numbers. Each Bird sensor tells the computer where it is with respect to a transmitter. This information is used to know where the Birds are with respect to each other. A user-dependent fingerspelling recognizer has been developed, which has achieved over 96% accuracy. The neural networks was tested on novel instances of fingerspelling from those on which they were trained. The nets have about 60% accuracy when tested on other people's data.

Sign Language Synthesis: Work on sign synthesis for American Sign Language (ASL) is currently in progress. A 3-D virtual signer has been developed. She can be "driven" by CyberGlove and Bird data, by recordings of such data, or by phonetic descriptions. A system of phonetic descriptions of ASL signs loosely based on Liddell and Johnson's sign language decomposition analysis have been developed. The virtual signer now has a "vocabulary" of over 100 signs. She has her own distinct "accent," but she is intelligible.

Back to Top

Systems Development and Evaluation Lab

Virtual Interaction: Virtual Interaction is a low immersion virtual reality system intended to provide recreation and exercise for individuals with disabilities. The users live video image becomes part of a computer generated display. The users image can manipulate the graphical objects around it so the user essentially becomes part of a computer game. This system is engaging but unlike video games, requires more physical activity than a few finger movements. Applications for this technology can be adapted to emphasize or compensate for the abilities of a particular individual. Several program have been written and will be used to demonstrate the technology.

Back to Top

Vestibular Stimulation Lab

Vestibular Stimulation: This research project is investigating the effect of vertical motion on children with cerebral palsy. A pilot study was done where 10 subjects were tested before and after riding on a platform that moves up and down in a similar motion to a trotting horse. Results indicated that there may have been a decrease in lower extremity spasticity following 15 minutes of up and down motion in a majority of the subjects tested. Present research continues to investigate the effects of vertical motion. There will be at least 30 subjects tested. Spasticity as well as balance, and respiratory effects are being tested.

Back to Top


[About ASEL] [ASEL]

URL of this document: http://www.asel.udel.edu/about/labtour.html
Last updated: September 16, 1996
Copyright © Applied Science and Engineering Laboratories, 1996.

Back to Top