Strength in one or both upper appendages, regular errands can be extremely challenging for people who have suffered from neurotrauma, such as a stroke. These issues have sparked the creation of robotic devices that help people improve their capabilities. However, the rigidity of these assistive devices can be problematic for more complex tasks like playing an instrument.
Piano players who have suffered a crippling stroke are receiving a first-of-its-kind mechanical glove as a “hand” and source of motivation. Researchers from Florida Atlantic University’s College of Engineering and Computer Science created the soft robotic hand exoskeleton, which uses artificial intelligence to improve hand dexterity.
The first robotic glove to incorporate AI, flexible tactile sensors, and soft actuators into a single hand exoskeleton and “feel” the difference between the correct and incorrect renditions of the same song is this robotic glove.
According to senior author Erik Engeberg, Ph.D., who is also a member of the FAU Community for Complex Frameworks and Cerebrum Sciences and the FAU Stiles-Nicholson Mind Establishment, “Playing the piano requires complex and profoundly talented developments, and relearning projects includes the reclamation and retraining of specific developments or abilities.” Our robotic glove is made of sensors and soft, flexible materials that provide gentle support to people who want to relearn and regain their motor skills.
Special sensor arrays were embedded in each fingertip of the robotic glove. In contrast to previous exoskeletons, this new technology restores the delicate finger movements required for piano playing with precise force and direction. By monitoring and responding to users’ movements, the robotic glove provides real-time adjustments and feedback, making it simpler for users to learn proper movement techniques.
To demonstrate the glove’s capabilities, researchers programmed the robotic glove to distinguish between the correct and incorrect piano renditions of the well-known song “Mary Had a Little Lamb.” They created a pool of 12 distinct types of errors to display in the exhibition. These errors could occur at the beginning or end of a note, or they could be caused by timing errors that were either untimely or postponed and lasted for 0.1, 0.2, or 0.3 seconds. There were ten distinct song variations, and in each of the three groups of three variations, the appropriate song played flawlessly.
The data from the fingertips’s tactile sensors were used to train Random Forest (RF), K-Nearest Neighbor (KNN), and Artificial Neural Network (ANN) algorithms for classifying the song variations. To feel the differences between the correct and incorrect versions of the song, the robotic glove was used independently and while a person was wearing it. To characterize the right and wrong melody varieties with and without a human subject, the accuracy of these calculations was looked at.
The ANN algorithm had the highest classification accuracy—97.13 percent with a human subject and 94.60 percent without one—according to the study’s findings, which were published in the journal Frontiers in Robotics and AI. The percentage error in a particular song and key presses that were out of time were successfully identified by the algorithm. These results highlight the smart robotic glove’s potential to aid disabled people in relearning manual dexterity skills like playing musical instruments.
Hydrogel casting and 3D-printed polyvinyl acid stents were used in the design of the robotic glove, which integrates five actuators into a single wearable device that fits the user’s hand. The fabrication process is novel because the form factor can be tailored to each patient’s individual anatomy using 3D scanning technology or CT scans.
According to Engeberg, “Our plan is essentially less difficult than the majority of plans because all of the actuators and sensors are consolidated into a single trim interaction.” Importantly, despite the fact that the purpose of this study was to play a song, the technique could be used for a wide range of everyday activities, and the device could facilitate intricate, patient-specific rehabilitation programs.
The data could be used by doctors to make individual plans of action to find a patient’s weaknesses. These flaws can be identified by observing sections of a song that are consistently played incorrectly and determining which motor functions require improvement. The rehabilitation team can provide patients with a customizable path to improvement by prescribing more challenging songs to them as they progress in a game-like progression.
Stella Batalama, Ph.D., dignitary of the FAU School of Designing and Software engineering, expressed, “The innovation created by teacher Engeberg and the exploration group is really a gamechanger for people with neuromuscular problems and diminished appendage usefulness.” In spite of the way that other delicate automated actuators have been used for piano playing; Our automated glove is the only one that has demonstrated the ability to “feel” the difference between the correct and incorrect versions of a similar tune.
Co-authors include the study’s first author, Ph.D. student Maohua Lin; Rudy Paul, a graduate student; moreover, Moaed Abd, Ph.D., another graduated class; every one of them are understudies at the FAU School of Designing and Software engineering; James Jones, a student at Boise State University; Darryl Dieujuste, a former FAU School of Designing and Software Engineering colleague and researcher; and Harvey Chim, M.D., a professor in the Division of Plastic and Reconstructive Surgery at the University of Florida.
This research was supported by the National Science Foundation, the National Institute on Aging, and the National Institute of Biomedical Imaging and Bioengineering (NIBIB) of the National Institutes of Health (NIH). A seed award from the FAU Foundation for Detecting and Inserted Organization Frameworks Designing (I-SENSE) and the FAU School of Designing and Software engineering contributed to some degree to this review’s subsidizing.