YaBeSH Engineering and Technology Library

    • Journals
    • PaperQuest
    • YSE Standards
    • YaBeSH
    • Login
    View Item 
    •   YE&T Library
    • ASME
    • Journal of Engineering and Science in Medical Diagnostics and Therapy
    • View Item
    •   YE&T Library
    • ASME
    • Journal of Engineering and Science in Medical Diagnostics and Therapy
    • View Item
    • All Fields
    • Source Title
    • Year
    • Publisher
    • Title
    • Subject
    • Author
    • DOI
    • ISBN
    Advanced Search
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Archive

    Design and Implementation of a Behavioral Sequence Framework for Human–Robot Interaction Utilizing Brain-Computer Interface and Haptic Feedback

    Source: Journal of Engineering and Science in Medical Diagnostics and Therapy:;2023:;volume( 006 ):;issue: 004::page 41003-1
    Author:
    Hazra, Sudip
    ,
    Whitaker, Shane
    ,
    Shiakolas, Panos S.
    DOI: 10.1115/1.4062341
    Publisher: The American Society of Mechanical Engineers (ASME)
    Abstract: In assistive robotics, research in Brain Computer Interface aims to understand human intent to enhance Human–Robot interaction and augment human performance. In this research, a framework to enable a person with an upper limb disability to use an assistive system toward maintaining self-reliance is introduced and its implementation and evaluation are discussed. The framework interlinks functional components and establishes a behavioral sequence to operate the assistive system in three stages; action classification, verification, and execution. An action is classified based on identified human intent and verified through haptic and/or visual feedback before execution. The human intent is conveyed through facial expressions and verified through head movements. The interlinked functional components are an electro-encephalogram (EEG) sensing device, a head movement recorder, a dual-purpose glove, a visual feedback environment, and a robotic arm. Five volunteers are used to evaluate the ability of the system to recognize a facial expression, the time required to respond using head movements, convey information through vibrotactile feedback effects, and the ability to follow the established behavioral sequence. Based on the evaluation, a personalized training dataset should be used to calibrate facial expression recognition and define the time required to respond during verification. Custom vibrotactile effects were effective in conveying system information to the user. The volunteers were able to follow the behavioral sequence and control the system with a success rate of 80.00%, thus providing confidence to recruit more volunteers to identify and address improvements and expand the operational capability of the framework.
    • Download: (2.875Mb)
    • Show Full MetaData Hide Full MetaData
    • Get RIS
    • Item Order
    • Go To Publisher
    • Price: 5000 Rial
    • Statistics

      Design and Implementation of a Behavioral Sequence Framework for Human–Robot Interaction Utilizing Brain-Computer Interface and Haptic Feedback

    URI
    http://yetl.yabesh.ir/yetl1/handle/yetl/4294615
    Collections
    • Journal of Engineering and Science in Medical Diagnostics and Therapy

    Show full item record

    contributor authorHazra, Sudip
    contributor authorWhitaker, Shane
    contributor authorShiakolas, Panos S.
    date accessioned2023-11-29T19:09:11Z
    date available2023-11-29T19:09:11Z
    date copyright5/15/2023 12:00:00 AM
    date issued5/15/2023 12:00:00 AM
    date issued2023-05-15
    identifier issn2572-7958
    identifier otherjesmdt_006_04_041003.pdf
    identifier urihttp://yetl.yabesh.ir/yetl1/handle/yetl/4294615
    description abstractIn assistive robotics, research in Brain Computer Interface aims to understand human intent to enhance Human–Robot interaction and augment human performance. In this research, a framework to enable a person with an upper limb disability to use an assistive system toward maintaining self-reliance is introduced and its implementation and evaluation are discussed. The framework interlinks functional components and establishes a behavioral sequence to operate the assistive system in three stages; action classification, verification, and execution. An action is classified based on identified human intent and verified through haptic and/or visual feedback before execution. The human intent is conveyed through facial expressions and verified through head movements. The interlinked functional components are an electro-encephalogram (EEG) sensing device, a head movement recorder, a dual-purpose glove, a visual feedback environment, and a robotic arm. Five volunteers are used to evaluate the ability of the system to recognize a facial expression, the time required to respond using head movements, convey information through vibrotactile feedback effects, and the ability to follow the established behavioral sequence. Based on the evaluation, a personalized training dataset should be used to calibrate facial expression recognition and define the time required to respond during verification. Custom vibrotactile effects were effective in conveying system information to the user. The volunteers were able to follow the behavioral sequence and control the system with a success rate of 80.00%, thus providing confidence to recruit more volunteers to identify and address improvements and expand the operational capability of the framework.
    publisherThe American Society of Mechanical Engineers (ASME)
    titleDesign and Implementation of a Behavioral Sequence Framework for Human–Robot Interaction Utilizing Brain-Computer Interface and Haptic Feedback
    typeJournal Paper
    journal volume6
    journal issue4
    journal titleJournal of Engineering and Science in Medical Diagnostics and Therapy
    identifier doi10.1115/1.4062341
    journal fristpage41003-1
    journal lastpage41003-11
    page11
    treeJournal of Engineering and Science in Medical Diagnostics and Therapy:;2023:;volume( 006 ):;issue: 004
    contenttypeFulltext
    DSpace software copyright © 2002-2015  DuraSpace
    نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
    yabeshDSpacePersian
     
    DSpace software copyright © 2002-2015  DuraSpace
    نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
    yabeshDSpacePersian