We create fit-for-purpose prosthesis control solutions. Acquiring a new skill, for example, learning to use chopsticks, requires accurate motor commands to be sent from the brain to the hand, and reliable sensory feedback from the hand to the brain. Over time and with training, the brain learns to handle this two-way communication flexibly and efficiently. Inspired by this sensorimotor interplay, our research is guided by a conviction that progress in prosthetic limb control is best achieved through a strong synergy of motor learning and sensory feedback. We, therefore, study the interaction of neural and behavioural processes that control the hand movements to ultimately innovate digitally-enabled prosthetic control solutions that users would find fit for purpose. Specifically, we are developing a data-driven care model that enhances the experience of receiving a prosthesis. novel methods and technologies enable the utilisation of the flexibility of the brain in learning new skills for closed-loop prosthesis control; efficient artificial intelligence algorithms for processing of multi-modal data collected with hybrid sensors; effective systems and stimulation paradigms to restore sensory feedback in prosthetic control; Current research Machine Learning for Prosthetic Hand Control We have worked on several classes of upper limb prosthetics controllers to design most intuitive control interfaces. Moreover, we have developed new ways of combining multi-modal bio-signals to improve intuitive control of prosthetics. The main focus has been to improve the control of prosthetics by the use of machine learning methods. Temporal convolutional networks for myoelectric control (submitted) 2024. PDF AT-Myo: Arm translation in electromyography (submitted) 2024. PDF Digital sensing systems for electromyography (submitted) 2024. PDF Explainable AI-powered graph neural networks for HD EMG-based gesture intention recognition IEEE Trans Cons Elec (in press) 2024 PDF One-shot random forest model calibration for hand gesture decoding J Neural Eng 21(1):016006, 2024. PDF Explainable and robust deep forests for EMG-force modeling IEEE J Biomed Health Informatics 27(6):2841-2852, 2023. PDF Recalibration of myoelectric control with active learning Front Neurorob 16, 277, 2022. PDF Discrete action control for prosthetic digits IEEE Trans Neural Sys Rehab Eng, 30:610-620, 2022. PDF Multi-grip classification-based prosthesis control with two EMG-IMU sensors, IEEE Trans Neural Sys Rehab Eng, 28(2):508-518, 2020. PDF Improved prosthetic hand control with concurrent use of myoelectric and inertial measurements, J NeuroEng Rehab, 14:71, 2017. PDF Human Learning for Prosthetic Hand Control When controlling a prosthesis, the patterns of neural and/or muscular activity can differ from those used to control the biological limbs. We explore the extent to which this activity can deviate from natural patterns employed in controlling the movement of the biological arm and hand. We will therefore examine whether prosthesis users can learn to synthesise new functional maps between muscles and prosthetic digits; for instance, in the case of a partial hand amputation, whether users can grasp an object by contracting a small group of muscles that do not naturally control the grasp. We have named this approach Abstract Decoding. In this definition, the user learns to generate functional muscle activity patterns. This notion is completely in contrast to the pattern recognition or regression approaches in which the prosthesis learns to identify movement intent(s) by decoding the EMG patterns without considering the users’ learning capability. DistaNet: Grasp-specific distance biofeedback promotes the retention of myoelectric skills (submitted) 2024 PDF Reducing motor variability enhances myoelectric control robustness across limb positions IEEE Trans Neural Sys Rehab Eng, 32:23-32, 2024 PDF Delaying feedback during pre-device training facilitates the retention of novel myoelectric skills, J Neural Eng 20(3):036008, 2023 PDF Learning, generalisation, scalability of abstract myoelectric control, IEEE Trans Neural Sys Rehab Eng, 28(7):1539-1547, 2020 PDF Myoelectric control with abstract decoders, J Neural Eng, 15(5):056003, 2018 PDF Artificial proprioceptive feedback for myoelectric control, IEEE Trans Neural Sys Rehab Eng, 23(3):498-507, 2014 PDF Abstract and proportional myoelectric control for multi-fingered hand prostheses, Ann Biomed Eng, 41(12):2687-2698, 2013 PDF Flexible cortical control of task-specific muscle synergies, J Neurosci, 32(36):12349-12360, 2012 PDF Prosthetic Control Beyond Laboratory In current clinical practice, a prosthesis is fitted in the clinic. Once the user is home, the clinician “cannot” tell if the prosthesis is used or how well it is functioning. Statistics show that up to 44% of the users abandon their prosthesis [Salminger et al. Disability & Rehab, 2020]. We aim to co-create the world’s first Internet-enabled prosthetic hand; connecting the user and the clinic seamlessly. Secure data flow and artificial intelligence (AI) sit at the heart of this bidirectional communication link. The opposite figure illustrates our vision. Internet of Things for beyond-the-laboratory prosthetics research, Phil. Trans. R. Soc. A.38020210005, 2022 PDF Arduino-based myoelectric control: Towards longitudinal study of prosthesis use, Sensors 21(3):763, 2021 PDF Research Funding Selected Projects Bionics+: User-Centred Design and Usability of Bionic Devices EPSRC (2021-2025) A smart electrode housing to improve the control of upper limb myoelectric prostheses NIHR (2021-2024) Sensorimotor learning for control of prosthetic limbs EPSRC (2018-2024) Facilities Labortoray Equipments Electrophysiology Blackrock Microsystems Neuroport (FDA) Cerebus CereStim Utah Array Pneumatic Inserter A-M Systems Differential AC Amplifier (Model 1700) Isolated Pulse Stimulator (Model 2100) Delsys Trigno Avanti Mobile (CE) Digitimer D360 Amplifier (CE) DS7A stimulators (CE, FDA) Prosthetic Hands COVVI ltd NEXUS (CE) Össur RoboLimb (CE) Other Turntable Crayfish 60 Cyberglove Cyberglove II Setups for Outcome Measurement SHAP Box and Blocks This article was published on 2024-10-15