top of page

Robotic Manipulation

"Intent-Aware Shared Control for Dexterous Grasp–Release Manipulation" 

Role: Robotics Research Assistant

I developed an intent-aware shared-control framework for a simulated 7-DOF robotic manipulator that integrates real-time neural intent decoding with task-phase–specific control policies for dexterous grasp and release. The system was designed to address a core challenge in human-in-the-loop manipulation: how a robotic arm can infer both user intent and manipulation phase from noisy biosignals, and use that information to adapt control authority during contact-rich tasks.

mouse-eeg.png
OzgurEgeAydogan_BMI.png

The control architecture explicitly models task-phase segmentation, distinguishing reach, grasp initiation, force modulation, and release. Motor intent was decoded online from non-invasive EEG using a linear SVM optimized for robustness and interpretability under low-SNR neural signals. Decoded intent was combined with proprioceptive and interaction force feedback to modulate torque commands during grasp and release, enabling phase-specific assistance rather than continuous automation.

This shared-control strategy dynamically reallocates control authority between the user and the robot based on inferred intent and task context, improving grasp stability while preserving user agency. In closed-loop simulation, the system achieved 75% real-time intent decoding accuracy with approximately 1.2 s end-to-end latency, sufficient for responsive grasp–release manipulation. Compared to intent-only control, phase-aware shared control improved grasp success and release reliability by approximately 30%, demonstrating the value of integrating neural intent with task-level structure and proprioceptive feedback.

Overall, this project reframes neural interfaces not as command generators, but as contextual signals within a manipulation-centric shared-control framework, aligning neural decoding with classical problems in dexterous manipulation, force modulation, and human-robot interaction.

Research Outputs:

Conference Presentation:

1. Aydogan, O. E.*, Ding, C., Dossa, R. F. J., Arulkumaran, K., Yanagisawa, T., "Closed-Loop EEG-Based BMI for Real-Time Assistive Robotic Arm Control via Motor Intent Decoding," 12th IEEE EMBS Conference on Neural Engineering (NER 2025), San Diego, US, November 2025

​

2. Aydogan, O. E.*, Changhao, D., Yanagisawa, T. (July 2025). Real-Time EEG-Based Brain–Machine Interface for Robotic Arm Control Using Motor Imagery. Poster presentation at the 1st International Symposium on Decoded Neurofeedback (DecNef 2025), Nara, Japan

​

3. Aydogan, O. E. (2024). Advancing Brain-Computer Interfaces (BCI): Overcoming Challenges in Transfer Learning. 2024 IEEE EMBS SAC Summer Camp, September 2024

Research Support:

Funding:

This research is supported by the JST Moonshot Research and Development Program.

  • LinkedIn
  • Twitter
  • Youtube

© 2023 by Ozgur Ege Aydogan

bottom of page