الفهرس | Only 14 pages are availabe for public view |
Abstract In recent years, there has been considerable progress in the invention of brain-controlled mobile robots and robotic arms. The development of electroencephalography (EEG) technology has made it possible to operate external equipment directly from the brain. A brain computer interface (BCI) enables the communication between the brain and an external device. BCI is used in many applications such as security, education, neuro-marketing, entertainment, and medical applications. In this thesis, the robot is controlled based on the identification of given commands. The robot’s directions are changed according to the attention level. Eye blinking is an alternative way to control the robot’s direction in the left and right directions. The full pipeline is implemented to achieve the goal and verify the correctness of the desired behavior. The signals are collected from the skull and then processed and classified to take action execution. For the signal classification step, a deep learning 1D convolutional neural network (1D-CNN)is applied, which outperformed several classical machine learning (ML) models. The outcomes show that the 1D-CNN is the most proper deep learning (DL) method to detect the level of attention with a testing accuracy of 95% to control the robot’s motor direction. This thesis presents several algorithms to classify brain signals based on ML and DL such as Multilayer Perceptron (MLP), Logistic Regression, Support Vector Machine (SVM), Naive Bayes, and 1D-CNN with an accuracy of 93.2%, 62.9%, 70.6%, 44.9%, and 95% respectively. |