Userguide AAL-Band 2.0
The AAL-band is intended to provide a machine learning interface between human motion and a variety of different applications or machines.
The technology, originally a patented technology from Aalborg University, Den- mark, was developed on the biomechanical principles of human motion. The voluntary limb movement is governed by a set of muscles. With each type of movement the muscle’s shape changes. The new sensor detects the human motion intention by reading the muscle’s shape change at different arm positions. It is achieved by utilizing single or combined sensor bands that are comprised of Force Sensing Resistors (FSRs). Machine learning technique is adopted in classifying and recognizing motions. The sensor outputs are classified motion types and rated muscle efforts, which are information useful for robot control, computer interface, and Virtual Reality systems (Islam and Bai, 2019).
Through this guide, you will first learn how to place the AAL-Band on your arm and how you can interact with the band. Then there is a step by step guide on how to get started using the AAL-Band and the application for gesture recognition. Further details are covered in Chapter 4.
To download the software go to thisURL:
2.2 Interactions with the AAL-Band
The AAL-Band has two buttons, on/reset and off, and a light that indicates the status of the band, this is shown on Figure 2.2.
Step1: Step 2:
Turn on ALL-Band by pressing the on the button, as illustrated on Figure 2.2.
Turn on Bluetooth on your computer and search for the AAL-Band by pressing “Add Bluetooth or another device”, as seen on Figure 3.1. If you need more information on how to connect to bluetooth go to URL: https://support.microsoft.com/en-us/help/15290/windows-connect-blu etooth-device.
Figure 3.1: Connecting to Bluetooth. Open the application, the start screen can be seen on Figure 3.2.
Figure 3.2: Start screen of the application.
Press the “Connect” button on the start screen. Under the button a light will indicate whether the connection is successful. On Figure 3.3a the light is yellow, so the band is trying to connect. When the light turns
Software version v1.0 3 / 11 Last updated September 16, 2020
￼ green, as seen on Figure 3.3b, the connection is completed and then the next step of the setup, calibration, can begin. If it is not possible to connect, the light will turn red and report that an error has occurred. If red, try to reset the AAL-Band by pressing the on button or reset your Bluetooth.
(b) AAL-Band is connected.
Figure 3.3: Connection of AAL-Band.
Now that the AAL-Band is connected to the application it is possible to calibrate the band. A calibration is needed to test out the muscle activities and arm mo- tions.
Step 1: Step 2:
Make a fist with the hand that you have placed the AAL-Band on.
Press the button “Calibrate”, see Figure 3.4. The calibration will take three seconds and during the calibration the light under the button will be yellow. When the light turns green it means that the calibration is done and you can relax your hand.
Step3: If the calibration was faulty you can press the button”Resetcalibration”, see Figure 3.5, and do a new calibration by pressing the button “Calibrate” again.
Figure 3.5: Calibrated AAL-Band, that can be reset.
3.2 Data Recording
The setup has been finished and now you can begin to record data for the different gestures and train the model.
Step 1: Go to the “Record”tab, the data recording menu can be seen on Figure 3.6.
Step 3: Step 4:
Step 5: Step 6:
Figure 3.6: Data Recording menu.
If you want to save your data press the tab “file” and choose “save session”. This will save all data. The output files are described in detail in Section 4.2.
Choose for how many seconds you want to record each gesture by set- ting the “Rec time”, which on Figure 3.6 is set to 5 seconds.
Choose a gesture to record. The first recording in the session have to be at least two of the gestures. After this, gestures can be chosen individually to provide more data for one or more gestures. To add or remove gestures, see Section 4.3.
Press the button “Start recording”.
Perform the gestures shown on the image to right, see Figure 3.7 where the gesture “Open” is shown. The loading bar tells you how long you have to hold the gesture before going to the next gesture. The gestures will be gone through in the order from top to bottom.
Figure 3.7: The gestures have been chosen and recording is started. If a mistake has occurred during the recording of the gestures it is pos-
sible to reset the data by pressing the button “Reset data” and start a new recording. This will clear out all recorded data and erase models if they are trained.
Step 8: When you are satisfied with your recorded data you can train the model with your recorded data.
Step 9: Choose what type of data you want to train your model with, FMG, IMU or both.
Step 10: Press the button “Train model”. The light under the button will change from red to green when the model has been trained, which can be seen on Figure 3.8.
Step 11: Record gestures more than once to get better performance in the test-
By training the model on the recorded data it can now recognise the gestures and a test of the models performance can be conducted.
Step1: Go to the “Testing” tab, see Figure3.9.
Step 2: Step 3:
Step 4: Step 5:
The Support Vector Machine is the chosen classifier.
Again, if you want to save your data press the tab “file” and choose “save session”.
Press the button “Start”.
Perform different gestures. An image to the right will display the prediction, see Figure 3.10 where the gesture “Flexion” is displayed.
Step 6: Step 7:
Figure 3.10: Recognition of the gesture “Flexion” during training. Press the button “Stop” when you have collected the wanted data.
If the classifier has difficulties with predicting a certain gesture, record more data for it and retrain the model.
An extra option in relation to testing is to apply the gestures to a Lego Mindstorms EV3, if you own one. However, this is not yet implemented. The intention is that you can control the device with the AAL-Band and in that way see how well it performs.
Miscellaneous 4.1 Function under development
In the application there is the option to press “View” in the upper right corner. Then you can press “Output graph”, however the graph that will appear is simply random generated data, as seen on Figure 4.1.
Figure 4.1: How the output graph currently looks like.
The intention is that it should be possible to get a graphical representation of the data during recording and training. This will be implemented soon, but for now it does not have any function.
4.2 Output files
The output files for raw and RMS data are saved as .csv files, one for each ges- ture. The output files consist of 21 columns and an overview over what type of measurement these columns consist of is seen in Table 4.1. Other dumped files’ format are dependent on the procedure of recording and training.
Table 4.1: Overview over types of measurements found in the output file.
Software version v1.0 9 / 11 Last updated September 16, 2020
Column number Measurement
9-11 Gravity (x, y, z)
12-14 Angular velocity (x, y, z)
15-17 Linear acceleration (x, y, z)
18-20 Euler angle (x, y, z)
Step1: Step 2: Step 3:
Go to the folder on your computer where you have installed theapplication.
Choose the folder “hand_pic”. Here are the images of the saved ges- tures.
Adding: If you want to add a gesture you must copy an image into the folder. It is important that you name the image accordingly to what ges- ture it represents.
Removing: If you want to remove a gesture simply delete the image of the gesture that you no longer want to be included.
Depending on the progress of the pipeline, 5 different dumps are executed: • Raw data. • Calculated root mean square of data windows. • Input for support vector machine, X and Y.
• Classifier model. (Saved with pickle as .sav)
• Testing data. The name of the saved files can be explained like this:
4.3 Adding or removing gestures
Step by step guide on how to add or remove gestures in the application:
Islam, M. R. and Bai, S. (2019). “Payload estimation using forcemyography sen- sors for control of upper-body exoskeleton in load carrying assistance”. In: Iden- tication and Control 40.4, pp. 189–198. doi: 10.4173/mic.2019.4.1.