A hand with a CHARGE power pack attached to the wrist is waving in front of a screenshot of the micro:bit machine learning interface.

Machine learning might seem like a complex topic, but with the micro:bit and its Machine Learning Tool, it’s easier than you might think to bring it into your classroom! If you’re new to the concept of machine learning and AI, don’t worry! This blog post will walk you through how you can introduce this topic in the classroom and run a fun engaging activity with your students using micro:bits! 

In this post, we’ll guide you through the basics of using the micro:bit Machine Learning Tool, where students can train their micro:bit to recognize different movements and gestures, like a wave or a shake. By adding samples of physical actions, students can teach their micro:bit to understand those motions through a process called training. Once the model is trained, students can test it in real time to see how well it identifies the different actions.

This introduction to machine learning is a great way to engage students in hands-on learning and show them how technology can recognize patterns—no coding required! 

Before we jump into the activity, let’s cover a few basic definitions so you understand the difference between two prevalent buzz words: Machine Learning and Artificial Intelligence.

When we talk about machine learning (ML) and artificial intelligence (AI), it can sometimes feel like they’re the same thing—but they’re not! Think of AI as the big idea: it’s the broad field where computers and machines are designed to do tasks that would normally require human intelligence, like recognizing faces or understanding language. Machine learning is a part of AI, but it’s more specific.

In simple terms, machine learning is how computers learn to do certain tasks by finding patterns in data, rather than being told what to do step by step. For example, instead of programming a robot to recognize different hand gestures, students can train their micro:bit to learn and recognize those gestures based on examples they give. With more data, the micro:bit gets better at understanding the actions, just like how humans improve by practising. So, while AI is the overall goal of creating machines that can “think,” machine learning is the process that helps computers learn to be smart in specific ways. It’s like teaching them through experience, not instructions!

  • Define machine learning and how it relates to AI 
  • Use the micro:bit Machine Learning Tool and 2 x micro:bits to add data samples for at least two unique gestures or actions 
  • Train the model on the data sets to recognize patterns or differences for various gestures or actions 
  • Test the model to see the accuracy of the Machine Learning Tool for the gestures that it has been trained on 
  • Add more or different data samples to increase the accuracy of the model
  • Identify ways machine learning could be used in real-life situations

To use the micro:bit Machine Learning Tool there are a few things we’ll need to get started:

Two micro:bits, a USB cable, a CHARGE power pack and a laptop with the micro:bit machine learning screen sit neatly arranged on a white tabletop.
  1. Add Data: Students will add samples of the gestures or actions they want the model to be able to recognize 
  2. Train Model: Students will then ask the computer to use their samples to train the machine learning model to recognize different actions
  3. Test: Once trained, students will be able to ‘test’ their model to see if it accurately recognizes the gestures or actions
Collage of the machine learning interface with the Add Data, Train Model, and Test Model pages.

First, head over to the machine learning tool in a new window. We suggest putting this guide on one side of the screen and the tool on the other so you can continue to follow along.

After making sure you have all of the required materials, you’ll want to connect and pair your first micro:bit to the Machine Learning Tool. You can do this by plugging it into the computer via USB cable and following the instructions on screen. 

Once paired and the Machine Learning program is downloaded onto the first micro:bit, you can disconnect it and connect it to the CHARGE or micro:bit battery pack. Then, connect and pair your second micro:bit.

Sitting on a desk is a plugged in micro:bit displaying a diamond and a CHARGE power pack withe a micro:bit displaying a smiling face. Behind is a computer monitor displaying the micro:bit Machine Learning interface and a live feed of data from the pair of micro:bits.

Once you’ve successfully set up both micro:bits, you’re ready to add data. All this means is we’re going to input the various gestures or actions we want the model to be able to recognize and then add some sample data of what this looks like. 

The micro:bit Machine Learning Tool utilizes the accelerometer in the micro:bit to detect this action, which is represented on the X, Y, and Z axis. If you pick up your first micro:bit, which we’ll call the ‘gesture micro:bit’, and start moving it around, you’ll see the X, Y, and Z axis change according to the line graph at the bottom of the Machine Learning Tool. 

This happens because our ‘gesture micro:bit’ is sending the data via Radio signal to the second micro:bit, which we’ll call the ‘receiving micro:bit’ and relaying the data to the Machine Learning Tool.

Try moving the ‘gesture micro:bit’ around a bit, notice what the X, Y, Z graph looks like for different actions. 

A micro:bit in a CHARGE power pack is shaken and moved around to change the XYZ data graph on the monitor behind.
Screen capture of a graph from the micro:bit machine learning interface. 3 lines representing the X, Y, and Z axis zig-zag in various patterns with time along the graph's X axis and acceleration in the Y axis.

Now we need to decide what specific gestures or actions we want to be a part of our model. Here are some examples you may want to consider:

  • Pronation/supination (or rotation) of the wrist 
  • Flexion/extension of the elbow
  • Saluting 
  • Waving
A left arm in three sections showing example gestures. Left is saluting, middle is waving, right is rotating wrist.

Tips for gestures: You’ll want to choose gestures or actions that are obvious enough for the micro:bit to detect. Small muscle twitches or hand movements may be tricky for the accelerometer in the micro:bit to pick up. Additionally, when we go to record data, you’ll notice you only have a few seconds to capture the movement, so the gesture or action needs to be relatively quick. 

Once you’ve decided on your gestures, you’re ready to add data for that gesture. Start by entering in the name of the gesture in the box.

micro:bit machine learning interface with a prompt to add an action you want recognized.

Now you need to record your action to give the model data samples to work with. Make sure the ‘gesture micro:bit’ is securely attached to your wrist. You don’t want it to move because this will create inaccurate data samples. When you’re ready, hit record, and after the countdown, perform your action. 

Record 3-4 examples of the gesture, performing it the same way each time (i.e., the same starting and ending position). 

Click “+ Add Action” to add another data set for another type of gesture or action and repeat the same steps. 

micro:bit machine learning interface with two actions and 4 sets each of line graph data.

Once you’ve added a few gestures or actions, and recorded 3-4 data samples for each, you’re ready to train the model. Click the ‘Train model’ button in the bottom right corner.

The computer program will now look at the data samples you’ve provided and analyze it for patterns or differences. It will use this data to build a mathematical model that allows the micro:bit Machine Learning Tool to recognise different actions when you move your micro:bit.

After the model has been trained, you can click the ‘Test model’ button to start testing your program! You’ll see on this screen that here is a list of actions we trained our model on, as well as a scale from 0 to 100%. This scale represents how confident, or certain the model is about each action being performed. The higher the percentage, the more certain the model is that that particular action is happening on the ‘gesture micro:bit’. 

You may also notice that at the top of this screen there is an ‘estimated action’ line. The computer program is going to indicate its best guess for which action is happening in real life based on the certainty percentages indicated below. 

Start performing your gestures or actions again – what happens to the percentage or certainty for each action? Is the model able to identify your actions accurately?

If yes, that’s great! What other gestures or actions could you add to your model to train it on?

micro:bit machine learning interface in the Test Model section, trying to identify which action is currently happening to the micro:bit with 0 to 100% prediction gauges.
micro:bit machine learning interface with a video overlay of an arm doing a number of gestures, like waving and saluting. The machine learning does a medium job guessing which movements are being conducted.

If your model is not as accurate as you’d like, you may need to add more data samples. Go back to ‘Add data’ and try recording samples of your gesture or action again.

When discussing Machine Learning, it’s important to talk about how the data we use to teach a machine learning model can affect its accuracy and fairness. When we only have one or two people providing gesture data, the model may learn to recognize those specific ways of moving but struggle to recognize similar gestures made by others. This is because people move in different ways based on their size, strength, flexibility, or even habits.

Let’s experiment with this by having multiple students record the same gesture (e.g., waving). When you have more diverse data from different people, the model will be better at recognizing a wider range of gestures.

As you add more data, pay attention to how the graphs for the same gesture look slightly different person-to-person.

  • How did the accuracy of the model change as you added more data samples? 
  • What types of gestures performed best, and why do you think they worked better than others?
  • What other gestures could you do if you attached the micro:bit to another part of your body, like your ankle?
  • What was the most surprising or interesting thing you learned about machine learning through this activity?
  • How did you feel when the Machine Learning Tool didn’t work as expected? What did you do to stay motivated and keep trying?
  • Can you think of other real-life examples where machine learning technology like this could be used? Why do you think machine learning would be valuable in these situations?
  • In this activity, we used physical gestures or actions using the accelerometer in the micro:bit to train our model. What other types of inputs could we use to train a model?
    • Sound: Noises, voices, words
    • Visuals: Colours, Shapes, Light
  • Think about the real-world applications of machine learning we’ve discussed. What would be some consequences of using limited or biased data samples in these situations?