This page provides the resources taught throughout various Creative ML workshops internationally.
These exemples explore classification using Google's Teachable Machine resource.
Remember to sign in then duplicate any P5JS examples in order to edit and save your changes.
Try make a change to an output in your Teachable Machine template. Look for the key variables to identify your classification label. Perhaps change a shape,text, image or sound as an output.
Extra Resources:
Sometimes its quicker to use the ML5 webcam classifier than always jumping on the back of Google's Teachable Machine.
These examples explore what is happening behind the scenes of the Teachable Machine examples.
Using your mouse explore how to train a neural net using classification.
A really nice example by Charlotte 000
We might not always want distinct label outputs. What if we want a more continuous output or prediction? This is where regression comes in.
Using your mouse explore how to train using regression.
Use your webcam to train using regression
Many ML models encountered in RunwayML utilise pre-trained models. These models already have dataset labels or keypoints we can reference in training.
MobileNet uses pretrained labels found in the ImageNet dataset.
YOLO is similar to MobileNet but uses slightly different methods with its pre-trained model more information here
Classify poses using the ML5 PoseNet model
Using ML5 Face API we can track facial expressions and use them as inputs for training.
These examples explore how to create neural nets of our own by scratch using ML5 Neural Net function
Here is a simple bare bones structure we will build from
---------------------------------------------------------------------------->You can use the DIY neural net function to train any input. This is what is used with the Face API classifier from part 1.
Its really useful if we want to add Classification or Regression training capabilities to Physical Computing components. Previously this was only possible using the Wekinator application. however, now we can bring this to P5JS and the browser
This example works with Adafruit BNO055 but can be adapted to any accellerrometer sensor and webUSB capable Arduino board (see notes on Github page)
This example works with the Bare Conductive Touchboard. An adapted DataStream code needs to be uploaded onto the board first.(see Arduino code on Github page)
Try considerring what otherr inputs could be trained.
---------------------------------------------------------------------------->We will explore going beyond the Runway ML interface. How we can communicate in and out of it using P5JS, Arduino and Processing
RunwayML can communicate with Processing, P5JS and any other tool that can utilise HTTP sockets.
For the additional Processing examples please refer to the repo here on Github
Make sure you also have the latest RunwayML Processing library installed via Processing or the Gitthub here
Additional P5JS examples can be found in my editor sketchbook. Some highlights are below
---------------------------------------------------------------------------->Many of the coded examples have been adapted and expanded upon from original template resources provided by...
The original source of inspiration for accessible ML for artiists and musicians
---------------------------------------------------------------------------->