Summary:
Wekinator inspired a series of accessible Machine Learning tools for the web such as Teachable Machine and Teachable Machine 2.
The latter tools primarily focus on image as an input. Below is a series of Teaching examples created for physical
computing inputs such as touch, light and movement sensors
The code is open source and uses the ML5 Library and Neural Net Function
The repo for the examples lives here and the P5JS collection is contained here
---------------------------------------------------------------------------->
Examples :
- IMU input classification
- Capacitice touch input classification
- Gesture classification
- Colour Sensor classification
- Speech to transducer speaker
---------------------------------------------------------------------------->
Materials :
- ML5 Library
- Bare Conductive Touch Board
- Conductive Ink
- Arduino Nano BLE 33 Sense
- Accelerometer
- Colour Sense
---------------------------------------------------------------------------->
---------------------------------------------------------------------------->