SlideShare uma empresa Scribd logo
1 de 75
Baixar para ler offline
www.embarcados.com.br
linkedin.com/embarcados
@portalembarcados
youtube/Embarcados TV
OBRIGADO!
A Leading Provider of Smart, Connected and Secure Embedded Control Solutions
November 2022
Rodrigo Britto
Starting with Microchip
Machine Learning
2
MPLAB® X IDE
• Free download, simple installation
• Rich debug and programming environment for
all PIC® and dsPIC® devices
• Runs on Windows®, Linux® and MAC® OS
• Software development kit (SDK) for custom
automated testing, manufacturing frameworks
• Strong ecosystem of plugins for many custom
features; visualization, code analysis, etc.
• Supports all Microchip, many third party
compilers
• Supports code generation for software libraries
• Complete simulation engine for most devices
3
MPLAB® XC C Compilers
• Three compiler types to chose from
• MPLAB XC8 = 8-bit PIC® devices
• MPLAB XC16 = 16-bit PIC and dsPIC® devices
• MPLAB XC32 = 32-bit PIC devices
• Compatible everywhere
• Windows®, Linux® and macOS®
• Optimization levels
• Free and PRO
• 70% of available optimizations in Free version
• Continuing improvements to Free and PRO
• Most flexible licensing in the industry
• Workstation, network server, site, subscription and dongle
licenses
4
MPLAB® Code Configurator (MCC)
FREE easy-to-use graphical
programming tool:
 Easily configure and use peripherals
 Generates efficient C code for your project
 Supports 8- and 16-bit devices
 Minimizes reliance on datasheet
www.microchip.com/mcc
5
MPLAB® Harmony
• Modular Software Framework
• A Graphical User Interface (GUI) tool
that takes the guesswork out of
configuring drivers and middleware.
• Direct resale by Microchip for
third-party libraries
• Microchip provides first line of
support
• MPLAB Harmony components
• Third-party solutions
• Comprehensive web portal
• Compatible with 32-bit PIC® MCUs
6
Microchip Debuggers
microchip.com/devtools
* MPLAB ICD 4 requires a hardware revision to support AVR devices that use
Debug Wire protocol
7
Machine Learning Demos
 TensorFlow Lite for Microcontroller Demos with Harmony 3.0 Plugin
(Training with your dataset in Google Colab, Converting & Deploying a
Model, Using the TF-Lite Runtime Engine for Inference)
 Hello World – a simple TF model predicting the value of a sine function
 Digit Recognition – identification of digits 0 to 9 written on a touchscreen
 Edge Impulse Speech Recognition using with SAM E54 Curiosity Ultra
 Keyword Spotting (also called “Micro Speech” in TensorFlow Lite Demos)
 SensiML Tools Workflow with the SAMD21 ML Evaluation Kit
 Gesture Recognition with an IMU (gyro/accelerometer)
*** Note: Click on any of the hyperlinks above to quickly navigate to the corresponding lab exercise section
8
Eval Kits for Machine Learning Development
• SAMD21 ML Evaluation Kit with TDK 6-axis MEMS (Part # EV18H79A)
• https://www.microchip.com/en-us/development-tool/EV18H79A
• SAM E54 Curiosity Ultra Development Board (Part Number: DM320210)
• https://www.microchip.com/en-us/development-tool/DM320210
• SAM E70 Xplained Ultra Development Board (Part Number: DM320113)
• https://www.microchip.com/en-us/development-tool/DM320113
• PIC32 Audio Codec Daughter Card (Part # AC328904)
• https://www.microchip.com/en-us/development-tool/AC328904
• Stereo 3.5 mm microphone (for speech input)
• SAM E51 Integrated Graphics and Touch Development Board (Part # EV14C17A)
• https://www.microchip.com/en-us/development-tool/EV14C17A
9
H3 TF-Lite “Hello World”
Using TensorFlow Lite for Microcontrollers and Harmony 3
A Simple Model for Prediction of a Sine of a Number
Return to Lab Exercises List
10
What is TensorFlow Lite for Microcontrollers?
Training Workflow
• TensorFlow Lite is an open-source, production ready, cross-
platform framework for deploying ML on mobile devices
and embedded systems
• Compatible with the TensorFlow training environment.
• Designed to run Google ML models on microcontrollers with only
a few KB of memory
• Built to fit on embedded systems
• Very small binary footprint. Optimized for ARM Cortex Mx, takes
approx. 16 KB on the M3/M4 for the interpreter/runtime engine.
• No dynamic memory allocation
• No dependencies on complex parts of the standard C/C++
libraries
• No operating system dependencies, can run on bare metal
• Designed to be portable across a wide variety of systems
• Google Colab (https://colab.research.google.com/)
• Goolgle offers Free Cloud service with free GPU. Provides a free
Jupyter notebook environment called Colaboratory or “colab”
• In Colab we can write and execute code in Python to train and
deploy a model for TensorFlow Lite
Micro
11
Harmony 3 Component for TensorFlow Lite Micro.
 TFLM component
• Adds required source file for TensorFlow Lite for Microcontrollers runtime engine
• Provides option to use the optimized CMSIS-NN kernel for Cortex-M MCUs
• Provides option to use the example audio front end required for micro speech application
 Data Log
 Debug Log function to print out error messages from TF-Lite for uC runtime engine
 Tools and Packages Required from H3 Repo
 MCC plugin version 5.1.2, MPLAB X IDE v6.0, csp v3.10.0, core v3.10.0, bsp v3.10.0, dev_packs v3.10.0, CMSIS-
FreeRTOS v10.3.1, audio v3.5.1, gfx v3.9.5, touch v3.1, tflite-micro-apps
 Public Harmony 3 Repository for TFLM - https://github.com/Microchip-MPLAB-Harmony/tflite-micro-apps
12
Hello World using TensorFlow Lite and Harmony 3
▪ This tutorial shows how to create and train a 2.5 kB model that predicts the value of a sine
function. The model accepts values between 2 and 2π and outputs a single value between -
1 and 1. The model takes an input value, x, and predicts its sine, y. That is, y = sin(x), where
x is our input and y is the output of the model.
▪ Hardware Required
▪ SAM E51 IGAT Board, or, SAM E70 Xplained Ultra Eval Kit, SSD1963 LCD Controller
Graphics Card and High-Performance WGVGA Display Module
▪ Micro USB Cable to connect the Debug USB Port to the computer
0
Model
Input
3-layer, fully connected neural network
13
Building and Running the Application - 1
▪ Downloading and building the application project
▪ Path of the application within the repository is apps/hello_world/firmware
▪ To create and train a new model using Google Colaboratory and integrating it into your
MPLAB-X project Click HERE to access and clone the Colab “Hello World” script. Select
Copy To Drive to duplicate and run the script from your own Google Drive.
▪ Select Runtime from the top menu Run all to execute the entire script in one step, or
individually execute each cell one by one.
14
Building and Running the Application - 2
▪ After completing Run all, We can download
models directory in Colab as shown on the
right
▪ The models directory includes the 3 model
files: model.pb , model.tflite and model.cc
▪ Copy the contents from model.cc file from
models directory and replace it in model.cpp
in your MPLAB-X project. Specifically, we need
to copy and paste the g_model[] array and
g_model_len variable declarations into
your model.cpp file defined you MPLAB
project to update the model.
▪ Rebuild and Run the project. The sine wave
will be displayed on the screen.
15
Creating a Sine Model – Step By Step In Colab - 1
▪ STEP 1 – Install TensorFlow 2.4.0
▪ STEP 2 – Import necessary Dependencies
16
Creating a Sine Model – Step By Step In Colab - 2
▪ STEP 3 – Generate Dataset of random Sine values and plot the results
17
Creating a Sine Model – Step By Step In Colab - 3
▪ STEP 4 – Add Gaussian noise to the output to generate a more realistic (real-world) model
18
Creating a Sine Model – Step By Step In Colab - 4
▪ STEP 5 – Split apart the data for training and plot the results
▪ Training: 60%
▪ Validation: 20%
▪ Testing: 20%
19
Creating a Sine Model – Step By Step In Colab - 5
▪ STEP 6 – Designing A Larger Model (adding a layer of 16 neurons) to improve the
performance of the model
20
Creating a Sine Model – Step By Step In Colab - 6
▪ STEP 7 – Train and Validate the Model, with 500 Epochs
21
Creating a Sine Model – Step By Step In Colab - 7
▪ STEP 8 – Generate a TensorFlow Lite Model with or without 8-bit Quantization
22
Creating a Sine Model – Step By Step In Colab - 8
▪ STEP 9 – Generate a TensorFlow Lite for Microcontrollers C++ Model Source File
23
Deploy the Model – Copy to MPLAB-X Project
▪ STEP 10 – DEPLOY THE MODEL. Copy the
contents from model.cc file from models
directory and replace it in model.cpp in your
MPLAB-X project. Specifically, we need to copy
and paste the g_model[] array and
g_model_len variable declarations into
your model.cpp file defined you MPLAB
project to update the model.
24
Deploy the Model – Build Project and Run
▪ STEP 11 - Rebuild and Run the project. The sine wave will be displayed on the
screen.
25
SensiML ML Workflow
Gesture Recognition Using SAMD21 ML Evaluation Kit
Return to Lab Exercises List
26
The SensiML Workflow: Data-Driven Rapid Model Creation
26
No Data Science or AI Expertise Required, Prototype Model Testing Without Coding
Data Capture Lab
Capture and annotate Data
Time: Hours to Weeks
(Depending on application
data collection complexity)
Skill: Domain Expertise
(As required to collect and
label events of interest)
Analytics Studio
Build, Train and Validate Models
Time: Minutes to Hours
(Depending on degree of model
control exerted)
Skill: None
(Full AutoML)
Basic ML Concepts
(Advanced UI tuning)
Python Programming
(Full pipeline control)
SensiML Knowledge
Embedded Inference Engine &
Test Validation Application
Time: Minutes to Weeks
(Depending on app code
integration needs)
Skill: None
(Binary firmware with auto
generated I/O wrapper code)
Embedding Programming
(Integration of SensiML library
or C source with user code)
27
SensiML Edge AI Tools Workflow for Microchip Platforms
Knowledge
Pack
• Ready-to-run Binary
• Linkable Library
• Full Source
Map Sensors Label Data
Build Model
Generate Code and Test
SAMD21
Data Capture Lab
Analytics Studio
▪ The SensiML Workflow: Data-Driven Rapid Model Creation
▪ No Data Science or AI Expertise Required, Prototype Model Testing Without Coding
28
SAMD21 ML Kit – Anatomy of the on-board IMU Sensor
• An inertial measurement unit (IMU) is a system composed of
sensors that relay information about a device’s movement
with an integrated accelerometer and gyroscope.
• Accelerometer. Measures changes in velocity (acceleration)
and position (velocity), as well as absolute orientation. The
accelerometer is the device in tablets and smartphones which
ensures the image on-screen remains upright regardless of
orientation. By itself, the accelerometer provides information
about the linear and rotational X-, Y-, and Z- directions. The
accelerometer allows the 3-axis of motion to be captured.
• Gyroscope. Measures changes in orientation (rotation) and
rotational velocity. Microelectromechanical gyroscopes, often
called gyro meters, are present in many consumer electronics
such as gaming controllers. A gyroscope provides information
about the rotational X- (roll), Y- (pitch), and Z- (yaw)
directions.
29
This exercise requires the following installation software
▪ Windows 10 PC
▪ Microchip Technology
• Hardware
• SAMD21 Machine Learning Evaluation Kit with TDK ICM42688 IMU (EV18H79A)
• SAMD21 Machine Learning Evaluation Kit with Bosch BMI160 IMU (EV45Y33A)
• micro-USB cable (more than 1m length is recommended)
• Software
• MPLAB® X IDE (https://microchip.com/mplab/mplab-x-ide)
• MPLAB® XC32 compiler (https://microchip.com/mplab/compilers)
• MPLAB® Harmony 3 (https://www.microchip.com/harmony)
• Design Asset
• https://github.com/MicrochipTech/ml-samd21-iot-sensiml-gestures-demo/releases/tag/v0.2 (ml-samd21-iot-sensiml-gestures-
demo.zip)
▪ SensiML
• Create Your Free Account of SensiML Analytics Toolkit
“Community Edition” (https://sensiml.com/plans/community-edition/)
• SensiML Data Capture Lab for Windows 10 (https://sensiml.com/download/)
• SensiML Open-Gateway (https://github.com/sensiml/open-gateway)
• Note: On windows use Python 3.7 or 3.8.
• Data Asset (SAMD21 + TDK ICM42688 IMU) (MCHP_HO)
30
SensiML Licensing Options
1. ROI benefit can be calculated https://sensiml.com/plans/#
2. Evaluation code is limited to 1000 inference results per embedded device power cycle
3. Risk-free Source Code Option: Purchase source code only after model is validated.
4. Additional users can be enabled at $99/user-month
5. MVGO = Motion, Vibration, Gesture, and Other sub-10kHz sensors
6. Audio = 10kHz-20kHz sensors; Ultra-High Rate (UHR) = >20 kHz sensors
7. Premier supportis offered in multiple tiers for direct toolkit support and consultation
Plan Editions: Compare Options SensiML | Plans
31
Create Your Free Account of SensiML Analytics Toolkit
▪ Go to https://sensiml.com/plans/community-edition/
▪ Fill your information, check the box of “Terms & Conditions”
(Yes, read after “Terms & Conditions”)
▪ And then click “Create My Account”
▪ You will receive a confirmation email from SensiML.
32
SensiML Data Capture Lab for Windows 10
▪ Download site:
• https://sensiml.com/download/
▪ SensiML Data Capture Lab can be downloaded from
• https://sensiml.cloud/downloads/SensiML_DataCaptureLab_Setup.exe
▪ Install, execute Data Capture Lab and login with your account.
33
SensiML Open-Gateway
Cloud-Based Tool via Web Browser – Forwards Sensor Data Capture or Inference
Recognition Results via UART to SensiML Data Capture Lab
▪ You need to install Python 3.7 or 3.8.
• https://www.python.org/downloads/windows/
▪ SensiML Open-Gateway install instruction describes at SensiML’s GitHub
• https://github.com/sensiml/open-gateway
▪ Installer for Windows Application is ready – SensiML_OpenGateway_Setup.exe
• https://github.com/sensiml/open-gateway/releases/tag/v2022.3.3.0
▪ Confirm to run SensiML Open-Gateway in your environment, the preparation is finished.
Recommended
34
Exercise: Introducing SensiML Toolkit Endpoint AI Workflow
1. Write FW for data collection in SAMD21 by MPLAB X and connect it to DCL.
2. Import the example data set into the DCL project.
3. Additional data collection and labeling.
4. Generate the algorithm by Analytics Studio.
5. Testing a Model Using the Data Capture Lab
6. Compile the downloaded library of Knowledge Pack with MPLAB X and write it to SAMD21
7. Verify the actual operation of gesture recognition using Open-Gateway
35
Exercise: Introducing SensiML Toolkit Endpoint AI Workflow
1. Write FW for data collection in SAMD21 by MPLAB X and connect it to DCL.
2. Import the example data set into the DCL project.
3. Additional data collection and labeling.
4. Generate the algorithm by Analytics Studio.
5. Testing a Model Using the Data Capture Lab
6. Compile the downloaded library of Knowledge Pack with MPLAB X and write it to SAMD21
7. Verify the actual operation of gesture recognition using Open-Gateway
36
SAMD21 Setup – 1 (Option #1)
1. Download ZIP file from https://github.com/MicrochipTech/ml-samd21-iot-
sensiml-gestures-demo/releases/download/v0.2/ml-samd21-iot-sensiml-
gestures-demo.zip
2. Plug your SAMD21 evaluation kit into your PC via USB. The SAMD21 should
automatically come up as a USB Flash drive.
3. Open the ml-samd21-iot-sensiml-gestures-demo.zip archive
downloaded previously and locate the gesture classifier demo HEX file
corresponding to your sensor make:
1. Bosch IMU: binaries/samd21-iot-sensiml-gestures-demo_bmi160.hex
2. TDK IMU: binaries/samd21-iot-sensiml-gestures-demo_icm42688.hex
4. Drag and drop the HEX file onto the SAMD21 USB drive to program the device.
37
SAMD21 Setup – 1 (Option 2)
1. Download ZIP file from https://github.com/MicrochipTech/ml-samd21-iot-imu-data-logger
2. Plug your SAMD21 evaluation kit into your PC via USB.
3. Extract ZIP file and open the firmware/samd21_iot_imu.X project folder in MPLAB X.
4. Select the SAMD21_IOT_WG_ICM42688 Project Configuration in MPLAB X below.
5. Select SensiML Simple Stream by setting the DATA_STREAMER_FORMAT macro in firmware/src/app_config.h to
#define DATA_STREAMER_FORMAT DATA_STREAMER_FORMAT_SMLSS (line 63).
6. Once you're satisfied with your configuration, click the Make and Program Device button in the toolbar (see image
below for reference). (Note: At first time, you will be asked to select “Debugger”.)
38
7. Confirm by the terminal software like Tera Term or Putty whether the FW could be programmed or not.
• Baud Rate 115200
• Data bits 8
• Stop bits 1
• Parity None
SAMD21 Setup – 2
39
Data Capture Lab (DCL) Setup – 1
1. Open up DCL and create a new project from “New project” button.
2. Specify “Location” and “Name”.
3. Switch to “Capture” mode by clicking “Switch Modes” button.
40
Data Capture Lab (DCL) Setup – 2
4. In capture mode, click “Connect” of “Sensor” at the bottom, “Sensor Configuration” is popped up. -> Click “Next”
5. In the “Select a Device Plugin” window select the SAMD21 ML Eval Kit item. -> Click “Next”
6. After selecting the device plugin, “Plugin Details” will appear; skip this by clicking “Next” to move forward to
“Sensor Properties”. On the properties page, select “Motion (ICM-42688-P)”, then click “Next”.
41
Data Capture Lab (DCL) Setup – 3
7. Give a name to the sensor configuration in “Sensor Configuration”. Click “Save” to save the sensor configuration to
your project.
• If you plan on trying different configurations for your application, it's a good idea to include a summary of the sensor
configuration in the name, for example, bmi160-100hz-16g-2000dps.
8. In “SAMD21 ML Eval Kit”. -> Click “three dot”, click “Connection Setting”, click “Scan”, select port “COM” of
SAMD21, click “Done” and click “Connect”.
9. The SAMD21 ML Eval Kit should now be streaming to the DCL. (See next page)
• Check out SensiML's documentation to learn more about how to use the DCL for capturing and annotating data.
42
Data Capture Lab (DCL) Setup – 4
42
43
Exercise: Introducing SensiML Toolkit Endpoint AI Workflow
1. Write FW for data collection in SAMD21 by MPLAB X and connect it to DCL.
2. Import the example data set into the DCL project.
3. Additional data collection and labeling.
4. Generate the algorithm by Analytics Studio.
5. Testing a Model Using the Data Capture Lab
6. Compile the downloaded library of Knowledge Pack with MPLAB X and write it to SAMD21
7. Verify the actual operation of gesture recognition using Open-Gateway
44
Import the example data set into the DCL project – 1
1. Extract the MCHP_HO_SAMD21_TDK.zip*1 archive containing the gestures data set into a working directory.
MCHP_HO_SAMD21_TDK has captured and labeled dataset in csv format.
2. With your created project opened, navigate to the File menu and click the “Import from DCLI…” item as shown in below
3. Import “MCHP_HO_SAMD21_TDK.dcli” below. Select dcli file and click “Open”.
4. Click “Next” both Import and Sensor Columns
*1: Download ZIP file from the embedded link.
45
Import the example data set into the DCL project – 2
5. Click “Done” Rename Sensor Columns and “Next” Import Settings
6. Select “SAMD21_TDK” and click “Select”.
7. When the file import is completed, “Import complete” green sign indicated at upper-right.
8. Click “Project Explorer” then you can see the imported files below.
9. “dataset” is uploaded to your SensiML cloud server automatically.
46
Import the example data set into the DCL project – 3
9. The imported data sets were taken from SAMD21 with TDK.
• On_Desk:
• Idle_Position:
• Back_and_Forth:
• Left_and_Right:
• Up_and_Down:
• Circle:
SAMD21 kit put off on the desk.
Holding verfically by hand and not move
Holding verfically by hand and move it back and forth
Holding verfically by hand and move it left and right
Holding verfically by hand and move it up and down
Holding verfically by hand and move it circle – clock-wise
10. Double click to open the imported file and then the captured waveform, label and metadata information can be
confirmed and modified. -> See next page.
47
Import the example data set into the DCL project – 4
47
48
Exercise: Introducing SensiML Toolkit Endpoint AI Workflow
1. Write FW for data collection in SAMD21 by MPLAB X and connect it to DCL.
2. Import the example data set into the DCL project.
3. Additional data collection and labeling. [Optional - NOT Required for Steps 4-7]
4. Generate the algorithm by Analytics Studio.
5. Testing a Model Using the Data Capture Lab
6. Compile the downloaded library of Knowledge Pack with MPLAB X and write it to SAMD21
7. Verify the actual operation of gesture recognition using Open-Gateway
49
1. Click “Switch Mode”, click “Capture”, click “Connect” for re-connect between DCL and SAMD21
2. Modify your “File Metadata” from “+ Add Metadata”
3. Double click “Date” of “Project Properties”, click icon, input your new value (220311), click “Done”.
4. You can see added new value in the dropdown menu of “Date” in “File Metadata”.
Additional data collection and labeling – 1
50
5. Click icon below, set up your file name in “File Name Template”, Click “Save”
For example, <Label> <Date> <Subject> are checked.
And you can select from “Space”, “Underscore”, and “Dash”
6. Click “Capture Settings”, check and specify “Max Record Time”, and “Save”.
For example, Max Record Time is 15 seconds below.
7. Click “Start Recording”, record your gesture, confirm and save “Save Confirmation”.
Additional data collection and labeling – 2
51
8. Open new recorded file “Circle_2022-02-07_ichiro” from “Project Explorer”
Additional data collection and labeling – 3
52
9. Open new recorded file “Circle_2022-02-07_ichiro” from “Project Explorer”
10. Attach “Label” in “Segments” tab of “Data Capture Properties”.
a. Make a right click on the waveform window, “blue (start) and red (end)” lines were indicated.
b. Adjust the position of “blue and red” lines by “left click and drag” to specify the “Start and Length”.
c. Right click and select “Edit” at “Segments” tab of “File Properties”.
Additional data collection and labeling – 4
Right click
here
Left click
And drag
53
d. Select “Label” from “Select Labels” -> Click “Done” and confirm the updated “Label”.
e. Click “Save Changes” at the upper-left of DCL.
f. The saved dataset is updated automatically and synchronized with SensiML cloud sever.
g. Repeat from the “Begin Recording” to update “Label” until your expecting amount of the data collection.
11. More details about DCL, “Capturing Sensor Data”, “Labeling Your Data” and “Other Useful Features”
• https://sensiml.com/documentation/guides/getting-started/capturing-sensor-data.html
• https://sensiml.com/documentation/guides/getting-started/labeling-your-data.html
• https://sensiml.com/documentation/guides/getting-started/other-useful-features.html
Additional data collection and labeling – 5
54
Exercise: Introducing SensiML Toolkit Endpoint AI Workflow
1. Write FW for data collection in SAMD21 by MPLAB X and connect it to DCL.
2. Import the example data set into the DCL project.
3. Additional data collection and labeling.
4. Generate the algorithm by Analytics Studio.
5. Testing a Model Using the Data Capture Lab
6. Compile the downloaded library of Knowledge Pack with MPLAB X and write it to SAMD21
7. Verify the actual operation of gesture recognition using Open-Gateway
55
Generate the algorithm by Analytics Studio – Log-in
1. Log-in to your SensiML account via web browser (https://app.sensiml.cloud/auth/login/)
56
Generate the algorithm by Analytics Studio –
Home -> Open Project
2. Open your project from “Open Project” by icon at the left of project name.
57
Generate the algorithm by Analytics Studio – Project Summary
3. The project summary page gives you an overview of your project. Each tab also provides more information about
PROJECT DESCRIPTION, CAPTURES, QUERIES, PIPELINES, and KNOWLEDGE PACKS of your project.
4. Open “Prepare Data” screen to create a query.
58
Generate the algorithm by Analytics Studio – Prepare Data
5. Querying Data
The query is used to select your sensor data from your project. If you need to filter out certain parts of your
sensor data based on metadata or labels, you can specify that here.
a) Query: MCHP_HO, (specify your unique name)
b) Session: Session_1, (session name was specified in DCL)
c) Label: Label, (you could select labels if you made other label(s) in DCL on same data)
d) Metadata: segment_uuid, For
(Differentiate the subset of captures that you want to work with for modeling)
e) Source: GyroscopeX, GyroscopeY, GyroscopeZ, AccelerometerX, AccelerometerY, AccelerometerZ
(You can select which sensor data use for modeling)
f) Query Filter: [For] in [train]
(File Metadata “train” will be used for modeling)
g) Plot: Segment
(Segment or Samples could be indicated)
6. Click “SAVE”
7. Click “Build Model” from the left menu
59
Generate the algorithm by Analytics Studio – Build Model
6. Building Model
a) Click “BUILD MODEL”, add “Pipeline name” in “Create New Pipeline”, click “BUILD”.
b) Confirm “Query *” in “Input Query” and click “SAVE”, set “200” in “Window Size” and click “SAVE”.
c) Once you've entered the pipeline settings, click the “OPTIMIZE” button. This step will use AutoML techniques to
automatically select the best features and machine learning algorithm for the gesture classification task given your input
data. This process will usually take several minutes.
d) Once the Build Model optimization step is completed, confirm “AutoML Results” and navigate to “Explore Model”.
-> Next Page
60
Generate the algorithm by Analytics Studio – Explore Model
7. In “Explore Model”, you can get more information about the models that were generated from “Build Model”.
https://sensiml.com/documentation/guides/getting-started/exploring-model-details.html
8. Once “Explore Model” is checked, navigate to “Test Model”.
61
Generate the algorithm by Analytics Studio – Test Model
9. Select the pipeline that we created in the previous step. Select one of the models generated in the previous step.
10. This text uses “MCHP_HO_rank_4”.
11. Select the upside-down triangle icon in “For” column and select “Test” to filter the data so only the test samples
are selected.
12. Select the check box to select the test samples you want to test and click “COMPUTE SUMMARY”.
13. Click “RESULTS” to confirm the details of simulation result. -> Next page
62
Generate the algorithm by Analytics Studio – Test Model
14. Once completed you will be presented with a table like is shown in below summarizing the classification results.
63
Generate the algorithm by Analytics Studio – Download Model 1
15. Finally, navigate to “Download Model” tab to download your model.
16. Select “Microchip SAMD21 ML Eval Kit” in “HW Platform” and “Library” in “Format”. Click “DOWNLOAD”.
17. In the right side of this window has Knowledge Pack Information including “Device Profile Information” which are
“Estimated Memory Size” and “Estimated Latency”.
64
Generate the algorithm by Analytics Studio – Download Model 2
18. Once completed “Downloading Knowledge Pack, please wait ...”, click “OK” to “Save File”.
65
Generate the algorithm by Analytics Studio – Memory Usage
19. Device Profile Information
Estimated Memory Usage
SRAM Used:
The number of byte in RAM taken up the model on a device
Stack Size:
The estimated worst-case stack usage in bytes of the entire model on a device
Flash Used:
The number of bytes taken up by the model on a device
Estimated Latency
Feature Extraction Latency:
The estimated amount of clock cycles and time the model will spend in feature
generation.
Classifier Latency:
The estimated amount of clock cycles and time the model will spend in the
classifier algorithm.
Total Latency:
The estimated total number of clock cycles and time the model will take to
operate on a segment of data.
66
Exercise: Introducing SensiML Toolkit Endpoint AI Workflow
1. Write FW for data collection in SAMD21 by MPLAB X and connect it to DCL.
2. Import the example data set into the DCL project.
3. Additional data collection and labeling.
4. Generate the algorithm by Analytics Studio.
5. Testing a Model Using the Data Capture Lab
6. Compile the downloaded library of Knowledge Pack with MPLAB X and write it to SAMD21
7. Verify the actual operation of gesture recognition using Open-Gateway
67
Testing a Model Using the Data Capture Lab – 1
1. The Data Capture Lab has two ways to test a model on your dataset:
Running a Model During Data Collection: Connect to a model during data collection and get the model results in real-time
Running a Model in the Project Explorer: Run a model on any previously collected CSV or WAV files in your project
2. This lets you see how your model will perform on real data before flashing it to a device.
3. Running a Model During Data Collection -> Next Page
a) Switch to Capture mode
b) Connect to your device
c) Open the Test Model panel and click Connect
d) Select a Knowledge Pack
e) Select a Session. This is where the Knowledge Pack results will be saved
f) Connect to the Knowledge Pack
g) You will now see your model results in real-time overlapping capturing waveform
4. (Optional) You can click Start Recording and the Data Capture Lab will save the Knowledge Pack results to your
project. This lets you quickly add additional training data to your project.
5. (Optional) In the Save Confirmation screen you can edit or delete the Knowledge Pack results before saving the
results to your project.
68
Testing a Model Using the Data Capture Lab – 2
68
69
Testing a Model Using the Data Capture Lab – 3
69
70
Exercise: Introducing SensiML Toolkit Endpoint AI Workflow
1. Write FW for data collection in SAMD21 by MPLAB X and connect it to DCL.
2. Import the example data set into the DCL project.
3. Additional data collection and labeling.
4. Generate the algorithm by Analytics Studio.
5. Testing a Model Using the Data Capture Lab
6. Compile the downloaded library of Knowledge Pack with MPLAB X and write it to SAMD21
7. Verify the actual operation of gesture recognition using Open-Gateway
71
Knowledge Pack Integration – 1
1. Deployed your knowledge pack in the Library format, the archive should contain a complete, ready to compile,
MPLAB X project. Follow the steps below to compile your project:
a) Unzip a downloaded zip file at your working folder.
b) In MPLAB X, open “samd21-iot-sensiml-template.X” project folder under the firmware folder of the knowledge pack.
c) Select the Project Configuration option in the MPLAB X toolbar according to which sensor you're using. -> ICM42688
d) Select SensiML Simple Stream by setting the DATA_STREAMER_FORMAT macro in
Header Files/knowledgepack/knowledgepack_project/app_config.h to
#define DATA_STREAMER_FORMAT DATA_STREAMER_FORMAT_SMLSS (line 63).
e) Your project should now be ready to “Make and Program Device (samd21-iot-sensiml-template)”.
f) Connect your SAMD21 Eval Kit via USB cable and click “Make and Program Device”
72
Knowledge Pack Integration – 2
2. Firmware Operation
The firmware behavior can be summarized as operating in one of three distinct states as reflected by the onboard
LEDs and described in the table below:
3. When operating normally, the firmware prints the classification prediction (classification ID number) and the
generated feature vector for each sample window over the UART port. To read the UART port use a terminal
emulator of your choice (e.g., MPLAB Data Visualizer's integrated terminal tool) with the following settings:
• Baud Rate 115200
• Data bits 8
• Stop bits 1
• Parity None
Status LED Behavior Description
Error Red (ERROR) LED lit Fatal error. (Do you have the correct sensor plugged in?
Buffer Overflow Yellow (DATA) and Red (ERROR) LED lit for 5 seconds Processing is not able to keep up with real-time: data buffer has been reset.
Running Yellow (DATA) LED flashingslowly Firmware is runningnormally.
1. Back_and_Forth
2. Circle
3. Idle_Position
4. Left_and_Right
5. On_Desk
6. Up_and_Down
73
• www.microchip.com/ml
• https://github.com/orgs/MicrochipTech/repositori
es?q=ML&type=all&language=&sort=
Aditional Informations
74
Obrigado

Mais conteúdo relacionado

Semelhante a Webinar: Começando seus trabalhos com Machine Learning utilizando ferramentas e demoboards da Microchip

Legion - AI Runtime Platform
Legion -  AI Runtime PlatformLegion -  AI Runtime Platform
Legion - AI Runtime PlatformAlexey Kharlamov
 
Массовый параллелизм для гетерогенных вычислений на C++ для беспилотных автом...
Массовый параллелизм для гетерогенных вычислений на C++ для беспилотных автом...Массовый параллелизм для гетерогенных вычислений на C++ для беспилотных автом...
Массовый параллелизм для гетерогенных вычислений на C++ для беспилотных автом...CEE-SEC(R)
 
SigOpt at MLconf - Reducing Operational Barriers to Model Training
SigOpt at MLconf - Reducing Operational Barriers to Model TrainingSigOpt at MLconf - Reducing Operational Barriers to Model Training
SigOpt at MLconf - Reducing Operational Barriers to Model TrainingSigOpt
 
Alexandra johnson reducing operational barriers to model training
Alexandra johnson   reducing operational barriers to model trainingAlexandra johnson   reducing operational barriers to model training
Alexandra johnson reducing operational barriers to model trainingMLconf
 
Kostiantyn Bokhan, N-iX. CD4ML based on Azure and Kubeflow
Kostiantyn Bokhan, N-iX. CD4ML based on Azure and KubeflowKostiantyn Bokhan, N-iX. CD4ML based on Azure and Kubeflow
Kostiantyn Bokhan, N-iX. CD4ML based on Azure and KubeflowIT Arena
 
JUNIPER: Towards Modeling Approach Enabling Efficient Platform for Heterogene...
JUNIPER: Towards Modeling Approach Enabling Efficient Platform for Heterogene...JUNIPER: Towards Modeling Approach Enabling Efficient Platform for Heterogene...
JUNIPER: Towards Modeling Approach Enabling Efficient Platform for Heterogene...Andrey Sadovykh
 
Scilab Technical Talk at NTU, TP and HCMUT (Dr Claude Gomez)
Scilab Technical Talk at NTU, TP and HCMUT (Dr Claude Gomez)Scilab Technical Talk at NTU, TP and HCMUT (Dr Claude Gomez)
Scilab Technical Talk at NTU, TP and HCMUT (Dr Claude Gomez)TBSS Group
 
How to use Apache TVM to optimize your ML models
How to use Apache TVM to optimize your ML modelsHow to use Apache TVM to optimize your ML models
How to use Apache TVM to optimize your ML modelsDatabricks
 
Mesa and Its Debugging, Вадим Шовкопляс
Mesa and Its Debugging, Вадим ШовкоплясMesa and Its Debugging, Вадим Шовкопляс
Mesa and Its Debugging, Вадим ШовкоплясSigma Software
 
How we scale up our architecture and organization at Dailymotion
How we scale up our architecture and organization at DailymotionHow we scale up our architecture and organization at Dailymotion
How we scale up our architecture and organization at DailymotionStanislas Chollet
 
"Deployment for free": removing the need to write model deployment code at St...
"Deployment for free": removing the need to write model deployment code at St..."Deployment for free": removing the need to write model deployment code at St...
"Deployment for free": removing the need to write model deployment code at St...Stefan Krawczyk
 
Machine Learning Infrastructure
Machine Learning InfrastructureMachine Learning Infrastructure
Machine Learning InfrastructureSigOpt
 
Evaluating GPU programming Models for the LUMI Supercomputer
Evaluating GPU programming Models for the LUMI SupercomputerEvaluating GPU programming Models for the LUMI Supercomputer
Evaluating GPU programming Models for the LUMI SupercomputerGeorge Markomanolis
 
Why we don’t use the Term DevOps: the Journey to a Product Mindset - Destinat...
Why we don’t use the Term DevOps: the Journey to a Product Mindset - Destinat...Why we don’t use the Term DevOps: the Journey to a Product Mindset - Destinat...
Why we don’t use the Term DevOps: the Journey to a Product Mindset - Destinat...Henning Jacobs
 
Sample Project using design patterns and agile
Sample Project using design patterns and agileSample Project using design patterns and agile
Sample Project using design patterns and agileVicente Bolea
 
Recommendations for Building Machine Learning Software
Recommendations for Building Machine Learning SoftwareRecommendations for Building Machine Learning Software
Recommendations for Building Machine Learning SoftwareJustin Basilico
 
Understand the Trade-offs Using Compilers for Java Applications
Understand the Trade-offs Using Compilers for Java ApplicationsUnderstand the Trade-offs Using Compilers for Java Applications
Understand the Trade-offs Using Compilers for Java ApplicationsC4Media
 
Open Standards for ADAS: Andrew Richards, Codeplay, at AutoSens 2016
Open Standards for ADAS: Andrew Richards, Codeplay, at AutoSens 2016Open Standards for ADAS: Andrew Richards, Codeplay, at AutoSens 2016
Open Standards for ADAS: Andrew Richards, Codeplay, at AutoSens 2016Andrew Richards
 

Semelhante a Webinar: Começando seus trabalhos com Machine Learning utilizando ferramentas e demoboards da Microchip (20)

Legion - AI Runtime Platform
Legion -  AI Runtime PlatformLegion -  AI Runtime Platform
Legion - AI Runtime Platform
 
Массовый параллелизм для гетерогенных вычислений на C++ для беспилотных автом...
Массовый параллелизм для гетерогенных вычислений на C++ для беспилотных автом...Массовый параллелизм для гетерогенных вычислений на C++ для беспилотных автом...
Массовый параллелизм для гетерогенных вычислений на C++ для беспилотных автом...
 
SigOpt at MLconf - Reducing Operational Barriers to Model Training
SigOpt at MLconf - Reducing Operational Barriers to Model TrainingSigOpt at MLconf - Reducing Operational Barriers to Model Training
SigOpt at MLconf - Reducing Operational Barriers to Model Training
 
Alexandra johnson reducing operational barriers to model training
Alexandra johnson   reducing operational barriers to model trainingAlexandra johnson   reducing operational barriers to model training
Alexandra johnson reducing operational barriers to model training
 
Kostiantyn Bokhan, N-iX. CD4ML based on Azure and Kubeflow
Kostiantyn Bokhan, N-iX. CD4ML based on Azure and KubeflowKostiantyn Bokhan, N-iX. CD4ML based on Azure and Kubeflow
Kostiantyn Bokhan, N-iX. CD4ML based on Azure and Kubeflow
 
DevOps demystified
DevOps demystifiedDevOps demystified
DevOps demystified
 
JUNIPER: Towards Modeling Approach Enabling Efficient Platform for Heterogene...
JUNIPER: Towards Modeling Approach Enabling Efficient Platform for Heterogene...JUNIPER: Towards Modeling Approach Enabling Efficient Platform for Heterogene...
JUNIPER: Towards Modeling Approach Enabling Efficient Platform for Heterogene...
 
Scilab Technical Talk at NTU, TP and HCMUT (Dr Claude Gomez)
Scilab Technical Talk at NTU, TP and HCMUT (Dr Claude Gomez)Scilab Technical Talk at NTU, TP and HCMUT (Dr Claude Gomez)
Scilab Technical Talk at NTU, TP and HCMUT (Dr Claude Gomez)
 
How to use Apache TVM to optimize your ML models
How to use Apache TVM to optimize your ML modelsHow to use Apache TVM to optimize your ML models
How to use Apache TVM to optimize your ML models
 
Scilab-by-dr-gomez-june2014
Scilab-by-dr-gomez-june2014Scilab-by-dr-gomez-june2014
Scilab-by-dr-gomez-june2014
 
Mesa and Its Debugging, Вадим Шовкопляс
Mesa and Its Debugging, Вадим ШовкоплясMesa and Its Debugging, Вадим Шовкопляс
Mesa and Its Debugging, Вадим Шовкопляс
 
How we scale up our architecture and organization at Dailymotion
How we scale up our architecture and organization at DailymotionHow we scale up our architecture and organization at Dailymotion
How we scale up our architecture and organization at Dailymotion
 
"Deployment for free": removing the need to write model deployment code at St...
"Deployment for free": removing the need to write model deployment code at St..."Deployment for free": removing the need to write model deployment code at St...
"Deployment for free": removing the need to write model deployment code at St...
 
Machine Learning Infrastructure
Machine Learning InfrastructureMachine Learning Infrastructure
Machine Learning Infrastructure
 
Evaluating GPU programming Models for the LUMI Supercomputer
Evaluating GPU programming Models for the LUMI SupercomputerEvaluating GPU programming Models for the LUMI Supercomputer
Evaluating GPU programming Models for the LUMI Supercomputer
 
Why we don’t use the Term DevOps: the Journey to a Product Mindset - Destinat...
Why we don’t use the Term DevOps: the Journey to a Product Mindset - Destinat...Why we don’t use the Term DevOps: the Journey to a Product Mindset - Destinat...
Why we don’t use the Term DevOps: the Journey to a Product Mindset - Destinat...
 
Sample Project using design patterns and agile
Sample Project using design patterns and agileSample Project using design patterns and agile
Sample Project using design patterns and agile
 
Recommendations for Building Machine Learning Software
Recommendations for Building Machine Learning SoftwareRecommendations for Building Machine Learning Software
Recommendations for Building Machine Learning Software
 
Understand the Trade-offs Using Compilers for Java Applications
Understand the Trade-offs Using Compilers for Java ApplicationsUnderstand the Trade-offs Using Compilers for Java Applications
Understand the Trade-offs Using Compilers for Java Applications
 
Open Standards for ADAS: Andrew Richards, Codeplay, at AutoSens 2016
Open Standards for ADAS: Andrew Richards, Codeplay, at AutoSens 2016Open Standards for ADAS: Andrew Richards, Codeplay, at AutoSens 2016
Open Standards for ADAS: Andrew Richards, Codeplay, at AutoSens 2016
 

Mais de Embarcados

Webinar: Controle de motores BLDC e de indução trifásico
Webinar: Controle de motores BLDC e de indução trifásicoWebinar: Controle de motores BLDC e de indução trifásico
Webinar: Controle de motores BLDC e de indução trifásicoEmbarcados
 
Webinar_ Julho 2023 - Embarcados.pdf
Webinar_ Julho 2023 - Embarcados.pdfWebinar_ Julho 2023 - Embarcados.pdf
Webinar_ Julho 2023 - Embarcados.pdfEmbarcados
 
Webinar: Especificação de Componentes Passivos
Webinar: Especificação de Componentes PassivosWebinar: Especificação de Componentes Passivos
Webinar: Especificação de Componentes PassivosEmbarcados
 
Webinar: Projeto de hardware utilizando Conversores DC/DC
Webinar: Projeto de hardware utilizando Conversores DC/DCWebinar: Projeto de hardware utilizando Conversores DC/DC
Webinar: Projeto de hardware utilizando Conversores DC/DCEmbarcados
 
Webinar: Comunicação TCP/IP segura
Webinar: Comunicação TCP/IP seguraWebinar: Comunicação TCP/IP segura
Webinar: Comunicação TCP/IP seguraEmbarcados
 
Webinar: Desvendando o Yocto Project
Webinar: Desvendando o Yocto ProjectWebinar: Desvendando o Yocto Project
Webinar: Desvendando o Yocto ProjectEmbarcados
 
Webinar: Bancada de eletrônica profissional
Webinar: Bancada de eletrônica profissionalWebinar: Bancada de eletrônica profissional
Webinar: Bancada de eletrônica profissionalEmbarcados
 
Webinar: Como projetar sensores de baixo consumo utilizando microcontroladore...
Webinar: Como projetar sensores de baixo consumo utilizando microcontroladore...Webinar: Como projetar sensores de baixo consumo utilizando microcontroladore...
Webinar: Como projetar sensores de baixo consumo utilizando microcontroladore...Embarcados
 
Webinar: Desvendando o seguidor de linha: sensores, montagem e programação co...
Webinar: Desvendando o seguidor de linha: sensores, montagem e programação co...Webinar: Desvendando o seguidor de linha: sensores, montagem e programação co...
Webinar: Desvendando o seguidor de linha: sensores, montagem e programação co...Embarcados
 
Webinar Gravado: Um Estudo sobre a I2C e o Futuro com a I3C
Webinar Gravado: Um Estudo sobre a I2C e o Futuro com a I3CWebinar Gravado: Um Estudo sobre a I2C e o Futuro com a I3C
Webinar Gravado: Um Estudo sobre a I2C e o Futuro com a I3CEmbarcados
 
Webinar: Microcontroladores Infineon TRAVEO T2G
Webinar: Microcontroladores Infineon TRAVEO T2GWebinar: Microcontroladores Infineon TRAVEO T2G
Webinar: Microcontroladores Infineon TRAVEO T2GEmbarcados
 
Webinar: Introdução à Reconfiguração dinâmica parcial em FPGAs
Webinar: Introdução à Reconfiguração dinâmica parcial em FPGAsWebinar: Introdução à Reconfiguração dinâmica parcial em FPGAs
Webinar: Introdução à Reconfiguração dinâmica parcial em FPGAsEmbarcados
 
Webinar: Microprocessadores 32 bits, suas principais aplicações no mercado br...
Webinar: Microprocessadores 32 bits, suas principais aplicações no mercado br...Webinar: Microprocessadores 32 bits, suas principais aplicações no mercado br...
Webinar: Microprocessadores 32 bits, suas principais aplicações no mercado br...Embarcados
 
Cristais e Ressonadores Murata
Cristais e Ressonadores MurataCristais e Ressonadores Murata
Cristais e Ressonadores MurataEmbarcados
 
Webinar: Silicon Carbide (SiC): A tecnologia do futuro para projetos de potência
Webinar: Silicon Carbide (SiC): A tecnologia do futuro para projetos de potênciaWebinar: Silicon Carbide (SiC): A tecnologia do futuro para projetos de potência
Webinar: Silicon Carbide (SiC): A tecnologia do futuro para projetos de potênciaEmbarcados
 
Webinar: Por que dominar sistema operacional Linux deveria ser a sua prioridade?
Webinar: Por que dominar sistema operacional Linux deveria ser a sua prioridade?Webinar: Por que dominar sistema operacional Linux deveria ser a sua prioridade?
Webinar: Por que dominar sistema operacional Linux deveria ser a sua prioridade?Embarcados
 
Webinar: Estratégias para comprar componentes eletrônicos em tempos de escassez
Webinar: Estratégias para comprar componentes eletrônicos em tempos de escassezWebinar: Estratégias para comprar componentes eletrônicos em tempos de escassez
Webinar: Estratégias para comprar componentes eletrônicos em tempos de escassezEmbarcados
 
Webinar: ChatGPT - A nova ferramenta de IA pode ameaçar ou turbinar a sua car...
Webinar: ChatGPT - A nova ferramenta de IA pode ameaçar ou turbinar a sua car...Webinar: ChatGPT - A nova ferramenta de IA pode ameaçar ou turbinar a sua car...
Webinar: ChatGPT - A nova ferramenta de IA pode ameaçar ou turbinar a sua car...Embarcados
 
Webinar: Power over Ethernet (PoE) e suas aplicações no mercado brasileiro
Webinar: Power over Ethernet (PoE) e suas aplicações no mercado brasileiroWebinar: Power over Ethernet (PoE) e suas aplicações no mercado brasileiro
Webinar: Power over Ethernet (PoE) e suas aplicações no mercado brasileiroEmbarcados
 
Webinar: Utilizando o Yocto Project para automatizar o desenvolvimento em Lin...
Webinar: Utilizando o Yocto Project para automatizar o desenvolvimento em Lin...Webinar: Utilizando o Yocto Project para automatizar o desenvolvimento em Lin...
Webinar: Utilizando o Yocto Project para automatizar o desenvolvimento em Lin...Embarcados
 

Mais de Embarcados (20)

Webinar: Controle de motores BLDC e de indução trifásico
Webinar: Controle de motores BLDC e de indução trifásicoWebinar: Controle de motores BLDC e de indução trifásico
Webinar: Controle de motores BLDC e de indução trifásico
 
Webinar_ Julho 2023 - Embarcados.pdf
Webinar_ Julho 2023 - Embarcados.pdfWebinar_ Julho 2023 - Embarcados.pdf
Webinar_ Julho 2023 - Embarcados.pdf
 
Webinar: Especificação de Componentes Passivos
Webinar: Especificação de Componentes PassivosWebinar: Especificação de Componentes Passivos
Webinar: Especificação de Componentes Passivos
 
Webinar: Projeto de hardware utilizando Conversores DC/DC
Webinar: Projeto de hardware utilizando Conversores DC/DCWebinar: Projeto de hardware utilizando Conversores DC/DC
Webinar: Projeto de hardware utilizando Conversores DC/DC
 
Webinar: Comunicação TCP/IP segura
Webinar: Comunicação TCP/IP seguraWebinar: Comunicação TCP/IP segura
Webinar: Comunicação TCP/IP segura
 
Webinar: Desvendando o Yocto Project
Webinar: Desvendando o Yocto ProjectWebinar: Desvendando o Yocto Project
Webinar: Desvendando o Yocto Project
 
Webinar: Bancada de eletrônica profissional
Webinar: Bancada de eletrônica profissionalWebinar: Bancada de eletrônica profissional
Webinar: Bancada de eletrônica profissional
 
Webinar: Como projetar sensores de baixo consumo utilizando microcontroladore...
Webinar: Como projetar sensores de baixo consumo utilizando microcontroladore...Webinar: Como projetar sensores de baixo consumo utilizando microcontroladore...
Webinar: Como projetar sensores de baixo consumo utilizando microcontroladore...
 
Webinar: Desvendando o seguidor de linha: sensores, montagem e programação co...
Webinar: Desvendando o seguidor de linha: sensores, montagem e programação co...Webinar: Desvendando o seguidor de linha: sensores, montagem e programação co...
Webinar: Desvendando o seguidor de linha: sensores, montagem e programação co...
 
Webinar Gravado: Um Estudo sobre a I2C e o Futuro com a I3C
Webinar Gravado: Um Estudo sobre a I2C e o Futuro com a I3CWebinar Gravado: Um Estudo sobre a I2C e o Futuro com a I3C
Webinar Gravado: Um Estudo sobre a I2C e o Futuro com a I3C
 
Webinar: Microcontroladores Infineon TRAVEO T2G
Webinar: Microcontroladores Infineon TRAVEO T2GWebinar: Microcontroladores Infineon TRAVEO T2G
Webinar: Microcontroladores Infineon TRAVEO T2G
 
Webinar: Introdução à Reconfiguração dinâmica parcial em FPGAs
Webinar: Introdução à Reconfiguração dinâmica parcial em FPGAsWebinar: Introdução à Reconfiguração dinâmica parcial em FPGAs
Webinar: Introdução à Reconfiguração dinâmica parcial em FPGAs
 
Webinar: Microprocessadores 32 bits, suas principais aplicações no mercado br...
Webinar: Microprocessadores 32 bits, suas principais aplicações no mercado br...Webinar: Microprocessadores 32 bits, suas principais aplicações no mercado br...
Webinar: Microprocessadores 32 bits, suas principais aplicações no mercado br...
 
Cristais e Ressonadores Murata
Cristais e Ressonadores MurataCristais e Ressonadores Murata
Cristais e Ressonadores Murata
 
Webinar: Silicon Carbide (SiC): A tecnologia do futuro para projetos de potência
Webinar: Silicon Carbide (SiC): A tecnologia do futuro para projetos de potênciaWebinar: Silicon Carbide (SiC): A tecnologia do futuro para projetos de potência
Webinar: Silicon Carbide (SiC): A tecnologia do futuro para projetos de potência
 
Webinar: Por que dominar sistema operacional Linux deveria ser a sua prioridade?
Webinar: Por que dominar sistema operacional Linux deveria ser a sua prioridade?Webinar: Por que dominar sistema operacional Linux deveria ser a sua prioridade?
Webinar: Por que dominar sistema operacional Linux deveria ser a sua prioridade?
 
Webinar: Estratégias para comprar componentes eletrônicos em tempos de escassez
Webinar: Estratégias para comprar componentes eletrônicos em tempos de escassezWebinar: Estratégias para comprar componentes eletrônicos em tempos de escassez
Webinar: Estratégias para comprar componentes eletrônicos em tempos de escassez
 
Webinar: ChatGPT - A nova ferramenta de IA pode ameaçar ou turbinar a sua car...
Webinar: ChatGPT - A nova ferramenta de IA pode ameaçar ou turbinar a sua car...Webinar: ChatGPT - A nova ferramenta de IA pode ameaçar ou turbinar a sua car...
Webinar: ChatGPT - A nova ferramenta de IA pode ameaçar ou turbinar a sua car...
 
Webinar: Power over Ethernet (PoE) e suas aplicações no mercado brasileiro
Webinar: Power over Ethernet (PoE) e suas aplicações no mercado brasileiroWebinar: Power over Ethernet (PoE) e suas aplicações no mercado brasileiro
Webinar: Power over Ethernet (PoE) e suas aplicações no mercado brasileiro
 
Webinar: Utilizando o Yocto Project para automatizar o desenvolvimento em Lin...
Webinar: Utilizando o Yocto Project para automatizar o desenvolvimento em Lin...Webinar: Utilizando o Yocto Project para automatizar o desenvolvimento em Lin...
Webinar: Utilizando o Yocto Project para automatizar o desenvolvimento em Lin...
 

Último

Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherRemote DBA Services
 
DBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor PresentationDBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor PresentationDropbox
 
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWEREMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWERMadyBayot
 
FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024The Digital Insurer
 
Cyberprint. Dark Pink Apt Group [EN].pdf
Cyberprint. Dark Pink Apt Group [EN].pdfCyberprint. Dark Pink Apt Group [EN].pdf
Cyberprint. Dark Pink Apt Group [EN].pdfOverkill Security
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyKhushali Kathiriya
 
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...apidays
 
Exploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusExploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusZilliz
 
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdfRising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdfOrbitshub
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...DianaGray10
 
Ransomware_Q4_2023. The report. [EN].pdf
Ransomware_Q4_2023. The report. [EN].pdfRansomware_Q4_2023. The report. [EN].pdf
Ransomware_Q4_2023. The report. [EN].pdfOverkill Security
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century educationjfdjdjcjdnsjd
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...apidays
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...Zilliz
 
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoffsammart93
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FMESafe Software
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FMESafe Software
 

Último (20)

Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a Fresher
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
 
DBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor PresentationDBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor Presentation
 
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWEREMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
 
FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024
 
Cyberprint. Dark Pink Apt Group [EN].pdf
Cyberprint. Dark Pink Apt Group [EN].pdfCyberprint. Dark Pink Apt Group [EN].pdf
Cyberprint. Dark Pink Apt Group [EN].pdf
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : Uncertainty
 
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
 
Exploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusExploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with Milvus
 
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdfRising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
 
Ransomware_Q4_2023. The report. [EN].pdf
Ransomware_Q4_2023. The report. [EN].pdfRansomware_Q4_2023. The report. [EN].pdf
Ransomware_Q4_2023. The report. [EN].pdf
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century education
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
 
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 

Webinar: Começando seus trabalhos com Machine Learning utilizando ferramentas e demoboards da Microchip

  • 2. A Leading Provider of Smart, Connected and Secure Embedded Control Solutions November 2022 Rodrigo Britto Starting with Microchip Machine Learning
  • 3. 2 MPLAB® X IDE • Free download, simple installation • Rich debug and programming environment for all PIC® and dsPIC® devices • Runs on Windows®, Linux® and MAC® OS • Software development kit (SDK) for custom automated testing, manufacturing frameworks • Strong ecosystem of plugins for many custom features; visualization, code analysis, etc. • Supports all Microchip, many third party compilers • Supports code generation for software libraries • Complete simulation engine for most devices
  • 4. 3 MPLAB® XC C Compilers • Three compiler types to chose from • MPLAB XC8 = 8-bit PIC® devices • MPLAB XC16 = 16-bit PIC and dsPIC® devices • MPLAB XC32 = 32-bit PIC devices • Compatible everywhere • Windows®, Linux® and macOS® • Optimization levels • Free and PRO • 70% of available optimizations in Free version • Continuing improvements to Free and PRO • Most flexible licensing in the industry • Workstation, network server, site, subscription and dongle licenses
  • 5. 4 MPLAB® Code Configurator (MCC) FREE easy-to-use graphical programming tool:  Easily configure and use peripherals  Generates efficient C code for your project  Supports 8- and 16-bit devices  Minimizes reliance on datasheet www.microchip.com/mcc
  • 6. 5 MPLAB® Harmony • Modular Software Framework • A Graphical User Interface (GUI) tool that takes the guesswork out of configuring drivers and middleware. • Direct resale by Microchip for third-party libraries • Microchip provides first line of support • MPLAB Harmony components • Third-party solutions • Comprehensive web portal • Compatible with 32-bit PIC® MCUs
  • 7. 6 Microchip Debuggers microchip.com/devtools * MPLAB ICD 4 requires a hardware revision to support AVR devices that use Debug Wire protocol
  • 8. 7 Machine Learning Demos  TensorFlow Lite for Microcontroller Demos with Harmony 3.0 Plugin (Training with your dataset in Google Colab, Converting & Deploying a Model, Using the TF-Lite Runtime Engine for Inference)  Hello World – a simple TF model predicting the value of a sine function  Digit Recognition – identification of digits 0 to 9 written on a touchscreen  Edge Impulse Speech Recognition using with SAM E54 Curiosity Ultra  Keyword Spotting (also called “Micro Speech” in TensorFlow Lite Demos)  SensiML Tools Workflow with the SAMD21 ML Evaluation Kit  Gesture Recognition with an IMU (gyro/accelerometer) *** Note: Click on any of the hyperlinks above to quickly navigate to the corresponding lab exercise section
  • 9. 8 Eval Kits for Machine Learning Development • SAMD21 ML Evaluation Kit with TDK 6-axis MEMS (Part # EV18H79A) • https://www.microchip.com/en-us/development-tool/EV18H79A • SAM E54 Curiosity Ultra Development Board (Part Number: DM320210) • https://www.microchip.com/en-us/development-tool/DM320210 • SAM E70 Xplained Ultra Development Board (Part Number: DM320113) • https://www.microchip.com/en-us/development-tool/DM320113 • PIC32 Audio Codec Daughter Card (Part # AC328904) • https://www.microchip.com/en-us/development-tool/AC328904 • Stereo 3.5 mm microphone (for speech input) • SAM E51 Integrated Graphics and Touch Development Board (Part # EV14C17A) • https://www.microchip.com/en-us/development-tool/EV14C17A
  • 10. 9 H3 TF-Lite “Hello World” Using TensorFlow Lite for Microcontrollers and Harmony 3 A Simple Model for Prediction of a Sine of a Number Return to Lab Exercises List
  • 11. 10 What is TensorFlow Lite for Microcontrollers? Training Workflow • TensorFlow Lite is an open-source, production ready, cross- platform framework for deploying ML on mobile devices and embedded systems • Compatible with the TensorFlow training environment. • Designed to run Google ML models on microcontrollers with only a few KB of memory • Built to fit on embedded systems • Very small binary footprint. Optimized for ARM Cortex Mx, takes approx. 16 KB on the M3/M4 for the interpreter/runtime engine. • No dynamic memory allocation • No dependencies on complex parts of the standard C/C++ libraries • No operating system dependencies, can run on bare metal • Designed to be portable across a wide variety of systems • Google Colab (https://colab.research.google.com/) • Goolgle offers Free Cloud service with free GPU. Provides a free Jupyter notebook environment called Colaboratory or “colab” • In Colab we can write and execute code in Python to train and deploy a model for TensorFlow Lite Micro
  • 12. 11 Harmony 3 Component for TensorFlow Lite Micro.  TFLM component • Adds required source file for TensorFlow Lite for Microcontrollers runtime engine • Provides option to use the optimized CMSIS-NN kernel for Cortex-M MCUs • Provides option to use the example audio front end required for micro speech application  Data Log  Debug Log function to print out error messages from TF-Lite for uC runtime engine  Tools and Packages Required from H3 Repo  MCC plugin version 5.1.2, MPLAB X IDE v6.0, csp v3.10.0, core v3.10.0, bsp v3.10.0, dev_packs v3.10.0, CMSIS- FreeRTOS v10.3.1, audio v3.5.1, gfx v3.9.5, touch v3.1, tflite-micro-apps  Public Harmony 3 Repository for TFLM - https://github.com/Microchip-MPLAB-Harmony/tflite-micro-apps
  • 13. 12 Hello World using TensorFlow Lite and Harmony 3 ▪ This tutorial shows how to create and train a 2.5 kB model that predicts the value of a sine function. The model accepts values between 2 and 2π and outputs a single value between - 1 and 1. The model takes an input value, x, and predicts its sine, y. That is, y = sin(x), where x is our input and y is the output of the model. ▪ Hardware Required ▪ SAM E51 IGAT Board, or, SAM E70 Xplained Ultra Eval Kit, SSD1963 LCD Controller Graphics Card and High-Performance WGVGA Display Module ▪ Micro USB Cable to connect the Debug USB Port to the computer 0 Model Input 3-layer, fully connected neural network
  • 14. 13 Building and Running the Application - 1 ▪ Downloading and building the application project ▪ Path of the application within the repository is apps/hello_world/firmware ▪ To create and train a new model using Google Colaboratory and integrating it into your MPLAB-X project Click HERE to access and clone the Colab “Hello World” script. Select Copy To Drive to duplicate and run the script from your own Google Drive. ▪ Select Runtime from the top menu Run all to execute the entire script in one step, or individually execute each cell one by one.
  • 15. 14 Building and Running the Application - 2 ▪ After completing Run all, We can download models directory in Colab as shown on the right ▪ The models directory includes the 3 model files: model.pb , model.tflite and model.cc ▪ Copy the contents from model.cc file from models directory and replace it in model.cpp in your MPLAB-X project. Specifically, we need to copy and paste the g_model[] array and g_model_len variable declarations into your model.cpp file defined you MPLAB project to update the model. ▪ Rebuild and Run the project. The sine wave will be displayed on the screen.
  • 16. 15 Creating a Sine Model – Step By Step In Colab - 1 ▪ STEP 1 – Install TensorFlow 2.4.0 ▪ STEP 2 – Import necessary Dependencies
  • 17. 16 Creating a Sine Model – Step By Step In Colab - 2 ▪ STEP 3 – Generate Dataset of random Sine values and plot the results
  • 18. 17 Creating a Sine Model – Step By Step In Colab - 3 ▪ STEP 4 – Add Gaussian noise to the output to generate a more realistic (real-world) model
  • 19. 18 Creating a Sine Model – Step By Step In Colab - 4 ▪ STEP 5 – Split apart the data for training and plot the results ▪ Training: 60% ▪ Validation: 20% ▪ Testing: 20%
  • 20. 19 Creating a Sine Model – Step By Step In Colab - 5 ▪ STEP 6 – Designing A Larger Model (adding a layer of 16 neurons) to improve the performance of the model
  • 21. 20 Creating a Sine Model – Step By Step In Colab - 6 ▪ STEP 7 – Train and Validate the Model, with 500 Epochs
  • 22. 21 Creating a Sine Model – Step By Step In Colab - 7 ▪ STEP 8 – Generate a TensorFlow Lite Model with or without 8-bit Quantization
  • 23. 22 Creating a Sine Model – Step By Step In Colab - 8 ▪ STEP 9 – Generate a TensorFlow Lite for Microcontrollers C++ Model Source File
  • 24. 23 Deploy the Model – Copy to MPLAB-X Project ▪ STEP 10 – DEPLOY THE MODEL. Copy the contents from model.cc file from models directory and replace it in model.cpp in your MPLAB-X project. Specifically, we need to copy and paste the g_model[] array and g_model_len variable declarations into your model.cpp file defined you MPLAB project to update the model.
  • 25. 24 Deploy the Model – Build Project and Run ▪ STEP 11 - Rebuild and Run the project. The sine wave will be displayed on the screen.
  • 26. 25 SensiML ML Workflow Gesture Recognition Using SAMD21 ML Evaluation Kit Return to Lab Exercises List
  • 27. 26 The SensiML Workflow: Data-Driven Rapid Model Creation 26 No Data Science or AI Expertise Required, Prototype Model Testing Without Coding Data Capture Lab Capture and annotate Data Time: Hours to Weeks (Depending on application data collection complexity) Skill: Domain Expertise (As required to collect and label events of interest) Analytics Studio Build, Train and Validate Models Time: Minutes to Hours (Depending on degree of model control exerted) Skill: None (Full AutoML) Basic ML Concepts (Advanced UI tuning) Python Programming (Full pipeline control) SensiML Knowledge Embedded Inference Engine & Test Validation Application Time: Minutes to Weeks (Depending on app code integration needs) Skill: None (Binary firmware with auto generated I/O wrapper code) Embedding Programming (Integration of SensiML library or C source with user code)
  • 28. 27 SensiML Edge AI Tools Workflow for Microchip Platforms Knowledge Pack • Ready-to-run Binary • Linkable Library • Full Source Map Sensors Label Data Build Model Generate Code and Test SAMD21 Data Capture Lab Analytics Studio ▪ The SensiML Workflow: Data-Driven Rapid Model Creation ▪ No Data Science or AI Expertise Required, Prototype Model Testing Without Coding
  • 29. 28 SAMD21 ML Kit – Anatomy of the on-board IMU Sensor • An inertial measurement unit (IMU) is a system composed of sensors that relay information about a device’s movement with an integrated accelerometer and gyroscope. • Accelerometer. Measures changes in velocity (acceleration) and position (velocity), as well as absolute orientation. The accelerometer is the device in tablets and smartphones which ensures the image on-screen remains upright regardless of orientation. By itself, the accelerometer provides information about the linear and rotational X-, Y-, and Z- directions. The accelerometer allows the 3-axis of motion to be captured. • Gyroscope. Measures changes in orientation (rotation) and rotational velocity. Microelectromechanical gyroscopes, often called gyro meters, are present in many consumer electronics such as gaming controllers. A gyroscope provides information about the rotational X- (roll), Y- (pitch), and Z- (yaw) directions.
  • 30. 29 This exercise requires the following installation software ▪ Windows 10 PC ▪ Microchip Technology • Hardware • SAMD21 Machine Learning Evaluation Kit with TDK ICM42688 IMU (EV18H79A) • SAMD21 Machine Learning Evaluation Kit with Bosch BMI160 IMU (EV45Y33A) • micro-USB cable (more than 1m length is recommended) • Software • MPLAB® X IDE (https://microchip.com/mplab/mplab-x-ide) • MPLAB® XC32 compiler (https://microchip.com/mplab/compilers) • MPLAB® Harmony 3 (https://www.microchip.com/harmony) • Design Asset • https://github.com/MicrochipTech/ml-samd21-iot-sensiml-gestures-demo/releases/tag/v0.2 (ml-samd21-iot-sensiml-gestures- demo.zip) ▪ SensiML • Create Your Free Account of SensiML Analytics Toolkit “Community Edition” (https://sensiml.com/plans/community-edition/) • SensiML Data Capture Lab for Windows 10 (https://sensiml.com/download/) • SensiML Open-Gateway (https://github.com/sensiml/open-gateway) • Note: On windows use Python 3.7 or 3.8. • Data Asset (SAMD21 + TDK ICM42688 IMU) (MCHP_HO)
  • 31. 30 SensiML Licensing Options 1. ROI benefit can be calculated https://sensiml.com/plans/# 2. Evaluation code is limited to 1000 inference results per embedded device power cycle 3. Risk-free Source Code Option: Purchase source code only after model is validated. 4. Additional users can be enabled at $99/user-month 5. MVGO = Motion, Vibration, Gesture, and Other sub-10kHz sensors 6. Audio = 10kHz-20kHz sensors; Ultra-High Rate (UHR) = >20 kHz sensors 7. Premier supportis offered in multiple tiers for direct toolkit support and consultation Plan Editions: Compare Options SensiML | Plans
  • 32. 31 Create Your Free Account of SensiML Analytics Toolkit ▪ Go to https://sensiml.com/plans/community-edition/ ▪ Fill your information, check the box of “Terms & Conditions” (Yes, read after “Terms & Conditions”) ▪ And then click “Create My Account” ▪ You will receive a confirmation email from SensiML.
  • 33. 32 SensiML Data Capture Lab for Windows 10 ▪ Download site: • https://sensiml.com/download/ ▪ SensiML Data Capture Lab can be downloaded from • https://sensiml.cloud/downloads/SensiML_DataCaptureLab_Setup.exe ▪ Install, execute Data Capture Lab and login with your account.
  • 34. 33 SensiML Open-Gateway Cloud-Based Tool via Web Browser – Forwards Sensor Data Capture or Inference Recognition Results via UART to SensiML Data Capture Lab ▪ You need to install Python 3.7 or 3.8. • https://www.python.org/downloads/windows/ ▪ SensiML Open-Gateway install instruction describes at SensiML’s GitHub • https://github.com/sensiml/open-gateway ▪ Installer for Windows Application is ready – SensiML_OpenGateway_Setup.exe • https://github.com/sensiml/open-gateway/releases/tag/v2022.3.3.0 ▪ Confirm to run SensiML Open-Gateway in your environment, the preparation is finished. Recommended
  • 35. 34 Exercise: Introducing SensiML Toolkit Endpoint AI Workflow 1. Write FW for data collection in SAMD21 by MPLAB X and connect it to DCL. 2. Import the example data set into the DCL project. 3. Additional data collection and labeling. 4. Generate the algorithm by Analytics Studio. 5. Testing a Model Using the Data Capture Lab 6. Compile the downloaded library of Knowledge Pack with MPLAB X and write it to SAMD21 7. Verify the actual operation of gesture recognition using Open-Gateway
  • 36. 35 Exercise: Introducing SensiML Toolkit Endpoint AI Workflow 1. Write FW for data collection in SAMD21 by MPLAB X and connect it to DCL. 2. Import the example data set into the DCL project. 3. Additional data collection and labeling. 4. Generate the algorithm by Analytics Studio. 5. Testing a Model Using the Data Capture Lab 6. Compile the downloaded library of Knowledge Pack with MPLAB X and write it to SAMD21 7. Verify the actual operation of gesture recognition using Open-Gateway
  • 37. 36 SAMD21 Setup – 1 (Option #1) 1. Download ZIP file from https://github.com/MicrochipTech/ml-samd21-iot- sensiml-gestures-demo/releases/download/v0.2/ml-samd21-iot-sensiml- gestures-demo.zip 2. Plug your SAMD21 evaluation kit into your PC via USB. The SAMD21 should automatically come up as a USB Flash drive. 3. Open the ml-samd21-iot-sensiml-gestures-demo.zip archive downloaded previously and locate the gesture classifier demo HEX file corresponding to your sensor make: 1. Bosch IMU: binaries/samd21-iot-sensiml-gestures-demo_bmi160.hex 2. TDK IMU: binaries/samd21-iot-sensiml-gestures-demo_icm42688.hex 4. Drag and drop the HEX file onto the SAMD21 USB drive to program the device.
  • 38. 37 SAMD21 Setup – 1 (Option 2) 1. Download ZIP file from https://github.com/MicrochipTech/ml-samd21-iot-imu-data-logger 2. Plug your SAMD21 evaluation kit into your PC via USB. 3. Extract ZIP file and open the firmware/samd21_iot_imu.X project folder in MPLAB X. 4. Select the SAMD21_IOT_WG_ICM42688 Project Configuration in MPLAB X below. 5. Select SensiML Simple Stream by setting the DATA_STREAMER_FORMAT macro in firmware/src/app_config.h to #define DATA_STREAMER_FORMAT DATA_STREAMER_FORMAT_SMLSS (line 63). 6. Once you're satisfied with your configuration, click the Make and Program Device button in the toolbar (see image below for reference). (Note: At first time, you will be asked to select “Debugger”.)
  • 39. 38 7. Confirm by the terminal software like Tera Term or Putty whether the FW could be programmed or not. • Baud Rate 115200 • Data bits 8 • Stop bits 1 • Parity None SAMD21 Setup – 2
  • 40. 39 Data Capture Lab (DCL) Setup – 1 1. Open up DCL and create a new project from “New project” button. 2. Specify “Location” and “Name”. 3. Switch to “Capture” mode by clicking “Switch Modes” button.
  • 41. 40 Data Capture Lab (DCL) Setup – 2 4. In capture mode, click “Connect” of “Sensor” at the bottom, “Sensor Configuration” is popped up. -> Click “Next” 5. In the “Select a Device Plugin” window select the SAMD21 ML Eval Kit item. -> Click “Next” 6. After selecting the device plugin, “Plugin Details” will appear; skip this by clicking “Next” to move forward to “Sensor Properties”. On the properties page, select “Motion (ICM-42688-P)”, then click “Next”.
  • 42. 41 Data Capture Lab (DCL) Setup – 3 7. Give a name to the sensor configuration in “Sensor Configuration”. Click “Save” to save the sensor configuration to your project. • If you plan on trying different configurations for your application, it's a good idea to include a summary of the sensor configuration in the name, for example, bmi160-100hz-16g-2000dps. 8. In “SAMD21 ML Eval Kit”. -> Click “three dot”, click “Connection Setting”, click “Scan”, select port “COM” of SAMD21, click “Done” and click “Connect”. 9. The SAMD21 ML Eval Kit should now be streaming to the DCL. (See next page) • Check out SensiML's documentation to learn more about how to use the DCL for capturing and annotating data.
  • 43. 42 Data Capture Lab (DCL) Setup – 4 42
  • 44. 43 Exercise: Introducing SensiML Toolkit Endpoint AI Workflow 1. Write FW for data collection in SAMD21 by MPLAB X and connect it to DCL. 2. Import the example data set into the DCL project. 3. Additional data collection and labeling. 4. Generate the algorithm by Analytics Studio. 5. Testing a Model Using the Data Capture Lab 6. Compile the downloaded library of Knowledge Pack with MPLAB X and write it to SAMD21 7. Verify the actual operation of gesture recognition using Open-Gateway
  • 45. 44 Import the example data set into the DCL project – 1 1. Extract the MCHP_HO_SAMD21_TDK.zip*1 archive containing the gestures data set into a working directory. MCHP_HO_SAMD21_TDK has captured and labeled dataset in csv format. 2. With your created project opened, navigate to the File menu and click the “Import from DCLI…” item as shown in below 3. Import “MCHP_HO_SAMD21_TDK.dcli” below. Select dcli file and click “Open”. 4. Click “Next” both Import and Sensor Columns *1: Download ZIP file from the embedded link.
  • 46. 45 Import the example data set into the DCL project – 2 5. Click “Done” Rename Sensor Columns and “Next” Import Settings 6. Select “SAMD21_TDK” and click “Select”. 7. When the file import is completed, “Import complete” green sign indicated at upper-right. 8. Click “Project Explorer” then you can see the imported files below. 9. “dataset” is uploaded to your SensiML cloud server automatically.
  • 47. 46 Import the example data set into the DCL project – 3 9. The imported data sets were taken from SAMD21 with TDK. • On_Desk: • Idle_Position: • Back_and_Forth: • Left_and_Right: • Up_and_Down: • Circle: SAMD21 kit put off on the desk. Holding verfically by hand and not move Holding verfically by hand and move it back and forth Holding verfically by hand and move it left and right Holding verfically by hand and move it up and down Holding verfically by hand and move it circle – clock-wise 10. Double click to open the imported file and then the captured waveform, label and metadata information can be confirmed and modified. -> See next page.
  • 48. 47 Import the example data set into the DCL project – 4 47
  • 49. 48 Exercise: Introducing SensiML Toolkit Endpoint AI Workflow 1. Write FW for data collection in SAMD21 by MPLAB X and connect it to DCL. 2. Import the example data set into the DCL project. 3. Additional data collection and labeling. [Optional - NOT Required for Steps 4-7] 4. Generate the algorithm by Analytics Studio. 5. Testing a Model Using the Data Capture Lab 6. Compile the downloaded library of Knowledge Pack with MPLAB X and write it to SAMD21 7. Verify the actual operation of gesture recognition using Open-Gateway
  • 50. 49 1. Click “Switch Mode”, click “Capture”, click “Connect” for re-connect between DCL and SAMD21 2. Modify your “File Metadata” from “+ Add Metadata” 3. Double click “Date” of “Project Properties”, click icon, input your new value (220311), click “Done”. 4. You can see added new value in the dropdown menu of “Date” in “File Metadata”. Additional data collection and labeling – 1
  • 51. 50 5. Click icon below, set up your file name in “File Name Template”, Click “Save” For example, <Label> <Date> <Subject> are checked. And you can select from “Space”, “Underscore”, and “Dash” 6. Click “Capture Settings”, check and specify “Max Record Time”, and “Save”. For example, Max Record Time is 15 seconds below. 7. Click “Start Recording”, record your gesture, confirm and save “Save Confirmation”. Additional data collection and labeling – 2
  • 52. 51 8. Open new recorded file “Circle_2022-02-07_ichiro” from “Project Explorer” Additional data collection and labeling – 3
  • 53. 52 9. Open new recorded file “Circle_2022-02-07_ichiro” from “Project Explorer” 10. Attach “Label” in “Segments” tab of “Data Capture Properties”. a. Make a right click on the waveform window, “blue (start) and red (end)” lines were indicated. b. Adjust the position of “blue and red” lines by “left click and drag” to specify the “Start and Length”. c. Right click and select “Edit” at “Segments” tab of “File Properties”. Additional data collection and labeling – 4 Right click here Left click And drag
  • 54. 53 d. Select “Label” from “Select Labels” -> Click “Done” and confirm the updated “Label”. e. Click “Save Changes” at the upper-left of DCL. f. The saved dataset is updated automatically and synchronized with SensiML cloud sever. g. Repeat from the “Begin Recording” to update “Label” until your expecting amount of the data collection. 11. More details about DCL, “Capturing Sensor Data”, “Labeling Your Data” and “Other Useful Features” • https://sensiml.com/documentation/guides/getting-started/capturing-sensor-data.html • https://sensiml.com/documentation/guides/getting-started/labeling-your-data.html • https://sensiml.com/documentation/guides/getting-started/other-useful-features.html Additional data collection and labeling – 5
  • 55. 54 Exercise: Introducing SensiML Toolkit Endpoint AI Workflow 1. Write FW for data collection in SAMD21 by MPLAB X and connect it to DCL. 2. Import the example data set into the DCL project. 3. Additional data collection and labeling. 4. Generate the algorithm by Analytics Studio. 5. Testing a Model Using the Data Capture Lab 6. Compile the downloaded library of Knowledge Pack with MPLAB X and write it to SAMD21 7. Verify the actual operation of gesture recognition using Open-Gateway
  • 56. 55 Generate the algorithm by Analytics Studio – Log-in 1. Log-in to your SensiML account via web browser (https://app.sensiml.cloud/auth/login/)
  • 57. 56 Generate the algorithm by Analytics Studio – Home -> Open Project 2. Open your project from “Open Project” by icon at the left of project name.
  • 58. 57 Generate the algorithm by Analytics Studio – Project Summary 3. The project summary page gives you an overview of your project. Each tab also provides more information about PROJECT DESCRIPTION, CAPTURES, QUERIES, PIPELINES, and KNOWLEDGE PACKS of your project. 4. Open “Prepare Data” screen to create a query.
  • 59. 58 Generate the algorithm by Analytics Studio – Prepare Data 5. Querying Data The query is used to select your sensor data from your project. If you need to filter out certain parts of your sensor data based on metadata or labels, you can specify that here. a) Query: MCHP_HO, (specify your unique name) b) Session: Session_1, (session name was specified in DCL) c) Label: Label, (you could select labels if you made other label(s) in DCL on same data) d) Metadata: segment_uuid, For (Differentiate the subset of captures that you want to work with for modeling) e) Source: GyroscopeX, GyroscopeY, GyroscopeZ, AccelerometerX, AccelerometerY, AccelerometerZ (You can select which sensor data use for modeling) f) Query Filter: [For] in [train] (File Metadata “train” will be used for modeling) g) Plot: Segment (Segment or Samples could be indicated) 6. Click “SAVE” 7. Click “Build Model” from the left menu
  • 60. 59 Generate the algorithm by Analytics Studio – Build Model 6. Building Model a) Click “BUILD MODEL”, add “Pipeline name” in “Create New Pipeline”, click “BUILD”. b) Confirm “Query *” in “Input Query” and click “SAVE”, set “200” in “Window Size” and click “SAVE”. c) Once you've entered the pipeline settings, click the “OPTIMIZE” button. This step will use AutoML techniques to automatically select the best features and machine learning algorithm for the gesture classification task given your input data. This process will usually take several minutes. d) Once the Build Model optimization step is completed, confirm “AutoML Results” and navigate to “Explore Model”. -> Next Page
  • 61. 60 Generate the algorithm by Analytics Studio – Explore Model 7. In “Explore Model”, you can get more information about the models that were generated from “Build Model”. https://sensiml.com/documentation/guides/getting-started/exploring-model-details.html 8. Once “Explore Model” is checked, navigate to “Test Model”.
  • 62. 61 Generate the algorithm by Analytics Studio – Test Model 9. Select the pipeline that we created in the previous step. Select one of the models generated in the previous step. 10. This text uses “MCHP_HO_rank_4”. 11. Select the upside-down triangle icon in “For” column and select “Test” to filter the data so only the test samples are selected. 12. Select the check box to select the test samples you want to test and click “COMPUTE SUMMARY”. 13. Click “RESULTS” to confirm the details of simulation result. -> Next page
  • 63. 62 Generate the algorithm by Analytics Studio – Test Model 14. Once completed you will be presented with a table like is shown in below summarizing the classification results.
  • 64. 63 Generate the algorithm by Analytics Studio – Download Model 1 15. Finally, navigate to “Download Model” tab to download your model. 16. Select “Microchip SAMD21 ML Eval Kit” in “HW Platform” and “Library” in “Format”. Click “DOWNLOAD”. 17. In the right side of this window has Knowledge Pack Information including “Device Profile Information” which are “Estimated Memory Size” and “Estimated Latency”.
  • 65. 64 Generate the algorithm by Analytics Studio – Download Model 2 18. Once completed “Downloading Knowledge Pack, please wait ...”, click “OK” to “Save File”.
  • 66. 65 Generate the algorithm by Analytics Studio – Memory Usage 19. Device Profile Information Estimated Memory Usage SRAM Used: The number of byte in RAM taken up the model on a device Stack Size: The estimated worst-case stack usage in bytes of the entire model on a device Flash Used: The number of bytes taken up by the model on a device Estimated Latency Feature Extraction Latency: The estimated amount of clock cycles and time the model will spend in feature generation. Classifier Latency: The estimated amount of clock cycles and time the model will spend in the classifier algorithm. Total Latency: The estimated total number of clock cycles and time the model will take to operate on a segment of data.
  • 67. 66 Exercise: Introducing SensiML Toolkit Endpoint AI Workflow 1. Write FW for data collection in SAMD21 by MPLAB X and connect it to DCL. 2. Import the example data set into the DCL project. 3. Additional data collection and labeling. 4. Generate the algorithm by Analytics Studio. 5. Testing a Model Using the Data Capture Lab 6. Compile the downloaded library of Knowledge Pack with MPLAB X and write it to SAMD21 7. Verify the actual operation of gesture recognition using Open-Gateway
  • 68. 67 Testing a Model Using the Data Capture Lab – 1 1. The Data Capture Lab has two ways to test a model on your dataset: Running a Model During Data Collection: Connect to a model during data collection and get the model results in real-time Running a Model in the Project Explorer: Run a model on any previously collected CSV or WAV files in your project 2. This lets you see how your model will perform on real data before flashing it to a device. 3. Running a Model During Data Collection -> Next Page a) Switch to Capture mode b) Connect to your device c) Open the Test Model panel and click Connect d) Select a Knowledge Pack e) Select a Session. This is where the Knowledge Pack results will be saved f) Connect to the Knowledge Pack g) You will now see your model results in real-time overlapping capturing waveform 4. (Optional) You can click Start Recording and the Data Capture Lab will save the Knowledge Pack results to your project. This lets you quickly add additional training data to your project. 5. (Optional) In the Save Confirmation screen you can edit or delete the Knowledge Pack results before saving the results to your project.
  • 69. 68 Testing a Model Using the Data Capture Lab – 2 68
  • 70. 69 Testing a Model Using the Data Capture Lab – 3 69
  • 71. 70 Exercise: Introducing SensiML Toolkit Endpoint AI Workflow 1. Write FW for data collection in SAMD21 by MPLAB X and connect it to DCL. 2. Import the example data set into the DCL project. 3. Additional data collection and labeling. 4. Generate the algorithm by Analytics Studio. 5. Testing a Model Using the Data Capture Lab 6. Compile the downloaded library of Knowledge Pack with MPLAB X and write it to SAMD21 7. Verify the actual operation of gesture recognition using Open-Gateway
  • 72. 71 Knowledge Pack Integration – 1 1. Deployed your knowledge pack in the Library format, the archive should contain a complete, ready to compile, MPLAB X project. Follow the steps below to compile your project: a) Unzip a downloaded zip file at your working folder. b) In MPLAB X, open “samd21-iot-sensiml-template.X” project folder under the firmware folder of the knowledge pack. c) Select the Project Configuration option in the MPLAB X toolbar according to which sensor you're using. -> ICM42688 d) Select SensiML Simple Stream by setting the DATA_STREAMER_FORMAT macro in Header Files/knowledgepack/knowledgepack_project/app_config.h to #define DATA_STREAMER_FORMAT DATA_STREAMER_FORMAT_SMLSS (line 63). e) Your project should now be ready to “Make and Program Device (samd21-iot-sensiml-template)”. f) Connect your SAMD21 Eval Kit via USB cable and click “Make and Program Device”
  • 73. 72 Knowledge Pack Integration – 2 2. Firmware Operation The firmware behavior can be summarized as operating in one of three distinct states as reflected by the onboard LEDs and described in the table below: 3. When operating normally, the firmware prints the classification prediction (classification ID number) and the generated feature vector for each sample window over the UART port. To read the UART port use a terminal emulator of your choice (e.g., MPLAB Data Visualizer's integrated terminal tool) with the following settings: • Baud Rate 115200 • Data bits 8 • Stop bits 1 • Parity None Status LED Behavior Description Error Red (ERROR) LED lit Fatal error. (Do you have the correct sensor plugged in? Buffer Overflow Yellow (DATA) and Red (ERROR) LED lit for 5 seconds Processing is not able to keep up with real-time: data buffer has been reset. Running Yellow (DATA) LED flashingslowly Firmware is runningnormally. 1. Back_and_Forth 2. Circle 3. Idle_Position 4. Left_and_Right 5. On_Desk 6. Up_and_Down