πŸ“·Hand recognition

In this walkthrough, you will create a hand and finger tracking from scratch in a very short period of time without gathering and processing data!

Step 1: Create a new Project

On the left sidebar first, click on "Project".

This will bring you to the Project screen which has 2 buttons for either creating a new or selecting and existing Project.

Click on the left button "+ Create new". A pop-up modal with the title "Create new project" will appear.

Now you need to write all the necessary information for one Coretex Project.

In this example, we are training a neural network that is detecting objects so for our Project Type we will select the Computer Vision type.

Step 2: Create a new Dataset

Now let's go to the Datasets screen and import Dataset for our new training.

On the left sidebar click on "Datasets" which will bring you to the Datasets screen.

On Dataset Controls click on "+ New Dataset". A pop-up modal with the title "Create a new dataset" will appear.

Now we need to write all the necessary information for our new Coretex Dataset.

If you want to know more about Coretex Dataset please visit this link Coretex Dataset.

In this example, we want to use a Coretex Dataset without any additional processing.

To import a Dataset select the option "Load sample dataset".

This gives us a list of curated Coretex Datasets to choose from that are ready for use.

We will select hand-tracking-train Dataset from a list of sample datasets. Select the "hand-tracking-train" Dataset and click on the "Use sample" button.

After clicking on the button we will see the Coretex logo and message β€œLoading…” in the center of our screen, this means Coretex is loading our new dataset from the existing sample.

This dataset contains 1208 samples, more items mean it will take some time to load the dataset, so please be patient.

You can preview a sample by selecting it.

After around 30 minutes, our dataset is imported on Coretex.

Step 3: Create a new Workflow

After creating Coretex Project and importing Dataset, we are now ready to create our first Workflow.

To create our first Workflow on the left sidebar click on "Workflows".

On Workflows Controls click on "+ New Workflow". A pop-up modal with the title "Create a New Workflow" will appear.

Now you need to write all the necessary information for your Coretex Workflow.

We will opt for the "empty type" selection in this tutorial. Provide a name in the required field and we are ready to create a workflow.

Clicking the 'Insert Task' button will result in displaying a modal showing existing tasks and the option to create new ones.

In this example, we use a Coretex Task which is ready for users without any modifications.

Select 'Create from' to "From template".

If you want to know more about Coretex Tasks please visit this link Coretex Task.

This gives us a list of Coretex Tasks with the same Type.

We will select the "Object Detection (YoloV8)" job because it is most suitable for our problem. Select "Object Detection (YoloV8)" and click on the "Confirm" button.

After a few moments, our Task is added to the Workflow.

Now we have all we need to start training for our model. Let's do it!

Step 4: Run the Workflow

Switch to the 'Advanced' button and let's play with parameters.

In order to start the run we first have to fill in all the required fields

All required fields are marked with red border.

Run Name is not required so we will skip this field. (Coretex will generate a Run Name for us).

In the "Environment" field we will select "Existing", this would matter if you were previously running this Task on your machine. To learn more about Environment Caching click here.

On the top right side click on the field "Select Node" and select the name of the machine you will be using for this Run.

Every Coretex task has task.yaml file, which has parameters that are important for the task. You can mix those parameters on the Coretex platform and see what gives you the best results.

The first parameter is "Dataset". Click on "Select Dataset" and select the Dataset we have just made.

The "Epochs" parameter represents the number of epochs (iterations) that training will be executed for.

More epochs bring you better results but that is not always the case.

We click on the "Run Task" button and a new pop-up will appear with some information about the Run we are starting, press the "Run Task" button again.

After the Loading screen disappears, we are returned to the Tasks screen waiting for our training to be done.

If you want to follow how your script is executing and other interesting stuff about the Run that is running, follow the next step.

Step 6: Track Run progress

In this step, we demonstrate how you can track your Run and their progress.

On the Runs screen, you can see a list of all Runs that are running or have been recently run on Coretex.

If you want to see details about the Run that is running, click on the eye button at the end of the row.

If you want to know more about the Run details, please visit this link Run Details

After training is finished, you can see a live preview of your Hand and Finger tracking model. In the next step you will learn how you can get live predictions of your model on Coretex.

Step 7: Test your model

Let's see how can you test your model on Coretex using the Live Preview feature.

First on the left sidebar select "Models" which will bring you to the Models screen. On the Models screen you can see a list of all recently trained models on Coretex.

If you can't find your model by Space or Job name, you can always use filters.

My models are on top of the list and I can see their accuracy in the "Accuracy" column.

Let's get to the Model Details screen first. Click on three points at the end of the row of the model that you want to see and click on "View Details".

This will bring you to the Model Details screen of your desired model. Here you can see all useful information about the model, run Live Preview, etc.

We will click on the green button "Live Test" because we want to see how is our model doing on some data that wasn't in the training batch.

By clicking on the "Live Test" button, you will see a new pop-up on your screen and the loading model on Coretex will start, select "From Sample" in the field "Select Data Source".

Click on the "Next" button and a new list is going to show up on your screen of Coretex Dataset samples.

From the list of Coretex Dataset samples, select hand-tracking-test and click "Use Sample".

After a few seconds of loading, live preview will show up and you can see the performance of your model.

If you are satisfied with your model performance, you can download or deploy your model.

Other applications

1. Production line quality control (visual inspection)

Production line visual quality control is a process of detecting objects with packaging anomalies in a production line.

By taking photos of the packaging and labelling them as "Good" or "Bad" you can train a model to recognize packaging anomalies as they appear. A package is considered to be bad if it has an improperly placed label, the wrong label on the packaging if the liquid content doesn't fill up the container or if it has a slightly different colour than expected.

Our dataset contained a total of 200 images with 124 "good" and 176 "bad" annotations.

Here's a result after 10 minutes of training on a consumer-level machine:

2. Workplace safety

Workplace safety is a process of monitoring the status of active construction sites, increasing worker safety. By training this model, it can indicate possible threats, and events that affect the safety of workers and enable you to prevent and minimize the chances of injuries at work.

In the process of creating this model, 1490 photos are included from various construction sites, forklifts (338 samples), workers (591 samples), and trucks (561 samples). Labels that are used for annotating are:

  • worker

  • truck

  • forklift

Here's a result after training on a consumer-level machine:

Last updated