Overview

Thymio2

class thymio.DistributedThymio2(name, controller, index, **kwargs)

Superclass: pyenki.Thymio2 -> the world update step will automatically call the Thymio controlStep.

Note

Thymio2 can measure the distances in two ways using the sensors available:

  • prox_values​, an array of 7 values that holds the values of 7 (horizontal) distance sensors around its periphery: ​*fll ​(front left left)​, fl ​(front left)​, fc ​(front center)​, fr (front right right)​, frr ​(front right)​, bl ​(back left)​, br ​(back right)*. The values vary from 0 – when the robot does not see anything – to several thousand – when the robot is very close to an obstacle. Thymio updates this array at a frequency of 10 Hz, and generates the prox event after every update. The maximum range, in this case, is 14 cm.

  • prox_comm_{enable, events, tx}​ are the horizontal infrared distance sensors to communicate value to peer robots within a range of 48 cm. Thymio sends an 11-bit value. To use the communication, call the ​prox_comm_enable(state) function, with 1 in a state to enable communication or 0 to turn it off. If the communication is enabled, the value in the ​prox_comm_tx​ variable is transmitted every 100 ms. When Thymio receives a value, the prox_comm_events is fired and the value is in the prox_comm.rx variable.

Parameters
  • name – name of the agent

  • controller – controller to use between OmniscientController, ManualController, LearnedController

  • index – index of the agents in the row

  • kwargs – other arguments

Variables
  • initial_position – the initial position of the agent is set to None

  • goal_position – the goal position of the agent is set to None

  • goal_angle – the goal angle of the agent is set to None

  • dictionary – the dictionary containing all the agent attributes is set to None

  • colour – the colour of the agent is set to None

  • goal_colour – the goal colour of the agent is set to ‘red’ or 0, if the agent is in the second half of the row, otherwise is set to ‘blue’ or 1

colour_thymios(dt: float)

Enable communication and send at each timestep a message decided by the controller. Set the top led colour based on the controller decision.

Parameters

dt – control step duration

controlStep(dt: float) → None
Perform one control step:

If distribute move the robots in such a way they stand at equal distances from each other. If colour color the first half of robot in a different way from the second half.

Parameters

dt – control step duration

distribute_thymios(dt: float)

Enable communication and send at each timestep a message decided by the controller. Set the velocity of the wheels based on the controller decision.

Parameters

dt – control step duration

Tasks

Task 1

task1.parse_args()

Imitation Learning - Distributed Controller + Communication

usage: task1.py [–args]

Return args
--help

Show this help message and exit

--gui

Run simulation using the gui (default: False)

--myt-quantity N

Number of thymios for the simulation (default: 5)

--n-simulations N

Number of runs for each simulation (default: 1000)

--task TASK

Choose the task to perform in the current execution between task1 and task2 (default: task1)

--avg-gap N

Average gap distance between thymios (default: 8)

--generate-dataset

Generate the dataset containing the simulations (default: False)

--generate-split

Generate the indices for the split of the dataset (default: False)

--plots-dataset

Generate the plots of regarding the dataset (default: False)

--check-dataset

Generate the plots that check the dataset conformity (default: False)

--compare-all

Generate plots that compare all the experiments in terms of distance from goal (default: False)

--controller CONTROLLER

Choose the controller for the current execution. Usually between all, learned, manual and omniscient (default: all)

--dataset-folder DATASET_FOLDER

Name of the directory containing the datasets (default: datasets)

--dataset DATASET

Choose the datasets to use in the current execution (default: all)

--models-folder MODELS_FOLDER

Name of the directory containing the models (default: models)

--model-type MODEL_TYPE

Name of the sub-directory containing the models (default: distributed)

--model MODEL

Name of the model (default: net1)

--train-net

Train the model (default: False)

--save-net

Save the model in onnx format (default: False)

--net-input SENSING

Choose the input of the net between prox_values, prox_comm or all_sensors (default: prox_values)

--plots-net

Generate the plots of regarding the model (default: False)

Task 1 Extension

task1_extension.parse_args()

Imitation Learning - Extension to an arbitrary number of agents

usage: task1_extensions.py [–args]

Return args
--help

Show this help message and exit

--gui

Run simulation using the gui (default: False)

--n-simulations N

Number of runs for each simulation (default: 1000)

--task TASK

Choose the task to perform in the current execution between task1 and task2 (default: task1)

--myt-quantity MYT_QUANTITY

Number of thymios for the simulation (default: variable)

--avg-gap AVG_GAP

Average gap distance between thymios (default: variable)

--generate-dataset

Generate the dataset containing the simulations (default: False)

--generate-split

Generate the indices for the split of the dataset (default: False)

--plots-dataset

Generate the plots of regarding the dataset (default: False)

--check-dataset

Generate the plots that check the dataset conformity (default: False)

--compare-all

Generate plots that compare all the experiments in terms of distance from goal (default: False)

--generate-animations

Generate animations that compare the controllers (default: False)

--controller CONTROLLER

Choose the controller for the current execution. Usually between all, learned, manual and omniscient (default: all)

--dataset-folder DATASET_FOLDER

Name of the directory containing the datasets (default: datasets)

--dataset DATASET

Choose the datasets to use in the current execution (default: all)

--models-folder MODELS_FOLDER

Name of the directory containing the models (default: models)

--model-type MODEL_TYPE

Name of the sub-directory containing the models (default: distributed)

--model MODEL

Name of the model (default: net1)

--train-net

Train the model (default: False)

--save-net

Save the model in onnx format (default: False)

--net-input SENSING

Choose the input of the net between prox_values, prox_comm or all_sensors (default: all_sensors)

--plots-net

Generate the plots of regarding the model (default: False)

Task 2

task2.parse_args()

Imitation Learning - Task 2 with Communication

usage: task2.py [–args]

Return args
--help

Show this help message and exit

--gui

Run simulation using the gui (default: False)

--n-simulations N

Number of runs for each simulation (default: 1000)

--task TASK

Choose the task to perform in the current execution between task1 and task2 (default: task2)

--myt-quantity MYT_QUANTITY

Number of thymios for the simulation (default: variable)

--avg-gap AVG_GAP

Average gap distance between thymios (default: variable)

--generate-dataset

Generate the dataset containing the simulations (default: False)

--generate-split

Generate the indices for the split of the dataset (default: False)

--plots-dataset

Generate the plots of regarding the dataset (default: False)

--check-dataset

Generate the plots that check the dataset conformity (default: False)

--compare-all

Generate plots that compare all the experiments in terms of distance from goal (default: False)

--generate-animations

Generate animations that compare the controllers (default: False)

--controller CONTROLLER

Choose the controller for the current execution. Usually between all, learned, manual and omniscient (default: all)

--dataset-folder DATASET_FOLDER

Name of the directory containing the datasets (default: datasets)

--dataset DATASET

Choose the datasets to use in the current execution (default: all)

--models-folder MODELS_FOLDER

Name of the directory containing the models (default: models)

--model-type MODEL_TYPE

Name of the sub-directory containing the models (default: distributed)

--model MODEL

Name of the model (default: net1)

--train-net

Train the model (default: False)

--save-net

Save the model in onnx format (default: False)

--net-input SENSING

Choose the input of the net between prox_values, prox_comm or all_sensors (default: all_sensors)

--plots-net

Generate the plots of regarding the model (default: False)