Experiment examples¶
Here we will present several examples of using netunicorn platform for network-oriented data collection purposes, including experiment descriptions and links to the code.
Deployment of netunicorn¶
For all these experiments, we assume that you have access to netunicorn deployment operating within some infrastructure. If yes, skip this section and proceed to the next one.
If this is not the case and you want to explore these experiments on your own, we suggest to easily deploy your own test copy of netunicorn based on Docker compose file. For further details, see Deployment guide, section “Simplified deployment”.
Preparation¶
You should have the following information to interact with netunicorn platform:
NETUNICORN_ENDPOINT: URL of netunicorn API endpoint. Provided by installation administrator. If you use local installation deployed from
docker-compose.yml
file, it would be the endpoint ofmediator
service indocker-compose.yml
file (by default:http://localhost:26611
)NETUNICORN_LOGIN: login of your user account. Provided by installation administrator. If you use local installation deployed from
docker-compose.yml
file, it would betest
(you can change it indevelopment/users.sql
file before runningdocker compose
).NETUNICORN_PASSWORD: password of your user account. Provided by installation administrator. If you use local installation deployed from
docker-compose.yml
file, it would betest
(provided asbcrypt2
-hashed value indevelopment/users.sql
file).
We propose to store these values in environment variables for convenience. For example, for local installation, you can run the following commands in the same terminal session where you will use netunicorn client or run jupyter-notebook:
export NETUNICORN_ENDPOINT=http://localhost:26611
export NETUNICORN_LOGIN=test
export NETUNICORN_PASSWORD=test
All experiments are implemented as Jupyter notebooks. You can find them in examples folder of netunicorn repository.
Basic Sleep Experiment¶
Goal¶
Verify the basic functionality of netunicorn platform and check the correctness of the data collection process.
Experiment Design¶
This experiment verifies if netunicorn installation is working correctly and accessible by the user. Specifically, it:
Verifies installation of needed Python packages, connection to netunicorn API endpoint and user authentication.
Describes how to create a simple pipeline consisting of several
sleep
tasks.Describes how to use
nodes
objects to get information about the nodes in the infrastructure.Leads through the process of running the experiment and obtaining the results.
Result¶
As a result, you should be able to learn the basics of netunicorn platform and run your first experiment.
Jupyter Notebook¶
https://github.com/netunicorn/netunicorn/blob/main/examples/basic_example.ipynb
Speed Test Experiment¶
Goal¶
Collect network performance metrics from multiple nodes in the infrastructure using speed-test utility and store the resulting PCAP files for future analysis.
Experiment Flow¶
This experiment shows an example of real-world data collection from a complex infrastructure. In particular, it:
Starts the network traffic capturing on the experiment nodes.
Execute Ookla Speed Test on these nodes
Saves the resulting traffic and uploads it to the cloud storage.
Demonstrates the result of execution, how to parse and read them.
Result¶
As a result, you should have results of speed-test measurements and corresponding PCAP files for future analysis using your favorite tools and methods.
Jupyter Notebook¶
https://github.com/netunicorn/netunicorn/blob/main/examples/speed_test_example.ipynb
Video Data Collection Experiment¶
Goal¶
In this experiment, we will watch videos from various platforms and record the corresponding network traffic for future analysis.
Experiment Flow¶
This example shows how nodes could interact with various video streaming platforms, in particular YouTube, Vimeo, and Twitch. We will watch videos (or streams) from them and record the network traffic. Specifically, this experiment:
Creates the pipeline with watching YouTube, Vimeo, and Twitch, while recording network traffic.
Demonstrates the principle of environment_definition object, how environment preparation commands are stored inside, and how to use your own Docker image for the experiment.
Executes the pipeline on the nodes from an infrastructure and uploads the resulting data to cloud data storage platform.
Result¶
As a result, you should have YouTube, Vimeo, and Twitch video streaming network recording that you can later analyse and explore.
Jupyter Notebook¶
https://github.com/netunicorn/netunicorn/blob/main/examples/video_watchers_example.ipynb