Using ML to guess when a packet will be lost

There is some regularity in when a congestion control mechanism, like TCP's, will lose a packet. Can we use Machine Learning (ML) to predict this packet loss, and add proactively add redundancy?

There have been multiple proposals on using some form of Forward Error Correction (FEC) code for protocols like TCP. However, typically, these protocols do not lose so many packets that this is worth it. Maybe it would be a better solution to dynamically only add FEC when a packet loss is imminent, judging based on the currently measurable conditions like the congestion window, the round-trip time etc.?

Reinforcement Learning (RL) is probably a good candidate methodology to approach this. This work will probably be done in simulations, but real-life tests in a testbed or even across the Internet can also be considered.

Publisert 18. okt. 2021 12:46 - Sist endret 18. okt. 2021 12:47

Omfang (studiepoeng)