The problems of slow-start algorithm
We have already mentioned some of the problems that can arise from the use of slow-start, the algorithm invented in the early days of internet to avoid network congestion. The main principle of slow-start, when determining the largest possible amount of data that can be sent, is assuming that unacknowledged segments are due to congestion on the network.
There are some situations however, when this assumption is not always acceptable. One example is a network with poor reception, such as wireless networks. In some other cases, even when the network connection is reliable, the use of slow-start algorithm itself causes bad performance.
While new web browsers reuse one connection to download all files from a web server, connections cannot be reused in some cases. For example, browsers must create many consecutive short-lived connections for web servers that use web advertising, social networking services or web analytics scripts.
You can imagine how much slower it takes to download a single webpage that contains all these elements (and many pages today do) with the use slow-start than without it. And this is the reason why, as we mentioned the other day, some major websites cheat this algorithm, and some of them (such as the servers of Microsoft) skip it altogether.
Incoming search terms: