The performance, power, energy consumption and useful life of routers are all measurable factors in product testing. The data, however, depend on the conditions under which manufacturers carry out the tests. Transparency is vital!
After 10 years and more than 400,000 km, it is finally time to say goodbye to my little Citroen Xsara. Who brought up planned obsolescence? It is with a degree of nostalgia that I say goodbye because, to be honest, the car has given me more joy than trouble. One aspect I personally value when looking for a replacement is reduced consumption and with 40,000 km per year it certainly needs to be kept in mind. The first step, of course, is to get hold of official manufacturing data. In so doing, the technological advances in data publishing of recent years are appreciated: quickly, we find consumption of less than 5 liters per 100 km in mid-size cars with large engine power and even 4 liters per 100 km or less, if we are willing to make do with less powerful engines that just make 100 horsepower. And that’s without even considering electric or hybrid vehicles; technology is most welcome! The second step is the Internet. Specialized websites with comparisons, trends, experiences, testing, user forums… Sometimes too much information can lead to misinformation. But there is a certain unanimity when it comes to consumption: official manufacturing data are usually lower (sometimes significantly) than what users report. Well, we haven’t discovered penicillin here, I think it’s something in the public domain. So… are we being deceived by manufacturers? Here I believe that my impression as a user is also shared by most: manufacturers are probably not deceiving us, they are only varying “the conditions”.
The same can be extrapolated to numerous cases where products are rated by quantitative data (a household appliance’s consumption, a mobile phone’s battery life, a device’s expected life span…). This is self-evident but the conditions under which the data are obtained are almost as important as the data itself, though this is not always given the attention it deserves.
The function of product testing
In telecommunications, two factors determine the validity of the access router, especially with regard to remote access to a corporate network. One of these factors is qualitative, the available interfaces and functionality, while the other is quantitative, the speed at which the device is capable of exploiting them. The first factor has often been a determining factor of the second; read, for example, the transition from networks based on serial lines (X25, FR, PP…) to ISDN and subsequently ADSL, VDSL, and finally to Ethernet and fiber connections (Giga).
In virtually all cases the conversion of the access method determined the speed thereof and thus the power required of the device. However, only a small fraction (usually 100 Mbps, or 10%) of the capacity of typical connections in central offices (Ethernet and fiber) is currently being exploited, leaving a long way to go. Thus the second factor (the power of the device, be it speed, performance, capacity, throughput…) takes on a key role in determining the suitability of the product for expecting a reasonable period of useful life. At the same time it is clear that an appropriate level of power ensures adequate performance during this lifetime, both of which impact both the business development and the income statement.
Unfortunately, Internet resources collecting user information on professional routers are much more limited than those for vehicles, mobile phones, and household appliances. Often, the only option is to rely on the data published by manufacturers. And this is what I was leading up to because, as product manager, I am very familiar with the problem…
How do you measure a router’s performance?
The conditions under which an access router’s maximum performance is determined are absolutely crucial and have a far greater impact on the outcome than in the case of fuel consumption which I spoke about at the beginning of this article. Hence, apart from the information on XXX Mbps supported by a router, it is important to specify, among other things, whether the data is unidirectional or bidirectional (whether XXX Mbps are supported in only one direction or both), since this clearly has a 100 percent impact on the published value.
Another important factor is the packet size used in the test to obtain the XXX Mbps. This is because a packet’s switching load is independent of its size, or to put it another way, the power of a device is determined by the number of Packets per Second (PPS) that it is capable of processing. Thus, a test with 100 byte packets will give one result, while the same test with 1500 byte packets, will produce a figure that is 15 times higher.
Finally, another important circumstance is the configuration loaded in the router, which can also have an effect that is just as important as the others.
In order to avoid these problems, a test pattern under given conditions has been standardized, defined in RFC2544 and RFC6815. In an ideal world, manufacturers would be able to use these standards and compare the published data directly without any uncertainty. A slight downside to these tests is that they don’t provide a single result, but rather a set of results obtained from a set of conditions. But that’s another story for another day.
Energy efficiency: powerful, high-performance routers
The performance of the Teldat routers is usually far superior to similarly-priced competitors’ routers. Sometimes, four or five times more powerful.
Furthermore, we try to be as objective as possible in providing performance data, always indicating bidirectional information (the data indicated on each side simultaneously), using IMIX packet size (statistical average of the Internet traffic) and a configuration of average complexity (ACLs + QoS), i.e. under conditions similar to the real-world so that there are no surprises as in the case of gasoline consumption…