Corporations can generate large volumes of data through the products and services they offer to their customers. Analyzing these data appropriately can provide very useful information for optimizing and improving the services offered and even generating new products and services. In short, good data analysis can give a corporation clear competitive advantages.
As I have done in previous years around this time, I would like to take this opportunity to summarize some of the most important issues we have encountered in 2017. This year is proving to be a very interesting year for the telecommunications sector.
A couple of weeks ago, Madrid was the venue for what has been heralded as Europe’s largest digital transformation trade show. Leading multinationals and the smallest of startup companies took the opportunity to showcase their products and services related to this new technological revolution that aims to change the way we live and work… Or maybe not?
Firstly, the move of IT infrastructure to the cloud means our current understanding of level 3 network traffic (IP) is insufficient to characterize applications transmitting over said network: Application servers had fixed, known IP addresses in traditional data centers, whereas IP addressing in cloud is no longer controlled by the organization using these services.
Secondly, far more applications (both corporate and personal) are in circulation today than a few years ago. Said applications have not, in general, been designed with bandwidth optimization in mind and all have different needs and behaviors. This means some applications can (and do) adversely affect others if the network is incapable of applying different policies to prevent this.
The vast majority of applications use http and https for communication mainly to evade, or minimize, possible negative effects arising from security policies or IP addressing (NAT) over the network. This means the transport layer (TCP or UDP port) is unable to adequately identify network applications as they tend to use the same ports (http 80 and https 443).
To further aggravate the problem, companies must provide connectivity to an enormous array of ‘authorized’ local devices. Remote local networks today, unlike the traditional single terminal of yesterday, are more varied and far less controlled: Wireless offices, guest access, home access, BYOD, IoT etc. Consequently, the difficulties in analyzing traffic, caching systems and CND also escalate
Finally this greater diversity increases security risks: viruses, malware, bots, etc. These, in turn, tend to generate “uncontrolled” network traffic that needs to be detected and characterized. At this point, the close link between visibility and security at the network level raises its head (with all its repercussions and analysis), a subject that we’ll tackle another day.
The above points make it very clear that analyzing network traffic has become more and more intricate over the last few years, boosting the need for new tools with greater capacity. Otherwise, we simply won’t know what is going through our network, placing it not only at risk but unnecessarily increasing its upkeep. Given the tremendous amount of information handled, using tools that are able to intelligently filter the information received and provide high level of granularity in analysis and reports is absolutely essential. It’s here where big data analysis technologies bring huge advantages when compared to traditional tools.
Well aware of this recent difficulty, users need application visibility and control solutions to meet these new needs.
- Said solutions must be able to scale down to small and medium corporate offices, and offer a sound compromise between CPU requirements (cost), needed for DPI (Deep Packet Inspection), and number of detected applications (customer service and quality of application detection).
- Integrating intelligent detection in remote routers and the use of a centralized management tool, versus current market solutions based on proprietor remote point polling and hardware appliances (also proprietor), allows for excellent detection granularity and affordable exploitation, scalable to any size of network.
- Instead of opting for proprietor solutions, it’s crucial to use suppliers who adopt standard protocols to communicate visibility information (Netflow / IPFIX for example). This allows customers to use their own information collection methods if they so wish.
As part of its access routers and management tool, Colibri Netmanager, Teldat offers visibility and control solutions for network applications capable of meeting the aforementioned market needs.
Three years ago, Gartner predicted that by 2017 CMOs will spend more money on IT than CIOs. And so far it appears that his prediction is on track to come true. The reason is that, although the opposite may seem true, (successful) marketing is all about analyzing the largest possible number of data and variables, which allow you to shape behaviors, reactions, preferences and trends to develop the right products and messages that lead consumers or customers to purchase those products. Today, technology has given us the ability to capture vast amounts of data, which are held by the CIO. These data are unstructured and impossible to manage.
These data are useless to the CMO, who must process them to turn them into understandable information. And even then, this enormous quantity of information would be impossible to process because humans lacks the physical capacity to effectively utilize the amount of information that can be received in a day. You have to place the information in the right context, to establish relationships between all the available information, and turn it into knowledge, so that it can be used to support decision making and thus reduce response or action time.
Organizations need knowledge, not data
This situation is particularly noticeable in Wi-Fi networks. Each access point can gather data on the IP addresses of connected devices, when they connect and how long they connect for, report the location (probably in coordinates) of the access point accessed by the device, provide navigation lists and a whole host of other technical data. However, all this data does not constitute a valid form of information on which to base decisions and take action. And if it isn’t in the case of a single device, imagine what happens in large spaces where the flow of people is in the thousands or hundreds of thousands.
The interesting thing about technology is the way in which it can help to increase business. In this case, how it can be used to understand customer behavior and encourage buying. The wealth of information gathered by the access points, managed by the CIO, represent an incredible opportunity for many sectors when big data analysis techniques are correctly applied. It then becomes possible to transform terabytes of unstructured data into accessible organized information (knowledge) that the CMO can use to significantly improve his strategies, even building one-to-one campaigns that aim to reach users and consumers wherever they might be. If you combine geolocation techniques with all the information pouring into social media networks, the real-time knowledge you possess is so powerful that it allows you to know pretty much anything and act in accordance with that knowledge. And all this using only public authorized information, since the users themselves are responsible for uploading it onto social networks (with their own user-defined privacy criteria). So, as long as the information is used properly, no legal issues would come up.
Like it or not, we live in a connected world and technology offers enormous possibilities that weren’t even conceivable some years ago. What we need to do now is to change the paradigm and start seeing technology as an important driver of business growth, when aligned with business needs, rather than viewing it as mere infrastructure. This is exactly what Gartner predicted three years ago.
Teldat possesses the infrastructure, tools and knowledge to offer organizations of all sizes (including SMEs and SMBs) the necessary solutions to meet this change of vision.
And so it seems that, at least in this respect, Gartner was right again.