According to O’Leary (1992), being among the first researchers to point to the trend of
developing intrusion-detection systems. These systems could detect potential and actual intruders
into the computer systems. The system was based on the user behavior, typically the user
preferences, network usage and course of action in given situation is analyzed. Therefore, the
system builds profiles of expected user behavior and uses these stored patterns as a basis to judge
the deviation. If the user deviates from the norm, then, the deviation may be handled as an
intrusion. By analyzing user behavior and matching it against the actual patterns, these systems
could compare actual behavior patterns to detect anomalies. Moreover, this proves to be an easy
method for detection as little computations are involved in the detection process. Data set
analysis through statistical methods is employed to improve the accuracy of detection systems.
Besides, the frequency of an attack can be used to infer the vulnerability of a certain section of a
system. Moreover, these methods aid in the application of detection and attack prevention
remedies. Furthermore, the application of mathematical approaches and models, improves the
accuracy of a detection and the ease with which the attack is alleviated.
According to Lunt (1993) systems could be secured through checking for network
irregularities. Lunt studied the different Intrusion Detection Systems that were available at the
time. Lunt’s research focused on audit trail analysis with an emphasis on anomaly-based
intrusion detection capabilities. The system was based on the principle of deviation, anomaly.
The system was designed to analyze data that goes through a network. The data was expected to
meet some set standards that it could be compared against to check for compliance. If the data in
the network could not meet these standards then, there were high chances that the transmitted
data could have been through an intrusion. Deviations that were detected could then be checked
for confirmation of the intrusion. In the end, he proposed an analysis of the different data that
goes through the network by comparing it to a precompiled signature database. Lunt further
argued that the application of data set analysis was very crucial in determining the accuracy of a
detection. The evaluation of the deviation of either individuals or the data flowing in the network
is a valuable approach towards countering of system attacks. Moreover, the detection may be
based on the collection and evaluation of various data in the network. The analysis is achieved
through mathematical models aimed at presenting data in a form that can be easy to analyze and
According to Sommer and Feldmann (2002). The two researchers looked at the various
methods of analyzing network traffic, pitting SNMP against Cisco’s NetFlow and packet-level
analysis. The two research helps in the reduction of storage, based detections as this adds
complexities on to the already existing security problem. Therefore, the analysis of data as it
flows in the network would be much easier, more convenient and efficient as compared to
storage based detection. From their research, they arrived at the conclusion that while NetFlow
had significant benefits over the other methods of analyzing network data, it resulted in almost
unmanageable data stores. Although storage based detections were easier to implement they
required a lot of complex configuration to avoid unmanageable levels of data. This limitation
resulted in the need for a computational detection system based on mathematical concepts rather
than storage based. The researchers further argue that networks should be operated with accurate
traffic statistics so as to enhance security. The Frequently used information sources are SNMP,
flow-level data, for example, Cisco’s NetFlow, or packet level data. The first data source delivers
low volume, non-application specific data. The last one offers high volume, information, and
application explicit data. NetFlow provides both: level of detail and volume. This approach is
more precisely interested in TCP accurately aggregated packet and connection summaries and
According to Jacobs and Rudis (2014) the use of data analysis to monitor and analyze
network traffic would benefit IT security. Relying on data patterns and signatures, will ensure
that the network data can be watched for any deviations from the norm as a frontline defense
against intrusions. Furthermore, they argue that information can be used to come up with an
alarm system that could notify the analyst about potential and actual threats. Furthermore, the
system could be configured to measure the degree of deviation, this was intended to help in
precision of the detection. The data patterns and signatures are used as the standard for this
system and therefore, makes the detection process more precise and convenient. The approach
relies on data patterns analyzed from the collected network data from which the detection can be
inferred. The data is scrutinized to check for any signatures that can provide evidence for a
security breach. Further, the network traffic is evaluated to check for any statistical deviations in
the systems’ operation. The statistical analysis provides reliable information on all the data
transferred in the network, therefore, the network can be secured irrespective of its size.
Furthermore, the approach gives detailed analysis of timely transmission on data in the network,
this not only helps in detection but also the timely countering of an attack.
According to Marty (2009) after recognizing the inherent challenges that arise as a result
of trying to secure increasingly complex networks. He proposed data analysis and visualization
as a possible solution for understanding the internals of the network structure so as to make sense
of any hidden patterns in the network results that may signify an imminent attack or one that was
in progress. Through the use of data visualizations, he proposed a way in which one could
identify hidden patterns of data that would enable him to identify emerging susceptibilities and
attacks, and respond decisively and proactively with countermeasures. Marty cited the relatively
higher success rate that was accompanied with having knowledge of the internals of the network
over the conventional method. The system’s success rate is attributed to its ability to decisively
counter attacks that may be launched against the system. First, the data from the network is
collected and then its stored, then, it is analyzed to provide a presentation of the analysis. The
visualization technique make is easy to detect network irregularities especially since the provided
graphics can be studied and inference drawn. Visualization tools may be used to detect a
probable attack that can cause damage or one that is in process. These two abilities make this
technique able to forecast an attack so as recommended remedies can be applied to stop such it
or make less destructive.
According to Keim et al. (2006), after researching the challenges in data analysis
especially on Information Technology architecture, and pointing out that the use of visuals could
significantly impact the network review process by making it possible to make sense of the
massive amounts of data such analysis typically produced. The authors contend that the use of
visualization technologies would make it easier to monitor and analyze data since visual data is
easier to interpret. This technique favors mathematical analysis as such precision is key in its
detection techniques. The aim of the system also being to help in the analysis of large amounts of
data transmitted through the network. This improved approach helps solve the problem on busy
traffic as all the data could be analyzed with little effort. Since the visualization technique may
be used to analyze large pieces of data, the detection systems based on this approach is suitable
in handling networks with heavy traffic. Furthermore, the analysis of visual data can be
beneficial in the review process of the network data sets, this benefits makes it possible to check
for certain possible network attack trends. Moreover, this technique is applied to aid in the
design of the information technology architecture inclusive of the network segments of the
system. The tools used in visualizing overall network characteristics may further be used to
analyze a section of the network.
Conti (2007) expressed a similar view on the value of visualization for handling the
information overload that resulted from collecting network data. Conti makes note of the fact
that network data is voluminous and that traditional analysis methods like the use of logs, packet
captures, alerts, and even binary files would not serve. Since the previous methods employed
take time and effort to analyze the generated data, the use of visual representations of the said
data will improve the data quality and value. Conti proposes the use of visual representation
methods and big data to ensure that effort that would be better spent on addressing other aspects
of the security perimeter are not wasted. Moreover, visualization saves on system storage since
the use of logs and other binary files is not employed, and provides enough information about the
current system status. Therefore, instead of regenerating data from a storage resource the
approach analyzes data as it flows through a network and gives instant feedback. The system
then stores the analyzed data to allow for more scrutiny by the security teams. Visualization
techniques are not restricted for use on large networks but rather can be used on small networks
According to Shiravi, Shiravi and Ghobani (2007), after carring out an analysis of the
different information security visualization tools available, making note of the fact that at
present, those willing to use the tools have to counter some challenges. Some of the problems the
authors identified included fine tuning the tools for better support of security-related data. The
authors then proceed to give a taxonomy of the existing methods while giving an analysis of their
individual strength and weaknesses. Among the frameworks they look at are Anubis, Cuckoo,
Joes Sandbox and ApiMon. Noting the unprecedented levels of sophistication displayed by
recent network intruders. The frameworks were analyzed to check for compatibility to current
networking tools. Furthermore, the question on whether the frameworks could withstand the
attacks launched and recover from the attacks is raised. The author’s analysis proves that, the
tools used to secure a network should be; flexible, robust, efficient and be able to recover after an
attack. If this standard cannot be met, then the network is left vulnerable to potential attacks.
Moreover, the tools should have strategic positioning and purposes so as to gain advantage over
possible attacks. The author also argues that the level of sophistication of attacker cannot be
countered by simply using security tools but rather by properly configuring them. Tools may not
be effective on their own but rather need well trained personnel to interpret the collected data and
infer a vulnerable or safe state of a system.
Collins (2014) pointed to the need for using data analysis to manage network traffic
since traditional methods of intrusion detection like log file analysis had failed. Pointing to the
vast stores of data that network traffic analysis produces, Collins contends that traditional means
of storing data like logs will not work. Thus, he proposes the use of big data systems like Hadoop
and Hive as well as Reduce. He notes the place of visualization and mathematical techniques to
make sense of the enormous volumes of data, but without getting into the specifics on the best
methods for data analysis. The visual analysis tools also make anomalies easier to identify and
correct. Collins divides the detection process into three distinct phases; first, capturing stage.
Since computers communicate only when they are connected then, to reveal the data sensors
must be installed in order to. The sensors can be in the network or host. Collins contend that such
tools need for sensor installation involve tools such as TCPdump. Second, storage of the
captured data, it involved the collection of the data from various sensors and storing them for
analysis. Collins proposes, SiLK and R as the tools for this task. The third part is devoted to
investigative statistical analysis to reveal the concealed information out of the stored data.
According to Adams and Heard, (2014), the question on network data and statistical
analysis is not a new idea in network security. Moreover, they assert that the rise of complex and
large networks results in both challenges and opportunities to statistical science on network
security. They further confirm that, tests made on network structure help to a lager extent in
understanding sophisticated inferential procedures require in the detection of an intrusion. This
statistical approach moreover, equips security analysts with the needed skills for statistical
analysis. The researchers further argue that, for precise and comprehensive data-set analysis of
large networks, classical inferential frameworks should be engaged. The scholars further argue
that, although the implementation of inferential frameworks on data set analysis may prove
helpful, the setting up of such frameworks may prove remarkably difficult. They further argue
that network attacks have increased the pressure to secure networks against unauthorized
intrusion. Therefore, the primary goal is to, monitor and analyze network traffic data, with the
intent of preventing and identifying, malevolent activity. The fundamental approach involves
representing a link between network nodes, therefore, graph analysis approaches appropriate.
Nevertheless, such approaches do not scale well to the difficulties of real network complications
since the timing aspect is not accounted for in the analysis.
Bass, T., 2000. Intrusion detection systems and multisensor data fusion. Communications of the
ACM, 43(4), pp.99-105.
Collins, M., 2014. Network security through data analysis: building situational awareness. “
O’Reilly Media, Inc.”.
Jacobs, J. and Rudis, B., 2014. Data-driven Security: Analysis, Visualization and Dashboards.
John Wiley & Sons.
Lee, C.P., Tros, J., Gibbs, N., Beyah, R. and Copeland, J.A., 2005, October. Visual firewall: real-
time network security monitor. In Visualization for Computer Security, 2005.(VizSEC
05). IEEE Workshop on (pp. 129-136). IEEE.
Lunt, T.F., 1993. A survey of intrusion detection techniques. Computers & Security, 12(4),
Marty, R., 2009. Applied security visualization (pp. 21-65). Upper Saddle River: Addison-
Newman, M.E. and Leicht, E.A., 2007. Mixture models and exploratory analysis in networks.
Proceedings of the National Academy of Sciences, 104(23), pp.9564-9569.
Sommer, Robin, and Anja Feldmann. “NetFlow: Information loss or win?.” In Proceedings of
the 2nd ACM SIGCOMM Workshop on Internet measurment, pp. 173-174. ACM, 2002.
Heard, N. ed., 2014. Data Analysis for Network Cyber-Security. Imperial College Press.
Delivering a high-quality product at a reasonable price is not enough anymore.
That’s why we have developed 5 beneficial guarantees that will make your experience with our service enjoyable, easy, and safe.
You have to be 100% sure of the quality of your product to give a money-back guarantee. This describes us perfectly. Make sure that this guarantee is totally transparent.Read more
Each paper is composed from scratch, according to your instructions. It is then checked by our plagiarism-detection software. There is no gap where plagiarism could squeeze in.Read more
Thanks to our free revisions, there is no way for you to be unsatisfied. We will work on your paper until you are completely happy with the result.Read more
Your email is safe, as we store it according to international data protection rules. Your bank details are secure, as we use only reliable payment systems.Read more
By sending us your money, you buy the service we provide. Check out our terms and conditions if you prefer business talks to be laid out in official language.Read more