Next Gen Intrusion Prevention System
Next Gen Intrusion Prevention System

If the new deployment involves next-generation firewalls or Intrusion Prevention Systems (IPS), this decision can have added challenges. The sophisticated high-performance network and security devices within these infrastructures require a more comprehensive approach to testing and validation than traditional testing tools can provide. Today’s devices use deep packet inspection (DPI) to examine traffic in ways that legacy testing tools were never designed to validate.


Rethink testing around repeatable, quantitative principles. Create a plan for stressing each device under test (DUT) with realworld application, attack and malformed traffic at heavy load. Doing this is not as simple as taking the older, ad hoc approach to testing and then injecting authentic traffic. Instead, the entire plan should embrace a standardized methodology and scientific approach to eliminate guesswork. That means the plan must use repeatable experiments that yield clear, quantitative results to accurately validate the capabilities of DPI-enabled devices. Previously, IT professionals have lacked the precision equipment necessary to enforce consistent standards across testing processes. Today, however, they have access to superior testing products that create authentic network traffic and capture precise measurements of its effects, even for complex environments.
Use standardized scores to separate pretenders from contenders.It is relatively straightforward to use standardized scoring methods to pare down a long list of candidate devices without performing comprehensive validation of each product. These scores quickly eliminate devices from consideration that clearly do not meet an organization’s needs. The resulting score is presented as a numeric grade from 1 to 100. Devices may receive no score if they fail to pass traffic at any point or if they degrade to an unacceptable performance level. The Resiliency Score1 takes the guesswork and subjectivity out of validation and allows administrators to quickly understand the degree to which system security will be impacted under load, attack and real-world application traffic.
Test final contenders with individual test scenarios that mirror the production environment. True validation requires an accurate understanding of the application, network and security landscape in which devices will be operating. Review the infrastructure’s traffic mix and the mixes of service providers before designing individual tests; this will ensure that the testing equipment reflects the latest versions and types of application traffic that traverse the network. However, generating real traffic is not enough. The traffic mix used also must be repeatable yet random. Randomization makes test traffic behave like real-world traffic, creating unexpected patterns that force DUTs to work harder. Creating repeatable, random traffic requires testing equipment that uses a pseudorandom number generator (PRNG) to set a seed value that creates standardized, random traffic.


IT departments should look for information that goes far beyond the performance and security features that can be read off a data sheet. They should be measuring the security and stability of their IPSs based on real-world conditions, not generic conditions in a lab. Another common mistake that IT departments make is relying on test lab reports to make informed decisions. Labs often perform device testing in isolation, without regard to the unique environments of purchasers. Also, test lab reports are often funded by device manufacturers, which inevitably raise objectivity questions. Ultimately, IT departments choose a firewall vendor, but they never feel as though they truly understand how well the device is going to work. Will it actually recognize the difference between applications, even at a granular level, such as the difference between Facebook traffic and Facebook messaging traffic? Putting that next-gen firewall/IPS through proper context-aware testing is the only way to be confident it will performas advertised.
If IT leaders follow these technical recommendations and avoid making common mistakes, they can select the right products to meet their business objectives, improve infrastructure planning and resiliency by understanding device capabilities and save up to 50 percent on IT investments. This also eliminates hundreds of man-hours in post-purchase configuration and tuning, and gives purchasers advanced insight into device capabilities, enabling them to configure devices appropriately in order to avoid surprises and delays.
TAHNK YOU U for so informative blog vist us for CCTV bus solutions in dubai
ReplyDelete