May 19, 2020

Deep protection and the intelligent network

Data Centres
Guy Matthews
4 min
Deep protection and the intelligent network

High performance networks need security to match. Guy Matthews, Zonic Group considers some recent advances in the field.

Today’s high-performance networks, and the hyperscale data centres they serve, are the backbone of the global economy. The security that protects these assets plays a crucial role in minimising the risks posed to essential data by hackers and malware, and safeguards the cloud-based applications and services we ultimately all rely on.

But the level at which these networks operate can make it hard to find security solutions that match. “As network speeds gather pace, particularly in the data centre where you have 100G Ethernet at some connection points and 400G Ethernet at others, the challenge is to scale the performance of the security to keep up, particularly at the edge of the network where it is connecting to the public Internet,” explains Bob Wheeler, principal analyst with independent consulting firm the Linley Group.

While relatively straightforward software-based measures are enough for most enterprise security needs, a different order of protection is required when it comes to looking after the assets of network operators, says Wheeler: “High performance networks with very high speed network connections need protection in real time, and so depend on innovations at the very high end of the network security appliance market,” he adds. “They need firewalls that look deep inside packets to inspect the content, as well as appropriate intrusion prevention and detection systems and anti-virus scanning.”

A number of vendors have been at work developing solutions that support this kind of deep packet inspection, based around advances in regular expression (RegEx) technology. While basic RegEx solutions have been in use for decades to help search high volumes of text, high-performance RegEx processing engines are the backbone of a new generation of intrusion detection systems and next-generation firewalls designed to scan packets of data for patterns that indicate an attack or the appearance of malware. Security appliances most commonly implement RegEx searches in software. A leading example of development in this area is Intel which in 2013 acquired Hyperscan. The resulting software-based solution has been open sourced and has since proved successful.

However, at faster speed like 100GbE, software-based screening cannot keep up, so others have taken a hardware-led approach, such as vendor Broadcom which integrated a RegEx processor into some of its chips. Cavium, now owned by Marvell, also developed products in this area. Perhaps the current leader in the hardware-based RegEx engine space is Belfast-based Titan IC. Its cutting-edge network intelligence technology and analytics accelerators are designed to search and generate meaningful insights from data within cloud, storage, and network environments. The company’s RXP acceleration engine is seen by many as the industry benchmark for high-speed complex pattern matching, real-time Internet traffic inspection and the detection of strings, keywords and malware using RegEx. 

“What is unique about Titan IC is that they have developed a hardware-based RegEx engine which is highly parallel and able to scale to the performance needed,” says Wheeler. “Similar designs have had difficulty scaling and tend to break down when facing large rule sets or a lot of matches in the packet data. The other thing Titan has focussed on with some success is the ability to deliver more predictable performance with large databases.”

Wheeler says the performance of other hardware-based RegEx engines has tended to fall off with the bigger databases, generally due to memory limitations: “Makers of firewalls and intrusion detection systems, the potential customers for these engines, sometimes find it difficult to map their proprietary rule sets to the peculiarities of other hardware designs,” he explains.

Titan IC has just been acquired by Mellanox, another vendor of networking products, and the industry awaits next steps with interest. The fact that Mellanox itself is in the process of being acquired by NVIDIA makes prospects even more intriguing.

“Mellanox has acquired a unique piece of technology and a team that is very knowledgeable in this space, as well as a base in Belfast,” suggests Wheeler. “It has some heritage in the network security space as a result of past acquisitions. Titan is complementary. What Mellanox does with it in the context of the SmartNIC, their main business, remains to be seen. Will customers want to offload functions onto the Titan Reg-Eex engine? I don’t know. The forthcoming NVIDIA deal opens up some new possibilities when that closes. NVIDIA might be able to apply this technology in other areas such as artificial intelligence. There is also a possible synergy with Mellanox’s activity in high performance computing where they have long been offloading some of the data processing from the server and into the network, at the adapter or switch level.”

Kevin Deierling, Mellanox's senior vice president of marketing, says the company remains focussed on growth and innovation, regardless of the NVIDIA deal. He is excited about what he now thinks can be achieved in the network intelligence field with Titan IC on board: “By mining data in novel ways, and bringing artificial intelligence and machine learning into the mix, you will be able to discern information that would otherwise not have been obvious,” he says. “This has implications across a number of industries – including healthcare, pharma, finance, trading, government. We’re just scratching the tip of the iceberg at the moment. I see a tremendous opportunity to innovate by looking inside data packets even more intelligently.”

From Zonic Group

Share article

Jun 18, 2021

Intelliwave SiteSense boosts APTIM material tracking

3 min
Intelliwave Technologies outlines how it provides data and visibility benefits for APTIM

“We’ve been engaged with the APTIM team since early 2019 providing SiteSense, our mobile construction SaaS solution, for their maintenance and construction projects, allowing them to track materials and equipment, and manage inventory.

We have been working with the APTIM team to standardize material tracking processes and procedures, ultimately with the goal of reducing the amount of time  spent looking for materials. Industry studies show that better management of materials can lead to a 16% increase in craft labour productivity.

Everyone knows construction is one of the oldest industries but it’s one of the least tech driven comparatively. About 95% of Engineering and Construction data captured goes unused, 13% of working hours are spent looking for data and around 30% of companies have applications that don’t integrate. 

With APTIM, we’re looking at early risk detection, through predictive analysis and forecasting of material constraints, integrating with the ecosystem of software platforms and reporting on real-time data with a ‘field-first’ focus – through initiatives like the Digital Foreman. The APTIM team has seen great wins in the field, utilising bar-code technology, to check in thousands of material items quickly compared to manual methods.

There are three key areas when it comes to successful Materials Management in the software sector – culture, technology, and vendor engagement.

Given the state of world affairs, access to data needs to be off site via the cloud to support remote working conditions, providing a ‘single source of truth’ accessed by many parties; the tech sector is always growing, so companies need faster and more reliable access to this cloud data; digital supply chain initiatives engage vendors a lot earlier in the process to drive collaboration and to engage with their clients, which gives more assurance as there is more emphasis on automating data capture. 

It’s been a challenging period with the pandemic, particularly for the supply chain. Look what happened in the Suez Canal – things can suddenly impact material costs and availability, and you really have to be more efficient to survive and succeed. Virtual system access can solve some issues and you need to look at data access in a wider net.

Solving problems comes down to better visibility, and proactively solving issues with vendors and enabling construction teams to execute their work. The biggest cause of delays is not being able to provide teams with what they need.

On average 2% of materials are lost or re-ordered, which only factors in the material cost, what is not captured is the duplicated effort of procurement, vendor and shipping costs, all of which have an environmental impact.

As things start to stabilise, APTIM continues to utilize SiteSense to boost efficiencies and solve productivity issues proactively. Integrating with 3D/4D modelling is just the precipice of what we can do. Access to data can help you firm up bids to win work, to make better cost estimates, and AI and ML are the next phase, providing an eco-system of tools.

A key focus for Intelliwave and APTIM is to increase the availability of data, whether it’s creating a data warehouse for visualisations or increasing integrations to provide additional value. We want to move to a more of an enterprise usage phase – up to now it’s been project based – so more people can access data in real time.


Share article