Shaping AI-optimised networks and enhancing security

0


As AI applications evolve, they place greater demands on network infrastructure, particularly in terms of latency and connectivity.

Supporting large-scale AI deployments introduces new issues, and analysts predict that AI-related traffic will soon account for a major portion of total network traffic. The industry must be prepared to handle this surge effectively. F5 is adapting its solutions to manage the complexity of AI workloads, and its technology now includes real-time processing of multimodal data.

Kunal Anand, Chief Technology and AI Officer at F5 (Source – F5)

AI presents both opportunities and risks in security, as it has the capability to enhance protection while also enabling AI-driven cyber threats. Collaboration among hyperscalers, telcos, and technology companies is critical for establishing AI-optimised networks. Collaboration and innovation continue to change the AI networking landscape, and F5 is dedicated to driving progress in this area.

Ahead of AI & Big Data Expo Europe, Kunal Anand, Chief Technology and AI Officer at F5, discusses the company’s role and initiatives to stay at the forefront of AI-enabled networking solutions.

AI News: As AI applications evolve, the demands on network infrastructure are becoming more complex. What key challenges does the industry face regarding latency and connectivity in supporting large-scale AI deployments?

Anand: F5 discovered that AI has drastically transformed application architectures. Some companies are investing billions of dollars in AI factories – massive GPU clusters – while others prefer cloud-based solutions or small language models (SLMs) as less expensive alternatives.

Network architectures are evolving to address these challenges. AI factories operate on distinct networking stacks, like InfiniBand with specific GPUs like the H100s or NVIDIA’s upcoming Blackwell series. At the same time, cloud-based technologies and GPU clouds are advancing.

A major trend is data gravity, where organisations’ data is locked in specific environments. This has driven the evolution of multi-cloud architectures, allowing workloads to link with data across environments for retrieval-augmented generation (RAG).

As RAG demands rise, organisations face higher latency because of limited resources, whether from heavily used data stores or limited sets of GPU servers.

AI News: As analysts predict AI-related traffic will soon make up a significant portion of network traffic. What unique challenges does this influx of AI-generated traffic pose for existing network infrastructure, and how do you see the industry preparing for it?

Anand: F5 believes that by the end of the decade, most applications will be AI-powered or AI-driven, necessitating augmentation across the network services chain. These applications will use APIs to communicate with AI factories and third-party services, access data for RAG, and potentially expose their own APIs. Essentially, APIs will be the glue holding this ecosystem together, as analysts have suggested.

Looking ahead, AI-related traffic is expected to dominate network traffic as AI becomes increasingly integrated into applications and APIs. As AI becomes central to practically all applications, AI-related traffic will naturally increase.

AI News: With AI applications becoming more complex and processing multimodal data in real time, how is F5 adapting its solutions to ensure networks can efficiently manage these dynamic workloads?

Anand: F5 looks at this from many angles. In the case of RAG, when data – whether images, binary streams, or text – must be retrieved from a data storage, the method is the same regardless of data format. Customers often want quick Layer 4 load balancing, traffic management, and steering capabilities, all of which F5 excels at. The company provides organisations with load balancing, traffic management, and security services, guaranteeing RAG has efficient data access. F5 has also enabled load balancing among AI factories.

In some cases, large organisations manage massive GPU clusters with tens of thousands of GPUs. Since AI workloads are unpredictable, these GPUs may be available or unavailable depending on the workload. F5 ensures efficient traffic routing, mitigating the unpredictability of AI workloads.

F5 improves performance, increases throughput, and adds security capabilities for organisations building AI factories and clusters.

AI News: As AI enhances security while also posing AI-driven cyber threats, what approaches is F5 taking to strengthen network security and resilience against these evolving challenges?

Anand: There are many different AI-related challenges on the way. Attackers are already employing AI to generate new payloads, find loopholes, and launch unique attacks. For example, ChatGPT and visual transformers have the ability to break CAPTCHAs, especially interactive ones. Recent demonstrations have shown the sophistication of these attacks.

As seen in past security patterns, every time attackers gain an advantage with new technology, defenders must rise to the challenge. This often necessitates reconsidering security models, like shifting from “allow everything, deny some” to “allow some, deny everything.” Many organisations are exploring solutions to combat AI-driven threats.

F5 is making big investments to keep ahead of AI-driven threats. As part of its F5 intelligence programme, the company is developing, training, and deploying models, which are supported by its AI Center of Excellence.

Earlier this year, F5 launched an AI data fabric, with a team dedicated to developing models that serve the entire business, from policy creation to insight delivery. F5 feels it is well placed to face these rising issues.

AI News: What role do partnerships play in developing the next generation of AI-optimised networks, especially between hyperscalers, telcos, and tech companies?

Anand: Partnerships are important for AI development. The AI stack is complex and involves several components, including electricity, data centres, hardware, servers, GPUs, memory, computational power, and a networking stack, all of which must function together. It is unusual for a single organisation to oversee everything from start to finish.

F5 focuses on establishing and maintaining the necessary partnerships in computation, networking, and storage to support AI.

AI News: How does F5 view its role in advancing AI networking, and what initiatives are you focusing on to stay at the forefront of AI-enabled networking solutions?

Anand: F5 is committed to developing its technology platform. The AI Data Fabric, launched earlier this year, will work with the AI Center of Excellence to prepare the organisation for the future.

F5 is also forming strong partnerships, with announcements to come. The company is excited about its work and the rapid pace of global change. F5’s unique vantage point – processing worldwide traffic – enables it to correlate data insights with industry trends. F5 also intends to be more forthcoming about its research and models, with some open-source contributions coming soon.

Overall, F5 is incredibly optimistic about the future. The transformative impact of AI is remarkable, and it is an exciting time to be part of this shift.

(Image by Lucent_Designs_dinoson20)

Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with other leading events including Intelligent Automation Conference, BlockX, Digital Transformation Week, and Cyber Security & Cloud Expo.

Explore other upcoming enterprise technology events and webinars powered by TechForge here.

Tags: artificial intelligence, cybersecurity, network



Source link

You might also like
Leave A Reply

Your email address will not be published.