News

Accelerating AI Innovation at the Edge

Tags

IT modernization is unlocking data at the edge, powering analytics, artificial intelligence (AI), and generative AI (GenAI). It is connecting data directly to use cases and business outcomes, driving profitability and accelerating competitive advantage, experts noted in a recent discussion about how IT leaders are bringing AI to their data at the edge.

“Edge might be one of the best places to start with AI because that’s where most of the data is originating,” Nicholas Brackney, senior consultant, product marketing in the Dell Technologies generative AI team, noted in the webinar discussion. “It’s where customers and employees are interacting with your business. It’s where there are going to be a ton of use cases … edge is one of the best investments you can make.”

The edge involves many locations, little on-site IT support, and bandwidth and temperature control challenges. Workloads at the edge support a wide range of use cases, from manufacturing quality control to smart grid to distributed retail operations.

Organizations have to consider how to manage, update, and secure the wide diversity of hardware, software, and devices at the edge, said Chhandomay Mandal, director, solutions marketing, Dell Technologies.

“All of these things lead to a challenging environment at the edge that is vastly different from a data center or a multi-cloud environment. Now you bring in AI, and it adds to the mix,” he said.

A MeriTalk survey of IT decision-makers, conducted in partnership with Dell Technologies, Microsoft, and NVIDIA, found that challenges of implementing AI at the edge include cybersecurity, data management, legacy integration, and the workforce skills gap. It also revealed that AI leaders are nearly twice as likely as other organizations to say they’ve implemented or expanded edge computing infrastructure.

To effectively implement AI at the edge, organizations need AI-enabled infrastructure – and a solution that supports virtualization, automates AI workflow tasks, and manages infrastructure effectively at scale.

The Dell AI Factory with NVIDIA provides a full-stack, virtualized edge solution that helps create models and deploy models securely and consistently at scale, optimized for AI, Mandal said. The goal is to bring

AI as close as possible to where data resides to minimize latency, lower costs, and maintain data security by keeping sensitive information within a controlled environment.

According to Brackney, when organizations started on their GenAI journeys within the last few years, many IT leaders thought they would need to build their own models. Then there was an explosion of open-source models followed by techniques like model augmentation with retrieval-augmented generation (RAG).

AI factories support a wide variety of AI use cases with end-to-end validation to support the entire AI lifecycle, from inferencing and RAG to model tuning and model development and training.

“The good news is that everyone can participate in this. If you have developers, you can do AI, and you can do AI with your data,” he said.

For more insight, view the webinar on demand.