The Future Of AI: Verticalized Working Models

Steven Carlini, Vice President of Innovation and Data Center, Energy Management Business Unit, Schneider Electric.

getty

AI has the power and potential to revolutionize how work gets done across industries and businesses. To accomplish this revolution, AI will need to spread enterprise-wide in most organizations, enabled by more IT and data centers.

Historically, technology teams pushed AI. Today, business units within the enterprise are starting to pull AI rather aggressively for specific tasks or operations through verticalized AI models. General-purpose AI models work across a broad set of applications and can input and output a variety of data types, but they’re probably impractical for smaller, specialized use cases. Verticalized AI models require domain-specific data.

Many companies testing AI in pilot projects are starting to realize that integrating an entire model like a large language model (LLM) into their core systems to deliver fast and accurate decisions or output is very "IT heavy" and most likely overkill. These models are designed to take in extreme amounts of data—the more data, the more accurate the decision.

However, if your application depends on a finite amount of domain-specific data from a single or very limited number of sources, you can deploy a smaller, more focused vertical model. Vertical, in this case, means the model is designed to solve a specific problem or, alternatively, a particular use case or even a process improvement in a particular industry.

Take, for example, a restaurant located next to a sports venue that wants to fine-tune its menu choices and inventory for a specific night. Its AI model can be fed with not only historical data but also data from the local and visiting teams’ fan demographics (age, income level, cuisine preferences), the local weather, any competing events in the area and the marketing efforts or spend.

Diagnose And Resolve Issues Faster
By leveraging verticalized AI, organizations can move faster than ever before, provide more automated options, optimize resource allocation and—the most popular use case today—provide predictive analytics for maintenance to minimize downtime.

These dedicated smaller models that are processing lower amounts of data will need a smaller and less powerful IT stack as well as less power backup and cooling. This IT may use far fewer GPU accelerators, for example, or possibly need only high-powered CPUs. The IT stack configuration needs will depend not only on the quantity of the data but also on the speed and accuracy that the application requires.

Obviously, even if you want to, it’s not practical to implement AI into all of your critical systems and process workflows immediately. A more pragmatic approach would be to take stock of existing applications in your business value streams. Evaluate the existing capabilities and see if there are opportunities for efficiency or effectiveness improvements, plus the opportunity to automate down the road.

It may make sense to start with the simplest process or the one that can easily show improvements such as cost savings, top-line growth or improved customer experience, for example.

Where To Host A Vertical AI Application
The leading options for hosting are in the cloud, deploying in-house and, most likely, at the edge. Before coming to a decision, companies must weigh many factors—accuracy, cost, control, availability, scalability/growth, ease of adoption, security, cyber safety, time to value and others.

Many verticalized AI applications will use high-definition video as an input. Think of a high-definition camera analyzing wear on a piece of manufacturing equipment or a mesh of cameras monitoring traffic flow in an urban area. These applications will need to be at the edge. Many cloud providers know this and have a strategy for migration to the edge.

If your choice is to build at the edge or outsource to the cloud at the edge, the advantages of outsourcing are mostly convenience and speed. For companies that want to deploy their own edge IT, the main advantages are mostly about control over data privacy and sovereignty, cost, performance, ethical and responsible guardrails, etc.

A Race Or A Marathon?
We are at the very beginning of this massive AI wave. Edge AI is most likely less than 5% of the still-nascent data center edge buildout, and cloud edge buildout is only beginning. With the need to greatly improve the efficiency and automation of business processes across every industry, many companies will take the vertical AI path.

The accompanying data center and network buildout to enable this technology wave has taken a back seat to the incredible training AI buildout in hyperscale data centers—for now. The reality is that vertical AI will permeate all industries and processes in time. The big question is: Will it be analogous to the incredible sprint race we are seeing in training AI, or will it be more of a massive marathon?
Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?

{Categories} _Category: Takes{/Categories}
{URL}https://www.forbes.com/sites/forbestechcouncil/2024/07/09/the-future-of-ai-verticalized-working-models/{/URL}
{Author}Steven Carlini, Forbes Councils Member{/Author}
{Image}https://imageio.forbes.com/specials-images/imageserve/668c262277946ce7e131d209/0x0.jpg?format=jpg&height=600&width=1200&fit=bounds{/Image}
{Keywords}Innovation,/innovation,Innovation,/innovation,technology,standard{/Keywords}
{Source}POV{/Source}
{Thumb}{/Thumb}

Exit mobile version