Forrester logo
Ivalua - Banner

Why You Shouldn’t Ignore Orchestration in Your Generative AI Approach


By 

  |  

  |  

Discussions regarding Generative AI typically focus on the Large Language Models (LLMs), or the use cases they are applied to. This is understandable, as it is rapid advances in LLMs that have made Generative AI capable of automating so many tasks and bringing us meaningful insights. 

They are also what most of us first experimented with in our personal lives, typing prompts into Chat GPT or alternate LLMs, to create content or research some topic. LLMs are the engine behind the promise of Generative AI.

But when it comes to enterprise usage, LLMs alone are not sufficient, just like having an automotive engine won’t get you anywhere without the other components of a car. In an enterprise environment, you need applications that users access to apply Generative AI to their business needs. 

You also need data, specific to the function, that can be used together with information on the internet to provide relevant results from LLMs. In procurement, this includes internal data such as suppliers or purchases, and documents such as contracts or questionnaires. 

And you need something to connect it all and make sure the parts work together optimally. That is precisely what orchestration does. It is the glue in an enterprise Generative AI architecture. Yet it remains rarely discussed and poorly understood.

Key Functions of Orchestration Are:

Processing prompts

Some prompts will be concise and focused but others, especially for more complex use cases, will be lengthy with various elements. Sometimes these can be too long or convoluted for an LLM to properly process. Other times you would get a better response by asking several questions and piecing the LLM responses together. Effective orchestration assesses prompts and decides if they should be broken up, and which LLM(s) to route them to for the best output. As use cases become more advanced and LLMs become more specialized with relative strengths and weaknesses, this will rapidly become even more important.

Pulling relevant information

Depending on the use case, you may need to feed the LLM information from internal systems to combine with that available on the web. For example, a use case that discovers new suppliers that meet certain minimum criteria to invite to a sourcing event should consider the suppliers you are already working with before researching new options online. Orchestration layers can access relevant information and include it in the final queries sent to LLMs.

Controlling quality

Errors and hallucinations (realistic sounding but fabricated responses) are a legitimate concern when using Generative AI. Advanced orchestration can mitigate this risk by checking responses prior to returning to users.

Given these and other functions, orchestration capabilities play a key role in determining output quality as well as the use cases that can be deployed. Since most applications leverage broadly available 3rd party LLMs such as Chat GPT, the biggest differences in potential value are often driven by the capabilities of the orchestration layer. As such, it is a critical element to evaluate when selecting Generative AI solutions for an enterprise. 

Key Capabilities to Assess Include:

Data. Can orchestration access and process all relevant information for creating its queries to LLMs?

Prompt processing. Are prompts optimally crafted and broken up before being sent to the LLM(s)? Are prompts routed to the optimal LLM for a specific type of request? Can more than one LLM be leveraged? Can customers specify the LLM(s) to leverage based on internal policies?

Quality control. What capabilities ensure accurate responses? Can customers define confidence levels for specific use cases based on sensitivity? Are responses checked for accuracy, for example by prompting LLMs in different ways to check response consistency? Are external sources listed with links so users can verify responses if desired?

Conclusion

The orchestration layer is a crucial element of any enterprise Generative AI architecture, improving quality and expanding the range of possible use cases. As such, it should not be overlooked when selecting technology solutions. By assessing the above criteria, leaders will greatly increase the odds of realizing the promise of Generative AI for their organizations.

To read more detail about this and the other key success factors for Generative AI in procurement, download our guide.

Further Reading

Accelerating Adoption of Generative AI in Procurement 

Transforming Procurement with Generative AI: A Practical Approach

Generative AI and the Future of Procurement: A Recap

AI in Procurement: The Ultimate Guide for Procurement Professionals

Alex Saric

Chief Marketing Officer

Alex has spent over 15 years of his career evangelizing Spend Management, shaping its evolution and working closely with hundreds of customers to support their Digital Transformation journeys. As CMO at Ivalua, Alex leads overall marketing strategy and thought leadership programs. Alex also spent 12 years at Ariba, first building and running the spend analytics business as General Manager. He then built and led Ariba’s international marketing team until successful acquisition by SAP, transitioning to lead business network marketing globally. Earlier, Alex was a founding member of Zeborg (acquired by Emptoris)where he developed vertical Procurement applications. He began his career in the U.S. Cavalry, leading tank and scout platoons through 2 combat deployments. Alex holds a B.S. in Economics from the U.S. Military Academy at West Point and an international M.B.A. from INSEAD. You can connect with Alex on Linkedin

Ready to Realize the Possibilities?