Forecasting with LLM and GenAI in Microsoft Fabric
Today, we're tackling a topic you probably haven't seen before: how to forecast and make predictions in Microsoft Fabric using Generative AI and Large Language Models (LLMs).
While most videos focus on LLMs as assistants or copilots, I find that use case uninspiring and soon-to-be commoditized. Instead, I'm focused on logic intelligence—applying GenAI to core app logic, business, and data processing.
In this video, we'll cover:
Setting up the environment and data for predictions.
Using LangChain and Azure's OpenAI for data analysis.
Predicting restaurant tips based on customer and waiter demographics.
Comparing LLM-based predictions with traditional math-based methods.
We'll start with the setup, move to prompt creation, and show you how to leverage LLMs for accurate predictions.
If you're as tired as I am of hearing about assistants and copilots, and you're ready to see how GenAI can actually drive business outcomes, you're in the right place.
Remember to like, subscribe, and leave a comment if you have questions.