Things should have known before attending to Microsoft AI Day, Thailand (1 May 2024)

Thongchan Thananate
2 min readMay 1, 2024

be prepared

Microsoft Fabric

Microsoft Fabric is an all-in-one analytics solution for enterprises. It covers everything from data movement to data science, real-time analytics, and business intelligence.

Components:

  • Data Lake: A storage repository that holds vast amounts of raw data in its native format.
  • Data Engineering: Provides a Spark platform for large-scale data transformation.
  • Data Integration: Facilitates seamless data movement and integration.
  • Data Science: Supports building and deploying machine learning models.
  • Real-Time Analytics: Enables real-time insights from streaming data.
  • Power BI Integration: Integrates with Power BI for interactive visualizations.

Advantages:

  • Simplifies analytics needs by offering an integrated environment.
  • Centralized administration and governance.
  • Unified data lake for preferred analytics tools.
  • Source

Vector Search

Vector search uses mathematical vectors to represent and efficiently search through complex, unstructured data. It focuses on discovering linked concepts rather than just keywords.

Working Principle:

  • Represents data points as vectors in a high-dimensional space.
  • Calculates similarity between query vectors and possible vector paths.
  • Uses techniques like cosine similarity or Euclidean distance.

Benefits:

  • Enhanced semantic relationships and contextual meaning.
  • Efficiently finds similar items in datasets.
  • Source

RAG (Retrieval-Augmented Generation)

RAG optimizes large language models (LLMs) by referencing authoritative knowledge bases outside their training data. It supplements prompts with external information.

Use Cases:

  • Improves factual reliability without extensive model fine-tuning.
  • Enhances LLM output for specific domains.

Components:

  • Prompt Engineering: Skillful creation of text prompts.
  • Fine-Tuning: Adapting pre-trained LLMs for specific use cases.
  • RAG (Knowledge Augmentation): Incorporating external sources into prompts.
  • Source

LLMOps (Large Language Model Operations)

LLMOps manages the lifecycle of LLMs and LLM-powered applications.

Key Concepts:

  • Collaboration Environment: Provides tools for AI developers and IT administrators.
  • Prompt Engineering, Fine-Tuning, and RAG are part of LLMOps.
  • Central Setup and Management: Security configuration, cost control, and governance.
  • Source

Azure AI Studio

Azure AI Studio is a generative AI development hub. It provides tools for building, deploying, and managing AI applications.

Capabilities:

  • Collaboration Environment: Allows teams to build and manage AI applications.
  • Prompt Engineering, Fine-Tuning, and RAG can be used within Azure AI Studio.
  • Central Management: Security settings, project organization, and governance controls.
  • Source

Prompt Flow

Prompt Flow involves skillfully creating text prompts to guide LLMs toward producing desired outputs. It’s crucial for achieving accurate and contextually relevant responses.

  • Techniques: Few-shot, chain-of-thought (CoT) prompting.
  • Use Cases: Interaction with LLMs via API calls or user-friendly platforms.
  • Source

--

--

Thongchan Thananate

People might laugh at it or call it foolish logic, but that’s enough for me. That’s what romanticism is about!