跳到主要內容區塊
:::
A- A A+

博士後演講公告

:::

Enhancing Statistical Modeling Efficiency with the Power of Large Language Models

  • 2025-07-23 (Wed.), 14:00 PM
  • Auditorium, B1F, Institute of Statistical Science;The tea reception will be held at 13:40.
  • Online live streaming through Microsoft Teams will be available.
  • Mr. Chih-Yu Chang (張致語 研究助理)
  • 中央研究院統計科學研究所

Abstract

Large Language Models (LLMs) like ChatGPT have demonstrated strong capabilities in low-data regimes, making them attractive tools for guiding statistical models to make decisions sequentially. However, their lack of calibrated uncertainty and opaque internal mechanisms limits their reliability when used in isolation. In this work, we explore how LLMs can be strategically integrated with statistical models to improve sample efficiency and decision quality. Building on our recent framework, LLINBO, we demonstrate that LLMs can facilitate early exploration by leveraging contextual reasoning, while statistical surrogates, such as Gaussian Processes, ensure principled exploitation and convergence. We present this hybrid approach with applications in black-box optimization, including a real-world 3D printing case study.
線上視訊請點選連結

最後更新日期:2025-07-17 09:37
回頁首