阿升

V1

2023/04/11阅读:60主题:默认主题

《大语言模型理解》2022课程

  普林斯顿陈丹琦最新《大语言模型理解》2022课程!全面讲述BERT、GPT、T5等大模型,附Slides。将讨论它们的技术基础(BERT、GPT、T5模型、专家混合模型、基于检索的模型)、新出现的功能(知识、推理、少样本学习、上下文学习)、微调和适应、系统设计以及安全和伦理。

课件如下所示:
[1]lec01-COS 597G:Understanding Large Language Models.pdf
https://url39.ctfile.com/f/2501739-833024300-818536?p=2096 (访问密码: 2096)

[2]lec02-BERT (encoder-only models).pdf
https://url39.ctfile.com/f/2501739-833024301-7273d1?p=2096 (访问密码: 2096)

[3]lec03-Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer:The T5 Model.pdf
https://url39.ctfile.com/f/2501739-833024302-d96bad?p=2096 (访问密码: 2096)

[4]lec04-Language Models are Few-Shot Learners (GPT-3).pdf
https://url39.ctfile.com/f/2501739-833024306-2f8383?p=2096 (访问密码: 2096)

[5]lec05-Prompting for Few-shot Learning.pdf
https://url39.ctfile.com/f/2501739-833024307-f204c9?p=2096 (访问密码: 2096)

[6]lec06-Prompt as Parameter-Efficient Fine-Tuning.pdf
https://url39.ctfile.com/f/2501739-833024308-61567e?p=2096 (访问密码: 2096)

[7]lec07-Towards Understanding In-context Learning.pdf
https://url39.ctfile.com/f/2501739-833024315-4e6ab1?p=2096 (访问密码: 2096)

[8]lec08-Calibration of prompting LLMs.pdf
https://url39.ctfile.com/f/2501739-833024319-bfd849?p=2096 (访问密码: 2096)

[9]lec09-Chain of Thought Prompting for Large Language Model Reasoning.pdf
https://url39.ctfile.com/f/2501739-833024323-a33b08?p=2096 (访问密码: 2096)

[10]lec10-Language Models and Knowledge.pdf
https://url39.ctfile.com/f/2501739-833024324-d7fdff?p=2096 (访问密码: 2096)

[11]lec11-Minority Voices Filtered Out of Google Natural Language Processing Models.pdf
https://url39.ctfile.com/f/2501739-833024325-d8b071?p=2096 (访问密码: 2096)

[12]On the Opportunities and Risks of Foundation Models.pdf
https://url39.ctfile.com/f/2501739-833024335-6886f7?p=2096 (访问密码: 2096)

参考文献:
[1]COS 597G (Fall 2022): Understanding Large Language Models:https://www.cs.princeton.edu/courses/archive/fall22/cos597G/
[2]On the Opportunities and Risks of Foundation Models:https://arxiv.org/pdf/2108.07258.pdf

分类:

人工智能

标签:

自然语言处理

作者介绍

阿升
V1

吾爱DotNet(公众号)