Full metadata record
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Tomasz Korbak | - |
| dc.contributor.author | Samuel R. Bowman | - |
| dc.contributor.author | Ethan Perez | - |
| dc.date.accessioned | 2023-10-10T15:26:46Z | - |
| dc.date.available | 2023-10-10T15:26:46Z | - |
| dc.date.issued | 2023 | - |
| dc.identifier.uri | http://hdl.handle.net/2451/69847 | - |
| dc.title | Pretraining Language Models with Human Preferences | en |
| Appears in Collections: | Machine Learning for Language Lab | |
Files in This Item:
| File | Description | Size | Format | |
|---|---|---|---|---|
| 2302.08582.pdf | Preprint | 1.37 MB | Adobe PDF | View/Open |
| pretraining-with-human-feedback-master.zip | Code | 142.74 kB | Unknown | View/Open |
Items in FDA are protected by copyright, with all rights reserved, unless otherwise indicated.