- From: Dave Raggett <dsr@w3.org>
- Date: Mon, 11 Mar 2024 18:08:03 +0000
- To: Timothy Holborn <timothy.holborn@gmail.com>
- Cc: public-cogai <public-cogai@w3.org>
Received on Monday, 11 March 2024 18:08:16 UTC
> On 11 Mar 2024, at 16:56, Timothy Holborn <timothy.holborn@gmail.com> wrote: > > https://g.co/gemini/share/f5e773916b42 A post on LLM model size: LLMs these days have hundreds of billions of parameters. There are techniques for reducing the computation cost for running the models. Pre-training involves truly vast amounts of data. Fine tuning for applications is less expensive, involving tens of thousands of examples. LLMs have little in common with human cognition despite being trained to mimic our language and art. The research challenge is how to close the gap, enabling smaller systems that can be widely deployed. I am not sure we’re quite ready for the AGI toaster* in Red Dwarf, but there will be lots of valuable applications, just as there are lots of humans but few geniuses. Dave Raggett <dsr@w3.org> * See: https://www.quotes.net/show-quote/67028 and https://www.youtube.com/watch?v=LRq_SAuQDec
Received on Monday, 11 March 2024 18:08:16 UTC