- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Avram Piltch is the editor in chief of Tom’s Hardware, and he’s written a thoroughly researched article breaking down the promises and failures of LLM AIs.
Avram Piltch is the editor in chief of Tom’s Hardware, and he’s written a thoroughly researched article breaking down the promises and failures of LLM AIs.
No, repeated extrapolation results in eventually making everything that ever could be made, constant interpolation would result in creating the same “average” work over and over.
The difference is infinite vs zero variety.
Fun fact, an open interval is topologically isomorphic the the entire number line. In practice they’re often different but you started talking about limits (“eventually”), where that will definitely come up.