![](https://sopuli.xyz/pictrs/image/156e13a6-4667-4631-920a-ee5f4feb737b.webp)
![](https://fry.gs/pictrs/image/c6832070-8625-4688-b9e5-5d519541e092.png)
“former”
“former”
Do people outside of tech care?
True. How could it be a free market if corporations are not allowed to form a cartel and agree on a price for a product that is literally vital for many people?
Does GDPR apply to stackoverflow? Since my data there probably does not identify me as a person?
Some other countries have the decency to add something like “of the USA” to the name. But this being the USA, of course he is just “the” president.
Imagine if twitter some day opens its gates and starts federating with everyone. Then the musk takeover would have actually improved the world. Even though it hurt his purse a lot, but that is also an improvement I believe.
Ah I see. But it would be great, the web interface has it as well.
Yeah this is really scary. Maybe the golden thing on the top right is supposed to be the marking? But this is clearly not clearly marked. Should be easy to sack in a court.
This may be on purpose though. They communicate to their future shareholders that they will do everything - legal and illegal - to make profit and pay dividends.
Brain melting
Unfortunate.
How old is he again? Will he die soon?
Good. Now make it be uninstalled by default.
Ah, the snake oil turned out to be poisonous.
Also, reading the article, the immediate practical implications of this improvement are almost nonexistent. This is a theoretical breakthrough, that may or may not lead to further theoretical breakthroughs that may or may not be practically more relevant.
Certainly important research, but nothing that AI people (or any other scientists) must celebrate. Feeds the AI hype though.
Maybe this fits better into a car community
If you want fast file sync between computers, use syncthing
So they are poisoning GitHub copilot?
When I first saw this, I thought it was by [email protected]
I didn’t read more than the abstract. It sounds like they are arguing that hallucinations are inevitable because the LLM cannot know everything. But wouldn’t it be enough for the LLM to know what it knows, and therefore know what it does not know?
This feels unreal.