So the LLM synthesizes a result from various sources (that may be made up or misquoted) and then you "double-check" the sources?
Sounds like the LLM is using you as research assistant.
So the LLM synthesizes a result from various sources (that may be made up or misquoted) and then you "double-check" the sources?
Sounds like the LLM is using you as research assistant.
@codinghorror You perhaps do. But 99.99% of people using these tools don’t.
@codinghorror You can’t hack yourself to magically change how they work—they will always be inaccurate and make things up. ¯\_(ツ)_/¯
@codinghorror Just like with other generative outputs, in summaries LLMs often omit important things and also often confabulate things that aren’t there.
LLMs don’t (and can’t) understand what they’re summarizing. They’re purely statistical and just happen to sometimes make passable summaries but often they don’t (this is also true for e.g. when they’re used for translations).
Bragging about how many Substack newsletters you’re subscribed to is definitely a choice
"How can you get more accurate answers from an LLM?” is like asking "how can you get more love from a prostitute?"
That's just not the service provided.
who the fuck needs all these summaries
Just fwiw the best way to exclude your open source code from being ingested for "AI" training is to load it up to the motherfucking brim with fucking 4-letter word comments #fuck /* fuck */
Anyone from Microsoft following me?
I’m going to leave GitHub (I’ve been a user since the beta days) and will incessantly ask everyone I know to leave it as well if you don’t roll back the “can’t opt out” force-feeding of “AI” nonsense.
You should also have no integration or collaboration with Xai at all, ever.
Make your management have some sense, this is hostile and disgusting and wrong.
”This discourse has two sides; one is objectively true and the other one verifiably false. Therefore we must listen to both sides equally.”
—The Centrist Credo
IMO the reason why people yearn for tools that generate code is that programming is broken—everything now is giant layer cakes of huge, complex and intransparent frameworks designed by and for large teams in giant tech companies.
The same tech companies that flooded programming with overly complex tools, endless toolchains, new programming languages du jour every few years, required backwards-compatibility breaking updates and mandatory design overhauls are now selling you “AI” to generate code for the mess they made.
The most widespread and arguably most damaging logical fallacy is:
"Many people use [thing], therefore it must be good and useful"
a thing of beauty
The more I think about it the clearer it is that the introduction of Google Chrome in 2008 was when the Web started to turn sour.