Friday, 27 October 2023

Sunday, 22 October 2023

Friday, 20 October 2023

Tuesday, 17 October 2023

New top story on Hacker News: Ask HN: When LLMs make stuff up, call it 'confabulating', not 'hallucinating'

Ask HN: When LLMs make stuff up, call it 'confabulating', not 'hallucinating'
10 by irdc | 8 comments on Hacker News.
In humans, a hallucination is formally defined as being a sensory experience without an external stimulus. LLMs have no sensors to experience the world with (other than their text input) and (probably) don’t even have a subjective experience in the same way humans do. A more suitable term would be confabulation, which is what humans do when due to a memory error (eg. due to Korsakoff syndrome) we produce distorted memories of oneself. This may sometimes sound very believable to outsiders; the comparison with LLMs making stuff up is rather apt! So please call it confabulating instead of hallucinating when LLMs make stuff up.

Saturday, 14 October 2023

Thursday, 12 October 2023

New top story on Hacker News: Ask HN: How do you tell if something has a keylogger implemented

Ask HN: How do you tell if something has a keylogger implemented
9 by laserstrahl | 5 comments on Hacker News.
I know a game where people say it has a keylogger implemented. Is there any way to tell if the program is not open source? Thanks

Thursday, 5 October 2023

A night inside a strip club near Ukraine’s front lines



from Yahoo News - Latest News & Headlines https://ift.tt/nDNOMIu

Sunday, 1 October 2023