Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I did my own experiment with https://chat.openai.com/ recently.

I asked it to tell me about myself, based on my GitHub profile. Its response was detailed, well written, and wrong. It told me that I had developed several tools that I could very plausibly have developed -- but I didn't. In particular, it told me that I had written something called "wgrep", a version of grep for Windows that works with Windows file formats and binary files. That's just the kind of thing I might have done, but it doesn't exist. (GNU grep works well on Windows.)

When I asked it when I had worked at one of my previous employers, it said it consulted by Linkedin profile, but it got the dates complete wrong. It said that I had worked on several projects -- all of which are things that interest me, but none of which I actually worked on.

If a human came up with this, I'd say they were lying, but ChatGPT doesn't have the awareness necessary to lie. The closest analogy I can think of is a reckless disregard for the truth.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: