Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I noticed that all text looked garbled up when I had some lucid dreams. When diffusion models started to gain attention, I made the connection that text generated in generated images also looked garbled up.

Maybe all of those are clues that parts of the human subconscious mind operate pretty close to the principles behind diffusion models.



I also lucid dream occasionally. Very rarely things are very detailed, most often the colors and details are just as bleak and blurry and keep changing as these videos. I walk down a street, take a turn (or not), its almost guaranteed I can't go back to where I came from. I usually appreciate when I can track back the same path.


I've watched entire movies while lucid dreaming. I've listened to entire concerts play out, eaten entire meals with all 5 senses.

One of the more annoying parts of growing older was that I stopped lucid dreaming.


I don't think lucid dreaming is a requirement for this. Whenever I dream my environment morphs into another one, scene by scene, things I try to get details from, like the content of a text, refuse to show clearly enough to extract any meaningful information from it, no matter what I try.


"I try" sounds like lucid dreams though? In my dreams there is no try, things just happen (including my actions). I think I get meaningful information sometimes, but it's selective


Also the AI generated images that can't get the fingers right. Have you ever tried to look at your hands while lucid dreaming and try counting fingers? There are some really interesting parallels between the dreams and diffusion models.


Of course, due to the very nature of dreams, your awareness of diffusion models and their output flavors how you perceive even past dreams.

Our brains love retroactively altering fuzzy memories.


On the other hand, psychedelics give you perceptions similar to even early deepdream genai images.

On LSD, I was swimming in my friend’s pool (for six hours…) amazed at all the patterns on his pool tiles underwater. I couldn’t get enough. Every tile had a different sophisticated pattern.

The next day I went back to his place (sober) and commented on how cool his pool tiles were. He had nfi what I was talking about.

I walk out to the pool and sure enough it’s just a grid of small featureless white tiles. Upon closer inspection they have a slight grain to them. I guess my brain was connecting the dots on the grain and creating patterns.

It was quite a trip to be so wrong about reality.

Not really related to your claim I guess but I haven’t thought of this story in 10 years and don’t want to delete it.


This may be a joke, but counting your fingers to lucid dream has been a thing for a lot longer than diffusion models.

That being said, your reality will influence your dreams if you're exposed to some things enough. I used to play minecraft on a really bad PC back in the day, and in my lucid dreams I used to encounter the same slow chunk loading as I saw in the game.


Playing Population One in VR did this to me. Whenever I hopped into a new game, I'd ask the other participants if they'd had particularly vivid dreams since getting VR, and more than half of folks said they had.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: