Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Thinking about it. Using CR alone in protocols actually make infinitely more sense. As that would allow use of LF in records. Which would make many use cases much simpler.

Just think about text protocols like HTTP, how much easier something like cookies would be to parse if you had CR as terminating character. And then each record separated by LF.



ASCII already has designated bytes for unit, group, and record separators. That aside, a big drawback of using unprintable bytes like these is they're more difficult for humans to read in dumps or type on a keyboard than a newline (provided newline has a strict definition CRLF, LF, etc)


There is no reason those ascii characters need to stay unprintable. You could use other characters like an interpunct, silcrow, or down carat.


There is in fact a reason those ASCII 'characters' should stay unprintable: the 0x00-0x1f (except Tab, CR, LF) range is explicitly excluded as invalid in a whole bunch of standards, e.g. XML.


That is so backwards incompatible that it is never, ever going to fly.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: