Sat, 11 Dec 2004 16:57:50 -0800
> By convention, a Char.char is an "interpreted" 8bit value. I would
> argue that what one really wants is a Word8.word reader to a
> WideChar.char reader. A Word8.word is an "uniterpreted" 8bit value.
> When one wants to recover a 1.2 style decoding converter from a
> Char.char reader, it should first be sent through the Byte
> structure, which explicitly relinquishes the bit interpretation.
I like this way of thinking about it.
> The problem with both NONE and exceptions for 1.2 style converters is that
> the invalidity of the input stream is not discovered until sufficient
> input is read; i.e., not at the point where the conversion is applied.
Why does this matter, since these are streams? I.E. who can tell that
we've looked a few characters ahead?
Can someone explain the difference between LargeChar and WideChar?