This would seem to be an elementary problem, but I have not been able to find a conclusive discussion about it…
Now that characters are (always) more than one byte in AppleScript 2.x, how does one read binary files byte-wise?
In the past, I would read as text, then use ASCII Number to get a byte value. Surely many many scripters did the same.
Using the new “ID” of a unicode character works of course, but reading a binary file as text results in a character stream that is problematic. I suppose that after reading as UTF-16, the bytes of the two byte characters might be teased apart, but based on my minimal and foggy recollection of unicode encoding, I believe there are certain byte sequences that would not reflect the true contents of the binary file.
The Leopard AS release note describes the change in terms of how it affects dealing with text files, but seems to ignore the fact that some of us would process binary files using “ASCII Number” of some byte (ie. MacRoman single byte character) read from a file.
Writing binary files amounts to the same issue, but I expect the solution for reading will work for writing too.
My current workaround is to read “as unsigned integer” (which is 4 bytes per value), then tease the bytes apart.
Ideally, there would be an “unsigned byte” class, but I can find no such thing… perhaps I should be looking for something with a different name?
I realize that one can read “as data”, but the result seems impervious to manipulation.
I see that the feature list of Satimage’s Smile has “read binary” and “write binary”, and no doubt there is an OSAX that can help (which one?), but I really want to build an applet that can stand-alone.
I suspect I may be missing something obvious… clue-by-fours appreciated.