Trouble with binary files?

I’m trying to write a program that will read a binary
file into an buffer, do stuff with it, and then write
the result back into another file. However, I’m
running into a problem.

I haven’t been able to find a way that will read in
more than the first 160 bytes (of a 910 byte file).
I’ve tried using each_byte and looping with getc, as
well as storing the results in a string or an array.

I’ve never had this sort of problem working with text
files. Is there something else I have to do to be able
to work with binary data?

-Morgan.

···

Do you Yahoo!?
Yahoo! SiteBuilder - Free, easy-to-use web site design software
http://sitebuilder.yahoo.com

agemoagemo@yahoo.com wrote:

I’m trying to write a program that will read a binary
file into an buffer, do stuff with it, and then write
the result back into another file. However, I’m
running into a problem.

I haven’t been able to find a way that will read in
more than the first 160 bytes (of a 910 byte file).
I’ve tried using each_byte and looping with getc, as
well as storing the results in a string or an array.

I’ve never had this sort of problem working with text
files. Is there something else I have to do to be able
to work with binary data?

-Morgan.


Do you Yahoo!?
Yahoo! SiteBuilder - Free, easy-to-use web site design software
http://sitebuilder.yahoo.com

Did you try IO#binmode ? Unix guys (like me) typically miss that
when working in a Windows environment.

Cheers,

-- Heinz

Are you on a Windows box and are you opening the file in binary mode? I had this same problem recently with the same results. The problem was a Ctrl-Z some way into the file that was being interpreted as EOF. The fix was to add a ‘b’ to the open flags:

File.open(“foo.db”, “rb”) do …

and then all was right with the world.

···

On 9/19/2003 3:00 PM, agemoagemo@yahoo.com wrote:

I’ve never had this sort of problem working with text
files. Is there something else I have to do to be able
to work with binary data?


Dean saor, dean saor an spiorad. Is seinn d’orain beo.

In article 20030919190050.36969.qmail@web14006.mail.yahoo.com,

I’m trying to write a program that will read a binary
file into an buffer, do stuff with it, and then write
the result back into another file. However, I’m
running into a problem.

I haven’t been able to find a way that will read in
more than the first 160 bytes (of a 910 byte file).
I’ve tried using each_byte and looping with getc, as
well as storing the results in a string or an array.

I’ve never had this sort of problem working with text
files. Is there something else I have to do to be able

If you’re on a windows platform this can happen if you have a ^Z in your
file.

If this is your problem then binmode may help:

[mike@ratdog mike]$ ri binmode
This is a test ‘ri’. Please report errors and omissions
on http://www.rubygarden.org/ruby?RIOnePointEight

------------------------------------------------------------- IO#binmode
ios.binmode → ios

···

agemoagemo@yahoo.com wrote:

  Puts ios into binary mode. This is useful only in
  MS-DOS/Windows environments. Once a stream is in binary mode, it cannot
  be reset to nonbinary mode.

Hope this helps,

Mike


mike@stok.co.uk | The “`Stok’ disclaimers” apply.
http://www.stok.co.uk/~mike/ | GPG PGP Key 1024D/059913DA
mike@exegenix.com | Fingerprint 0570 71CD 6790 7C28 3D60
http://www.exegenix.com/ | 75D2 9EC4 C1C0 0599 13DA

— Heinz Werntges

files. Is there something else I have to do to be
able
to work with binary data?

Did you try IO#binmode ? Unix guys (like me)
typically miss that
when working in a Windows environment.

Nope, hadn’t seen that. And that fixes it.

I don’t particularly understand why though. Those
first 160 bytes were being read properly, what stops
it from getting the rest? Why 160? Is it going to
suddenly require something else to be done when I go
over to the 1.73mb file that the program is designed
to process? (I wouldn’t think so, but then I didn’t
expect this 160 thing either…)

Anyway, thanks for the help.

-Morgan.

···

werntges@informatik.fh-wiesbaden.de wrote:


Do you Yahoo!?
Yahoo! SiteBuilder - Free, easy-to-use web site design software
http://sitebuilder.yahoo.com

checks the file Next byte was hex 1A, which… is
Ctrl-Z. Well, that answers the other questions…

-Morgan.

···

— Joey Gibson joey@joeygibson.com wrote:

On 9/19/2003 3:00 PM, agemoagemo@yahoo.com wrote:

I’ve never had this sort of problem working with
text
files. Is there something else I have to do to be
able
to work with binary data?

Are you on a Windows box and are you opening the
file in binary mode? I had this same problem
recently with the same results. The problem was a
Ctrl-Z some way into the file that was being
interpreted as EOF. The fix was to add a ‘b’ to the
open flags:


Do you Yahoo!?
Yahoo! SiteBuilder - Free, easy-to-use web site design software
http://sitebuilder.yahoo.com

agemoagemo@yahoo.com graced us by uttering:

files. Is there something else I have to do to be able to
work with binary data?

Did you try IO#binmode ? Unix guys (like me) typically miss
that when working in a Windows environment.

Nope, hadn’t seen that. And that fixes it.

I don’t particularly understand why though. Those first 160
bytes were being read properly, what stops it from getting the
rest? Why 160? Is it going to suddenly require something else
to be done when I go over to the 1.73mb file that the program
is designed to process? (I wouldn’t think so, but then I
didn’t expect this 160 thing either…)

You misunderstand the 160-byte barrier as being related to Ruby.
It’s a Win32/DOS issue.

Way back in PC-/MS-DOS days, it was decided that the
non-printable ASCII-26 (^Z) character would mark the end of a
textmode file. The difference between textmode and binmode of a
DOS file is important, though it need not ever have become an
issue.

The requirement for ^Z to terminate a textfile has since been
changed. However, (for backward compatibility?) when the ^Z is
encountered in a textmode file, DOS (and subsequently, Windows)
still set the EOF flag and stop reading.

This becomes more of an issue when the default file open mode for
DOS/Win is in text mode, creating the need for a completely new
function call almost exclusively for DOS/Win platforms; in this
case, binmode(), which explicitly sets the file read mode to
binary, preventing the OS from stopping at the first ^Z (and from
changing line endings, blah, blah…).

As someone else mentioned above, this isn’t an issue on Unix or
many other systems, since EOF on these OSes isn’t determined by
file contents. I’m not if it’s an issue for Macs, as they also
historically use different line endings. This also may have
changed with OS X; anyone know?

The moral of the story is:

Always call fh.binmode() before reading
any non-text file on non-Unix platforms.

HTH,
Tim Hammerquist

···

werntges@informatik.fh-wiesbaden.de wrote:

scanf() is evil.

Tim Hammerquist wrote:

You misunderstand the 160-byte barrier as being related to Ruby.
It’s a Win32/DOS issue.

Way back in PC-/MS-DOS days, it was decided that the
non-printable ASCII-26 (^Z) character would mark the end of a
textmode file. The difference between textmode and binmode of a
DOS file is important, though it need not ever have become an
issue.

The requirement for ^Z to terminate a textfile has since been
changed. However, (for backward compatibility?) when the ^Z is
encountered in a textmode file, DOS (and subsequently, Windows)
still set the EOF flag and stop reading.

This becomes more of an issue when the default file open mode for
DOS/Win is in text mode, creating the need for a completely new
function call almost exclusively for DOS/Win platforms; in this
case, binmode(), which explicitly sets the file read mode to
binary, preventing the OS from stopping at the first ^Z (and from
changing line endings, blah, blah…).

As someone else mentioned above, this isn’t an issue on Unix or
many other systems, since EOF on these OSes isn’t determined by
file contents. I’m not if it’s an issue for Macs, as they also
historically use different line endings. This also may have
changed with OS X; anyone know?

The moral of the story is:

Always call fh.binmode() before reading
any non-text file on non-Unix platforms.

True, but let’s be fair.

MSDOS stole many things from Unix, such as the notion of a
hierarchical directory structure and the use of < > | at the
shell level. (Many things were incompletely stolen, unfortunately.)

The binmode/textmode distinction came from Unix. At that time
Unix had an EOF character of control-D (which explains the ^D we
still type occasionally at the terminal).

So historically Unix’s behavior with respect to ^D was the same as
DOS’s with respect to ^Z. But Unix/Linux moved beyond that, and
DOS/Windows never did.

Hal

Tim Hammerquist tim@vegeta.ath.cx wrote in message news:slrnbmn365.dmt.tim@vegeta.ath.cx

However, (for backward compatibility?) when the ^Z is
encountered in a textmode file,

…or, better yet, when any text character whose encoding happens to
include an 0x1a is encountered! My, that was annoying.

As someone else mentioned above, this isn’t an issue on Unix or
many other systems,

Or indeed on Windows, provided you avoid ruby :I

The moral of the story is:

Always call fh.binmode() before reading
any non-text file on non-Unix platforms.

You have to call it before reading any file, unless you just know
that only ASCII was used, for the reason above. I think ruby is the
only software I’ve ever used that has this issue. I suppose ruby must
check for the 0x1a before allowing for the encoding system.

Hal Fulton wrote:

True, but let’s be fair.

MSDOS stole many things from Unix, such as the notion of a
hierarchical directory structure and the use of < > | at the
shell level. (Many things were incompletely stolen, unfortunately.)

The binmode/textmode distinction came from Unix. At that time
Unix had an EOF character of control-D (which explains the ^D we
still type occasionally at the terminal).

No. Unix never distinguished between text and binary files. Unix did
(and does) interpret ASCII EOT (ctrl-d) as an end-of-input indicator for
terminal devices, but it never used any in-band character to mark the
end of a file. The EOT never got past the terminal driver, and was never
delivered to an application.

Steve

As someone else mentioned above, this isn’t an issue on Unix or
many other systems,

Or indeed on Windows, provided you avoid ruby :I

Or C, or perl, or a host of other languages. It’s not ruby specific.
(Has perl “magiced” around this? I haven’t used it much since 4.0x)

···

Do you Yahoo!?
Yahoo! SiteBuilder - Free, easy-to-use web site design software
http://sitebuilder.yahoo.com

Benjamin Peterson graced us by uttering:

However, (for backward compatibility?) when the ^Z is
encountered in a textmode file,

…or, better yet, when any text character whose encoding
happens to include an 0x1a is encountered! My, that was
annoying.

Multi-byte encodings weren’t recognized as “text files” by
DOS/Win until fairly recently, if currently (I don’t use them),
so I probably wouldn’t have tried to get away with it.

As someone else mentioned above, this isn’t an issue on Unix
or many other systems,

Or indeed on Windows, provided you avoid ruby :I

No, Perl has a binmode() function as well, for exactly this
issue. This question’s asked on c.l.p.m almost weekly.

The moral of the story is:

Always call fh.binmode() before reading
any non-text file on non-Unix platforms.

You have to call it before reading any file, unless you just
know that only ASCII was used, for the reason above. I think
ruby is the only software I’ve ever used that has this issue.
I suppose ruby must check for the 0x1a before allowing for
the encoding system.

AFAIK, Ruby just calls the system read calls, as do all the other
scripting languages. If there’s any “magic” in the Ruby
implementation, that would be interesting to see.

How about:

Always call fh.binmode() before reading any non-7-bit-clean
file on non-Unix platforms.

Cheers,
Tim Hammerquist

···

Tim Hammerquist tim@vegeta.ath.cx wrote:

It’s there as a sop to former Ada programmers. :slight_smile:
– Larry Wall regarding 10_000_000 in 11556@jpl-devvax.JPL.NASA.GOV

I thought that the Control-Z usage was borrowed from CP/M in order to
make it easier to port programs from that to MS-DOS.

http://www.finseth.com/~fin/craft/Chapter-5.html

···

On Fri, 2003-09-19 at 19:41, Hal Fulton wrote:

True, but let’s be fair.

MSDOS stole many things from Unix, such as the notion of a
hierarchical directory structure and the use of < > | at the
shell level. (Many things were incompletely stolen, unfortunately.)

The binmode/textmode distinction came from Unix. At that time
Unix had an EOF character of control-D (which explains the ^D we
still type occasionally at the terminal).

So historically Unix’s behavior with respect to ^D was the same as
DOS’s with respect to ^Z. But Unix/Linux moved beyond that, and
DOS/Windows never did.

Hal

http://thispaceavailable.uxb.net/blog/index.html

The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offense. – E. W. Dijkstra

Steven Jenkins wrote:

Hal Fulton wrote:

True, but let’s be fair.

MSDOS stole many things from Unix, such as the notion of a
hierarchical directory structure and the use of < > | at the
shell level. (Many things were incompletely stolen, unfortunately.)

The binmode/textmode distinction came from Unix. At that time
Unix had an EOF character of control-D (which explains the ^D we
still type occasionally at the terminal).

No. Unix never distinguished between text and binary files. Unix did
(and does) interpret ASCII EOT (ctrl-d) as an end-of-input indicator for
terminal devices, but it never used any in-band character to mark the
end of a file. The EOT never got past the terminal driver, and was never
delivered to an application.

If Unix never distinguished between text and binary files, what
was the binary mode flag for?

Hal

“Hal Fulton” hal9000@hypermetrics.com wrote in message

If Unix never distinguished between text and binary files, what
was the binary mode flag for?

For CR-LF may be … just a wild guess.

Steven Jenkins wrote:

Hal Fulton wrote:

True, but let’s be fair.

MSDOS stole many things from Unix, such as the notion of a
hierarchical directory structure and the use of < > | at the
shell level. (Many things were incompletely stolen,
unfortunately.)

The binmode/textmode distinction came from Unix. At that time
Unix had an EOF character of control-D (which explains the ^D
we still type occasionally at the terminal).

No. Unix never distinguished between text and binary
files. Unix did (and does) interpret ASCII EOT (ctrl-d) as an
end-of-input indicator for terminal devices, but it never used
any in-band character to mark the end of a file. The EOT never
got past the terminal driver, and was never delivered to an
application.

> If Unix never distinguished between text and binary files,
> what was the binary mode flag for?

I recall that the whole ^Z terminator from CP/M and the fact that file
sizes were always multiples of 128 bytes (saving 7 bits in a size
field being important at the time), so a text file needed a special
character to mark the end of the text. MSDOS carried that “tradition”
on, to ease porting of CP/M applications to DOS, and, well, saving bits
was important at that time, at least it seemed to be important!

d.k.

···


Daniel Kelley - San Jose, CA
For email, replace the first dot in the domain with an at.

If Unix never distinguished between text and binary files, what
was the binary mode flag for?

Don’t equate unix and C, for the 2 are not the same thing.

My old K&R C book says of this, “…SOME systems distinguish between
text and binary files; for the latter, a “b” must be appended to the
mode string.” (emphasis mine).

Or were you referring to something else?

···

Do you Yahoo!?
Yahoo! SiteBuilder - Free, easy-to-use web site design software
http://sitebuilder.yahoo.com

In Message-Id: 3F6BB6FB.3020204@hypermetrics.com
Hal Fulton hal9000@hypermetrics.com writes:

If Unix never distinguished between text and binary files, what
was the binary mode flag for?

For ANSI-C compliance. From fopen(3) of FreeBSD 4.8-RELEASE:

 The mode string can also include the letter ``b'' either as a third char-
 acter or as a character between the characters in any of the two-charac-
 ter strings described above.  This is strictly for compatibility with
 ISO/IEC 9899:1990 (``ISO C89'') and has no effect; the ``b'' is ignored.

I believe most of Unix like platforms stand on a similar position.

···


kjana@dm4lab.to September 20, 2003
A man is known by the company he keeps.

Unix originally didn’t have one. Only has it now for compatibility.

···

On Fri, 2003-09-19 at 22:10, Hal Fulton wrote:

If Unix never distinguished between text and binary files, what
was the binary mode flag for?


– Jim Weirich jweirich@one.net http://onestepback.org

“Beware of bugs in the above code; I have only proved it correct,
not tried it.” – Donald Knuth (in a memo to Peter van Emde Boas)

Hal Fulton wrote:

If Unix never distinguished between text and binary files, what
was the binary mode flag for?

The ‘b’ modifier was added to ANSI C to support non-Unix execution
environments that distinguish between text and binary files. It didn’t
exist in Unix until ANSI C required it; since then, it’s been a no-op.

http://www.lysator.liu.se/c/rat/d9.html#4-9-2

Steve