GNU Bc Base Conversion

Home » CentOS » GNU Bc Base Conversion
CentOS 9 Comments

Hello CentOS List Members,

Thoughts as to why my BC functions aren’t properly converting between bases?

Decimal to binary or hex works fine, but not binary or hex to decimal and so forth. No doubt the syntax is in some way wrong, but when I test from the CLI and the right values are returned, I have to wonder.

Other than a few other define statements, the only other option I have set is scale=5 in my bcrc.

In reference to the order of (o|i)base parameters, I have specified obase before ibase [0].
[0] http://docstore.mik.ua/orelly/unix/upt/ch49_03.htm

See below for my examples. Thanks!

]$ echo “bin_to_dec(1001)” | bc
1001
# should be decimal 9

]$ echo “obase; ibase=2; 1001” | bc
9

]$ grep bin_to_dec ~/.bcrc define bin_to_dec(b) { obase; ibase=2; return b; }

9 thoughts on - GNU Bc Base Conversion

  • I’m not an expert in bc, so I might be wrong, but it looks like setting the ibase inside a function is simply too late. ibase affects how bc interprets input. So “echo “bin_to_dec(1001)” | bc” is going to interpret the value of 1001 while reading it from input, not after passing it to a function where ibase is reset.

    Supporting that theory:
    $ bc
    … define bin_to_dec(b) { obase; ibase=2; return b; }
    bin_to_dec(1001)
    1001

    Decimal to binary and hex work correctly because decimal input is the default. Since ibase is already 10, those values are interpreted the way you want, but not because you’re setting ibase in your function.

  • Yes, and it’s a serious design mistake in bc, IMHO. No other programmable system I’ve ever used changes how numbers in program text are interpreted based on prior commands to the system.

    I wrote a long answer explaining this on the Unix & Linux Stack Exchange here:

    http://unix.stackexchange.com/a/199620

  • No, there’s a fairly common hack around this problem: ibase=A and obase=A *always* means “base 10” regardless of the current base, due to a quirk in the way values for these settings are interpreted. Thus you can always force your way back to sanity.

    My objection is that this is even necessary in the first place.

  • Not sure how this helps me with my most recent example of bin_to_hex where the ibase within the define clause wasn’t honored.

    Working with bc interfactively or by piping produce the desired/correct values. Testing indicates the ibase is defaulted or overrode as base10 despite what is specified in the define clause. :-(

    Agreed.


    —~~.~~—
    Mike
    // SilverTip257 //

  • I suppose that depends on what you’re trying to accomplish. Most conversions you can do entirely within bash, if that’s your goal.

    function dec_to_hex () { printf ‘%x\n’ $1; }
    function hex_to_dec() { printf ‘%d\n’ $(( 16#$1 )); }
    function hex_to_bin() { echo ‘obase=2;’ $(( 16#$1 )) | bc; }

    $ dec_to_hex 10
    a
    $ hex_to_dec a
    10
    $ hex_to_bin a
    1010

  • That’s because your bin_to_hex function is erroneously assuming that its input is just a string of digits that has no base interpretation, so that it can set ibase *after* bc has already seen the value, and that this will change the meaning of the prior input.

    That’s just plain wrong: the prior input has already been interpreted. You can’t wind back the clock like that. That sort of thinking only works in a string-based language like shell or Tcl, where a numeric string doesn’t take a meaning until you assign one.

    The correct form is:

    ibase=A /* needed only if ibase might not be 10 at this point */
    obase=A /* ditto */
    obase=16 /* means 16-base-10 here */
    ibase=2 /* no possibility of confusion */
    10101011

    Result: AB, as expected.

    The tricky bit is that you can’t swap the second pair of ibase and obase settings, since that would cause bc(1) to interpret obase=16 in base 2: bc(1) will clamp “16” to 11, which in base 2 is 3-base-10.

  • Explains the behavior.

    Thank you. We’re in agreement, your solution _does_ work when piped or interactively. Any order does not work from a definition stanza since base10 is
    (apparently) not capable of being overrode with definitions.

    tmp]$ grep bin_to_hex ~/.bcrc
    #define bin_to_hex(b) { obase=16; ibase=2; return b; }
    define bin_to_hex(b) { ibase=A; obase=A; obase=16; ibase=2; return b; }

    tmp]$ echo “bin_to_hex(10101011)” | bc
    9A2113
    tmp]$ echo “ibase=A; obase=A; obase=16; ibase=2; 10101011” | bc AB
    tmp]$ echo “obase=16; ibase=2; 10101011” | bc AB

    Agreed. Thank you, Warren.

    I don’t believe bc is the tool for this job since define stanzas don’t replicate the behavior as seen when used via piping or interactively. :-/


    —~~.~~—
    Mike
    // SilverTip257 //

  • Again, it comes down to order of evaluation. You can look at the following in three stages:

    define bin_to_hex(b) { obase=16; ibase=2; return b; }
    bin_to_hex(10101011)

    1. Here is a number, 10101011, which you shall interpret in base 10, since that’s the default and no one has said otherwise yet.

    2. Pass that value to bin_to_hex()

    3. Change the obase and ibase, then return the parameter with those new settings.

    You are expecting that because the define line appears first that it affects what follows, but it doesn’t affect anything until you call it for the first time.

    That’s clear.

    What I *don’t* understand is why this doesn’t work:

    define bin_to_hex(b) { obase=16; ibase=2; return b; }
    bin_to_hex(0) /* bogus call to force obase and ibase */
    bin_to_hex(10101011)

    For some strange reason, I get 20100 out of this with both GNU and BSD bc, even though ibase=2 and obase=10 by the time of the second bin_to_hex() call.

    (If you’re confused about obase=10, realize that 10-base-16 is 16-base-10. :) )

    Bottom line, don’t use bc for this kind of thing. It’s not the right tool for the job.