BigDecimal might be much slower than float (but not really - I was working on a large banking application where one of the (many) things we did was convert everything from floats to BigDecimals, and it didn't slow down considerably, even with all the new heavy calculations), but for me the worse thing is that you have to explicitly specify rounding mode and scale every time you call divide(), and I would also be wary of memory requirements (more frequent garbage collections, at the very least).
Everything else is handwaveavble or an advantage, imao.
Methods instead of operators: that's change only inside mafia itself, ASH can keep using operators. Not envious of the person coding the change to mafia, though. Unless the only thing changed is ASH's numbers, and mafia keeps using floats everywhere else.
BigDecimal will keep source-specified precision (as long as you initialize it with a String and don't go through an intermediate float, like what confounded me a few times already; right: new BigDecimal("0.1"), wrong: new BigDecimal(0.1));
I suspect ASH (and java itself) has enough overhead that the difference will be significantly smaller than what measuring two tight loops doing nothing else would suggest.
Worst case, we can use BigDecimal just to output doubles at specified precision:
Code:
(new BigDecimal(1234860.43248754324)).setScale(50, RoundingMode.HALF_UP).stripTrailingZeros().toString();
(that 50 is how many decimals (after the decimal point) I want at most, stripTrailingZeroes is there to prevent extra zeroes at the end)
But I would prefer having numbers that exactly represent what I wrote in the code (or in the inpout box) over having unknowably-faster calculations.
I mean, unless we are talking a change from "almost instant" to "minutes", but I am not convinced about that.