I understand that efficiency isn't something that can necessarily be considered, because Java is merely an interpreter at heart, but that doesn't mean that using built-in Java math operators is -as slow- as writing code to do the same thing. Interpreting might be slow, but your statement implies that interpreting once and then again wouldn't be any slower.
Java natively supplies bitwise operators, it would merely be a matter of supplying functionality to the ASH interpreter, which would execute quite a bit faster than converting a function using quite a bit of division and if-checks to get one bit out of four bytes. SHL and SHR (in java I believe "<<" and ">>") would probably be the only addition that might require checking big vs little endian, but again, Java would make that check for you.
When I said "every language" I may have been oversimplifying my beliefs. I can think of a few instances where the inclusion into the language would be excess or redundant. In procedural and functional languages, the inclusion of math functions [and bit manipulation] are generally aspects of the implementation. Most of my work has been in compiled code, not interpreted code, so if that has a bearing, then I apologize for my misunderstanding. And, I must emphasize this more strongly than I did last, the necessity of this ability is entirely a personal belief.
If you would care to illustrate the "state of art of computer language design" that would be wonderful, since the way it's written I don't understand at all what it is you are trying to say, other than the inferred stab at my knowledge in the subject.