Hex literals for signed integers
⚓ Rust 📅 2025-07-29 👤 surdeus 👁️ 11I am implementing a protocol that defines special values for certain integer types.
The specification defines those as hexadecimal representations of the signed integers as returned by the "{:x#}" format string, e.g. 0x80 for i8.
However, I cannot initialize a signed integer with such a hexadecimal value:
The compiler's suggestion works as intended, but I have three issues with it
- It looks ugly and unnecessarily complicated.
- I don't trust
ascasting as it may break my code somewhere due to e.g. truncation. - I will need to add a clippy exemption above the definition in order to satisfy my strict linter config (see 2.).
I also find this error very surprising, since the conversion into the other direction results in the exact same hex literal:
Why does Rust have this inconsistency?
3 posts - 2 participants
🏷️ Rust_feed