Hex literals for signed integers

⚓ Rust    📅 2025-07-29    👤 surdeus    👁️ 11      

surdeus

Warning

This post was published 127 days ago. The information described in this article may have changed.

I am implementing a protocol that defines special values for certain integer types.
The specification defines those as hexadecimal representations of the signed integers as returned by the "{:x#}" format string, e.g. 0x80 for i8.

However, I cannot initialize a signed integer with such a hexadecimal value:

The compiler's suggestion works as intended, but I have three issues with it

  1. It looks ugly and unnecessarily complicated.
  2. I don't trust as casting as it may break my code somewhere due to e.g. truncation.
  3. I will need to add a clippy exemption above the definition in order to satisfy my strict linter config (see 2.).

I also find this error very surprising, since the conversion into the other direction results in the exact same hex literal:

Why does Rust have this inconsistency?

3 posts - 2 participants

Read full topic

🏷️ Rust_feed