Request for feedback / evaluation: `json-escape` — no-std, zero-copy JSON escaping & unescaping library
⚓ Rust 📅 2025-10-01 👤 surdeus 👁️ 22Hello all
,
I’ve been working on a Rust crate called json-escape — a no_std, zero-copy, allocation-free library for streaming JSON string escaping and unescaping.
I’m posting here to get a broad set of eyes from the community: users, library authors, embedded folks, performance geeks, etc. – any thoughts, critiques, or suggestions are very welcome.
What it offers
Here’s a quick rundown of the features and design goals:
- Works in no_std contexts (with an optional
alloc/stdfeature for owned types) - Zero-copy slicing: when no modifications are needed, the iterators yield slices borrowed from the input
- Streaming support:
UnescapeStreamAPI handles escape sequences that might cross chunk boundaries — useful when reading JSON over I/O or in fragments - Fully compliant with RFC 8259, including correct handling of surrogate pairs (e.g.
\uD83D\uDE00) - Multi-layered APIs:
• High-level iterator functions (e.g.escape_str,unescape)
• Streaming for I/O
• Token-level APIs (token,explicit) for low-level control - Performance: in benchmarks, streaming approach outperforms the common “collect buffer then unescape” model (up to ~5× for dense escapes, ~2.2× for sparse, ~1.7× for no escapes)
Areas I’m especially seeking feedback on
These are the questions / concerns where I’d love community input:
-
Soundness and correctness
- Are there tricky edge cases I might have missed (e.g. partial escape sequences at buffer boundaries, malformed inputs, invalid surrogate handling)?
- Does the API correctly propagate errors or invalid input cases in a way that’s ergonomic for callers?
-
API design / ergonomics
- Are the layering choices (iterator + streaming + token) intuitive and useful?
- Are there missing conveniences or ergonomics you'd expect in a crate like this?
-
Performance trade-offs / benchmarks
- Any real-world workloads I should test (e.g. very large JSON strings, network streaming, embedding in serde contexts)?
- Are there micro-optimizations or algorithmic tweaks you might suggest?
-
Use cases / adoption potential
- For folks building JSON parsers, custom serializers, embedded systems, or I/O streaming pipelines: would this library fit your needs? If not, why not?
- What alternative approaches (if any) would you use instead?
-
Versioning, stability, ecosystem fit
- What kinds of guarantees or deprecation paths would you expect from such a library?
- How would you prefer it integrates with existing JSON/serde/streaming ecosystems?
1 post - 1 participant
🏷️ Rust_feed