Messages in this thread Patch in this message | ![/](/images/icornerl.gif) | | Date | Tue, 1 Jul 2003 19:18:58 +0200 | From | Jan Hudec <> | Subject | [bug?] How to stuff 21 bits in __u16 |
| |
Hello All,
I have a question to definition from nls.h (both 2.4.21 and 2.5.56):
The utf-8 decoding stuff seems to handle all characters up to 0x7fff_ffff. But then it supposes to store them in wchar_t and it is defined as __u16. To me it seems like a bug (which should moreover be trivial to fix with something like:)
--- linux-2.4.21/include/linux/nls.h.orig 2003-06-30 10:12:37.000000000 +0200 +++ linux-2.4.21/include/linux/nls.h 2003-07-01 19:07:17.000000000 +0200 @@ -4,7 +4,7 @@ #include <linux/init.h> /* unicode character */ -typedef __u16 wchar_t; +typedef __u32 wchar_t; struct nls_table { char *charset; ------------------------------------------------------------------------------- Jan 'Bulb' Hudec <bulb@ucw.cz> - To unsubscribe from this list: send the line "unsubscribe linux-kernel" in the body of a message to majordomo@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html Please read the FAQ at http://www.tux.org/lkml/
| ![\](/images/icornerr.gif) |