lkml.org 
[lkml]   [2003]   [Jul]   [1]   [last100]   RSS Feed
Views: [wrap][no wrap]   [headers]  [forward] 
 
Messages in this thread
Patch in this message
/
Date
From
Subject[bug?] How to stuff 21 bits in __u16
Hello All,

I have a question to definition from nls.h (both 2.4.21 and 2.5.56):

The utf-8 decoding stuff seems to handle all characters up to
0x7fff_ffff. But then it supposes to store them in wchar_t and it is
defined as __u16. To me it seems like a bug (which should moreover be
trivial to fix with something like:)

--- linux-2.4.21/include/linux/nls.h.orig 2003-06-30 10:12:37.000000000 +0200
+++ linux-2.4.21/include/linux/nls.h 2003-07-01 19:07:17.000000000 +0200
@@ -4,7 +4,7 @@
#include <linux/init.h>

/* unicode character */
-typedef __u16 wchar_t;
+typedef __u32 wchar_t;

struct nls_table {
char *charset;
-------------------------------------------------------------------------------
Jan 'Bulb' Hudec <bulb@ucw.cz>
-
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to majordomo@vger.kernel.org
More majordomo info at http://vger.kernel.org/majordomo-info.html
Please read the FAQ at http://www.tux.org/lkml/
\
 
 \ /
  Last update: 2005-03-22 13:46    [W:0.021 / U:0.264 seconds]
©2003-2020 Jasper Spaans|hosted at Digital Ocean and TransIP|Read the blog|Advertise on this site