Question about String::WriteAscii

32 views
Skip to first unread message

r...@tinyclouds.org

unread,
Sep 21, 2010, 6:28:21 PM9/21/10
to v8-users

fuzzy spoon

unread,
Sep 22, 2010, 3:58:46 AM9/22/10
to v8-u...@googlegroups.com
It seems like a safe guard for buffer* having '\0' in it (obviously, i know you knew that).

To me it seems like an issue, because char* uses '\0' to denote the end of the string,
but perhaps writing a buffer or multiple strings in the same buffer was causing problems with 
the strings stopping at string 1, so this was added because the supplied length is explicit.  

ie : it _allows_ you to write strings contiguous in memory provided you know how long they are combined (including the zero-termination for each). 

Perhaps it's a neat trick for performance reasons? 
Im curious - does it impact performance at all? 
Or does it break a normal char* by changing its terminator to a space? 

On Wed, Sep 22, 2010 at 12:28 AM, <r...@tinyclouds.org> wrote:

Alan Gutierrez

unread,
Sep 22, 2010, 4:31:25 AM9/22/10
to v8-u...@googlegroups.com

Camilo Aguilar

unread,
Sep 22, 2010, 7:57:31 AM9/22/10
to v8-u...@googlegroups.com
No, it isn't. The valid ascii character is NUL or 000 or 0 in Char, Oct, Dec and Hex respectively.

Alan Gutierrez

unread,
Sep 22, 2010, 12:30:19 PM9/22/10
to v8-u...@googlegroups.com
So, what does '\0' map to in ASCII? Its character code value is 0.

Are you correcting semantics or are you adding something to the discussion?

Alan Gutierrez

unread,
Sep 27, 2010, 10:43:07 PM9/27/10
to v8-u...@googlegroups.com
Seems like this conversation stalled. I'm still curious to learn why
the ASCII encoding arbitrarily converts 0 to 32 when encountered in
the encoding, and if it is such great shakes, why not do it for UTF-8?

In the meantime, I'm zeroing the most significant bit and using the
UTF-8 decoder.

Alan Gutierrez

Reply all
Reply to author
Forward
0 new messages