Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

An alternative proposal for Glk, stage 1

15 views
Skip to first unread message

vaporware

unread,
Feb 3, 2010, 4:51:33 PM2/3/10
to
Hi all,

I'm assembling my own proposal for extending Glk, and I'd like to
solicit your feedback.

=== Introduction ===

The primary goal is to make Glk capable of handling all the features
of the Z-machine: that is, to make it possible to write a full-
featured Z-machine interpreter using Glk as its output system without
giving up features like color. The secondary goal is to extrapolate
those features in a way that's sensible for today's more capable
hardware (e.g. 24-bit color rather than a palette of 12 colors).
Ideally, extensions like Basic Screen Effects will be more capable on
Glulx than on the Z-machine, not less.

For this, I'm drawing on my experience with Demona, which is a Z-
machine interpreter using Gargoyle's Glk library for output. When I
was writing Demona, I identified three features that Glk needed to
support the Z-machine's display: terminating characters, color changes
independent of style, and the ability to "unwrite" text from the end
of the buffer to support a particularly annoying case in the Z-
machine's "read" instruction. This proposal will include added Glk
functions, gestalt selectors, and other minor changes.

=== Added Functions ===

void glk_set_line_terminators(winid_t win, const glui32 *keycodes,
glui32 numkeycodes);

This registers a set of key codes which will terminate line input
(without printing a new line). The key code will be returned in the
val2 field of the event record.

void glk_unput_string(char *s);
void glk_unput_string_uni(glui32 *s);
void glk_unput_string_stream(strid_t str, char *s);
void glk_unput_stream_stream_uni(strid_str, glui32 *s);

These remove a string from the end of the specified stream's buffer,
or the currently selected stream's buffer, if indeed it is there. The
Z-machine's "read" instruction expects to be able to continue line
input using a partial line that has already been printed by the game;
the interpreter knows what the partial text is, but is not expected to
print it. With this function, the interpreter can unput the partial
text and then use the regular Glk line input feature (which *does*
print the partial text).

void glk_stylehint_set_temp(winid_t win, glui32 styl, glui32 hint,
glsi32 val);

This sets a temporary style hint for the specified window. Unlike a
regular style hint, this *does* take effect immediately. Temporary
style hints take precedence over the attributes of the currently
selected style, so selecting style_Alert and then hinting its
foreground color to red will produce a red alert. All temporary style
hints will be cleared upon calling glk_set_style or
glk_set_style_stream (without changing the appearance of any text that
was printed in the temporary style).

=== Gestalt Selectors ===

New gestalt selectors are added:

* gestalt_HasTerminators, gestalt_HasUnput, gestalt_HasTempStyleHints:
check whether these new functions are present

* gestalt_SupportTempStyleHint: check which hints are supported as
temporary style hints (interpreters may choose not to support
temporary changes in justification, etc.)

=== Other Changes ===

New user style constants are added, style_User3 through style_User16.

=== Your Feedback ===

Are the goals and scope of this proposal worthy? Are these changes
sufficient to achieve them?

Interpreter/Glk library authors: do these changes seem practical to
implement?

Since this is not an official part of the Glk spec, I expect to have
to write a few patches myself (though volunteers would be appreciated,
especially on MacOS). What is the core set of interpreters that would
need to support these changes in order for them to be useful?

vw

Evin

unread,
Feb 3, 2010, 6:28:52 PM2/3/10
to
>   void glk_set_line_terminators(winid_t win, const glui32 *keycodes,
> glui32 numkeycodes);

Yes, Beyond Zork needs this to scroll the upper window. But it wants
the up/down arrows, which the interpreter hopefully wants to use for
command-history, and keycodes can't specify control-up-arrow as an
alternative... Currently, you can request character input from the
upper window and require the user to tab over there. Clunky, but it's
just one game.

>   void glk_unput_string(char *s);
>   void glk_unput_string_uni(glui32 *s);
>   void glk_unput_string_stream(strid_t str, char *s);
>   void glk_unput_stream_stream_uni(strid_str, glui32 *s);

These functions are unnecessary to implement a Z-machine interpreter.
Your interpreter can buffer input until a newline is received, so that
when it comes time to perform preloaded line input, you can unput from
your buffer and never print the text to the screen. It makes more
sense to put this code inside all Glk-based Z-machine interpreters
than inside all Glk implementations.

>   void glk_stylehint_set_temp(winid_t win, glui32 styl, glui32 hint,
> glsi32 val);

Something like this would be very handy. Glk's existing style system
can display the colors for existing Z-Machine IF games (but not all
abuses or terpetude), but the trick is knowing what colors are going
to be used ahead of time (the upper window isn't a problem; we can
reopen it without the user noticing). Nitfol takes the approach of
"allocate green in case we're playing Varicella, etc.," which could be
carried to its conclusion of having a data table listing all games
that use colors and the appropriate CSS for each game. But allowing
run-time color changes would make all of that complication
unnecessary.

> New user style constants are added, style_User3 through style_User16.

If we had glk_stylehint_set_temp, would anyone use user styles any
more?

Eliuk Blau

unread,
Feb 3, 2010, 8:42:00 PM2/3/10
to
On 3 feb, 18:51, vaporware <jmcg...@gmail.com> wrote:
> Hi all,
>
> I'm assembling my own proposal for extending Glk, and I'd like to
> solicit your feedback.
>
> === Your Feedback ===
>
> Are the goals and scope of this proposal worthy? Are these changes
> sufficient to achieve them?
>
> Interpreter/Glk library authors: do these changes seem practical to
> implement?
>
> Since this is not an official part of the Glk spec, I expect to have
> to write a few patches myself (though volunteers would be appreciated,
> especially on MacOS). What is the core set of interpreters that would
> need to support these changes in order for them to be useful?
>
> vw

I like this! It is complete enough for me and very simple (I imagine)
to implement. You have my vote! =3

I prefer glk_stylehint_redefine() instead glk_stylehint_set_temp().

And I think that styles should be not limited to 16 only. I think it
would be a good idea to allow as many styles as the programmer would
like ... simply increasing the "number" of style, example:

style_User3 to style_UserNNN (where "NNN" an undetermined number, ie,
limited only by the maximum value of numeric variables in Inform-
Glulx).

I congratulate you for your contribution to development! =D

Saludos!
Eliuk Blau.

vaporware

unread,
Feb 3, 2010, 8:48:18 PM2/3/10
to
On Feb 3, 3:28 pm, Evin <nit...@gmail.com> wrote:
> >   void glk_set_line_terminators(winid_t win, const glui32 *keycodes,
> > glui32 numkeycodes);
>
> Yes, Beyond Zork needs this to scroll the upper window.  But it wants
> the up/down arrows, which the interpreter hopefully wants to use for
> command-history, and keycodes can't specify control-up-arrow as an
> alternative...  Currently, you can request character input from the
> upper window and require the user to tab over there.  Clunky, but it's
> just one game.

I see. What would be a better solution here? Adding alternative key
codes for the game to use? Letting the game set the up/down arrows as
terminators and forcing the interpreter to use something else in that
case?

> >   void glk_unput_string(char *s);
> >   void glk_unput_string_uni(glui32 *s);
> >   void glk_unput_string_stream(strid_t str, char *s);
> >   void glk_unput_stream_stream_uni(strid_str, glui32 *s);
>
> These functions are unnecessary to implement a Z-machine interpreter.
> Your interpreter can buffer input until a newline is received, so that
> when it comes time to perform preloaded line input, you can unput from
> your buffer and never print the text to the screen.  It makes more
> sense to put this code inside all Glk-based Z-machine interpreters
> than inside all Glk implementations.

Good point, I hadn't thought of that. But on the other hand, what
happens if the game explicitly turns off buffering and expects printed
text to appear immediately?

> > New user style constants are added, style_User3 through style_User16.
>
> If we had glk_stylehint_set_temp, would anyone use user styles any
> more?

Perhaps not, but if we're going to have any user styles at all, we may
as well have a more useful number than two, right? For example, these
could store every combination of bold+italic+underline+fixed without
having to reuse the standard styles; that would save up to four
glk_stylehint_set_temp calls when the style changes.

vw

vaporware

unread,
Feb 3, 2010, 10:33:56 PM2/3/10
to
On Feb 3, 5:42 pm, Eliuk Blau <eliukb...@gmail.com> wrote:
> On 3 feb, 18:51, vaporware <jmcg...@gmail.com> wrote:
>
>
>
> > Hi all,
>
> > I'm assembling my own proposal for extending Glk, and I'd like to
> > solicit your feedback.
>
> > === Your Feedback ===
>
> > Are the goals and scope of this proposal worthy? Are these changes
> > sufficient to achieve them?
>
> > Interpreter/Glk library authors: do these changes seem practical to
> > implement?
>
> > Since this is not an official part of the Glk spec, I expect to have
> > to write a few patches myself (though volunteers would be appreciated,
> > especially on MacOS). What is the core set of interpreters that would
> > need to support these changes in order for them to be useful?
>
> > vw
>
> I like this! It is complete enough for me and very simple (I imagine)
> to implement. You have my vote! =3
>
> I prefer glk_stylehint_redefine() instead glk_stylehint_set_temp().

Hmm. Would that change the definition of a style? My idea was that
glk_stylehint_set_temp() would not affect the existing style: if you
select style_Alert, set a temporary style hint to make the text red,
then select style_Alert again, you get the original alert color back.

> And I think that styles should be not limited to 16 only. I think it
> would be a good idea to allow as many styles as the programmer would
> like ... simply increasing the "number" of style, example:
>
> style_User3 to style_UserNNN (where "NNN" an undetermined number, ie,
> limited only by the maximum value of numeric variables in Inform-
> Glulx).

Yes, lifting the limit on the number of styles would be nice. On the
other hand, as Evin pointed out, user styles might not get much use
anyway with glk_stylehint_set_temp() available.

Another concern is avoiding conflicts with future style numbers: I
can't claim every unused style number for a user style unless I can be
sure they won't be used for something else in a future Glk version. On
the other hand, it looks like mainline Glk development is heading in
another direction, so maybe that won't be a problem.

A third concern is limiting the amount of space interpreters have to
use to store style numbers. The only implementation I'm familiar with
so far is Gargoyle, which (before I added Z-machine colors) used one
byte per character to store the style number -- so a buffer with 1000
characters in it took up at least 2000 bytes total. If the number of
user styles were unbounded, that would be 5000 bytes instead (4 bytes
per character for style numbers), plus whatever extra space is needed
to track temporary style hints. We could fit up to 247 user styles in
a byte, though.

> I congratulate you for your contribution to development! =D

Thanks... I just hope this isn't too ambitious to make happen. I
suspect it'll be harder to get cooperation from interpreter and
library authors for an unofficial change like this.

vw

Andrew Plotkin

unread,
Feb 4, 2010, 12:34:43 AM2/4/10
to
Here, vaporware <jmc...@gmail.com> wrote:
>
> I'm assembling my own proposal for extending Glk, and I'd like to
> solicit your feedback.

Without getting into the larger issue of extending or forking Glk (in
fact I haven't read the following thread messages yet)...



> void glk_set_line_terminators(winid_t win, const glui32 *keycodes,
> glui32 numkeycodes);
>
> This registers a set of key codes which will terminate line input
> (without printing a new line). The key code will be returned in the
> val2 field of the event record.

I was already planning to include this feature, and this syntax is
fine. (Just what I would have written, but I'm sure you knew that...)

The "without printing a newline" bit doesn't belong to this call,
however. There will be a separate call, glk_set_echo_line_event(),
which flips between the old behavior (print input buffer and newline)
and the opposite (don't print input buffer or newline).

For the record, the feature list for the next spec update currently
looks like:

- new style system
- glk_set_line_terminators
- glk_set_echo_line_event
- glk_buffer_normalize_uni

(That last takes a Unicode character array and converts it to
Normalization Form C. The spec says the interpeter is supposed to do
that on line input, but I'd like to rip that rule out -- it should be
done by the library, not by the terp. I don't think any terp ever
implemented it anyhow. If they did, games still aren't relying on it,
because I only released the patch to make I6 grok Unicode dictionaries
a couple of months ago.)

--Z

--
"And Aholibamah bare Jeush, and Jaalam, and Korah: these were the borogoves..."
*

Matthew Wightman

unread,
Feb 4, 2010, 3:40:43 AM2/4/10
to

Are you intending this to be restricted to the 'special' keycodes?
If not, what is the behaviour intended to be if one of the keycodes is
a printable character? (Does that character get printed?)

>
>   void glk_unput_string(char *s);
>   void glk_unput_string_uni(glui32 *s);
>   void glk_unput_string_stream(strid_t str, char *s);
>   void glk_unput_stream_stream_uni(strid_str, glui32 *s);
>
> These remove a string from the end of the specified stream's buffer,
> or the currently selected stream's buffer, if indeed it is there. The
> Z-machine's "read" instruction expects to be able to continue line
> input using a partial line that has already been printed by the game;
> the interpreter knows what the partial text is, but is not expected to
> print it. With this function, the interpreter can unput the partial
> text and then use the regular Glk line input feature (which *does*
> print the partial text).

This sounds like it could be difficult to implement for some
implementations (e.g. cheapglk); would it be worth helping simpler glk
libraries implement this by restricting its use to (as an example)
removing strings that contain no new-lines that have been output since
the last glk_select call (thus allowing implementation by buffering
output prior to writing it to the screen), or would this be too
limiting?

>
>   void glk_stylehint_set_temp(winid_t win, glui32 styl, glui32 hint,
> glsi32 val);
>
> This sets a temporary style hint for the specified window. Unlike a
> regular style hint, this *does* take effect immediately. Temporary
> style hints take precedence over the attributes of the currently
> selected style, so selecting style_Alert and then hinting its
> foreground color to red will produce a red alert. All temporary style
> hints will be cleared upon calling glk_set_style or
> glk_set_style_stream (without changing the appearance of any text that
> was printed in the temporary style).
>
> === Gestalt Selectors ===
>
> New gestalt selectors are added:
>
> * gestalt_HasTerminators, gestalt_HasUnput, gestalt_HasTempStyleHints:
> check whether these new functions are present

Suggestion: remove the 'Has', to increase consistency with the
existing names of glk gestalt selectors.

Would it be worth defining gestalt_(Has)?Terminators(0) as returning 1
iff terminators are supported at all, while gestalt_(Has)?
Terminators(x) returns 1 iff the keycode x is supported as a line
terminator?

>
> * gestalt_SupportTempStyleHint: check which hints are supported as
> temporary style hints (interpreters may choose not to support
> temporary changes in justification, etc.)

Again, I think this would be more consistent with the naming of
existing gestalt selectors without the "Support".

>
> === Other Changes ===
>
> New user style constants are added, style_User3 through style_User16.

Would it be worth adding a new gestalt selector so that games can
easily check if these styles exist? (e.g. one returning the number of
user-styles?)

>
> === Your Feedback ===
>
> Are the goals and scope of this proposal worthy? Are these changes
> sufficient to achieve them?
>
> Interpreter/Glk library authors: do these changes seem practical to
> implement?

You will need to pin down a few more details (like gestalt selector
numbers and dispatch ids for functions, which should be allocated via
Zarf IIRC) before these could be added to a Glk implementation.

(Given that as far as I'm aware the entire set of users of my
implementation can be counted on the thumbs of one hand, please don't
take any of my feedback as necessarily being representative of any of
the common implementations; however, I wouldn't expect much of a
problem implementing your suggestions for RiscGlulxe)

>
> Since this is not an official part of the Glk spec, I expect to have
> to write a few patches myself (though volunteers would be appreciated,
> especially on MacOS). What is the core set of interpreters that would
> need to support these changes in order for them to be useful?
>
> vw

--
Matthew

vaporware

unread,
Feb 4, 2010, 6:24:30 AM2/4/10
to
On Feb 4, 12:40 am, Matthew Wightman <matthew.wight...@gmail.com>
wrote:

> On 3 Feb, 21:51, vaporware <jmcg...@gmail.com> wrote:
> >   void glk_set_line_terminators(winid_t win, const glui32 *keycodes,
> > glui32 numkeycodes);
>
> > This registers a set of key codes which will terminate line input
> > (without printing a new line). The key code will be returned in the
> > val2 field of the event record.
>
> Are you intending this to be restricted to the 'special' keycodes?
> If not, what is the behaviour intended to be if one of the keycodes is
> a printable character? (Does that character get printed?)

I'll defer to Andrew on this, in light of the news that this is going
to be part of the official Glk spec. I imagine it will be restricted
to special key codes, which is the case on the Z-machine.

> >   void glk_unput_string(char *s);
> >   void glk_unput_string_uni(glui32 *s);
> >   void glk_unput_string_stream(strid_t str, char *s);
> >   void glk_unput_stream_stream_uni(strid_str, glui32 *s);
>
> > These remove a string from the end of the specified stream's buffer,
> > or the currently selected stream's buffer, if indeed it is there. The
> > Z-machine's "read" instruction expects to be able to continue line
> > input using a partial line that has already been printed by the game;
> > the interpreter knows what the partial text is, but is not expected to
> > print it. With this function, the interpreter can unput the partial
> > text and then use the regular Glk line input feature (which *does*
> > print the partial text).
>
> This sounds like it could be difficult to implement for some
> implementations (e.g. cheapglk); would it be worth helping simpler glk
> libraries implement this by restricting its use to (as an example)
> removing strings that contain no new-lines that have been output since
> the last glk_select call (thus allowing implementation by buffering
> output prior to writing it to the screen), or would this be too
> limiting?

That restriction would be acceptable - I only see this being useful
for the Z-machine's preloaded line input, and I wouldn't expect any
interpreters to know what to do with preloaded input containing
newlines anyway.

On the other hand, Evin pointed out that the buffering could be done
by the interpreter rather than the Glk library. Leaving this feature
out is tempting, since it only really has one use (yet feels
inconsistent with any less than four functions). Now that I think some
more about it, I think the buffering mode issue I raised earlier is a
non-issue. The other thing that I was a little worried about was the
implementation overhead of making the interpreter buffer styles, but
the Z-machine only has about 12 bits of style information per
character anyway.

> > * gestalt_HasTerminators, gestalt_HasUnput, gestalt_HasTempStyleHints:
> > check whether these new functions are present
>
> Suggestion: remove the 'Has', to increase consistency with the
> existing names of glk gestalt selectors.

Fair enough.

> Would it be worth defining gestalt_(Has)?Terminators(0) as returning 1
> iff terminators are supported at all, while gestalt_(Has)?
> Terminators(x) returns 1 iff the keycode x is supported as a line
> terminator?

Yes, this sounds useful, although again that's also a question for
zarf.

> > * gestalt_SupportTempStyleHint: check which hints are supported as
> > temporary style hints (interpreters may choose not to support
> > temporary changes in justification, etc.)
>
> Again, I think this would be more consistent with the naming of
> existing gestalt selectors without the "Support".

Agreed, and in that case it could be merged with the one to check
whether temporary style hints are supported at all... although 0 is a
valid style hint selector. Would it be too weird to use
gestalt_TempStyleHints(-1) to check for the feature, and
gestalt_TempStyleHints(stylehint_*) to check whether a particular
style hint is supported as a temp?

> > === Other Changes ===
>
> > New user style constants are added, style_User3 through style_User16.
>
> Would it be worth adding a new gestalt selector so that games can
> easily check if these styles exist? (e.g. one returning the number of
> user-styles?)

I suppose so. How about gestalt_UserStyles returning the highest user
style constant (so, style_User16)?

> You will need to pin down a few more details (like gestalt selector
> numbers and dispatch ids for functions, which should be allocated via
> Zarf IIRC) before these could be added to a Glk implementation.

Indeed. I will await zarf's word on which numbers are safe to use.

vw

Andrew Plotkin

unread,
Feb 4, 2010, 12:09:15 PM2/4/10
to
Following up: You are correct that your other suggestions are not
going into the official Glk spec. I am willing to reserve call numbers
and gestalt numbers in the spec. (And there will definitely be a
HasTerminator gestalt of some sort. I hope you don't want exact specs
very soon; I'm obviously not moving very quickly...)

I am predictably concerned about the idea of forking the spec; a
format war inside Glulx would be very bad for Inform writers and
players. But I think that's an unlikely outcome. Possibly I think that
because I have an inflated view of how wonderful my own plan is. If
so, my delusions are keeping me calm for the moment. :)

Basically, I see the value in putting a Z-machine interpreter into a
Glk framework, and I don't think that work in that direction is going
to disrupt the evolution of Glulx.

Ben Cressey

unread,
Feb 4, 2010, 5:16:21 PM2/4/10
to
> I am predictably concerned about the idea of forking the spec; a
> format war inside Glulx would be very bad for Inform writers and
> players. But I think that's an unlikely outcome. Possibly I think that
> because I have an inflated view of how wonderful my own plan is. If
> so, my delusions are keeping me calm for the moment. :)

What about adding glk_stylehint_override / glk_stylehint_set_temp to
the old Glk style system? (And possibly retracting interpolation from
Glk CSS in the bargain.) That would give authors a choice between a
"style as you go" system and a "style in advance" system, though not
all libraries would support both.

The catch here is that the development environment would somehow need
to allow authors to choose between those systems explicitly. Is
something like this planned, or is the thought that Inform will be
"Glk style hints" in one release and "Glk style sheets" in the next?

Even if the functionality were never exposed directly to Inform 7
authors, it would still be tremendously useful for libraries trying to
support styles in non-Glulx interpreters. I understand that this is
not a priority, but it would be great if all modifications at the
interpreter level could work portably between Glk implementations.

Dannii

unread,
Feb 6, 2010, 9:18:13 AM2/6/10
to

Would just like to give my support to this proposal. A little bit of
competition can be a good thing.

David Fletcher

unread,
Feb 6, 2010, 5:50:33 PM2/6/10
to
vaporware <jmc...@gmail.com> writes:

If I was in the Glk-extending business, one thing I would like to try
would be to make "undo" vanish the text from the turn that was undone.
This would need some sort of unput, but it would be fiddly if it had
to match the text which had been printed. So I'd say make it a
general stream truncation call which just takes a stream position. Or
alternatively, a call to insert "bookmarks" into the stream, plus an
unput-back-to-bookmark call.

> void glk_stylehint_set_temp(winid_t win, glui32 styl, glui32 hint,
> glsi32 val);
>
> This sets a temporary style hint for the specified window. Unlike a
> regular style hint, this *does* take effect immediately. Temporary
> style hints take precedence over the attributes of the currently
> selected style, so selecting style_Alert and then hinting its
> foreground color to red will produce a red alert. All temporary style
> hints will be cleared upon calling glk_set_style or
> glk_set_style_stream (without changing the appearance of any text that
> was printed in the temporary style).
>

Does the call need to take a style number? From your description, it
seems like there wouldn't be any point setting a temporary hint on a
style other than the current style.

> === Gestalt Selectors ===
>
> New gestalt selectors are added:
>
> * gestalt_HasTerminators, gestalt_HasUnput, gestalt_HasTempStyleHints:
> check whether these new functions are present
>
> * gestalt_SupportTempStyleHint: check which hints are supported as
> temporary style hints (interpreters may choose not to support
> temporary changes in justification, etc.)
>
> === Other Changes ===
>
> New user style constants are added, style_User3 through style_User16.
>
> === Your Feedback ===
>
> Are the goals and scope of this proposal worthy? Are these changes
> sufficient to achieve them?
>

I like the idea of having a complete Z-machine based on Glk plus
minimal extensions. I don't much like having this style extension
available in Glulx as well as a new CSS system.

If you only wanted to support the Z-machine, you could add the C
functions but leave them out of the dispatch layer. But you've said
your secondary goal is to have the style extension in Glulx, so never
mind.

> Interpreter/Glk library authors: do these changes seem practical to
> implement?
>

Yeah, not too bad. Maybe I'd put them into QGlk, eventually, despite
being vaguely disapproving. Not that you should care about QGlk at
this point, since it doesn't have many users yet.

> Since this is not an official part of the Glk spec, I expect to have
> to write a few patches myself (though volunteers would be appreciated,
> especially on MacOS). What is the core set of interpreters that would
> need to support these changes in order for them to be useful?
>
> vw

--
David Fletcher.

vaporware

unread,
Feb 6, 2010, 6:25:23 PM2/6/10
to
On Feb 6, 2:50 pm, David Fletcher <david-n...@bubblycloud.com> wrote:

> vaporware <jmcg...@gmail.com> writes:
> >   void glk_unput_string(char *s);
> >   void glk_unput_string_uni(glui32 *s);
> >   void glk_unput_string_stream(strid_t str, char *s);
> >   void glk_unput_stream_stream_uni(strid_str, glui32 *s);
>
> > These remove a string from the end of the specified stream's buffer,
> > or the currently selected stream's buffer, if indeed it is there. The
> > Z-machine's "read" instruction expects to be able to continue line
> > input using a partial line that has already been printed by the game;
> > the interpreter knows what the partial text is, but is not expected to
> > print it. With this function, the interpreter can unput the partial
> > text and then use the regular Glk line input feature (which *does*
> > print the partial text).
>
> If I was in the Glk-extending business, one thing I would like to try
> would be to make "undo" vanish the text from the turn that was undone.
> This would need some sort of unput, but it would be fiddly if it had
> to match the text which had been printed.  So I'd say make it a
> general stream truncation call which just takes a stream position.  Or
> alternatively, a call to insert "bookmarks" into the stream, plus an
> unput-back-to-bookmark call.

This is an interesting idea, but I don't think it would work for the
same purpose: the interpreter doesn't know when the game is about to
print the text it's going to use for preloaded input, so I think the
only way to set the bookmark at the right place would be to buffer the
whole line before sending it off to Glk -- and if the terp is doing
that, it doesn't need unput anyway.

It would work for other situations, as you say. But perhaps that could
be accomplished without defining a new function, by extending
glk_stream_set_position to work with window streams?

> >   void glk_stylehint_set_temp(winid_t win, glui32 styl, glui32 hint,
> > glsi32 val);
>
> > This sets a temporary style hint for the specified window. Unlike a
> > regular style hint, this *does* take effect immediately. Temporary
> > style hints take precedence over the attributes of the currently
> > selected style, so selecting style_Alert and then hinting its
> > foreground color to red will produce a red alert. All temporary style
> > hints will be cleared upon calling glk_set_style or
> > glk_set_style_stream (without changing the appearance of any text that
> > was printed in the temporary style).
>
> Does the call need to take a style number?  From your description, it
> seems like there wouldn't be any point setting a temporary hint on a
> style other than the current style.

Good catch! No, it doesn't need the "styl" parameter; temporary hints
are always applied on top of the current style.

> I like the idea of having a complete Z-machine based on Glk plus
> minimal extensions.  I don't much like having this style extension
> available in Glulx as well as a new CSS system.
>
> If you only wanted to support the Z-machine, you could add the C
> functions but leave them out of the dispatch layer.  But you've said
> your secondary goal is to have the style extension in Glulx, so never
> mind.

Indeed. I would like this to be available to games, so the Z-machine
styling extensions can be ported to Glulx, and new Glulx styling
extensions can have full access to the available styles (and do the
things I've griped about not being able to do with the CSS system,
like compute colors at runtime).

vw

namekuseijin

unread,
Feb 6, 2010, 9:25:00 PM2/6/10
to
On 6 fev, 12:18, Dannii <curiousdan...@gmail.com> wrote:
> Would just like to give my support to this proposal. A little bit of
> competition can be a good thing.

Competition is good. Cordiality too.

Eliuk Blau

unread,
Feb 6, 2010, 10:11:21 PM2/6/10
to
Hello. =) This message basically sums up what I think is missing from
Glk styles. I quote it here hoping that it may be a small
contribution, because it is difficult for me to express myself
properly in English. The message is not much, but it seems appropriate
to appear here as well. =D

http://groups.google.com/group/rec.arts.int-fiction/msg/dfc8b3717ae356bc

Saludos a todos! =)

Eliuk Blau

unread,
Feb 6, 2010, 10:19:25 PM2/6/10
to

Oh, and I would add the possibility to make visible or invisible
borders between windows programmatically, as mentioned by the original
proposal. :-)

Saludos!

Andrew Plotkin

unread,
Feb 7, 2010, 1:36:18 AM2/7/10
to
Here, Ben Cressey <bcre...@gmail.com> wrote:
> > I am predictably concerned about the idea of forking the spec; a
> > format war inside Glulx would be very bad for Inform writers and
> > players. But I think that's an unlikely outcome. Possibly I think that
> > because I have an inflated view of how wonderful my own plan is. If
> > so, my delusions are keeping me calm for the moment. :)
>
> What about adding glk_stylehint_override / glk_stylehint_set_temp to
> the old Glk style system? (And possibly retracting interpolation from
> Glk CSS in the bargain.) That would give authors a choice between a
> "style as you go" system and a "style in advance" system, though not
> all libraries would support both.

As I said, I'm pretty sure stylehints are useless the way they are,
and not just because of the static-ness of them. You can't represent
sizes or distances in any meaningful way. You can't represent fonts.
Color is about the only attribute they *can* represent fully.



> The catch here is that the development environment would somehow need
> to allow authors to choose between those systems explicitly. Is
> something like this planned, or is the thought that Inform will be
> "Glk style hints" in one release and "Glk style sheets" in the next?

There will be a Glk call to request the new system or the old system.
I imagine that some future version of Inform will request the new
system by default.

(If I can't convince Graham to make the stylesheet model the default,
that's a big vote against it...)



> Even if the functionality were never exposed directly to Inform 7
> authors, it would still be tremendously useful for libraries trying to
> support styles in non-Glulx interpreters. I understand that this is
> not a priority, but it would be great if all modifications at the
> interpreter level could work portably between Glk implementations.

The more different the models, the more work is involved in making a
given implementation support both. It would indeed be ideal if all
libraries supported both models -- but I'm not sanguine about that
happening.

Andrew Plotkin

unread,
Feb 7, 2010, 1:42:21 AM2/7/10
to
Here, Eliuk Blau <eliu...@gmail.com> wrote:
>
> Oh, and I would add the possibility to make visible or invisible
> borders between windows programmatically, as mentioned by the original
> proposal. :-)

Did I leave that out of my most recent feature list? Yes, that's still
planned.

vaporware

unread,
Feb 7, 2010, 2:44:09 PM2/7/10
to
On Feb 6, 10:36 pm, Andrew Plotkin <erkyr...@eblong.com> wrote:

> Here, Ben Cressey <bcres...@gmail.com> wrote:
>
> > > I am predictably concerned about the idea of forking the spec; a
> > > format war inside Glulx would be very bad for Inform writers and
> > > players. But I think that's an unlikely outcome. Possibly I think that
> > > because I have an inflated view of how wonderful my own plan is. If
> > > so, my delusions are keeping me calm for the moment. :)
>
> > What about adding glk_stylehint_override / glk_stylehint_set_temp to
> > the old Glk style system? (And possibly retracting interpolation from
> > Glk CSS in the bargain.) That would give authors a choice between a
> > "style as you go" system and a "style in advance" system, though not
> > all libraries would support both.
>
> As I said, I'm pretty sure stylehints are useless the way they are,
> and not just because of the static-ness of them. You can't represent
> sizes or distances in any meaningful way. You can't represent fonts.
> Color is about the only attribute they *can* represent fully.

stylehint_Size uses a meaningless scale, but you could add a style
hint that uses points or ems instead: the current set of style hints,
not the style hint API, is what prevents the meaningful specification
of sizes and distances. Selecting a font by name would require passing
a string out of the game with a separate call, but sticking with
integers, you could at least suggest a font type (serif, sans serif,
script).

> There will be a Glk call to request the new system or the old system.
> I imagine that some future version of Inform will request the new
> system by default.
>
> (If I can't convince Graham to make the stylesheet model the default,
> that's a big vote against it...)

For what it's worth, I would be happy to lend my support to a
stylesheet model that included temporary off-sheet changes (a la
glk_stylehint_set_temp).

vw

Carlos Sánchez

unread,
Feb 7, 2010, 7:06:23 PM2/7/10
to
Hi,

I would like to point to something not related with styles, that I
think is something Glk lacks, and for me is the only really important
thing Glk lacks: full screen.

Probably text games are the only games nowadays that open on windowed
mode. When the game is just text based that doesn't seem to be a big
problem, but when the game is a rich multimedia based game, it makes
them look less professional (imo).

My proposal is allowing the game to request full screen at a given
resolution, and of course specifications may say that the order may be
ignored if the interpreter can't do it (and even use gestalt to check
before, etc.) Also the interpreters may have on their setup the option
to ignore the order even if possible. Probably interpreters like the
javascript one you are doing will have to ignore the order, but glulxe
and others can benefit of it, and make games look more professional.

Ben Cressey

unread,
Feb 8, 2010, 6:42:35 PM2/8/10
to
> stylehint_Size uses a meaningless scale, but you could add a style
> hint that uses points or ems instead: the current set of style hints,
> not the style hint API, is what prevents the meaningful specification
> of sizes and distances. Selecting a font by name would require passing
> a string out of the game with a separate call, but sticking with
> integers, you could at least suggest a font type (serif, sans serif,
> script).

stylehint_Size seems fine the way it is. Gargoyle doesn't implement it
at all; a quick glance at Windows Glk suggests that it defines the
value in terms of points, and I'd probably follow that convention if I
were to add the functionality. I really dislike the notion of games
specifying font sizes directly; it assumes a common, desktop-centric
presentation that can't help but break down on mobile devices with
limited screen space, or in cases where users want larger default
fonts for readability purposes.

I like the idea of a stylehint_FontFamily for selecting between serif,
sans serif, and script.

If we're serious about additional user styles, I would like to see an
independent mechanism for storing and modifying these new styles,
entirely distinct from the current system:

void glk_set_global_style(glui32 val);
void glk_set_global_style_stream(strid_t str, glui32 val);

void glk_global_stylehint_set(glui32 styl, glui32 hint, glsi32 val);
void glk_global_stylehint_clear(glui32 styl, glui32 hint);

glui32 glk_global_style_distinguish(glui32 styl1, glui32 styl2);
glui32 glk_global_style_measure(glui32 styl, glui32 hint, glui32
*result);

These global styles would be shared between all windows (existing and
newly opened) of all types, and hints would take effect immediately in
all windows, probably triggering a redraw event.

This would allow more efficient storage of dynamic styles - one set
for all windows, rather than one set per window - and buffered
characters would only need one additional bit of attribute storage, to
indicate whether the style value referenced a local or global style.
It also avoids the issues that come from extending and potentially
redefining the existing range of styles.

I dislike the notion of "temporary" styles because it presumes that
styles are somehow transient. But the library has to persist style
information for the lifespan of a window, so that it is prepared to
redraw everything in response to user-driven events like a window
refresh. If games are allowed to set a style, print some text, then
set a temporary style derived from the first, the library will have to
allocate storage for the temporary style exactly as if it were a
regularly defined one. And each successive temporary style adds
another set of attributes to store and uniquely identify for later
retrieval.

Given that the styles must be persistently stored and uniquely
identified in any case, I'd rather turn that bookkeeping over to the
author. At least that way, the "temporary" style is available for
subsequent reuse. If we need some sort of progressive modification or
inheritance mechanism, I would propose:

glui32 glk_global_style_copy(glui32 styl1, glui32 styl2);

The first would copy the attributes of source styl1 to destination
styl2. The desired changes could then be applied to styl2 with
successive calls to glk_global_stylehint_set().

Ben Cressey

unread,
Feb 8, 2010, 7:51:38 PM2/8/10
to
> glui32 glk_global_style_copy(glui32 styl1, glui32 styl2);
>
> The first would copy the attributes of source styl1 to destination
> styl2. The desired changes could then be applied to styl2 with
> successive calls to glk_global_stylehint_set().

I meant to suggest a second function:

void glk_global_stylehint_temp(winid_t win, glui32 hint);

This would copy the current style attributes (whether local or global)
to the next unused global style, apply the supplied hint, then make
the new global style the current style in the specified window.

To avoid clashing with author-defined global styles, we would need
another function to specify how global styles should be allocated:

void glk_set_global_style_mode(glui32 val);

This would have to be called before any windows have been opened, and
the behavior of global styles would be immutable once set:
- 0 disables global styles.
- 1 turns on author-managed global styles, disabling calls to
glk_global_stylehint_temp().
- 2 turns on library-managed global styles, disabling calls to the
other functions.

This is similar in principal to the proposed selector between the Glk
CSS and stylehint models, and should imitate its syntax and behavior
with respect to the supplied value.

vaporware

unread,
Feb 9, 2010, 9:23:22 PM2/9/10
to

That's one possible implementation, in which each character has an
associated style number but no other information, but it's not the
only one. For instance, each character could have an associated color
as well (which is basically what I did for Demona), and then the
library could support temporary changes of color simply by changing
the color attribute for future characters. A library that stored the
buffer as a linked list of styled spans rather than an array of styled
characters would also have an easy time with temporary styles, I
think.

> Given that the styles must be persistently stored and uniquely
> identified in any case, I'd rather turn that bookkeeping over to the
> author. At least that way, the "temporary" style is available for
> subsequent reuse.

I'm concerned that this will be impractical when "the author" really
means "an Inform extension running in the VM". A game that uses a lot
of foreground/background/font combinations could easily require
thousands of styles; reusing them would mean storing that huge list in
VM memory and slowly searching through it. If the library is
responsible for implementing temporary styles, then that bookkeeping
may not be necessary at all (depending on the library design), and
even if it is, at least it can be done in native code.

vw

namekuseijin

unread,
Feb 9, 2010, 10:45:26 PM2/9/10
to
On 7 fev, 22:06, Carlos Sánchez <csanche...@gmail.com> wrote:
> I would like to point to something not related with styles, that I
> think is something Glk lacks, and for me is the only really important
> thing Glk lacks: full screen.
>
> Probably text games are the only games nowadays that open on windowed
> mode. When the game is just text based that doesn't seem to be a big
> problem, but when the game is a rich multimedia based game, it makes
> them look less professional (imo).

heh, I'm sure my full-screen pure Linux console running frotz doesn't
look "professional" either. ;)

> Probably interpreters like the
> javascript one you are doing will have to ignore the order

I'm sure you know in most browsers you may just hit F11 to turn it to
fullscreen mode.

Andrew Plotkin

unread,
Feb 9, 2010, 11:26:11 PM2/9/10
to
Here, namekuseijin <nameku...@gmail.com> wrote:

This seems like something that should be interpreter-wide in any
event, and not have to be requested (or even cared about) by any
particular game.

Carlos Sánchez

unread,
Feb 10, 2010, 4:06:30 AM2/10/10
to

> This seems like something that should be interpreter-wide in any
> event, and not have to be requested (or even cared about) by any
> particular game.

Full screen is common request by Superglus users that I cannot provide
myself as Glk is unable to perform it. For me is something requested
often, probably the most requested feature.

Andrew Plotkin

unread,
Feb 10, 2010, 11:44:57 AM2/10/10
to

Then interpreters should definitely do it. File your feature requests
now.

Putting this feature into the Glk spec would make sense if:

- Some games should run full-screen and others shouldn't
- This should be the author's decision rather than the player's
- All games published prior to 2010 should *not* run full-screen
- Only Glulx games should run full-screen

All of those are false.

Ben Cressey

unread,
Feb 10, 2010, 2:35:23 PM2/10/10
to
> That's one possible implementation, in which each character has an
> associated style number but no other information, but it's not the
> only one.

True, but implementing a global pool of styles shared commonly between
windows will take the library one step closer to supporting Glk CSS,
which would appear to require something similar. Storing style
information for each character moves it further away; I just don't see
that as an efficient approach in the CSS world.

I'd like the new temporary styles to fall somewhere along the
continuum between static, local styles and static, global styles.
Static, ad-hoc global styles seem like the best conceptual fit. By
implementing temporary styles on top of that, you automatically
support an unlimited number of user styles, which is something that's
been requested. Best of all, you do it without any incompatible
modifications to the original style hint system; there's no need to
add more user styles or to treat the original Glk styles as mutable
structures.

> I'm concerned that this will be impractical when "the author" really
> means "an Inform extension running in the VM". A game that uses a lot
> of foreground/background/font combinations could easily require
> thousands of styles; reusing them would mean storing that huge list in
> VM memory and slowly searching through it.

That is where the library managed mode for global styles would come
in; calls to glk_global_stylehint_temp() would create the styles
dynamically without the VM needing to refer to specific style numbers.
It presents some challenges for the library in terms of efficiently
allocating and cleaning up temporary styles, but those are not
insurmountable.

Carlos Sánchez

unread,
Feb 10, 2010, 2:55:50 PM2/10/10
to
> Putting this feature into the Glk spec would make sense if:
>
> - Some games should run full-screen and others shouldn't
> - This should be the author's decision rather than the player's
> - All games published prior to 2010 should *not* run full-screen
> - Only Glulx games should run full-screen
>
> All of those are false.

I think having the decision of requesting full screen is an author's
right, as requesting styles for the text, or requesting to play a
sound, the author should decide how he want's the game to look like,
though the player maybe be able also to deny authors's opinion and
impose their own, and obviously, hardware limitations would also limit
author's design (as it does right now, as some interpretres won't play
sounds, or accept style changes, etc.). So for me the first two
sentences in the list are not false. In fact, if you apply that
sentences to all Glk features then there would be no sound, images,
styles, hyperlinks, etc. (Some games should play music and some
shouldn't, some games should change styles and some shouldn't, etc.).
As summary, I can't see any different between asking Glk to set full
screen, and , for instance, asking Glk to play a sound file, if both
can be overriden both by hardware limitations and by player
interpreter setup.

I don't understand what you mean with the third sentence, not the
fourth. Or maybe I just don't understand why a reason for not adding a
capabilities may be "that capabilitie has not been there before" :)


Carlos.

mwigdahl

unread,
Feb 10, 2010, 3:33:00 PM2/10/10
to

I would rephrase Andrew's first statement as "Some games should _be
able to_ run full screen and others shouldn't". What he's getting at
is that if it's a Glk feature, games compiled against older versions
of Glk can't benefit from the feature. Whereas if it's done at the
interpreter level, appropriate behavior can be retrofitted for older
games.

And I'm surprised you beat Jim to the punch on asserting author's
prerogative. I would have put money on him being first in on that
one... :)

Matt

Ben Cressey

unread,
Feb 10, 2010, 3:39:50 PM2/10/10
to
> I don't understand what you mean with the third sentence, not the
> fourth. Or maybe I just don't understand why a reason for not adding a
> capabilities may be "that capabilitie has not been there before" :)

Interpreter packages like Gargoyle, Zoom, Spatterlight and so forth
could conceivably offer a full-screen mode; players would not want
that mode artificially restricted to a certain type of game (Glulx)
made after a certain point in time (2010) when the Glk API method
became available. They would want to set full-screen mode whenever
they felt like it, regardless of the game's format. So it's properly a
feature request for the application that opens the game file, as
distinct from the Glk library or the Glulx virtual machine
interpreter.

If you open a bug report on the Gargoyle project site, the issue will
at least be on my radar.

Andrew Plotkin

unread,
Feb 10, 2010, 3:43:43 PM2/10/10
to
Here, Ben Cressey <bcre...@gmail.com> wrote:
> > I don't understand what you mean with the third sentence, not the
> > fourth. Or maybe I just don't understand why a reason for not adding a
> > capabilities may be "that capabilitie has not been there before" :)
>
> Interpreter packages like Gargoyle, Zoom, Spatterlight and so forth
> could conceivably offer a full-screen mode; players would not want
> that mode artificially restricted to a certain type of game (Glulx)
> made after a certain point in time (2010) when the Glk API method
> became available. They would want to set full-screen mode whenever
> they felt like it, regardless of the game's format. So it's properly a
> feature request for the application that opens the game file, as
> distinct from the Glk library or the Glulx virtual machine
> interpreter.

Yes, thank you. That's what I was trying to say.

It's reasonable for authors to want this full-screen feature for their
games. But presumably you want it for *all* games, not just your own!

> If you open a bug report on the Gargoyle project site, the issue will
> at least be on my radar.

--Z

Erik Temple

unread,
Feb 10, 2010, 4:08:40 PM2/10/10
to
On Wed, 10 Feb 2010 14:39:50 -0600, Ben Cressey <bcre...@gmail.com> wrote:

>> I don't understand what you mean with the third sentence, not the
>> fourth. Or maybe I just don't understand why a reason for not adding a
>> capabilities may be "that capabilitie has not been there before" :)
>
> Interpreter packages like Gargoyle, Zoom, Spatterlight and so forth
> could conceivably offer a full-screen mode; players would not want
> that mode artificially restricted to a certain type of game (Glulx)
> made after a certain point in time (2010) when the Glk API method
> became available. They would want to set full-screen mode whenever
> they felt like it, regardless of the game's format. So it's properly a
> feature request for the application that opens the game file, as
> distinct from the Glk library or the Glulx virtual machine
> interpreter.

The author could get some control over fullscreen if interpreter authors
agreed to support a new fullscreen value(s) for the Treaty of Babel's
<presentationprofile> meta tag. (Officially, this tag's values are "plain
text" or "multimedia", but custom values are allowed.)

As a player, I'd be cool with fullscreen *if* the actual game window(s)
don't go full width/height as well; in other words, the whole screen goes
black, but only a portion of the screen is devoted to the game windows. (I
can't think of anything worse than text spanning the entire width of the
screen, but it would be nice to have the rest of the screen blacked out.)
Authors could suggest a size for the rendered portion of the screen using
the Treaty of Babel's <height> and <width> metadata tags, but players also
ought to have some control over it.

(Speaking of, are there any interpreters that actually respect the height
and width tags? The expected behavior sounds great--particularly the
ability for the author to constrain the aspect ratio--but the few test
attempts I made a few months ago didn't show indicate much compliance...)

--Erik

namekuseijin

unread,
Feb 10, 2010, 4:32:18 PM2/10/10
to
On 10 fev, 17:55, Carlos Sánchez <csanche...@gmail.com> wrote:
> I think having the decision of requesting full screen is an author's
> right, as requesting styles for the text, or requesting to play a
> sound, the author should decide how he want's the game to look like,

Whatever. All text games "look" like a stream of characters to me.
And particularly, I'd rather read paragraphs with lots of lines rather
than lots of sentences in a single outstretched long line...

Carlos Sánchez

unread,
Feb 10, 2010, 4:42:35 PM2/10/10
to
> It's reasonable for authors to want this full-screen feature for their
> games. But presumably you want it for *all* games, not just your own!

No no, I think I've been misunderstood. My propososal is shomething
like this:

glui32 glk_fullscreen(glui32 *width, glui32 *height);

Request the interpreter to set full screen mode at a give resolution.
It returns TRUE if it was possible to set the resolution, and FALSE
otherwise.

So I'm not asking all games to set full screen, if an author never
calls to glk_fullscreen the game will never request that. Even on
Superglus, I would make an order calling glk_fullscreen so the author
can decide using it or not. Obviously, no game before the
glk_fullscreen is added to specs may be using that, like no game
before ogg support was added to blorb could use ogg files, and well.
Still I can't think that a reason for not adding a capabilitie is
"that would be infair for old games" :)

>
> > If you open a bug report on the Gargoyle project site, the issue will
> > at least be on my radar.

I will do, but those solutions look like patchs to me to be honest.
One author wanting his/her game shown at a give resolution should
provide then a give garglk.ini file together with the blorb file, plus
a gamename.ini file for Windows Gluxle, plus a whatever.ini for Zoom,
plus another ini file for Zag, etc. And still, he can not be sure that
all interpreters supporting full screen will try to set it, as he/she
maybe doesn't know all existing interpreters. Also, 5 years later
there may be a bunch of new interpreters and none of them will open
the game in full screen as they were unknown to the author the day the
game was issued. Nothing of all this is needed if glk_fullscreen
exists, and a game requesting full screen will be set to full screen
by every interpreter who decides to implement full screen feature.

vaporware

unread,
Feb 10, 2010, 4:48:16 PM2/10/10
to
On Feb 10, 11:35 am, Ben Cressey <bcres...@gmail.com> wrote:
> > That's one possible implementation, in which each character has an
> > associated style number but no other information, but it's not the
> > only one.
>
> True, but implementing a global pool of styles shared commonly between
> windows will take the library one step closer to supporting Glk CSS,
> which would appear to require something similar. Storing style
> information for each character moves it further away; I just don't see
> that as an efficient approach in the CSS world.
>
> I'd like the new temporary styles to fall somewhere along the
> continuum between static, local styles and static, global styles.
> Static, ad-hoc global styles seem like the best conceptual fit. By
> implementing temporary styles on top of that, you automatically
> support an unlimited number of user styles, which is something that's
> been requested. Best of all, you do it without any incompatible
> modifications to the original style hint system; there's no need to
> add more user styles or to treat the original Glk styles as mutable
> structures.

The advantage I see to this approach is that it addresses the request
for more user styles without the possibility of conflicting with the
style_* namespace. On the other hand, it seems unlikely that any more
Glk styles are going to be defined, now that Glk is going in another
direction.

A serious disadvantage is that it locks the library into a certain set
of implementations. For instance, since global style hints take effect
immediately and may trigger a redraw, changing a global style may
cause previously printed text to change appearance... which means you
need to keep a Glk style number associated with all text in the
buffer... which means you can't implement text buffer windows using
standard rich text controls that only store style information.

I would rather treat the Glk style number as shorthand for a set of
style attributes. If the library author decides the best way to keep
track of those attributes is to retain the style number in the buffer,
and the best way to implement temporary style changes is to allocate
new style numbers, he can do that, but I'd rather not expose those
implementation details to the game.

A lesser disadvantage is simply that it's a bigger change. Maybe this
is pessimistic, but I still expect to have to write a few patches
myself: I'd rather add one function than six, and I'd like to have the
flexibility to implement it in whichever way is quickest for the
particular library.

> > I'm concerned that this will be impractical when "the author" really
> > means "an Inform extension running in the VM". A game that uses a lot
> > of foreground/background/font combinations could easily require
> > thousands of styles; reusing them would mean storing that huge list in
> > VM memory and slowly searching through it.
>
> That is where the library managed mode for global styles would come
> in; calls to glk_global_stylehint_temp() would create the styles
> dynamically without the VM needing to refer to specific style numbers.
> It presents some challenges for the library in terms of efficiently
> allocating and cleaning up temporary styles, but those are not
> insurmountable.

Yes, that would work. But it has the same effect as
glk_stylehint_set_temp(), and I'm not convinced that anything is
gained from specifying how that effect is to be implemented.

vw

Andrew Plotkin

unread,
Feb 10, 2010, 5:34:46 PM2/10/10
to
Here, Carlos S�nchez <csanc...@gmail.com> wrote:
> > It's reasonable for authors to want this full-screen feature for their
> > games. But presumably you want it for *all* games, not just your own!
>
> No no, I think I've been misunderstood. My propososal is shomething
> like this:
>
> glui32 glk_fullscreen(glui32 *width, glui32 *height);
>
> Request the interpreter to set full screen mode at a give resolution.
> It returns TRUE if it was possible to set the resolution, and FALSE
> otherwise.

I understand the request, yes. I have rejected it, on the grounds that
this behavior should not be game-specific.

(It fits badly anyhow. Game runtime is way too late to be making
decisions about the display mode. If the game *did* have a way to
convey this request, it would be in the Blorb display metadata, as
suggested in an earlier post.)

> And still, he can not be sure that all interpreters supporting full
> screen will try to set it, as he/she maybe doesn't know all existing
> interpreters.

That will be true no matter what the spec says.

> Also, 5 years later there may be a bunch of new interpreters and
> none of them will open the game in full screen as they were unknown
> to the author the day the game was issued.

That will be true no matter what the spec says.

> Nothing of all this is needed if glk_fullscreen exists, and a game
> requesting full screen will be set to full screen by every
> interpreter who decides to implement full screen feature.

That's circular reasoning.

If you think IF should be played full-screen, convince interpreter
authors to make their interpreters work that way. I support that. I
think it's a good idea. The Glk spec is not a tool in that argument.
I am not the arbiter of all IF UI decisions!

(Think about command history -- the up and down arrow keys. You don't
use it because the spec has a way to switch command history on and
off. It exists because interpreters support it. Interpreters support
it because it's a good idea. Having a glk_use_command_history() call
would make IF-playing *worse*, because then some games wouldn't have
it, and interpreters would be that much harder to implement
"correctly".)

Stuart Allen

unread,
Feb 10, 2010, 5:36:09 PM2/10/10
to
This is probably a high ill-conceived idea but I thought I'd just put
it out there... A while ago I made a web front-end for my IF games
that gave a very basic ability to play over the net. For example:
http://jacl.game-host.org:8080/fastcgi-bin/grail.jacl This is done is
an entirely server-side way as opposed to the modern trend of lots of
Javascript giving desktop-like functionality. There are obvious
disadvantages to this, but it does leave the author free to use all
the power that HTML offers. For example, this is made using the same
system: http://jacl.game-host.org:8080/fastcgi-bin/blackjacl.jacl

The point to all this is that I would be interested in implementing
this as a Glk library that other interpreters could be compiled
against. As far as I can see, the only facility I am missing is the
ability for the library to tell the interpreter to save and restore
the game state. My current model is to generate a unique user ID for
each player (or grab the value of REMOTE_USER) and restore their
personal save file, process the command, save the state again then
return the HTML page. ie. one instance of the running game serves all
requests (well, FastCGI creates more processes under heavy load but
that's transparent.)

So my initial thought is that the interpreter should have some way of
registering the function calls to save and restore state with the Glk
library. Any ideas on a better way to implement this?

Stuart

Ben Cressey

unread,
Feb 10, 2010, 6:33:35 PM2/10/10
to
> A serious disadvantage is that it locks the library into a certain set
> of implementations. For instance, since global style hints take effect
> immediately and may trigger a redraw, changing a global style may
> cause previously printed text to change appearance... which means you
> need to keep a Glk style number associated with all text in the
> buffer... which means you can't implement text buffer windows using
> standard rich text controls that only store style information.

Standard rich text controls?! What luxury!

I see your point, though. Suppose global style hints became immutable
on first use; that is, the first call to glk_set_global_style would
prevent any future calls to glk_global_stylehint_set with that value
from having any effect. Possibly such calls could even be illegal at
that point.

That seems like it would sidestep the implementation difficulties; the
library would then be free to either store the style information
locally or globally as you see fit. The authors lose the ability to
adjust that style, but with an unlimited number of other styles to
manipulate, that shouldn't be a tragic loss.


> A lesser disadvantage is simply that it's a bigger change. Maybe this
> is pessimistic, but I still expect to have to write a few patches
> myself: I'd rather add one function than six, and I'd like to have the
> flexibility to implement it in whichever way is quickest for the
> particular library.

Granted six is more than one, but the global stylehint functions would
be fairly similar to the existing stylehint ones, and could share most
of the same code. glk_global_stylehint_temp() would be a
straightforward to implement with those functions in place.

Carlos Sánchez

unread,
Feb 10, 2010, 7:03:18 PM2/10/10
to
Hi

> I understand the request, yes. I have rejected it, on the grounds that
> this behavior should not be game-specific.

I can't think anything that may be more game specific from my point of
view than this, but anyway, the best thing about points of view is
everybody has one.

> (It fits badly anyhow. Game runtime is way too late to be making
> decisions about the display mode. If the game *did* have a way to
> convey this request, it would be in the Blorb display metadata, as
> suggested in an earlier post.)

Ahem...yeah, changing from windowed mode to full-screen is a huge
challenge nowadays :P

> > And still, he can not be sure that all interpreters supporting full
> > screen will try to set it, as he/she maybe doesn't know all existing
> > interpreters.
>
> That will be true no matter what the spec says.

Incorrect. If the spec says it is possible is more likely that
interpreters provide the option to support it when possible for a
given platform, so the author don't have to worry about which
interpreters he has to do an ini or cfg file for, authors can just
request full-screen and every interpreter able to do it will do it,
like when the game requests playing a sound file. Can't you see how
ridiculous is to have to zip the blorb file together with several
config files, one for each interpreter? Maybe Glk is not the way,
maybe there is other, but the solution cannot be having a folder like
this:

game.blb
game.ini
zoom.ini
...
garglk.ini


> > Also, 5 years later there may be a bunch of new interpreters and
> > none of them will open the game in full screen as they were unknown
> > to the author the day the game was issued.
>
> That will be true no matter what the spec says.

I seriously doubt it, additions to the spec have been adopted by
implementations (that where able to adopt them) pretty soon so far.

> > Nothing of all this is needed if glk_fullscreen exists, and a game
> > requesting full screen will be set to full screen by every
> > interpreter who decides to implement full screen feature.
>
> That's circular reasoning.

Circular reasoning is saying a feature is not welcome cause it was not
there before. No one asked to remove the full screen capabilities that
(just Winglulxe maybe, and is not real full screen, just screen at
given size with no border) have, just leave it like that so old games
can be played full screen also on Winglulxe, but make it easier for
future games. Old games couldn't play ogg and there is no drama about
it.

> If you think IF should be played full-screen, convince interpreter
> authors to make their interpreters work that way. I support that. I
> think it's a good idea. The Glk spec is not a tool in that argument.
> I am not the arbiter of all IF UI decisions!

The question is why this makes you an arbiter of anything if it is
optional. I still don't get it, honestly. Anyway my original idea is
to make possible for the author to request full screen, if that shoudl
be done by Glk or by any other standarized display request method, I
don't mind, but having one thousand ini files to make sure your game
requests full screen is just nonsense. Can't be done that way.

> (Think about command history -- the up and down arrow keys. You don't
> use it because the spec has a way to switch command history on and
> off. It exists because interpreters support it. Interpreters support
> it because it's a good idea. Having a glk_use_command_history() call
> would make IF-playing *worse*, because then some games wouldn't have
> it, and interpreters would be that much harder to implement
> "correctly".)

I can't see the relationship with command history, but anyway. How
would be a blorb based implementation (or any other standarized
method) of a feature like this?

/C

Carlos Sánchez

unread,
Feb 10, 2010, 7:09:31 PM2/10/10
to

> If you open a bug report on the Gargoyle project site, the issue will
> at least be on my radar.

Thanks for the offer but I look for a standarized method. I don't want
to be chasing all current and future interpreter developers to make
them do somehting in their config files like WinGlulxe does. And also
I don't like the idea of having to put one thousand different ini
files together with any game just to be sure if the player eventually
opens the game with interpreter X, there will be a X.ini file there to
request some resolution,

If we can talk about a standarized method, whether is within Glk or
not, is something I would be very interested in, but doing it
interpreter by interpreter doesn't worth the effort to be honest :)

Carlos Sánchez

unread,
Feb 10, 2010, 7:21:13 PM2/10/10
to

> The author could get some control over fullscreen if interpreter authors  
> agreed to support a new fullscreen value(s) for the Treaty of Babel's  
> <presentationprofile> meta tag. (Officially, this tag's values are "plain  
> text" or "multimedia", but custom values are allowed.)

This is an interesting approach, though again, if it is not explicitly
set on point 5.10.2.1 of Treaty of Babel no interpreter developer will
ever consider it. I'm not saying of course that if it appears there
they will consider it inmediatly, but there is a chance if it is kind
of official.

> As a player, I'd be cool with fullscreen *if* the actual game window(s)  
> don't go full width/height as well; in other words, the whole screen goes  
> black, but only a portion of the screen is devoted to the game windows. (I  
> can't think of anything worse than text spanning the entire width of the  
> screen, but it would be nice to have the rest of the screen blacked out.)  

Well, that may be an option (fullscreen | windowed | mixed), your
option is last. But you are thinking in usual small letter, and some
authors decide they like bigger letters, check this:

http://www.caad.es/modulos.php?modulo=descarga&id=1704

Sorry, spanish game, but you get an idea of why this game may
perfectly be full screen even at 1280x1024. Obviously to see the
proper letter size you have to have the garglk.ini file (for Gargoyle)
or the cfg file (for WinGlulxe), that is anothe problem imo, but that
maybe solved with all the style discussion in this thread.

Rikard Peterson

unread,
Feb 10, 2010, 8:09:31 PM2/10/10
to
In article
<686cc45f-f81f-4c72...@a32g2000yqm.googlegroups.com>,

Carlos S�nchez <csanc...@gmail.com> wrote:

> Old games couldn't play ogg and there is no drama about it.

There's a big difference. You can add as much sound support as you want
to an interpreter, but unless there's any sound to play, the game will
still be silent. Any sound in the game has to be specific to the game.
But any game can be made to play full screen. The game doesn't have to
be written in any special way if it's a feature of the interpreter.

The downside with your proposal where it'd be a feature of the game is
that every game would have to add that feature or they wouldn't display
full screen. If it's simply an interpreter setting, any game can be
played full screen regardless if the author remembered to put the
feature in or if the game was made before the feature was available.

> > (Think about command history -- the up and down arrow keys. You don't
> > use it because the spec has a way to switch command history on and
> > off. It exists because interpreters support it. Interpreters support
> > it because it's a good idea. Having a glk_use_command_history() call
> > would make IF-playing *worse*, because then some games wouldn't have
> > it, and interpreters would be that much harder to implement
> > "correctly".)
>
> I can't see the relationship with command history, but anyway.

Imagine if games had to explicitly support the use of command history
for players to take advantage of it. Wouldn't that be a nuisance?

I can understand your wish for a game to be able to say "I'd prefer to
run full screen, please", and it would of course be possible to have
both the interpreter feature and the game request that you want. One
doesn't rule the other out, but I think the best place for such a thing
would be (as someone suggested earlier) as a meta tag in the blorb file
rather than a function call.

/ Rikard

Carlos Sánchez

unread,
Feb 10, 2010, 8:36:28 PM2/10/10
to
> The downside with your proposal where it'd be a feature of the game is
> that every game would have to add that feature or they wouldn't display
> full screen. If it's simply an interpreter setting, any game can be
> played full screen regardless if the author remembered to put the
> feature in or if the game was made before the feature was available.

I've not said that interpreters supporting full screen at the moment
(as far as I know, just Winglulxe in a limited way) have to stop
supporting client/side fullscreen support. So if I'm not asking to
replace a capability that anyway only exist in a very limited way.
What is the problem?

You want your game full screen, say it.
You don't want it, do nothing.
You have an old game not supportin this feature. Well, your only
option till now was Windows Glulxe, and so it is from now, nothing
changes.
The author didn't ask for full screen but you as player want it: use
Windows Glulxe, as if the game was done before the added feature.

Transparent, simple, clean, easy.

Anyway, as said before, while is standarized mode, whether it is on
Glk, on Blorb, on babel info or even if I have to make a new spec for
"common interpreter config file formats" I don't mind, but it needs to
be a standard or the alternative is useless.

Pete Chown

unread,
Feb 11, 2010, 8:30:09 AM2/11/10
to
Stuart Allen wrote:

> A while ago I made a web front-end for my IF games
> that gave a very basic ability to play over the net. For example:
> http://jacl.game-host.org:8080/fastcgi-bin/grail.jacl This is done is
> an entirely server-side way as opposed to the modern trend of lots of
> Javascript giving desktop-like functionality.

I've wondered about this too. There are several terminal emulators that
use Javascript to provide shell access through a web page. I
experimented with some of them, and shellinabox worked best for me.
There is a demo at http://code.google.com/p/shellinabox/ . Anyterm is
similar and has a demo running Colossal Cave at
http://demos.anyterm.org/adventure/anyterm.html .

It would be very straightforward to set up IF on the web, using these
programs. I wondered about doing it, but I was put off by the fact that
it's something of a dead end. I made character-mode glulx work, but
then what? Most glulx games don't play properly in character mode (or
display warning messages that detract from the game). If I set this up,
I would risk creating expectations that are difficult to fulfil with
that technology.

There are various options for supporting the multimedia aspects of glulx
games. One is to write a glk to HTML converter, but that would be a
huge amount of work. Another is to implement a new style system based
around HTML, but that would only work with new games, and for authors
who didn't mind developing solely for the web. Finally it would be
possible to use something like NoMachine ( http://nomachine.com/ ) to
provide a graphical login. That last option would require a download
before the user could play the game, which seems to defeat the point
somewhat.

Another thing that puts me off is that I've had some bad experiences
running glulx games on my Linux system. I'm not sure if that's just
because I downloaded the wrong software, or because glulx actually only
has good quality support on Windows (and perhaps Macs, I wouldn't know
about that). If glulx games don't play well on Linux, I wouldn't want
to build a system for online play on my hosted Linux server. That would
be asking for trouble.

Pete

Dannii

unread,
Feb 11, 2010, 9:55:15 AM2/11/10
to
On Feb 11, 11:30 pm, Pete Chown <1...@234.cx> wrote:
>
> There are various options for supporting the multimedia aspects of glulx
> games.  One is to write a glk to HTML converter, but that would be a
> huge amount of work.  Another is to implement a new style system based
> around HTML, but that would only work with new games, and for authors
> who didn't mind developing solely for the web.

Well Zarf is working on a web Glulx and Glk now. It's coming along
quite well actually... I'm always excited to see the progress he's
making.

As to a new web based style system (IO system really), that is what I
plan to develop, though I have other priorities at the moment. It
won't be restricted to web play. Although theoretically it would be
possible to write your own implementation of it, I would think that
embedding Webkit would be the best path for desktop interpreters to
take. Then they can get all the benefits of their favourite C/C++/C#/
Java/Python interpreters with an identical IO experience as those on
the slower web terps.

namekuseijin

unread,
Feb 11, 2010, 2:02:07 PM2/11/10
to
On 11 fev, 11:30, Pete Chown <1...@234.cx> wrote:
> Stuart Allen wrote:
> > A while ago I made a web front-end for my IF games
> > that gave a very basic ability to play over the net. For example:
> >http://jacl.game-host.org:8080/fastcgi-bin/grail.jacl This is done is
> > an entirely server-side way as opposed to the modern trend of lots of
> > Javascript giving desktop-like functionality.
>
> I've wondered about this too.  There are several terminal emulators that
> use Javascript to provide shell access through a web page.  I
> experimented with some of them, and shellinabox worked best for me.
> There is a demo athttp://code.google.com/p/shellinabox/.  Anyterm is

> similar and has a demo running Colossal Cave athttp://demos.anyterm.org/adventure/anyterm.html.
>
> It would be very straightforward to set up IF on the web, using these
> programs.

should work decent enough for TADS 2 games, I guess. :)

Andrew Plotkin

unread,
Feb 11, 2010, 4:50:29 PM2/11/10
to
Here, Carlos S�nchez <csanc...@gmail.com> wrote:

> but it needs to be a standard or the alternative is useless.

You misunderstand the purpose of a specification, and you
misunderstand my job as the Glk spec maintainer. I can't tell
interpreter authors what features to support. I can make requests,
same as you.

If you believe you are powerless to make this happen, then that's a
self-fulfilling prophecy.

0 new messages