Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

How can I stuff characters into the keyboard buffer?

21 views
Skip to first unread message

Alex Russell

unread,
Mar 23, 2003, 2:35:08 AM3/23/03
to
"Marek Majchrowski" <majh...@unrealleague.org> wrote in message
news:slrnb7n4h5....@marek.majcom...
> hi, is there any other way to put characters into the keyboad buffer than
> INT 16 AH 5?
>
> I want to put character into the same buffer from which I read scan code
by
> inportb(0x60)...
>
> Will it work?
>
> outportb(0x64,0xd2);
> while((int)(inportb(0x64)&0x02));
> outportb(0x60,0x01); // for ESC
>
> besause it doesn't work for me... If i read after that (by inportb(0x60))
> scan code, I always have code 254 and it doesn't matter if I put 0x01 or
> something else...
>
> What do I wrong?
>
> Please Help... and sorry for my english...
>
>
> --
> Marek Maj(c)herek Majchrowski
> E-mail: majh...@unrealleague.org
> UIN: 18207917 GG: 207055

The keyboard hardware does not have a built in buffer. When you read the
keyboard using inport 0x60 you are looking at the keyboard hardware.
Normally when you press or release a key an int9 is generated, which the
BIOS handles. It processes the key, and places it in a buffer in RAM. The
int16 functions change this RAM keyboard buffer. As most DOS programs allow
the BIOS to handle all keypresses, in most cases using the int16 functions
is the best way to "stuff keys".

Some programs, mainly games, install their own int9 handler and do not use
the bios keyboard buffer. As they install their own int9 handler it is VERY
difficult (or impossible) to make a TSR that will stuff keys for them.

So, I would say that you cannot stuff keys that can be read by a program
using inport 60h.

It is possible to send data to the keyboard (that is how the LEDS are turned
on and off, and the auto-repeat is set), but I'm not aware of commands to
setup a keypress.

/*

led.c

Internet: ale...@uniserve.com
Copyright 1995, January 15 by Alex Russell, NO rights reserved

Created - 1995/1/15

History:
New file

code fragments to turn on the LED lights.
Your program would have to track the status of all lights
and send the correct bits instead of just CAPLOCK

ONLY use this code in an int 9 replacement.
If the BIOS keyboard handler is running,
just set the bit mask in BIOS data area.

eg

unsigned char far *bios_key_state;

// turn off num-lock
bios_key_state=MK_FP(0x040, 0x017);
*bios_key_state&=(~(32 | 64)); // toggle off caps lock and num lock
bits in the BIOS varible

*/


/* keyboard controller and LED lights stuff */
#define KEYSTATUS 0x64
#define KEYDATA 0x60
#define LEDUPDATE 0xed
#define OB_FULL 1
#define IB_FULL 2
#define KEY_ACK 0xfa

/* bit masks to be sent */
#define SCROLLOCK 1
#define NUMLOCK 2
#define CAPLOCK 4


/* ---------------------- send_keycontrol() ------------ January 15,1995 */
short send_keycontrol(BYTE v)
{
short count, err=1;
BYTE c;

for ( count=0; count < 3; count++ )
{
do
{
c=inportb(KEYSTATUS);
}
while ( c & IB_FULL );

outportb(KEYDATA, v);
do
{
c=inportb(KEYSTATUS);
}
while ( c & OB_FULL);

c=inportb(KEYDATA);
if ( c == KEY_ACK )
{
err=0;
break;
}
}

return err;
}

/* ---------------------- cap_light_on() --------------- January 15,1995 */
void cap_light_on(void)
{
if ( !send_keycontrol(LEDUPDATE) ) /* tell keyboard next byte is led
bitmask */
send_keycontrol(CAPLOCK); /* the led bitmask */
}

--
Alex Russell
alexande...@telus.net


Ben Peddell

unread,
Mar 24, 2003, 4:14:02 AM3/24/03
to

Marek Majchrowski <majh...@unrealleague.org> wrote in message
news:slrnb7n4h5....@marek.majcom...
> hi, is there any other way to put characters into the keyboad buffer than
> INT 16 AH 5?
>
> I want to put character into the same buffer from which I read scan code
by
> inportb(0x60)...
>
> Will it work?
>
> outportb(0x64,0xd2);
> while((int)(inportb(0x64)&0x02));
> outportb(0x60,0x01); // for ESC
>
> besause it doesn't work for me... If i read after that (by inportb(0x60))
> scan code, I always have code 254 and it doesn't matter if I put 0x01 or
> something else...

The FEh (254 decimal) is a resend. For some reason, your computer's keyboard
controller is not accepting the command, or it was interrupted between the
command and the data.

However, it does work on my computer. It even works without the wait.
If it works, a good idea would be to follow it with a fake key release. For
the ESC key, that would be 81h.

>
> What do I wrong?
>
> Please Help... and sorry for my english...
>

Are you running under Windows NT, Windows 2000, or Windows XP? This could be
your problem, since these OSes virtualize basically everything.
Or, if you're running under Windows 9x or plain DOS, your computer's
keyboard controller might not understand the command.

Peter Shaggy Haywood

unread,
Mar 27, 2003, 10:05:15 PM3/27/03
to
Please forgive the lateness of this followup.

Groovy hepcat Marek Majchrowski was jivin' on 21 Mar 2003 22:29:13 GMT
in comp.os.msdos.programmer.
How can I stuff characters into the keyboard buffer?'s a cool scene!
Dig it!

>hi, is there any other way to put characters into the keyboad buffer than
>INT 16 AH 5?

No. That's the best way to do it. That's what it's for.

>I want to put character into the same buffer from which I read scan code by
>inportb(0x60)...

That's not a buffer, it's an I/O port.

>Will it work?
>
> outportb(0x64,0xd2);
> while((int)(inportb(0x64)&0x02));
> outportb(0x60,0x01); // for ESC

No. I don't know what that will do, if anithing. It may simply do
nothing. Port 64h may be read-only. And even if it were read-write, it
may not retain the value written to it.

>besause it doesn't work for me... If i read after that (by inportb(0x60))
>scan code, I always have code 254 and it doesn't matter if I put 0x01 or
>something else...
>

>What do I wrong?

You think an I/O port is a buffer. That's wrong.

--

Dig the even newer still, yet more improved, sig!

http://alphalink.com.au/~phaywood/
"Ain't I'm a dog?" - Ronny Self, Ain't I'm a Dog, written by G. Sherry & W. Walker.
I know it's not "technically correct" English; but since when was rock & roll "technically correct"?

0 new messages