Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Re: Mainframe Capabilities was: Re: improving EDT

531 views
Skip to first unread message

Kerry Main

unread,
Dec 4, 2016, 10:20:05 AM12/4/16
to comp.os.vms to email gateway
> -----Original Message-----
> From: Info-vax [mailto:info-vax...@rbnsn.com] On Behalf
> Of Bill Gunshannon via Info-vax
> Sent: 29-Nov-16 2:12 PM
> To: info...@rbnsn.com
> Cc: Bill Gunshannon <bill.gu...@gmail.com>
> Subject: Re: [Info-vax] Variable declarations, was: Re:
improving
> EDT
>
> On 11/28/16 11:14 PM, Arne Vajhøj wrote:
> > On 11/23/2016 11:49 AM, Bill Gunshannon wrote:
> >> Don't do .Net or J2EE. Web integration doesn't require it
> anyway.
> >> I have written web programs in COBOL. I converted a broken
> PHP
> >> program that was used for the Department's High School
> Programming
> >> Contest into COBOL just as a "proof of concept". (Remember
> the
> >> comment earlier about the maintainability, or lack thereof,
of
> PHP. A
> >> change in PHP with a new version broke the running script.
> The
> >> student who wrote it couldn't figure out what it actually
did in
> order to fix it.
> >> 2 students and a professor spent days trying to fix it. I
wrote
> my
> >> COBOL version in about a half hour. I wrote a version is
> Bourne
> >> Shell using awk, which is still running today, in about 15
> minutes.
> >
> > You can write CGI scripts in COBOL.
>
> I know, I did it.
>
> >
> > I wrote CGI in Fortran 20 years ago.
>
> You can write CGI in any language. But fast and dirty is the
> mantra and thus we have all these security disasters running on
> the web.
>
> >
> > But it does not cut it in todays web world.
>
> See comment above. It is up to the programmers to fix this.
> But the big question is "Why doesn't it cut it in todays web
> world?"
> My guess is for the same reason COBOL is seen to be in decline.
>
>
> >
> >> The only place COBOL is dead is academia. They are already
> feeling
> >> the pinch from tech/trade schools. I can see a future (not
to
> >> distant) where they will start teaching things like COBOL
and
> >> academia will feel the bite even more. No one comes out of
a
> trade
> >> school with $100,000 in debt and no prospects for a real
job.
> >
> > There has been a lot of talk about the problem of COBOL
> programmers
> > retiring resulting in a shortage.
> >
> > But it seems like it has not materialized.
>
> Really? I thought I mentioned it here, but maybe it was
> somewhere else. The place in GA I went to do COBOL for a few
> months just went thru their fourth attempt to find a
> replacement. Not one qualified applicant. Is the COBOL going
> away? I asked about that because I have a scheme that would
> make it possible with minimal impact on the users. Their
answer:
> "No, it is not going away. It will just sit there and run like
it has
> since I left 4 years ago." This is problematic in itself but
does
> show that people are not rushing to get rid of their COBOL.
>
> > Apparently COBOL
> > development is declining at a rate similar to number of COBOL
> > programmers.
>
> Not that I have seen. I can find ads for COBOL Programmers
> anytime I look. I know of at least two major COBOL users who
are
> constantly advertising and hiring COBOL programmers. Neither
of
> those applications are going to be rewritten any time soon.
One
> investigated that possibility a number of years ago and the
> determination was that it was likely not even possible.
>

Another few examples of current mainframe opportunities
(typically Cobol is one of the requirements):

"Mainframe Engineer"
http://jobs.oodle.com/detail/mainframe-engineer/4396169907-frankl
in-wi/
"position5+ years Development background in Mainframe technology.
Working background with PL/1 or COBOL, JCL, CICS and HostScope
Background writing and debugging on-line and batch applications"

"Senior Mainframe Developer"
http://bit.ly/2gVn2Gc
"Candidate should have experience in supporting Mainframe
applications in COBOL, CICS, EGL and working experience with DB2,
and SQL Server"

"Mainframe Developer"
http://bit.ly/2goWr3l
Successful candidate must have Cobol and JCL experience

Others -
http://us.jobrapido.com/?w=mainframe

[snip]

There is a good reason why mainframes are still playing key
corporate roles - even since Gartner famously (infamously)
declared in 1990 that by 1995, there would be no mainframes left.


As part of a major DC Consolidation effort, some Cust are
seriously looking at Oracle DB consolidation via a DB shared
services offering on the mainframe. Apparently, there could be
serious license cost savings - not to mention security, DR, power
(much fewer high energy blades etc.)
http://ibm.co/2gV589b (mainframe positioning)

And in case anyone thinks the mainframe is old HW technology or
not being marketed, check out some of their recent features:
http://www-03.ibm.com/systems/z/announcement.html

z13 features: (pay attention to emphasis on workload management,
IO features and virtualization - 8,000 VM's per mainframe)
https://www.youtube.com/watch?v=ErZRSQoTFXw

This is not to say there are not issues and challenges with
mainframes, but - Never under estimate the competition .. learn,
adapt and build on their capabilities, marketing.

Regards,

Kerry Main
Kerry dot main at starkgaming dot com





VAXman-

unread,
Dec 4, 2016, 11:14:42 AM12/4/16
to
In article <mailman.11.1480864507....@rbnsn.com>, "Kerry Main" <kemain...@gmail.com> writes:
>> -----Original Message-----
>> From: Info-vax [mailto:info-vax...@rbnsn.com] On Behalf
>> Of Bill Gunshannon via Info-vax
>> Sent: 29-Nov-16 2:12 PM
>> To: info...@rbnsn.com
>> Cc: Bill Gunshannon <bill.gu...@gmail.com>
>> Subject: Re: [Info-vax] Variable declarations, was: Re:
>improving
>> EDT
>>=20
>>=20
>> I know, I did it.
>>=20
>> >
>> > I wrote CGI in Fortran 20 years ago.
>>=20
>> You can write CGI in any language. But fast and dirty is the
>> mantra and thus we have all these security disasters running on
>> the web.
>>=20
>> >
>> > But it does not cut it in todays web world.
>>=20
>> See comment above. It is up to the programmers to fix this.
>> But the big question is "Why doesn't it cut it in todays web
>> world?"
>> My guess is for the same reason COBOL is seen to be in decline.
>>=20
>>=20
>> >
>> >> The only place COBOL is dead is academia. They are already
>> feeling
>> >> the pinch from tech/trade schools. I can see a future (not
>to
>> >> distant) where they will start teaching things like COBOL
>and
>> >> academia will feel the bite even more. No one comes out of
>a
>> trade
>> >> school with $100,000 in debt and no prospects for a real
>job.
>> >
>> > There has been a lot of talk about the problem of COBOL
>> programmers
>> > retiring resulting in a shortage.
>> >
>> > But it seems like it has not materialized.
>>=20
>> Really? I thought I mentioned it here, but maybe it was
>> somewhere else. The place in GA I went to do COBOL for a few
>> months just went thru their fourth attempt to find a
>> replacement. Not one qualified applicant. Is the COBOL going
>> away? I asked about that because I have a scheme that would
>> make it possible with minimal impact on the users. Their
>answer:
>> "No, it is not going away. It will just sit there and run like
>it has
>> since I left 4 years ago." This is problematic in itself but
>does
>> show that people are not rushing to get rid of their COBOL.
>>=20
>> > Apparently COBOL
>> > development is declining at a rate similar to number of COBOL
>> > programmers.
>>=20
>> Not that I have seen. I can find ads for COBOL Programmers
>> anytime I look. I know of at least two major COBOL users who
>are
>> constantly advertising and hiring COBOL programmers. Neither
>of
>> those applications are going to be rewritten any time soon.
>One
>> investigated that possibility a number of years ago and the
>> determination was that it was likely not even possible.
>>=20
>
>Another few examples of current mainframe opportunities
>(typically Cobol is one of the requirements):
>
>"Mainframe Engineer"
>http://jobs.oodle.com/detail/mainframe-engineer/4396169907-frankl
>in-wi/
>"position5+ years Development background in Mainframe technology.
>Working background with PL/1 or COBOL, JCL, CICS and HostScope
>Background writing and debugging on-line and batch applications"
>
>"Senior Mainframe Developer"
>http://bit.ly/2gVn2Gc
>"Candidate should have experience in supporting Mainframe
>applications in COBOL, CICS, EGL and working experience with DB2,
>and SQL Server"
>
>"Mainframe Developer"
>http://bit.ly/2goWr3l
>Successful candidate must have Cobol and JCL experience
>
>Others -
>http://us.jobrapido.com/?w=3Dmainframe
>
>[snip]
>
>There is a good reason why mainframes are still playing key
>corporate roles - even since Gartner famously (infamously)
>declared in 1990 that by 1995, there would be no mainframes left.

Didn't those Gartner prognosticators move on to famously predict the outcome
of the recent US POTUS election too? :P

--
VAXman- A Bored Certified VMS Kernel Mode Hacker VAXman(at)TMESIS(dot)ORG

I speak to machines with the voice of humanity.

Arne Vajhøj

unread,
Dec 4, 2016, 10:32:32 PM12/4/16
to
On 12/4/2016 10:14 AM, Kerry Main wrote:
> There is a good reason why mainframes are still playing key
> corporate roles - even since Gartner famously (infamously)
> declared in 1990 that by 1995, there would be no mainframes left.
>
> As part of a major DC Consolidation effort, some Cust are
> seriously looking at Oracle DB consolidation via a DB shared
> services offering on the mainframe. Apparently, there could be
> serious license cost savings - not to mention security, DR, power
> (much fewer high energy blades etc.)
> http://ibm.co/2gV589b (mainframe positioning)
>
> And in case anyone thinks the mainframe is old HW technology or
> not being marketed, check out some of their recent features:
> http://www-03.ibm.com/systems/z/announcement.html
>
> z13 features: (pay attention to emphasis on workload management,
> IO features and virtualization - 8,000 VM's per mainframe)
> https://www.youtube.com/watch?v=ErZRSQoTFXw
>
> This is not to say there are not issues and challenges with
> mainframes, but - Never under estimate the competition .. learn,
> adapt and build on their capabilities, marketing.

Mainframes will be around for many decades to come.

But the overall trend is down.

It seems like sales is halfed every 10 year.

So 4 B$ now, 2 B$ in 10 years, 1 B$ in 20 years, 500 M$ in 30 years.

And then at some point in time IBM will reduce effort to pure maintenance.

And at some later point in time IBM will kill it.

Arne


seasoned_geek

unread,
Dec 7, 2016, 7:42:59 AM12/7/16
to

> > >> The only place COBOL is dead is academia. They are already
> > feeling
> > >> the pinch from tech/trade schools. I can see a future (not
> to
> > >> distant) where they will start teaching things like COBOL
> and
> > >> academia will feel the bite even more. No one comes out of
> a
> > trade
> > >> school with $100,000 in debt and no prospects for a real
> job.
> > >
> > > There has been a lot of talk about the problem of COBOL
> > programmers
> > > retiring resulting in a shortage.
> > >
> > > But it seems like it has not materialized.
> >

Once I got into this tiny edit window and trimmed things away to the snippet I wanted I couldn't trace back to who wrote this. Doesn't really matter. I just wanted to point out that many years ago I made the mistake of attending the scam of DeVry after having attended a valid Junior College getting my feet wet in software development. A good many students graduating around the same time did were in that boat. Massive student loan debt and no job prospects but you would never see this in the numbers they published. Hell, they counted me as being employed in my field thanks to them because I had gotten a job as a midnight computer operator all on my own to work my way through school. Several others were still working at Pizza Hut and Jewel foods months after graduation but I believe they were still counted as "having been placed."

Actually I think one of the guys I knew never got a job. I have to reach out to a few former classmates now that I'm thinking about it. About a year after graduation one of the guys I knew said that guy had enrolled in another school trying to get an accounting minor to go with his B.S. C.I.S. (supposedly we only needed a few extra classes if you could find the one school in the universe where DeVry credits would transfer)

Now that I think of it, was way more than one guy. Mike ended up going back to Wisco and working at some forestry type outdoor thing a few other family members were involved in after not getting a programming job. Another guy who was actually decent at coding became an electrician after not getting a job.

Granted, I haven't followed up on these people in decades. I believe we can all agree that once you take a gig outside of software that close to graduation that you aren't coming back to the field. I mean this was in the 1980s

seasoned_geek

unread,
Dec 7, 2016, 8:06:37 AM12/7/16
to
On Sunday, December 4, 2016 at 9:32:32 PM UTC-6, Arne Vajhøj wrote:
>
> Mainframes will be around for many decades to come.
>
> But the overall trend is down.
>
> It seems like sales is halfed every 10 year.
>
> So 4 B$ now, 2 B$ in 10 years, 1 B$ in 20 years, 500 M$ in 30 years.
>
> And then at some point in time IBM will reduce effort to pure maintenance.
>
> And at some later point in time IBM will kill it.
>

The only real problem with that argument is that I have been hearing it since the mid 1980s. I really heard it when IBM's stock price tanked, they had massive layoffs and there was talk about them going out of business. They still make big iron.

Despite what the "x86 should run the world" crowd wishes to believe they will never compete with the throughput and stability of proprietary systems designed to provide maximum stability. The x86 world had its fate sealed when the initial OSs designed for it opted for streams instead of record/block based I/O. This caused a variety of hardware/firmware choices to be made re-enforcing the bad decision.

I've been programming in C/C++ for decades. My C++ days started with the Zortech compiler on DOS. Moved to Watcom under OS/2, DOS and GUI DOS (Windows). I will stand here and say, while you may be able to find isolated cases where a C/C++ program can churn through big data sets with the same speed as COBOL, they are outliers. COBOL was designed for the "big data" of its time and it scaled well. It was tailor made for generating monthly billing, payroll and a host of other applications which by their very nature are batch. C was designed for driver, OS and other low level development. As such it lacks the under-the-hood hardware COBOL has for massive throughput.

DEC's major downfall was that it always sucked at I/O throughput. It rocked at various scientific calculations and being a front end for Cray for a while, but, I believe the IRS experiment with that massive cluster to process taxes back in the day of DEC Professional magazine really showed the throughput problem.

Without specialized hardware and a proprietary OS you simply can't push the throughput envelop far enough.

seasoned_geek

unread,
Dec 7, 2016, 8:23:21 AM12/7/16
to
Btw. I haven't talked to him in a few years, but I worked with a contractor who still did COBOL on a Prime computer.

http://www.chilton-computing.org.uk/acd/icf/mums/prime/p009.htm

https://en.wikipedia.org/wiki/Prime_Computer

People are still hiring TAL developers. This was posted 5 days ago.

https://jobs.tsys.com/job/Columbus-Mainframe-Programmer-Analyst-I-or-Above-GA-31829/378659500/?feedId=79200&utm_source=Indeed&utm_campaign=TSYS_Indeed&sponsored=ppc

As to why we don't have a shortage of COBOL programmers right now, I think the answer to that is two fold.

1) visa workers are simply adding COBOL to their resume, getting imported en-mass and hoping they can hide in a room where at least one of the people actually knows COBOL. I've seen this with just about ever other programming language so cannot believe COBOL is any different.

2) Enough of the old COBOL programmers haven't kicked off. They've retired officially, but I still hear of some "working for a few months" here and there because their retirement account took a hit from whatever the latest Wall Street manufactured tragedy was.

Let's face it. Y2K happened because systems designed in the 70s and 80s were so well created they never got replaced. In 2027 we will have another mini-Y2K at all of those shops who hacked a date routine to assume years before 27 were 2000 while all two digit years higher than that were 1900 and never went back to sweep it up.

While we are on the topic of things which supposedly went away, every 4 months or so I get contacted about doing some DIBOL work.

I remember hearing the story of how IBM came out to Joliet Junior College a year or so in front of the big Y2K rush, installed a main frame and provided a COBOL instructor to crash course kids in COBOL programming without all of the programming background one needs to design software correctly. Never actually met anyone involved with that but the story was soooo often repeated in the Chicago area . . . The same thing will happen in another 30-40 years. Too bad DEC didn't follow the same approach.

Scott Dorsey

unread,
Dec 7, 2016, 9:26:25 AM12/7/16
to
seasoned_geek <rol...@logikalsolutions.com> wrote:
>
>Without specialized hardware and a proprietary OS you simply can't push the=
> throughput envelop far enough.

Right. The problem is that in the 1970s, the kind of work that people wanted
to do required far more throughput than the hardware was capable of. But today
cheap commodity hardware has more than enough I/O bandwidth for what people
want to do. So high performance I/O systems which used to be sold to every
local bank and insurance broker now have a far more limited customer base.

Of course, Oracle is doing their best to make the hardware I/O bound again...
--scott
--
"C'est un Nagra. C'est suisse, et tres, tres precis."

Rich Alderson

unread,
Dec 7, 2016, 4:49:53 PM12/7/16
to
seasoned_geek <rol...@logikalsolutions.com> writes:

> Let's face it. Y2K happened because systems designed in the 70s and 80s were
> so well created they never got replaced.

OK, I will dispute this, since I was a financial applications programmer in a
big IBM shop in the 1970s. Those 2-digit year fields were legacy even then,
from 1401-based programs written in 1959 and 1960 (by the assistant manager of
the group for whom I worked!). I was pushing for 4-digit years way back then,
because (1) the century was going to change in about the same number of years
as we had been running all that AP, PR, and GL code under emulation, and (2)
disk space was cheap "these days". I wrote all my code for 4 digit years. I
was not entirely atypical.

--
Rich Alderson ne...@alderson.users.panix.com
Audendum est, et veritas investiganda; quam etiamsi non assequamur,
omnino tamen proprius, quam nunc sumus, ad eam perveniemus.
--Galen

erga...@gmail.com

unread,
Dec 8, 2016, 3:51:48 AM12/8/16
to
On Wednesday, 7 December 2016 13:23:21 UTC, seasoned_geek wrote:

> Let's face it. Y2K happened because systems designed in the 70s and 80s were so well created they never got replaced. In 2027 we will have another mini-Y2K at all of those shops who hacked a date routine to assume years before 27 were 2000 while all two digit years higher than that were 1900 and never went back to sweep it up.

Why would you choose '27 as the pivot year?

Bob Koehler

unread,
Dec 8, 2016, 9:18:18 AM12/8/16
to
In article <216a5c23-8716-4ab5...@googlegroups.com>, erga...@gmail.com writes:
> On Wednesday, 7 December 2016 13:23:21 UTC, seasoned_geek wrote:
>
>> Let's face it. Y2K happened because systems designed in the 70s and 80s w=
> ere so well created they never got replaced. In 2027 we will have another m=
> ini-Y2K at all of those shops who hacked a date routine to assume years bef=
> ore 27 were 2000 while all two digit years higher than that were 1900 and n=
> ever went back to sweep it up.
>
> Why would you choose '27 as the pivot year?

I think he's mis-remembering the C/UNIX time rollover date.

Jim

unread,
Dec 8, 2016, 10:54:46 AM12/8/16
to
On Thursday, December 8, 2016 at 9:18:18 AM UTC-5, Bob Koehler wrote:
Some old OSes used 7 bit year fields (0-127)... for example:

http://www.apple2.org.za/gswv/USA2WUG/Glen.Bredon.In.Memoriam/A2.Software/A2.Y2K.Patcher/Y2K.explanation.txt

Bill Gunshannon

unread,
Dec 8, 2016, 11:18:16 AM12/8/16
to
Ya know, I always thought most of the people here were either my age or
very close but when I read threads like this, I have to wonder.

The two digit date problem stems from the days of punched cards when
those two digits for the never changing "19" were just too valuable
to waste. I personally worked with 80 column punched cards right up
thru the mid 1980's. Code that was modified to use files instead of
cards seldom, if ever, made the date modification until it was really
necessary. Of course, anyone who was around then knows that the world
didn't end when Y2K hit because everything that really mattered was
fixed in time. A lot of stuff wasn't and silliness continued into
the era of the Web. But nothing that really mattered.

And then we have the comment that the IRS ever considered using DEC
for their hardware. The IRS has been UNISYS since the days when it
was still Univac. Late 70's they were already on the Univac 1100
before the first 32bit (VAX) DEC machine came off the factory floor.
When I was working on 1100's for the government we lost our DBA to
the IRS because the IRS DBA was the highest paying GS job in the
government. They ran on 11oo's unitl it was replaced by the fully
compatible 2200 series and continue on that to this date. There was
once an attempt to re-write the whole IS (not necessarily moving it
off of UniSYS but wanting to move it off COBOL) but that was abandoned
after the best contractors in the business told them, based on the RFP,
that it could not be done.

As for mainframes in general, the three biggest ISes I have ever been
aware of are all on mainframes. Two on IBM and one on Unisys.

bill

Howard S Shubs

unread,
Dec 8, 2016, 2:56:57 PM12/8/16
to
In article <eatfe6...@mid.individual.net>,
Bill Gunshannon <bill.gu...@gmail.com> wrote:

> Ya know, I always thought most of the people here were either my age or
> very close but when I read threads like this, I have to wonder.
>
> The two digit date problem stems from the days of punched cards when
> those two digits for the never changing "19" were just too valuable
> to waste. I personally worked with 80 column punched cards right up

I've heard of systems which used ONE digit years. They figured the
systems wouldn't be used longer than that... Memory used to be
EXPENSIVE, remember.

Bill Gunshannon

unread,
Dec 8, 2016, 3:07:32 PM12/8/16
to
Not just expensive, small, too. My first mainframe had 4K 8 bit
characters (yes, that's how it was done) of memory locations.

My first home computer had 16K, My second had 64K. And my first
Unix machine (at home) had 768K which was more than the IBM 360/40
I worked on that had 512K.

bill

Howard S Shubs

unread,
Dec 8, 2016, 5:49:25 PM12/8/16
to
In article <eatss2...@mid.individual.net>,
I tend to think more in terms of the MicroVAX IIs my first job used.
They maxed out, of course, at well under the 4GiB limit. 16MiB, I
think? And that handled half the company (we had two) for 16 people
each.

And the VAX 11/780 I used at Northeastern U, which handled 50 people
just fine with, I think, 8MiB. If you put 60 people on, it started to
bog down.

And the IBM 1130 I used in high school had been upgraded from the 4k
16-bit words it shipped with to 8k. Whoa.

I'm typing this on a MacBook Pro with 16GiB of memory, dedicated to one
user. The way things have changed!

Steven Schweda

unread,
Dec 8, 2016, 9:41:06 PM12/8/16
to
> And the IBM 1130 I used in high school had been upgraded from the 4k
> 16-bit words it shipped with to 8k. Whoa.

As I recall from my college days, for another zillion dollars you
could add on another 8K (words) in an add-on box about the size of a
two-drawer filing cabinet.

My first computer contact (in high school) was with a UNIVAC 422 with
its 512 15-bit words. It served its educational purpose, but didn't
have much of a chance for real work. On the bright side, you might not
need an NSF grant to buy one. (That, as I recall, is how Macalester got
its IBM 1130 (with 8K words). Something around $100,000, as I recall,
when tuition cost about $2000. The lease payments on the 1132 printer
probably totaled more in the long run, however.)

Bill Gunshannon

unread,
Dec 8, 2016, 10:07:44 PM12/8/16
to
At the same time I worked on the 1401 we also had a Univac 1005.
While the 1401 was busy we used to try teaching the 1005 to play
tic-tac-toe. It was a loosing battle. The 1005 was just to dumb.
I'll take Autocoder over Sarge any day.

bill

Howard S Shubs

unread,
Dec 9, 2016, 4:15:36 AM12/9/16
to
In article <3b47e028-0813-4fab...@googlegroups.com>,
Steven Schweda <sms.an...@gmail.com> wrote:

> > And the IBM 1130 I used in high school had been upgraded from the 4k
> > 16-bit words it shipped with to 8k. Whoa.
>
> As I recall from my college days, for another zillion dollars you
> could add on another 8K (words) in an add-on box about the size of a
> two-drawer filing cabinet.

It would max out at 32k words, actually. Expensive? Hell yes.
Magnetic core memory doesn't weave itself!

> My first computer contact (in high school) was with a UNIVAC 422 with
> its 512 15-bit words. It served its educational purpose, but didn't
> have much of a chance for real work. On the bright side, you might not
> need an NSF grant to buy one. (That, as I recall, is how Macalester got
> its IBM 1130 (with 8K words). Something around $100,000, as I recall,
> when tuition cost about $2000. The lease payments on the 1132 printer
> probably totaled more in the long run, however.)

512 words? Now I feel spoiled. OTOH, I never did understand the
CompuCom(?) II I had as a child.

Howard S Shubs

unread,
Dec 9, 2016, 4:17:21 AM12/9/16
to
In article <eaulft...@mid.individual.net>,
Ever try an IBM 1620 (CADET)? It was a base-10 machine which did
arithmetic using tables. Its name stood for Can't Add, Doesn't Even Try.

Stephen Hoffman

unread,
Dec 9, 2016, 8:33:43 AM12/9/16
to
On 2016-12-08 08:51:47 +0000, erga...@gmail.com said:

> Why would you choose '27 as the pivot year?

Pivot year selection varies. In OpenVMS, 57 was the pivot year,
within the OS. Being the founding year of DEC factored slightly into
that particular selection, but I digress. Apps might select some
other pivot.


--
Pure Personal Opinion | HoffmanLabs LLC

David Froble

unread,
Dec 9, 2016, 11:08:56 AM12/9/16
to
Stephen Hoffman wrote:
> On 2016-12-08 08:51:47 +0000, erga...@gmail.com said:
>
>> Why would you choose '27 as the pivot year?
>
> Pivot year selection varies. In OpenVMS, 57 was the pivot year,
> within the OS. Being the founding year of DEC factored slightly into
> that particular selection, but I digress. Apps might select some other
> pivot.
>
>

This topic caused me to realize that I forgot what pivot year I'd used. So a
search into the past, to find where such hackery might reside. Found it, and
noticed some other "stuff".

Short walk down memory lane ....

Pivot year was 50. Also found code that allowed "YESTERDAY", "TODAY", and
"TOMORROW". I'd pretty much forgotten about that. Allowable dates are 1970 to
2035. Talk about short sighted hackery ....

:-)

Paul Sture

unread,
Dec 9, 2016, 1:19:09 PM12/9/16
to
On 2016-12-09, Stephen Hoffman <seao...@hoffmanlabs.invalid> wrote:
> On 2016-12-08 08:51:47 +0000, erga...@gmail.com said:
>
>> Why would you choose '27 as the pivot year?
>
> Pivot year selection varies. In OpenVMS, 57 was the pivot year,
> within the OS. Being the founding year of DEC factored slightly into
> that particular selection, but I digress. Apps might select some
> other pivot.

The application can and does influence the choice. A hospital system
will have to cater for patients over 100 years old (date of birth in the
past), but a life assurance / pension system will be looking at a
timespan of similar magnitude into the future.

And if the choice of the OS doesn't match the requirements of your
application you get to roll your own.

FWIW 05-Aug-1948 is one date burned into my memory, being the zero date
for one system I used (think variables initialised to zero when displayed
on a screen).

17-Nov-1858 + 32768 -> 05-Aug-1948

and of course 05-Aug-1948 + 32767 -> 22-Apr-2038, and the hospital
system the above choice was devised for is good for another 20+ years in
that respect.

--
tar: Cowardly refusing to create an empty archive

Paul Sture

unread,
Dec 10, 2016, 5:52:40 AM12/10/16
to
On 2016-12-07, Rich Alderson <ne...@alderson.users.panix.com> wrote:
> seasoned_geek <rol...@logikalsolutions.com> writes:
>
>> Let's face it. Y2K happened because systems designed in the 70s and 80s were
>> so well created they never got replaced.
>
> OK, I will dispute this, since I was a financial applications programmer in a
> big IBM shop in the 1970s. Those 2-digit year fields were legacy even then,
> from 1401-based programs written in 1959 and 1960 (by the assistant manager of
> the group for whom I worked!). I was pushing for 4-digit years way back then,
> because (1) the century was going to change in about the same number of years
> as we had been running all that AP, PR, and GL code under emulation, and (2)
> disk space was cheap "these days". I wrote all my code for 4 digit years. I
> was not entirely atypical.

VMS itself adopted 4 digit years from the start.

Unfortunately those of us who had started out in the mainframe world had
been *trained* that:

a) disk space was still expensive and with thousands or even millions of
records, those byte savings added up.
b) the overhead of a 4 digit year for data entry was seen as a waste of
time. Some applications had quite a lot of dates, and although it was
a hangover from traditional data entry onto punched cards, keystrokes
did get measured back then.

The combination of both meant that pressing for 4 digit years could be
quite a hard sell. Disks were still a major expenditure, especially for
that class of clients who wanted so save as much history as possible.

A smart thing to do was ensure that the dates in records wouldn't hit
the Y2K limit, but input/output screens and printed reports were a
different matter.

We really did think that the systems we were building in the late 70s
and early 80s would be replaced well before Y2K. This was reinforced by
the fact that as you moved up through a given manufacturer's product
line you were faced with a rewrite anyway (hello IBM, ICL).

VMS was the first system I worked on where you could keep the same
applications (mostly) unaltered right to the top of the range, though
you might want to do that rewrite to take advantage of new technology
as it arrived (e.g. clustering and networking).

seasoned_geek

unread,
Dec 10, 2016, 9:07:00 AM12/10/16
to
I must vehemently dispute this Scott. While the completely worthless $300 PCs sold as $8K blades and rack mounts have gotten better, they totally lack the same throughput. That's like comparing a 2 lane gravel road to an 8 lane wide Interstate.

For years the x86 world has been trying to "compensate for a certain shortcoming" by pushing more and more logic out to Oracle and other database products. There are still massive batch/transactional processes which cannot be accomplished humanely without the gigantic I/O pipe of mainframes. Think about all of the timecard files coming into a payroll processor like ADP and the like. No matter how one slices it that is a batch/transactional process with tens of thousands of input files.

Think about the electronic filing systems for the IRS. Literally millions of XML files containing 1040s and schedules for every citizen coming in. While the XML to flat parsing can be handled by a wanna-be computer the full audit filtering and form verification is a Big Iron process. Even DEC failed miserably at this when the IRS installed what was at the time one of the largest OpenVMS clusters in the world. Yes it was cheaper with more computational power, but didn't have a big enough pipe.

It doesn't matter if you can hook a worthless x86 up to a fiber optic cable coming from a SAN or other large storage device, the little wanna-be can't drink the full volume from the fire hose.

seasoned_geek

unread,
Dec 10, 2016, 9:10:31 AM12/10/16
to
No. There were quite a few articles at the time about the use of 27 as the dividing line. It was known to be a "temporary" fix. There was a reason for it, I just don't remember exactly why 27 was the magic number. I know a lot of shops used it for their date input routine.

seasoned_geek

unread,
Dec 10, 2016, 9:14:59 AM12/10/16
to
Thank you! Not so much the OS as an intermediate and/or storage data type had a size limit. With a range of up to 127 your "base" year could still be 1900. This little time bomb doesn't go off for another decade.

johnwa...@yahoo.co.uk

unread,
Dec 10, 2016, 10:49:32 AM12/10/16
to
VMS internally used quadword time [1], a rather neat
concept especially when combined with lots of runtime
support routines. It would have been even better if
someone had been forward-thinking enough to consider
stuff like timezones, and maybe fake time for testing.

How quadword time is presented externally is a
different matter.


But on the whole, VMS quadword time beats anything
I've used elsewhere in the last few decades (largely
POSIX-like boxes and Window boxes). In fact the last
non-trivial from-scratch project I worked on needed
a *lot* of time manipulation (not presentation,
manipulation, including high resolution timestamps and
arithmetic) and although it wasn't a VMS project we
adopted something akin to quadword time and the
associated support routines, because the OS and
language native facilities weren't up to it.



OpenVMS Programming Concepts: 27.1: System Time Format

The operating system maintains the current date
and time in 64-bit format. The time value is a
binary number in 100-nanosecond (ns) units offset
from the system base date and time, which is 00:00
o'clock, November 17, 1858 (the Smithsonian base
date and time for the astronomic calendar).

Time values must be passed to or returned from
system services as the address of a quadword
containing the time in 64-bit format. A time
value can be expressed as either of the following:

* An absolute time that is a specific date or
time of day, or both. Absolute times are always
positive values (or 0).
* A delta time that is an offset from the current
time to a time or date in the future. Delta times
are always expressed as negative values and cannot
be zero. The binary format number for delta time
will always be negative.

If you specify 0 as the address of a time value,
the operating system supplies the current date and
time.
[etc]

seasoned_geek

unread,
Dec 10, 2016, 11:36:38 AM12/10/16
to

seasoned_geek

unread,
Dec 10, 2016, 11:38:08 AM12/10/16
to
Actually I used to have a client which had that exact problem with one of their factory systems. It was written in FORTRAN. Once every 10 years they pay someone to hack a mod an do something with the old data. Management deems it a cost effective solution.

seasoned_geek

unread,
Dec 10, 2016, 11:42:42 AM12/10/16
to
On Friday, December 9, 2016 at 10:08:56 AM UTC-6, David Froble wrote:
>
> This topic caused me to realize that I forgot what pivot year I'd used. So a
> search into the past, to find where such hackery might reside. Found it, and
> noticed some other "stuff".
>
> Short walk down memory lane ....
>
> Pivot year was 50. Also found code that allowed "YESTERDAY", "TODAY", and
> "TOMORROW". I'd pretty much forgotten about that. Allowable dates are 1970 to
> 2035. Talk about short sighted hackery ....
>
> :-)

So, do you retire on December 31, 2034 or do you hang around to cash in on Y2035? <Grin>

seasoned_geek

unread,
Dec 10, 2016, 11:47:26 AM12/10/16
to
On Saturday, December 10, 2016 at 4:52:40 AM UTC-6, Paul Sture wrote:
> b) the overhead of a 4 digit year for data entry was seen as a waste of
> time. Some applications had quite a lot of dates, and although it was
> a hangover from traditional data entry onto punched cards, keystrokes
> did get measured back then.

Ah yes. Before there were Systems Analysts they were called Efficiency Experts. Not only did they count keystrokes, but the number of times during a given entry sequence a user would have to leave "home row" to either navigate or use the numeric keypad (once keyboards got numeric keypads).

Bill Gunshannon

unread,
Dec 10, 2016, 12:46:38 PM12/10/16
to
All this is interesting, but has nothing to do with the Y2K
problem. Unless you think that data entry clerks could type
in a quad-word value. Oh wait, that is even bigger than the
four character date that was the original problem. In any
event, the problem really originated with having only 80 positions
for data and continued on as a legacy long after it was no longer
relevant. And the lesson people should take away from this is
to pay attention to inertia. What is there now is going to be
there for a long time and all the talking in the world is not
going to change that.

bill


Bill Gunshannon

unread,
Dec 10, 2016, 4:09:56 PM12/10/16
to
On 12/10/16 11:36 AM, seasoned_geek wrote:
> On Thursday, December 8, 2016 at 10:18:16 AM UTC-6, Bill Gunshannon wrote:
>> And then we have the comment that the IRS ever considered using DEC
>> for their hardware.
>
> http://www.nytimes.com/1986/01/28/business/irs-ends-computer-contract.html

This article is about a court case. They voided a contract that might
have let DEC in, but never happened. It is typical press. Talks about
the system like it's a single computer that stops everything when
anything goes wrong. Also, they mention that the current system was
installed by Sperry. That's Univac/UNISYS and the "system" they are
talking about was an upgrade to the already existing 1100 system.
Note also that they say the DEC bid did not meet very specific
requirements of the RFP. Business as usual. A company I was working for
many years ago once lost a bid to DEC who won without submitting any
results for the required benchmarks submitting instead a letter saying,
"If the machine we are bidding actually existed today it would have the
best benchmark results." Go figure.


>
> https://books.google.com/books?id=tBXQZbbSyeQC&pg=PA23&lpg=PA23&dq=IRS+computerworld+digital+equipment+corporation&source=bl&ots=aMJCLf24DA&sig=QWlemDz3zlUsyoRbcD6O6sfkjjU&hl=en&sa=X&ved=0ahUKEwjQ-7SlgOrQAhVQ_mMKHU1tBpQQ6AEIHDAB#v=onepage&q=IRS%20computerworld%20digital%20equipment%20corporation&f=false

This is about a particular smaller system. The tax lien system.
Surely you didn't think the IRS was just one system. They have
numerous systems doing multiple tasks. The primary system is
still a UniSYS 2200 (1100 compatible) running a system written
in COBOL and Assembler.

>
> https://books.google.com/books?id=YzfuC0kmxM4C&pg=PA12&lpg=PA12&dq=IRS+computerworld+digital+equipment+corporation&source=bl&ots=wBP8-TpmoW&sig=de4R9I8hE0GeoXsAjr2vL6YswD8&hl=en&sa=X&ved=0ahUKEwiJ1KrRgOrQAhVX-mMKHVFcCg8Q6AEIIjAE#v=onepage&q=IRS%20computerworld%20digital%20equipment%20corporation&f=false
>

This one is more about increasing the bureaucracy than computers.
It does mention their "1960's era' computer system. Yeah, that's
a UniSYS 1100/2200. And people here should be very familiar with
these kinds of comments. We all know that "legacy" means bad. And
like the comments here about Unix, the immediate assumption is that
nothing has changed and the systems are exactly the same as they were
in the 60's. Of course, a major part of this "legacy", "60's" attitude
is because of the use of COBOL.

bill




David Froble

unread,
Dec 10, 2016, 9:38:03 PM12/10/16
to
Yep!

If it ain't broke, don't fix it.

For a long time it wasn't broke.

For me, the fix of the date, even though I used a quick and dirty fix, wasn't
the problem. Like the punched cards, output, screens and reports, was the real
issue. Sort of hard to stretch the screen, or paste in some extra paper. So it
wasn't just "fix the date", it in all too many cases was re-work the screen
display or the report. Now, that was significant work.

And as usual, easiest thing was to stick to the 2 characters for the year, with
the users "knowing" (yeah, right) that "01" was actually 2001.

While there was plenty of it, I didn't like Y2K work ....

David Froble

unread,
Dec 10, 2016, 9:40:15 PM12/10/16
to
People good at typing would not go to the numeric keypad. I'm not even good,
and I use the numerics on the top row.

David Froble

unread,
Dec 10, 2016, 9:48:05 PM12/10/16
to
Actually, December 31, 2035. But who wants to pick nits?

At that time I will be 89 years old. If I make it that far. Family history
says there is a chance. So, yeah, I might still need some work then.

:-)

The reality is, the applications that were using that hack are no longer in use,
for various reasons. But not because of the date.

Bill Gunshannon

unread,
Dec 10, 2016, 10:03:13 PM12/10/16
to
Professional data entry clerks can touch type on a keypad faster than
your (or they) can touch type the top row of a standard keyboard. But
to bring this back to reality, we were talking about punch cards, and
I don't remember ever seeing a punch card machine with a keypad. :-)

bill

David Froble

unread,
Dec 10, 2016, 11:02:12 PM12/10/16
to
If they're already on the keypad, otherwise not.

Jan-Erik Soderholm

unread,
Dec 11, 2016, 4:36:35 AM12/11/16
to
For one single digit, maybe. But a SSN (9 digits), our "personal number"
(10 digits) or a date (8 digits) is usually faster on the numeric
kaypad for someone used to it.



Bill Gunshannon

unread,
Dec 11, 2016, 9:18:29 AM12/11/16
to
Well, for a single number, maybe. But a professional can change to the
keypad without looking (yes, they have the same tactile assistance most
keyboards have so you don't have to look to place your hand in the right
place). But, as I said, I don't remember the keypunches I used ever
having a numeric keypad.

bill



Steven Schweda

unread,
Dec 11, 2016, 9:36:37 AM12/11/16
to
> [...] I don't remember the keypunches I used ever having a
> numeric keypad.

Not a separate keypad, but a close look at the keyboard
for an IBM 029 might be educational. Perhaps you can find
one on this new Inter-Web thing.

Bill Gunshannon

unread,
Dec 11, 2016, 9:48:00 AM12/11/16
to
But note there is no tactile mark making it easy to have your hand slip
to the wrong keys. Keypunching was an art, not just a skill. I did a
lot punching programs but luckily never had to do data. While the
layout is different the shift between alpha and numeric reminds me a
lot of the old teletypes I used to work.

One facet of this business I don't miss.

bill

Bob Koehler

unread,
Dec 12, 2016, 9:47:22 AM12/12/16
to
Same POS we get on tablet keyboards. Is was a PITA then, and it
still is.

VAXman-

unread,
Dec 12, 2016, 9:56:55 AM12/12/16
to
Which is why I have installed one of the extended keyboards on my tablet. ;)

FWIW, I hated the 029 keyboard and punch cards too.

--
VAXman- A Bored Certified VMS Kernel Mode Hacker VAXman(at)TMESIS(dot)ORG

I speak to machines with the voice of humanity.

Stephen Hoffman

unread,
Dec 12, 2016, 11:49:41 AM12/12/16
to
On 2016-12-10 15:49:30 +0000, johnwa...@yahoo.co.uk said:

> But on the whole, VMS quadword time beats anything I've used elsewhere
> in the last few decades (largely POSIX-like boxes and Window boxes).

Even OpenVMS itself has something better than that native quadword
format. UTC.

But few folks have migrated and use the replacement UTC format within
OpenVMS. Compatibility precludes wholesale replacement of the archaic
quadword even with all its limitations and problems, after all.

The time and date APIs did improve markedly with the integration of
DECdtss stuff into OpenVMS as well as the UTC work circa V7.3, but few
folks know about or use that API, either.

One of the other boxes I deal with hides all this dreck inside objects,
with a date range from ~3000 BCE to ~7000 CE, with better than
millisecond precision. Which is decent. With better support for
formating and processing than is available with the OpenVMS system
services and RTLs, though it's possible to slog through that with a mix
of system services and RTL calls and the DECdtss-integrated timekeeping
bits.

http://labs.hoffmanlabs.com/node/735
http://labs.hoffmanlabs.com/node/124

See the "falsehoods" links in that second article for even more, too.
But then times and dates and timezones and related calculations and
conversions are all of a dog's breakfast irrespective of the platform.

Bill Gunshannon

unread,
Dec 12, 2016, 1:26:10 PM12/12/16
to
On 12/12/16 9:56 AM, VAX...@SendSpamHere.ORG wrote:
> In article <U37NXx...@eisner.encompasserve.org>, koe...@eisner.nospam.decuserve.org (Bob Koehler) writes:
>> In article <ebe32b31-cc8a-46c8...@googlegroups.com>, Steven Schweda <sms.an...@gmail.com> writes:
>>>> [...] I don't remember the keypunches I used ever having a
>>>> numeric keypad.
>>>
>>> Not a separate keypad, but a close look at the keyboard
>>> for an IBM 029 might be educational. Perhaps you can find
>>> one on this new Inter-Web thing.
>>
>> Same POS we get on tablet keyboards. Is was a PITA then, and it
>> still is.
>
> Which is why I have installed one of the extended keyboards on my tablet. ;)
>
> FWIW, I hated the 029 keyboard and punch cards too.
>

I wasn't fond of the 029 although when your working in an IBM shop
there usually isn't an alternative. The Univac ones I used later
were a little better.

Hating punch cards is funny. What alternative was your choice? :-)

bill

Phillip Helbig (undress to reply)

unread,
Dec 12, 2016, 3:55:08 PM12/12/16
to
In article <00B137E6...@SendSpamHere.ORG>, VAXman-
@SendSpamHere.ORG writes:

> Which is why I have installed one of the extended keyboards on my tablet. ;)

Bluetooth?

Is there a way to get an LK 411 to work with an iPad?

seasoned_geek

unread,
Dec 12, 2016, 4:11:43 PM12/12/16
to
On Saturday, December 10, 2016 at 3:09:56 PM UTC-6, Bill Gunshannon wrote:
> bill

Well, I could not find DEC Professional content on-line (at least with Google). I was a midnight computer operator at AirFone Inc finishing up degree when the story came out about IRS creating the largest VMS cluster to date for incometax processing. This would have been somewhere between 1985-1987. I remember the story quite well because it went into a big discussion of the HSC hardware of the day. JPO had not left there to go work for DEC and it was still called DEC, with a campus in Elk Grove Village, IL. DEC was crowing about the cluster the IRS was setting up. Then the throughput had something to do with delayed tax refunds that year and it wasn't talked about anymore.

seasoned_geek

unread,
Dec 12, 2016, 4:18:51 PM12/12/16
to
On Saturday, December 10, 2016 at 3:09:56 PM UTC-6, Bill Gunshannon wrote:
> On 12/10/16 11:36 AM, seasoned_geek wrote:
>
> >
> > https://books.google.com/books?id=tBXQZbbSyeQC&pg=PA23&lpg=PA23&dq=IRS+computerworld+digital+equipment+corporation&source=bl&ots=aMJCLf24DA&sig=QWlemDz3zlUsyoRbcD6O6sfkjjU&hl=en&sa=X&ved=0ahUKEwjQ-7SlgOrQAhVQ_mMKHU1tBpQQ6AEIHDAB#v=onepage&q=IRS%20computerworld%20digital%20equipment%20corporation&f=false
>
> This is about a particular smaller system. The tax lien system.
> Surely you didn't think the IRS was just one system. They have
> numerous systems doing multiple tasks. The primary system is
> still a UniSYS 2200 (1100 compatible) running a system written
> in COBOL and Assembler.
>

Actually, now that I look at it, this solves the issue. I never said the IRS used VAX exclusively but you said they never let VAX in. As I'm sure you've been told many times in your life

"size doesn't matter"

Bill Gunshannon

unread,
Dec 12, 2016, 7:05:32 PM12/12/16
to
And I didn't say they used anything exclusively. They have a lot
of systems (including the Zilog systems mentioned in one of the
articles). But the main system is a UniSYS mainframe and it is
one of the largest, if not the largest, Information Systems in
the world. And attempts to replace it have proven fruitless up
to this point. Of course, if they ever do what is always being
threatened, and simplify the tax code it would probably become
doable. Unfortunately it would probably end out on Windows Servers.

bill

David Froble

unread,
Dec 12, 2016, 7:14:05 PM12/12/16
to
Why would you want to waste a valuable LK 411 on a POS ipad?

Phillip Helbig (undress to reply)

unread,
Dec 13, 2016, 2:48:41 AM12/13/16
to
In article <7ff9db6e-a064-452f...@googlegroups.com>,
seasoned_geek <rol...@logikalsolutions.com> writes:

> As I'm sure you've been told many times in your life
>
> "size doesn't matter"

True; many women say that size doesn't matter; what's important is how
often one has to change the batteries. :-)

Phillip Helbig (undress to reply)

unread,
Dec 13, 2016, 2:54:49 AM12/13/16
to
In article <o2nedp$lqo$1...@dont-email.me>, David Froble
Well, I have about a dozen LK 411 keyboards, so I have enough. I don't
have a Tadpole, and in any case newer versions of VMS aren't supported
on it. At home (like now) I am of course working directly on a VMS
system (graphics monitor, CDE, DECterm, NEWSRDR). But I am not always
at home. There are probably no internet cafés with VMS systems, and
probably no such systems in hotel rooms. An iPad provides internet
connectivity, and I can log in to my cluster at home via a VT emulator.
These days screens are available which are larger than a VT. I even
have a smart-cover keyboard, which is better than the on-screen
keyboard, and actually quite similar to a DEC keyboard, though it has
just the alphanumeric part, not the other three sections. BOSS can be
used to switch between processes without windowing. (I haven't
investigated an X server yet.) It would be nice to have a proper
keyboard. There are USB LK keyboards, and probably some USB to
BlueTooth converter, but it would be interesting to know if it actually
works.

Alternatively, if you can provide me with a VMS laptop, with a proper
keyboard, I'll take it.

Stephen Hoffman

unread,
Dec 13, 2016, 9:33:41 AM12/13/16
to
On 2016-12-12 20:55:07 +0000, Phillip Helbig (undress to reply said:

> In article <00B137E6...@SendSpamHere.ORG>, VAXman-
> @SendSpamHere.ORG writes:
>
>> Which is why I have installed one of the extended keyboards on my tablet. ;)
>
> Bluetooth?

Most Bluetooth keyboards will work with iOS, and with Bluetooth-capable Macs.

> Is there a way to get an LK 411 to work with an iPad?

LK411 used the old PS/2 keyboard connection, and not anything that
Apple or anybody else particularly uses in recent times. It was
neither USB nor Bluetooth. So... No. No LK411. Not without building
your own adapter to USB or Bluetooth.

A USB keyboard usually does works via the so-called camera kit USB
adapter for iOS, or via one of the available docks.

The Apple extended wired Keyboard is pretty close to an LK layout, and
does work with iOS via adapter.

If you scrounged one of the LK USB keyboards such as the LK463, that
will probably (mostly) work, but I'd expect that the DEC-specific keys
will not be supported by iOS nor by the terminal emulator.

Bluetooth keyboard cases and keyboards are available, and do work with
iOS devices, including the Apple wireless keyboard.

It's also possible to create a custom software keyboard within iOS, so
it's theoretically possible to create a soft LK keyboard and build or
port a terminal emulator that can recognize that. Not a weekend
project, but likely all feasible.

Extended keyboards are not the path forward for even command-line
applications, though. If you're still tied to EDT, then EVE (with the
EDT keypad selected) and LSEDIT both work similarly to EDT — with the
keypad, when that's available — and all three can be used from
commands. The LSEDIT commands are somewhat more OpenVMS-like than the
Unix-like commands of EDT though, and LSEDIT is a far better choice for
programming than is EDT. If you're inclined to drag your fingers
forward, emacs and vim are both options, and both are customizable to
the keyboard. http://vim.wikia.com/wiki/PuTTY_numeric_keypad_mappings

Related:
http://labs.hoffmanlabs.com/node/428
http://labs.hoffmanlabs.com/node/768
http://labs.hoffmanlabs.com/node/1088

I've been using Panic Prompt on iOS, and that does pretty well. But
I've moved clear of the keypad.

VAXman-

unread,
Dec 13, 2016, 9:46:00 AM12/13/16
to
An LK463, a 2S LiPo battery, 5v voltage regulator, an HC-06 and a bit of
solder can make you a Bluetooth LK keyboard. ;)

Bill Gunshannon

unread,
Dec 13, 2016, 10:21:27 AM12/13/16
to
On 12/13/16 9:33 AM, Stephen Hoffman wrote:
> On 2016-12-12 20:55:07 +0000, Phillip Helbig (undress to reply said:
>
>> In article <00B137E6...@SendSpamHere.ORG>, VAXman-
>> @SendSpamHere.ORG writes:
>>
>>> Which is why I have installed one of the extended keyboards on my
>>> tablet. ;)
>>
>> Bluetooth?
>
> Most Bluetooth keyboards will work with iOS, and with Bluetooth-capable
> Macs.
>
>> Is there a way to get an LK 411 to work with an iPad?
>
> LK411 used the old PS/2 keyboard connection, and not anything that Apple
> or anybody else particularly uses in recent times. It was neither USB
> nor Bluetooth. So... No. No LK411. Not without building your own
> adapter to USB or Bluetooth.
>

Build or buy.


http://lifehacker.com/convert-any-usb-keyboard-to-bluetooth-with-a-diy-adapte-1786324129

http://handheldsci.com/kb

https://learn.adafruit.com/convert-your-model-m-keyboard-to-bluetooth-with-bluefruit-ez-key-hid/code

And probably as dozen other examples.

bill

Phillip Helbig (undress to reply)

unread,
Dec 13, 2016, 12:21:31 PM12/13/16
to
In article <o2p0pf$baj$1...@dont-email.me>, Stephen Hoffman
<seao...@hoffmanlabs.invalid> writes:

> > Is there a way to get an LK 411 to work with an iPad?
>
> LK411 used the old PS/2 keyboard connection, and not anything that
> Apple or anybody else particularly uses in recent times. It was
> neither USB nor Bluetooth. So... No. No LK411. Not without building
> your own adapter to USB or Bluetooth.

I guess that can't be bought off the shelf. :-(

> The Apple extended wired Keyboard is pretty close to an LK layout, and
> does work with iOS via adapter.
>
> If you scrounged one of the LK USB keyboards such as the LK463, that
> will probably (mostly) work, but I'd expect that the DEC-specific keys
> will not be supported by iOS nor by the terminal emulator.

Yes, the LK463 is essentially the usual VMS keyboard with USB. However,
if the DEC-specific keys don't work, there is little point.

Paul Sture

unread,
Dec 13, 2016, 6:54:15 PM12/13/16
to
I have a Logitech keypad for my iPad, and I can see the value. Some
folks have managed to migrate away from their desktops to tablets,
though the ones I have come across are more writers than developers.

To answer Phillip's question, it's doubtful without writing your
own low level software. Given the existence of Apple and third
party keyboards for the iPad which use Bluetooth, that's one way
forward but would need some kind of adaptor for the LK 411.

--
Irregular English verbs:
I have an independent mind
You are an eccentric
He is round the twist

seasoned_geek

unread,
Dec 14, 2016, 11:38:54 AM12/14/16
to
On Tuesday, December 13, 2016 at 8:33:41 AM UTC-6, Stephen Hoffman wrote:
>
> LK411 used the old PS/2 keyboard connection, and not anything that
> Apple or anybody else particularly uses in recent times. It was
> neither USB nor Bluetooth. So... No. No LK411. Not without building
> your own adapter to USB or Bluetooth.
>

One can purchase PS/2 to USB adapters for less than $8.

http://www.newegg.com/Product/ProductList.aspx?Description=ps2%20usb%20adapter&Submit=ENE

I have several for the rare times I encounter a machine which doesn't have PS/2 connectors. Those two machines would be my netbook and laptop. I don't buy desktop machines without PS/2 connectors. I'm actually typing on an 8100 Elite SFF that has an I7 quad-core in it yet came with PS/2 __and__ an actual honest to God serial port. Not some take up a slot barely works some of the time serial card, but an actual serial port that works with every flavor of Linux I have.

Stephen Hoffman

unread,
Dec 18, 2016, 1:07:42 PM12/18/16
to
On 2016-12-14 16:38:52 +0000, seasoned_geek said:

> On Tuesday, December 13, 2016 at 8:33:41 AM UTC-6, Stephen Hoffman wrote:
>>
>> LK411 used the old PS/2 keyboard connection, and not anything that>
>> Apple or anybody else particularly uses in recent times. It was
>> neither USB nor Bluetooth. So... No. No LK411. Not without building
>> your own adapter to USB or Bluetooth.
>
> One can purchase PS/2 to USB adapters for less than $8.
>
> ...
>
> I have several for the rare times I encounter a machine which doesn't
> have PS/2 connectors. Those two machines would be my netbook and
> laptop. I don't buy desktop machines without PS/2 connectors.

Setting out to purchase what is now long-outdated gear is not a
particularly robust path forward.

> I'm actually typing on an 8100 Elite SFF that has an I7 quad-core in it
> yet came with PS/2 __and__ an actual honest to God serial port. Not
> some take up a slot barely works some of the time serial card, but an
> actual serial port that works with every flavor of Linux I have.

Yes, adapters which usually don't support LK keys, and — having tried
more than a few — sometimes don't work at all with LK keyboards. LK
keyboards and OpenVMS itself uses keyboard modes that most PS/2 systems
and software doesn't, and that adapters may or may not have been tested
with. For details, see the scanset 3 discussions around KVMs. But
again, I've tried a half-dozen examples of these adapters, and had zero
success. And that's without the added 30-pin or Lightning connection
involved...

seasoned_geek

unread,
Dec 19, 2016, 9:05:46 AM12/19/16
to
On Sunday, December 18, 2016 at 12:07:42 PM UTC-6, Stephen Hoffman wrote:
> >
> > I have several for the rare times I encounter a machine which doesn't
> > have PS/2 connectors. Those two machines would be my netbook and
> > laptop. I don't buy desktop machines without PS/2 connectors.
>
> Setting out to purchase what is now long-outdated gear is not a
> particularly robust path forward.
>

PS/2 is only long dead according to non-tech companies like Apple. I was on multiple shopping sites this past month configuring new machines for purchase and they still come with PS/2. One was a small form factor HP desktop.

Ahhh Apple, butcher of CUPS, on a whim ripped support for robust and proven Postscript from the package without warning, replacing it with PDF support where PDF is more of a "generalized guide" without any significant rules, thus breaking _every_ Linux printer driver in every package. Adding insult to injury without warning or communication to the community they also ripped out support for serial and parallel ports simply because their worthless products didn't include them.

> > I'm actually typing on an 8100 Elite SFF that has an I7 quad-core in it
> > yet came with PS/2 __and__ an actual honest to God serial port. Not
> > some take up a slot barely works some of the time serial card, but an
> > actual serial port that works with every flavor of Linux I have.
>
> Yes, adapters which usually don't support LK keys, and — having tried
> more than a few — sometimes don't work at all with LK keyboards. LK
> keyboards and OpenVMS itself uses keyboard modes that most PS/2 systems
> and software doesn't, and that adapters may or may not have been tested
> with. For details, see the scanset 3 discussions around KVMs. But
> again, I've tried a half-dozen examples of these adapters, and had zero
> success. And that's without the added 30-pin or Lightning connection
> involved...

Hmmm... me thinks you meant this response to go with the previous and didn't mean to paste in/leave the serial port comment above it since it says nothing about serial ports.

I'm in no position to do any testing. I tossed out my last 2 LK series keyboards during an electronics recycling drive several years ago. One was still new in box. The last time I looked into it and enlisted a few low level geeks from the geek channels the LK not working under Linux and the el-cheapo x86 family of chips wasn't a hardware problem but a driver problem. None of us took the time to fix it though. Was one of those take an hour to test it things. As I remember all of the keystrokes were getting into the driver via the PS/2 port but the driver was tossing what it didn't understand.

Tests of course happened using real computers not Apple products with their obsolete BSD implementation.

Stephen Hoffman

unread,
Dec 19, 2016, 1:15:17 PM12/19/16
to
On 2016-12-19 14:05:41 +0000, seasoned_geek said:

> On Sunday, December 18, 2016 at 12:07:42 PM UTC-6, Stephen Hoffman wrote:
>>>
>>> I have several for the rare times I encounter a machine which doesn't
>>> have PS/2 connectors. Those two machines would be my netbook and
>>> laptop. I don't buy desktop machines without PS/2 connectors.
>>
>> Setting out to purchase what is now long-outdated gear is not a
>> particularly robust path forward.
>>
>
> PS/2 is only long dead according to non-tech companies like Apple. I
> was on multiple shopping sites this past month configuring new machines
> for purchase and they still come with PS/2. One was a small form factor
> HP desktop.

Other companies that refer to the PS/2 stuff as legacy ports include
Microsoft, and there've been ensuring statements from companies as far
back as those of Compaq circa 2000, with its non-legacy iPaq Desktop
boxes. Having deployed those legacy-free iPaq desktop boxes, those
were nice boxes, too. But I digress.

https://en.wikipedia.org/wiki/Legacy_port
https://en.wikipedia.org/wiki/Legacy-free_PC
https://en.wikipedia.org/wiki/PC_System_Design_Guide

Etc.

But then I'm half expecting to see requests to reinstitute more than a
few other legacy tools and interfaces and software packages. Trying
swimming upstream against the tide has always been popular in IT, after
all. That approach even works for a while, too. Then everybody
involved gets tired of the effort, and the {whatever} gets replaced.

seasoned_geek

unread,
Dec 28, 2016, 10:36:41 AM12/28/16
to
Thankfully it will be at least 50 years, if not longer, before that has a chance to pass. I realize Apple and Microsoft may wish for it and Apple certainly committed a crime against humanity removing serial and parallel support from CUPS while at the same time replacing proven postscript which actually had a standard with the artsy-fartsy-barely-a-guideline-exists PDF data requirement WITHOUT TELLING ANYONE BEFORE RELEASING IT. But hey, those 2 non-tech companies really don't matter.

Nobody in the embedded systems world, especially the medical device world would _ever_ put anything made by either of those two companies in an FDA regulated product. If the prison time didn't kill you the lawsuits from late night television commercial lawyers would take everything you ever hoped to have including personal assets once they breached the veil of corporate protection, which can happen with medical products, hence the television commercial lawyers.

The legacy USB port cannot function in an industrial environment. It cannot be shielded well enough, protected from vibration nor can it transmit far enough in an industrial environment. Some of the devices I've written code for are in automotive plants at locations where, quite literally, there is 3-phase power arcing overhead. Some of my scale software is most likely still in use at landfills today. Those _had_ to be long-pull-slow-baud RS-232 so inbound and outbound could all transmit back. Wireless anything was not an option because right beside the scale area were multiple GE jet engines burning off enough landfill gas to generate electricity for 10,000 or so homes. At least that was the first site we did it for and they planned to roll it out to all sites because selling electricity made the landfills more profitable and it got the socks-with-sandals crowd off their back.

I really really really really wish I had not given away my first edition of this book:

http://www.barnesandnoble.com/w/c-programmers-guide-to-serial-communications-joe-campbell/1000156675?ean=9780672302862

I did that when I got the later edition. The first edition had a beautiful history of RS-232 and the 25-pin connector. Can anyone guess where that pin-out and connector came from? The industrial revolution. Before there was much, if any, fiction about something we now call a computer, there was a 25-pin industrial control standard with 2 anchor screws baked into the standard.

First edition had such a perfect, well researched and well referenced chapter on that. I kick myself every month having let that copy go. Have not been able to find a first edition. Why did I let it go? Because I was working tons on VMS, basically having gotten out of the embedded world. For the past decade I've been back in the embedded world.

Most of the places making industrial quality embedded systems, even the vending machines which go in factory environments, purchase "engineering desktops." Most of those come from HP because they must still have one engineer, Apple and Microsoft certainly don't. An "engineering desktop" has a physical serial port with PS/2 connectors. Why the PS/2 connectors? Because you can shield those and they don't suffer from vibration like legacy USB ports.

You know what I really find funny? When you buy the cheaper USB sticks, the ones without a shroud that just flick open, they look exactly like a hunk of ISA bus.
0 new messages