80s and 90s

1,049 views
Skip to first unread message

Francesco Nigro

unread,
Sep 16, 2015, 2:00:22 AM9/16/15
to mechanical-sympathy
Hi to everyone!
In a recent presentation (http://www.slideshare.net/c24tech/john-davies-high-performance-java-binary-from-javazone-2015?utm_campaign=Javazone&utm_content=20914085&utm_medium=social&utm_source=twitter) on one of the last slides one advice hit me bad:
"Start coding intelligently like we used to in the 80s and 90s"
I'm an almost-young programmer that work in a company that live making (j2ee/spring) portal and on the paper i cannot disagree with the advice due to the environment sorrounding me,but i did't know the 80s/90s...any of you have a though or experience to share?
It's important...in case of good feedbacks i could ask mr.T to change the name of the group in Vintage Sympathy :)

Henri Tremblay

unread,
Sep 16, 2015, 10:39:34 AM9/16/15
to mechanica...@googlegroups.com
Quick feedback:

0x10 is not 32. Is it 16... Not a good way to start a presentation.

However, they use Censum and JMH. Good for them :-)

To be serious:
  • They are right, Java is really memory intensive and we use to work with a much more compact style in the 90s and 80s (but I'm more a 90s guy)
  • Reducing memory usage by putting it in byte buffers is indeed a good solution. Only to be done when really needed because it still makes the code more cryptic. However, it really reduce allocation rate and the pressure on the GC
  • As a side note, value types are supposed to also help that
However, they are probably hiding a thing of two. On a modern CPU, you need to pad against false sharing and allow stay aligned on memory bus sizes. So sometime, you need to be a bit bigger to keep being faster. They are probably doing it but not mentioning it here.

But overall, not a bad presentation to put things in perspective.

Cheers,
Henri


--
You received this message because you are subscribed to the Google Groups "mechanical-sympathy" group.
To unsubscribe from this group and stop receiving emails from it, send an email to mechanical-symp...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Francesco Nigro

unread,
Sep 16, 2015, 2:56:35 PM9/16/15
to mechanical-sympathy
Thanks to share Henry,

There are one or more errors/omissions in the presentation but the main ideas were given more or less...
What really intrigue me is that sentence...maybe is the first time in history in which we have too much power for our (simple?) tasks. And the shadow of the "big data" is a sort of okkam razor that raise again the level of the challenge for the programmers (or the hardware?) and we were come back in 80s/90s...but with (i hope) the new experience gained in these years...

Henri Tremblay

unread,
Sep 16, 2015, 3:40:40 PM9/16/15
to mechanica...@googlegroups.com
Oh! Now we are talking philosophy :-)

In the 80/90s we were thinking a bit too much about the technical issues. Today, we are able to code much more things in a really short time. However, this gives more space to overdesign. And people tend to think less about performance. Pretty much all Java was built under the assumption that Moore's law would make the performance issues irrelevant soon (unicode 16 characters, synchronized collections, multi-threading, ...). We soon discovered that it wasn't true but are now stuck with the legacy.

Also, along the years, something funny tend to happen. People rediscover things grandpa used to know. For instance, parallel and functional programming used to be common in the 60s and 70s. It disappeared in the 80s and 90s. So people forgot about it. It is now coming back because we need it. Same thing for BigDecimal. Anyone who has worked on a banking application in Cobol knows that you need a decimal type. But people forget about it. Which is why it was added as an afterthought in Java (and it sadly isn't a primitive type...) (and yes, I know some in this list are BigDecimal haters that much prefer deal with floating points rounding issues. They have really good reasons to be)

But he got something right. A BigData problem might need a big data answer. But the answer can also be to transform the BigData problem into a smaller data problem ;-) And we should never forget that.


Greg Young

unread,
Sep 16, 2015, 3:44:35 PM9/16/15
to mechanica...@googlegroups.com
"Also, along the years, something funny tend to happen. People
rediscover things grandpa used to know. For instance, parallel and
functional programming used to be common in the 60s and 70s. It
disappeared in the 80s and 90s. So people forgot about it. It is now
coming back because we need it. Same thing for BigDecimal. Anyone who
has worked on a banking application in Cobol knows that you need a
decimal type. But people forget about it. Which is why it was added as
an afterthought in Java (and it sadly isn't a primitive type...) (and
yes, I know some in this list are BigDecimal haters that much prefer
deal with floating points rounding issues. They have really good
reasons to be)"

To be fair I worked in banking/gambling software in the 90s and no we
did not use BigDecimals we used "pennies" and handled things as
integers (usually).
--
Studying for the Turing test

Michael Barker

unread,
Sep 16, 2015, 4:30:11 PM9/16/15
to mechanica...@googlegroups.com
To be fair I worked in banking/gambling software in the 90s and no we
did not use BigDecimals we used "pennies" and handled things as
integers (usually).

We still do, 64 bit milli-pennies. 

Vitaly Davidovich

unread,
Sep 16, 2015, 4:40:10 PM9/16/15
to mechanical-sympathy

As the saying goes "the more things change, the more they stay the same ..."

Machines have gotten more powerful, but the workloads and performance expectations have gone up quite a bit as well.  With mobile, power consumption is also at the foray, even with consumer devices.

sent from my phone

Greg Young

unread,
Sep 16, 2015, 4:43:49 PM9/16/15
to mechanica...@googlegroups.com
Last place I worked with such systems we did the same. As the next
post says the more things change the more they stay the same. If
someone had suggested floats they would have met many raised eyebrows
the same for big decimal.

Michael Barker

unread,
Sep 16, 2015, 4:54:11 PM9/16/15
to mechanica...@googlegroups.com
Although, I'd be very happy if our CPUs supported a Decimal64 type.

Mike.

Kirk Pepperdine

unread,
Sep 16, 2015, 6:19:06 PM9/16/15
to mechanica...@googlegroups.com
On Sep 16, 2015, at 9:40 PM, Henri Tremblay <henri.t...@gmail.com> wrote:

Oh! Now we are talking philosophy :-)

In the 80/90s we were thinking a bit too much about the technical issues. Today, we are able to code much more things in a really short time.

Agreed, also the level of skill required to build complex systems today is much lower.

Also, along the years, something funny tend to happen. People rediscover things grandpa used to know. For instance, parallel and functional programming used to be common in the 60s and 70s. It disappeared in the 80s and 90s.

I’d disagree here. What is commonly known as functional programming is what I’d call standard coding in a language that treats code as a first class citizen. The popular programming languages we use today don’t do this and thus we need to structures and constructs and programming idioms to make up for this fundamental flaw in their designs. FP makes about as much sense in Smalltalk as selling snow in Tuktoyaktuk would, it’s just a natural part of the environment.

So people forgot about it. It is now coming back because we need it. Same thing for BigDecimal. Anyone who has worked on a banking application in Cobol knows that you need a decimal type. But people forget about it. Which is why it was added as an afterthought in Java (and it sadly isn't a primitive type...) (and yes, I know some in this list are BigDecimal haters that much prefer deal with floating points rounding issues. They have really good reasons to be)

Exposing primitives in Java (the language) was a mistake. Smalltalk used primitives, you just never saw them in the code. Double edged sword, Smalltalk numbers had an infinite range of values. Of course there was a performance hit when you moved over the boundary that could be supported by a primitive. However you still see that today if you think you need to use something like a BigDecimal.


But he got something right. A BigData problem might need a big data answer. But the answer can also be to transform the BigData problem into a smaller data problem ;-) And we should never forget that.

I would say that John got a lot right. The advantages of SDO is huge. On one system alone they went from 80 servers to 2 by simply switching to using SDOs binary FPML format. I helped John take is benchmark from 400,000 TPS to 25,000,000TPS with a few tweaks here and there and some brilliant work on his teams part. In more practical situations where he is forced to use BigDecimal he can easily maintain 6-7,000,000TPS.

I’d say it’s back to binary!!!

Kind regards,
Kirk
signature.asc

Henri Tremblay

unread,
Sep 16, 2015, 9:04:59 PM9/16/15
to mechanica...@googlegroups.com
:-)

I'll remove some misinterpretation just for the sake of clarity.

Yes :-) For games, we used fixed point (so basically pennies). In Cobol it depends. There was a decimal type at some point. But otherwise they were using fixed points too. And I was also using Integer value + Integer scale for international banking (because the precision varies). One of my nice stories about that was my boss in the early 2000s. He was having a meeting with the architecture team (a real bad and naive team. They were all fired some months after). A meeting because all calculations tend to be just a bit of. After 15 minutes, he went out of the meeting yelling "That's nonsense! I've been using Integer for amount for 30 years! What is that float you are talking about?!?" I went to calm him and explain what they were talking about... and then went to the architecture team to give them a little math course... 

About FP, I haven't said it was well used or implemented. I said it was rediscovered ;-)

Exposing primitive was a big mistake. In fact, I don't really want BigDecimal to be a primitive. I just want to be able to do bd1 + bd2. Like I can in C# (also, optionals in C# have a syntactic sugar int? which I would love too... but that's another story)

And don't get me wrong, the presentation is great and I love C24. They have wonderful products that I would love to have had when I was doing my own swift parser because Tibco was selling one that just wasn't working. They also used to give beer opener USB keys. Hard to not like them ;o)
 

Kirk Pepperdine

unread,
Sep 17, 2015, 2:43:36 AM9/17/15
to mechanica...@googlegroups.com

> On Sep 17, 2015, at 3:04 AM, Henri Tremblay <henri.t...@gmail.com> wrote:
>
> :-)
>
> I'll remove some misinterpretation just for the sake of clarity.
>
> Yes :-) For games, we used fixed point (so basically pennies). In Cobol it depends. There was a decimal type at some point. But otherwise they were using fixed points too. And I was also using Integer value + Integer scale for international banking (because the precision varies). One of my nice stories about that was my boss in the early 2000s. He was having a meeting with the architecture team (a real bad and naive team. They were all fired some months after). A meeting because all calculations tend to be just a bit of. After 15 minutes, he went out of the meeting yelling "That's nonsense! I've been using Integer for amount for 30 years! What is that float you are talking about?!?" I went to calm him and explain what they were talking about... and then went to the architecture team to give them a little math course…

Indeed but of course we all know that 16 bit representations of decimal data leaves huge gaps on the number line given any reasonable level of precision.

>
> About FP, I haven't said it was well used or implemented. I said it was rediscovered ;-)
>
> Exposing primitive was a big mistake. In fact, I don't really want BigDecimal to be a primitive. I just want to be able to do bd1 + bd2. Like I can in C# (also, optionals in C# have a syntactic sugar int? which I would love too... but that's another story)

indeed….
>
> And don't get me wrong, the presentation is great and I love C24. They have wonderful products that I would love to have had when I was doing my own swift parser because Tibco was selling one that just wasn't working. They also used to give beer opener USB keys. Hard to not like them ;o)

Been working with binary formatted data streams since forever. In fact when working with Cray systems, we turned everything into a binary stream.. Funny, today we turn everything into a text stream. And I wonder why the Cray system still seems faster than anything we have today despite the slower clocks. Maybe it’s time playing tricks with me but I could compile a $6i7 load of code before I could get my finger off of the return key.

Regards,
Kirk

signature.asc

Eric DeFazio

unread,
Oct 2, 2015, 10:05:15 AM10/2/15
to mechanical-sympathy
Hello Francisco,

I find the activity of "Rapid Prototyping" in Java fun; ( I think this is what Java excels at ) it's easy to:
  • take an idea in my head
  • convert it to working code to get it to do something for me
  • show the code to someone else (a Junior developer and they could use it, understand it, and support it )
  • allow me to trace / debug the code (without errant pointers and segfaults)
I think the Virtual Machine is a wonderful piece of engineering. 
Unfortunately, the problems I have with Java stem from it's "strengths", but partially from the "industry".
  • The conventional OO "easy way" of doing things has abstraction overhead (you pay for the luxury of having a Garbage collector and Object Abstraction)
  • Most Java developers I've met have only a very cursory understanding of the cost of the Objects and the Garbage Collector
  • People too often point and scream "Premature-Optimization!..." (to cover up for their lack of understanding) and are overconfident that their working-but-bloated-mess-with-every-conceivable-design-pattern-and-open-source-project is "well-architected" rather than writing concise, readable and efficient code with minimal dependencies.
  • The industry seems to endorse "Invented Here" (https://en.wikipedia.org/wiki/Invented_here) and Jar-hell (Junior developers are "more-productive" when they can use an external jar/ API, and they dont really understand how to best use, configure and optimize library or understand the "cost" of said library)
  • Most Java code has been  written by "junior" developers (Java is easy to teach, and they can get things done fast, therefore there is lots of "opportunities for improvement")
  • "Throw-away" prototypes often end up in production (it's easy to get something working, much harder to get things working efficiently)
  • "Throw more hardware at it" (especially in the cloud) approach often becomes an "out" for poorly designed systems.
  • Periodic Rolling restarts becomes another "out" for covering up memory leaks and bad engineering
Anyways, I appreciate that in Java (if you care about real engineering performance) you have the OPTION of doing things like:
  • going off heap (Unsafe) (but it's hidden away so not all Junior Developers don't trip over themselves)
  • reading/writing directly to Buffers   
  • etc.
Or, if you want to sling some code real quick you can do so and it'll run "good enough"

To use a cooking analogy, Java lets me be a chef, the JIT (sous chef) and GC (dishwasher) make things efficient to do all the cleanup, and I can focus on the problem/algorithms...

Unfortunately (for High performance "Big Data" types of problems where the dishes pile up faster than the dishwasher can get to them you run into problems and (conventionally designed) Java programs can be the cause of the pile-up.

james bedenbaugh

unread,
Oct 12, 2015, 10:03:35 PM10/12/15
to mechanical-sympathy
At the risk of exposing my advanced age in this industry. . .

I started messing with coding in the late 70's. I came of age in the mid-80's with my first gigs at a big Fortune 100 company. I started coding with the usual suspects on IBM 360 and 370 machines:
  • COBOL
  • RPG II/III
  • FORTRAN
  • ASSEMBLER
  • BASIC
I even still know the Hollerith Code, what an over-punch is and why Assembler must be used in banking applications.
 
Scripting was confined to. . .well, we didn't have much. JCL was the closest thing we had. There were some nice 4GL tools we could use - such as DYL-260 and then DYL-280 - made writing quick and dirty apps easy.  I did a lot of early work with DBase 2 and 3 and some other PC based packages but they were departmental apps. The reality is that many software companies made coding a cheap commodity. When I coded banking applications using Assembler, it was because fractions of pennies are important and COBOL rounding techniques were not robust enough.

And RAM? I was around when it was said no one was would ever need a computer in their house, or that 640K RAM should be enough for anyone. The closest I came to OO was Lisp, but there was ZERO interest in the industry for it. Procedural code was the only game in town. The only way you could do kewl stuff was to work in Defense contracts or with NASA.

There were no RDBMS's to speak of - DB2 and Oracle was still in it's infancy - data was accessed (if you were lucky) via key access such as VSAM but ISAM was still around. IMS databases were a big deal. Flat files abounded. Ever heard of DOS VSE? I can still remember when a 16GB drive with controller, etc., from IBM cost about <Austin Powers mode>$1 million dollars</Austin Powers mode> apiece. I can get that on a flash drive now for under $10 and it's faster. Forget about networks - phone lines were so slow that it could take over a minute for a single page - in ASCII text - to appear.

The bottom line was that we were thrifty with all our resources because we had to be. I got one of the first 286, 386 and 486 machines - each time thinking I had gone to developer heaven because they were so "Big". I can remember getting a hard drive on my PC - I was still using 8" floppies on TRS-80 machines and the old Xenix boxes. A 5MB drive was. . .awesome.

Hardware has gotten so fast and cheap that tight code almost doesn't exist unless you are using C or Assembler and even then it's iffy. I'm glad to see fellows like Gil and Martin taking this on with languages like Java. OO is great but I work with in-memory data storage all the time and you wouldn't believe the things I see coded as data structures. 

The good news is that a lot of the work we did back then translated into good habits today. For instance, a LOT of design patterns you see and hear about weren't invented in the 90's or 2000's - we did that stuff decades ago - someone just discovered what we did and gave it a name. 

I do miss the old code reviews. Junior developers (anyone with less than 10 years experience) were subject to brutal reviews by Senior Developers (usually 15-20 years experience) - I saw people coming out of those almost crying because their code had been ripped apart. There was a joke back then that programmers ate their own. Those events quickly separated those who could make it in the industry from those who couldn't. Today you'd be fired for conducting those types of reviews. Too harsh for today's sensitive work environment - but the bottom line was we wrote better, tighter and cleaner code (whatever that means).

Ok, back to yelling at the neighbors kids to get off my grass. . .

:)~

Martin Thompson

unread,
Oct 13, 2015, 4:51:24 AM10/13/15
to mechanical-sympathy
Hi James,

Thanks for the nice read this morning. :-)

I started a bit later than you with most of my professional coding starting in the early 90s. The brutal code reviews still lingered on then. Imagine what it is like having a core Tandem Computers developer assigned as your mentor. Luckily for me he was a great coach and not brutal on the young me.

One observation I would like to make on review and feedback point. I think it is so important to give feedback and it cannot be too fluffy otherwise it is vague. If we want our industry to advance and be more inclusive it would help it we can be more focused on the actions and work product, not the person. I spent some time in the late 90s working with security and cryptography professionals. I found it really enlightening how they took the "adversarial" approach to design review but upfront it was always made clear the attack is on the work and not the person. 

We need to get more comfortable with giving feedback that encourages learning while actually learning at the same time. Some of the most effective programmers I know today were not always so gifted. They are relentless learners. When we start most software projects there is always so much we do not know. Why not just get comfortable admitting this and plan accordingly? Praise people for seeking out and mitigating the risks and unknowns. Current location is nowhere near as important as vector towards the goal.

We need to learn from our history and combine that with fact that every project is new. It is all about learning. Not knowing some things, or making mistakes, is natural part of learning. To "beat" people up for this is just immature and naive, it could even be construed as bullying. If encouraged to learn and coaching is provided, and still someone does not advance, then maybe they are not cut out for this. If we just give brutal feedback then we narrow to those who can take the feedback and have ability, this is limited selection and self similar reinforcing.

That's my old man rant for this morning. Now back to sitting on my porch and finishing that presentation for Joker Conf.

Martin...

p.s. James I remember Xenix and how it fell to Linux :-)

Kirk Pepperdine

unread,
Oct 13, 2015, 7:32:23 AM10/13/15
to mechanica...@googlegroups.com
Hi James,

I was going to ask if you’d done any IMP or Register IMP. As for OO, started with it in the mid-80s. A couple of us that graduated together managed to get the official waterfall methodology tossed from a government department by completely crushing the engineering departments delivery schedule. We delivered ahead of time (for one piece it was 6 months ahead on what was scheduled to be a year long effort). It got everyone in other departments that we partnered with very interested in what we were doing and they started adopting OO also. At the time, we were just having fun writing software in a very agile manner without realizing how disruptive we were being. The catalyst in this case was Dave (Smalltalk) Thomas though I don’t think he knows much, if anything, about what he unleashed on the many departments we eventually had a hand in changing how the wrote and delivered software.

Regards,
Kirk

signature.asc

james bedenbaugh

unread,
Oct 13, 2015, 10:59:27 AM10/13/15
to mechanical-sympathy
Martin, Thanks for the reply.

I guess the operative phrase for me was  "the code was ripped apart." The problem for most developers is that the code is so much a part of them they can't help feeling personally attacked. I used to be this way but not so much anymore. Now I just laugh at myself.

Your point concerning aggressive learning is very important - my grandchildren sometimes look at me with awe at my overall knowledge - but I always tell them the same thing: It's not that I'm smart, I've just lived a long time and I paid attention. 

I'm really interested in seeing what happens in the next 10 years with the industry. I've seen it turn over in about 4 major evolutionary leaps during my career. We are in the midst of the 5th and I've noticed the time between each leap has been compressed. Anyway, good luck at the conference. 

james bedenbaugh

unread,
Oct 13, 2015, 11:08:29 AM10/13/15
to mechanical-sympathy
Hi Kirk. Never have done any of that.

I remember the early tools we used for project delivery - I remember the first PERT chart software I used. Yep - the method that delivered the Polaris submarine. I guess in those days the engineering styles we so rigid because it was so easy to screw things up because we worked at such a low level on the machines. Nowadays, there are so many layers of abstraction and so much fault tolerance built in there isn't much care for the precision levels we were held to.

You know, I know or have worked with close to 40 languages - scripting or otherwise - and Smalltalk isn't in the toolbox. Someday when I'm retired and sitting on the back porch smoking a good Cuban, I'm going to fire up the ole laptop and see what the fuss is all about.

Cheers.
To unsubscribe from this group and stop receiving emails from it, send an email to mechanical-sympathy+unsub...@googlegroups.com.

Martin Thompson

unread,
Oct 14, 2015, 3:24:05 AM10/14/15
to mechanica...@googlegroups.com

Your point concerning aggressive learning is very important - my grandchildren sometimes look at me with awe at my overall knowledge - but I always tell them the same thing: It's not that I'm smart, I've just lived a long time and I paid attention. 

Nice way of putting it :-) 
 
I'm really interested in seeing what happens in the next 10 years with the industry. I've seen it turn over in about 4 major evolutionary leaps during my career. We are in the midst of the 5th and I've noticed the time between each leap has been compressed. Anyway, good luck at the conference. 

Thanks. At the airport now ready to fly.

Martin...
 

ben cotton

unread,
Oct 14, 2015, 7:25:36 AM10/14/15
to mechanica...@googlegroups.com
> We need to learn from our history and combine that with fact that every project is new. It is all about learning.

Beautifully said, Martin.  Thank you.  You should know that I treasure this community forum as a place that I can count on to learn new and exciting things. Thank you again.


> That's my old man rant for this morning.

:-)  ...My old man rant for this morning is this: If you are young and concerned that those of us who became professional programmers in the 80s and 90s somehow 'did it better' (than those that started programming this century) ... well ... DO NOT HAVE THAT CONCERN.  It is true that we solved a lot of hard problems writing code with less resources than today's young  programmers, but, we did nothing better.   For me, it has always been a great time to be a programmer.   I do not resent that today's young  programmers have so many superior resources than i had when I started the craft.  But, admittedly, I might rant "damn, wish I had at least some of all this when I was that age".  :-)
--
You received this message because you are subscribed to the Google Groups "mechanical-sympathy" group.
To unsubscribe from this group and stop receiving emails from it, send an email to mechanical-symp...@googlegroups.com.

Francesco Nigro

unread,
Oct 14, 2015, 8:18:55 AM10/14/15
to mechanical-sympathy
Thanks guys to share....every answer is full of wisdom :)
To unsubscribe from this group and stop receiving emails from it, send an email to mechanical-sympathy+unsub...@googlegroups.com.

Gary Mulder

unread,
Oct 15, 2015, 1:14:09 PM10/15/15
to mechanica...@googlegroups.com
On 14 October 2015 at 12:25, ben cotton <bendc...@gmail.com> wrote:

> We need to learn from our history and combine that with fact that every project is new. It is all about learning.

Beautifully said, Martin.  Thank you.  You should know that I treasure this community forum as a place that I can count on to learn new and exciting things. Thank you again.

While a lot has changed, the fundamentals stay the same. One should always have well-thumbed copies of The Mythical Man Month and The Art of Computer Programming on one's shelf.

The first it good at throwing at managers. The second makes for good bed time reading.

Regards,
Gary

Howard Chu

unread,
Nov 4, 2015, 9:18:23 AM11/4/15
to mechanical-sympathy


On Wednesday, October 14, 2015 at 12:25:36 PM UTC+1, Ben Cotton wrote:


:-)  ...My old man rant for this morning is this: If you are young and concerned that those of us who became professional programmers in the 80s and 90s somehow 'did it better' (than those that started programming this century) ... well ... DO NOT HAVE THAT CONCERN.  It is true that we solved a lot of hard problems writing code with less resources than today's young  programmers, but, we did nothing better.   For me, it has always been a great time to be a programmer.   I do not resent that today's young  programmers have so many superior resources than i had when I started the craft.  But, admittedly, I might rant "damn, wish I had at least some of all this when I was that age".  :-)

My first programmable device was an HP-25 calculator with 49 instructions of memory, back in the early 1980s as a teenager. I spent years programming that before I got hold of something more powerful - an HP-15C. You learn to make every byte count when you want to get something done. You also learn about exotic topics like "synthetic programming" - making the device do things its designers never intended for you to do. (I did the same on the 6502, exploring the whole 256 byte opcode space and discovering strange and wonderful undocumented instructions...) Even later in college, working on DECstation 3100s, which had relatively gargantuan CPU and memory resources, I was still counting bytes and machine cycles - which led to my writing the world's fastest integer multiplier for the MIPS R3000.

Necessity is the mother of invention - if you have never been forced to program in tightly constrained environments, you will never develop something radically more efficient than the status quo.

Francesco Nigro

unread,
Nov 5, 2015, 2:09:46 PM11/5/15
to mechanical-sympathy
"Necessity is the mother of invention - if you have never been forced to program in tightly constrained environments, you will never develop something radically more efficient than the status quo."

In the 2015 i choose to develop and design my libs first for the raspeberry pi and than for the uber-servers of my company for this reason :D

Reply all
Reply to author
Forward
0 new messages