Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Moore's Law

270 views
Skip to first unread message

John Larkin

unread,
Apr 6, 2016, 5:08:30 PM4/6/16
to


http://www.eetimes.com/document.asp?doc_id=1329362&


"Moore's law is alive and well, even though the number of people that
say it is over also doubles every two years."


--

John Larkin Highland Technology, Inc
picosecond timing precision measurement

jlarkin att highlandtechnology dott com
http://www.highlandtechnology.com

Phil Hobbs

unread,
Apr 6, 2016, 5:11:14 PM4/6/16
to
On 04/06/2016 05:08 PM, John Larkin wrote:
>
>
> http://www.eetimes.com/document.asp?doc_id=1329362&
>
>
> "Moore's law is alive and well, even though the number of people that
> say it is over also doubles every two years."

"Moore's law is alive and well because it gets redefined every two years."

CHeers

Phil Hobbs


--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC
Optics, Electro-optics, Photonics, Analog Electronics

160 North State Road #203
Briarcliff Manor NY 10510

hobbs at electrooptical dot net
http://electrooptical.net

Martin Riddle

unread,
Apr 6, 2016, 8:42:38 PM4/6/16
to
On Wed, 06 Apr 2016 14:08:15 -0700, John Larkin
<jjla...@highlandtechnology.com> wrote:

>
>
>http://www.eetimes.com/document.asp?doc_id=1329362&
>
>
>"Moore's law is alive and well, even though the number of people that
>say it is over also doubles every two years."

Intel Just abandoned their Tick Tock cycle, 10nm is giving them
headaches.

Cheers

DecadentLinuxUserNumeroUno

unread,
Apr 7, 2016, 2:09:01 AM4/7/16
to
On Wed, 6 Apr 2016 17:11:10 -0400, Phil Hobbs
<pcdhSpamM...@electrooptical.net> Gave us:

>On 04/06/2016 05:08 PM, John Larkin wrote:
>>
>>
>> http://www.eetimes.com/document.asp?doc_id=1329362&
>>
>>
>> "Moore's law is alive and well, even though the number of people that
>> say it is over also doubles every two years."
>
>"Moore's law is alive and well because it gets redefined every two years."
>
>CHeers
>
>Phil Hobbs


Heheheh Just like John's IQ.

John Larkin

unread,
Apr 7, 2016, 10:04:43 AM4/7/16
to
By one of the definitions of "infinite", I am infinitely intelligent,
because being more intelligent wouldn't make any difference.

And I'm Usually Right.


--

John Larkin Highland Technology, Inc

lunatic fringe electronics

krw

unread,
Apr 7, 2016, 3:37:28 PM4/7/16
to
On Thu, 07 Apr 2016 07:04:41 -0700, John Larkin
<jjla...@highlandtechnology.com> wrote:

>On Thu, 07 Apr 2016 02:08:52 -0400, DecadentLinuxUserNumeroUno
><DL...@DecadentLinuxUser.org> wrote:
>
>>On Wed, 6 Apr 2016 17:11:10 -0400, Phil Hobbs
>><pcdhSpamM...@electrooptical.net> Gave us:
>>
>>>On 04/06/2016 05:08 PM, John Larkin wrote:
>>>>
>>>>
>>>> http://www.eetimes.com/document.asp?doc_id=1329362&
>>>>
>>>>
>>>> "Moore's law is alive and well, even though the number of people that
>>>> say it is over also doubles every two years."
>>>
>>>"Moore's law is alive and well because it gets redefined every two years."
>>>
>>>CHeers
>>>
>>>Phil Hobbs
>>
>>
>> Heheheh Just like John's IQ.
>
>By one of the definitions of "infinite", I am infinitely intelligent,
>because being more intelligent wouldn't make any difference.

Because he wouldn't understand anyway.
>
>And I'm Usually Right.

...and he's AlwaysWrong.

DecadentLinuxUserNumeroUno

unread,
Apr 7, 2016, 4:25:28 PM4/7/16
to
On Thu, 07 Apr 2016 15:37:20 -0400, krw <k...@nowhere.com> Gave us:

snip
>
>...and he's AlwaysWrong.

Nope. And also not wrong abut you. You dead yet, putz?
Bwuahahahahahahaha!

Apparently I did something right. My health is an order of magnitude
better than all you jackasses who spent the last decade jacking off at
the mouth about me.

rickman

unread,
Apr 7, 2016, 4:56:56 PM4/7/16
to
Except for when you're not.

--

Rick

sean....@gmail.com

unread,
Apr 7, 2016, 8:19:32 PM4/7/16
to
Well it turned into a sigma curve as it must. But it hit the buffers at absolutely astronomical performance rates and numbers of active elements. It happened so fast that people don't really understand what they have in their possession. In terms of human level AI current CPUs are more than fast enough, the main restraint is memory. The more the better. What is a neural net except an analog hash table? Then it is just a question of how you connect these hash tables together and what learning mechanism you apply. That is all doable with current hardware. It is just taking humans a bit of time to catch up with the extraordinary hardware that has been created. Google DeepMind are showing what is possible. Even they though are not using the best types of neural net.
Anyway memory density is still growing exponentially, with the possibility of nearly infinite memory in 3D cubes containing nanoparticles. Well nearly infinite memory would allow construction of something well beyond human capacities.
I would say that within 5 years you are going to see some pretty weird things happening. At a minimum you are going to see a surge in industrial automation. Particularly the cost of automation is going to fall dramatically. That's quite positive, if you want to do something or make something then the burden of finding money, week in week out, for wages is gone.

bill....@ieee.org

unread,
Apr 8, 2016, 6:14:53 AM4/8/16
to
On Friday, April 8, 2016 at 12:04:43 AM UTC+10, John Larkin wrote:
> On Thu, 07 Apr 2016 02:08:52 -0400, DecadentLinuxUserNumeroUno
> <DL...@DecadentLinuxUser.org> wrote:
>
> >On Wed, 6 Apr 2016 17:11:10 -0400, Phil Hobbs
> ><pcdhSpamM...@electrooptical.net> Gave us:
> >
> >>On 04/06/2016 05:08 PM, John Larkin wrote:
> >>>
> >>>
> >>> http://www.eetimes.com/document.asp?doc_id=1329362&
> >>>
> >>>
> >>> "Moore's law is alive and well, even though the number of people that
> >>> say it is over also doubles every two years."
> >>
> >>"Moore's law is alive and well because it gets redefined every two years."
> >>
> >>CHeers
> >>
> >>Phil Hobbs
> >
> >
> > Heheheh Just like John's IQ.
>
> By one of the definitions of "infinite", I am infinitely intelligent,
> because being more intelligent wouldn't make any difference.

In other words, he's got intelligence, bit doesn't use it, preferring to let the Murdoch media do his thinking for him.

> And I'm Usually Right.

For rather small values of "usually".

--
Bill Sloman, Sydney

krw

unread,
Apr 8, 2016, 12:22:55 PM4/8/16
to
Of course, you're AlwaysWrong.

John S

unread,
Apr 8, 2016, 1:10:16 PM4/8/16
to
On 4/6/2016 4:08 PM, John Larkin wrote:
>
>
> http://www.eetimes.com/document.asp?doc_id=1329362&
>
>
> "Moore's law is alive and well, even though the number of people that
> say it is over also doubles every two years."
>
>

<yawn>

Another worthless thread.

DecadentLinuxUserNumeroUno

unread,
Apr 8, 2016, 2:34:07 PM4/8/16
to
On Fri, 8 Apr 2016 12:10:43 -0500, John S <Sop...@invalid.org> Gave us:
And one not so worthless....

http://spectrum.ieee.org/computing/software/linux-at-25-why-it-flourished-while-others-fizzled

John Larkin

unread,
Apr 8, 2016, 6:38:32 PM4/8/16
to
Design something cool and post it here.

bill....@ieee.org

unread,
Apr 8, 2016, 8:16:18 PM4/8/16
to
On Saturday, April 9, 2016 at 8:38:32 AM UTC+10, John Larkin wrote:
> On Fri, 8 Apr 2016 12:10:43 -0500, John S <Sop...@invalid.org> wrote:
>
> >On 4/6/2016 4:08 PM, John Larkin wrote:
> >>
> >>
> >> http://www.eetimes.com/document.asp?doc_id=1329362&
> >>
> >>
> >> "Moore's law is alive and well, even though the number of people that
> >> say it is over also doubles every two years."
> >>
> >>
> >
> ><yawn>
> >
> >Another worthless thread.
>
> Design something cool and post it here.

So John can copy it and sell it as one of his own insanely good ideas ...

--
Bill Sloman, Sydney

omni...@gmail.com

unread,
Apr 8, 2016, 8:55:29 PM4/8/16
to
Gordon Moore said something doubles every 18 months. Not two years. That two year rumor is just proof of the slowing down of progress. It was not a Law, it was a trend that is over, now.

Les Cargill

unread,
Apr 8, 2016, 10:54:53 PM4/8/16
to
It's a law like pan laws, control laws. It's model that's
been ... knighted And Given Official Heft And Status.

--
Les Cargill

rickman

unread,
Apr 9, 2016, 10:58:39 AM4/9/16
to
On 4/8/2016 8:55 PM, omni...@gmail.com wrote:
> Gordon Moore said something doubles every 18 months. Not two years. That two year rumor is just proof of the slowing down of progress. It was not a Law, it was a trend that is over, now.

From wikipedia, "The period is often quoted as 18 months because of
Intel executive David House, who predicted that chip performance would
double every 18 months (being a combination of the effect of more
transistors and the transistors being faster).[17]"

17. "Moore's Law to roll on for another decade". Retrieved 2011-11-27.
"Moore also affirmed he never said transistor count would double every
18 months, as is commonly said. Initially, he said transistors on a chip
would double every year. He then recalibrated it to every two years in
1975. David House, an Intel executive at the time, noted that the
changes would cause computer performance to double every 18 months."

Initially Moore *observed* a doubling every year in the number of
transistors on a chip and projected this would continue for a decade. A
decade later he changed the projection to a doubling every two years.
This has held pretty well for decades now.

--

Rick

DecadentLinuxUserNumeroUno

unread,
Apr 9, 2016, 12:17:46 PM4/9/16
to
On Sat, 9 Apr 2016 10:58:34 -0400, rickman <gnu...@gmail.com> Gave us:
First IC by TI and or Fairchild in 1960 was ten transistor elements..

Intel's 4004 in 1971 (release. was designed in 1970) had 2300
transistor elements.

John Larkin

unread,
Apr 9, 2016, 1:21:31 PM4/9/16
to
On Fri, 8 Apr 2016 17:55:22 -0700 (PDT), omni...@gmail.com wrote:

>Gordon Moore said something doubles every 18 months. Not two years. That two year rumor is just proof of the slowing down of progress. It was not a Law, it was a trend that is over, now.

It's good for a while longer, maybe a bit slower progression, but the
cost of design and masks for 5 nm multi-billion-transistor chips is
going to mean that only mega-volume-sales chips will be at the leading
edge. Maybe Gordon didn't think about the ultimate limit being cost
and complexity, rather than resolution.


--

John Larkin Highland Technology, Inc

lunatic fringe electronics

Phil Hobbs

unread,
Apr 9, 2016, 1:51:19 PM4/9/16
to
On 04/09/2016 01:21 PM, John Larkin wrote:
> On Fri, 8 Apr 2016 17:55:22 -0700 (PDT), omni...@gmail.com wrote:
>
>> Gordon Moore said something doubles every 18 months. Not two years. That two year rumor is just proof of the slowing down of progress. It was not a Law, it was a trend that is over, now.
>
> It's good for a while longer, maybe a bit slower progression, but the
> cost of design and masks for 5 nm multi-billion-transistor chips is
> going to mean that only mega-volume-sales chips will be at the leading
> edge. Maybe Gordon didn't think about the ultimate limit being cost
> and complexity, rather than resolution.

Back in the palmy days of Mead-Conway VLSI scaling, transistors got
faster as they got smaller. Thresholds went down with gate oxide
thickness, so you could reduce VDD and save power. It was very
nice--for a new device node, we just needed the litho folks to get off
their duffs and improve the resolution.

'Taint like that any more--not since 65 nm or so. As transistors
shrink, they now get slower. And leakier. And the threshold voltages
get more and more poorly controlled due to dopant atom statistics. And
we've about run out of entries in the periodic table to try fixing those
problems. (Hafnium oxide gate dielectric, for instance.)

So you need to throw lots more transistors at the same functions to keep
them working right, and turn off most of the chip most of the time to
prevent it from turning to lava. (That's a bit of hyperbole, but less
and less so.)

And chip area is limited to about 2 cm square by yield and
thermomechanical stress issues. It's been stuck there for decades.

That's why Moore's law is over. You can concentrate on counting
transistors, but if most of them are there to compensate for the badness
of the rest, it's sort of academic.

The CPU speed of my desktop computers peaked in about 2006, at 3.5 GHz.
My present Supermicro boxes run at 2.3 GHz. When I left IBM in 2009,
their Deep Blue supercomputer ran at (iirc) 700 MHz for power reasons.

Cheers

Jim Thompson

unread,
Apr 9, 2016, 2:12:39 PM4/9/16
to
I get all kinds of inquiries from managers wringing their hands... "I
spent all this money at BigNameExpensiveConMen, Inc, for analog
functions on 65nm and they don't work for shit." >:-}

...Jim Thompson
--
| James E.Thompson | mens |
| Analog Innovations | et |
| Analog/Mixed-Signal ASIC's and Discrete Systems | manus |
| San Tan Valley, AZ 85142 Skype: Contacts Only | |
| Voice:(480)460-2350 Fax: Available upon request | Brass Rat |
| E-mail Icon at http://www.analog-innovations.com | 1962 |

The touchstone of liberalism is intolerance

jurb...@gmail.com

unread,
Apr 9, 2016, 6:53:07 PM4/9/16
to
>"Initially Moore *observed* a doubling every year in the number of
>transistors on a chip and projected this would continue for a >decade. A
>decade later he changed the projection to a doubling every two years.
>This has held pretty well for decades now."

Moore's law has to come to odds with the law of diminishing returns. This is a reality. Throwing the GPU in the CPU does not make a better CPU. What's more, look at how the speed of these processors is stuck at a couple of GHz. Electrons only move so fast.

I predicted they would figure out optical processors eventually, and they probably will, but when is eventually ?

In all things there is a theoretical limit. Even if a car engine has a VE of friggin 200, it cannot exceed its theoretical compression ratio, and if you separate the components, even a turbo or supercharger cannot do it. What's more, the crank can only stand so much force.

They have TV sets with 1080p resolution. do you think they are going to go higher ? They are getting to the point where people can't see any better.

Computers ? They are so fast now that even Microsoft can't write software to slow them down enough anymore. They've been outdone by the people who wrote Solidworks and Rhinocerous. But at least those programs DO something. Maybe Pspice, but I aonly have LTspice. It can take a few second to analyze and do a simulation, but then this computer is like from 2008. My brand new Win 10 coputer at work is no faster.

What did we gain ?

Well we stimulated the economy and put a bunch of geeks to work. Ad that is what it is. We really do not need any more, they have to push it or they're out of a job. I'd use Win 98SE on a 486 if I could get the damn thing to connect to the router. It was simply not that slow. The other problem is the browser.

The internet is the biggest impetus for this buy and throw out deal we got now. Friggin nags me to upgrade my browser to see a picture hosted on one of those sites, go fuck yourself. I can do everything else I want just they way it is, I have to change to see a picture ? Tell you what, you want to show me a picture get Dropbox and use the \public directory. Simple, no ads, no frames, no nothing, just the damn picture. Want a caption, take a super uper duper advanced graphics program like Paint Shop Pro 4 from 1995 or so and "expand canvas" and then go to the little "A" and insert some text. Be sure to pick the right font size first, that can be tricky at times.

I would use DOS. Hell, it is probably easier to write programs for DOS but you supply the GUI. People will not give up their mouse.

Anyway, the law of diminishing returns is really a factor here and one of the reasons I won't buy anything anymore. Plus they always take out functions I want.

Bottom line, putting more transistors on a chip is not advance. You throw the GPU in there, that is just two chips in one package. Alright they can communicate a bit faster, but not all that much. Eventually they might put the soundcard in the CPU too and call that an advancement. But it ain't, it is just saving them a few bucks. Next maybe the wifi receiver goes in, but still all they have done is to cram more stuff into one package.

Ad then, when just ONE of those billions of transistors goes bad in there, you have to shitcan the whole thing. How convenient for them.

That's progress.

John Larkin

unread,
Apr 9, 2016, 7:15:24 PM4/9/16
to
On Sat, 9 Apr 2016 13:51:13 -0400, Phil Hobbs
<pcdhSpamM...@electrooptical.net> wrote:

Maybe the only chips worth doing at sub 10 nm is memory.

Tim Williams

unread,
Apr 10, 2016, 2:49:36 AM4/10/16
to
"Jim Thompson" <To-Email-Use-Th...@On-My-Web-Site.com> wrote in
message news:2chigbti6e4nsev5m...@4ax.com...
> I get all kinds of inquiries from managers wringing their hands... "I
> spent all this money at BigNameExpensiveConMen, Inc, for analog
> functions on 65nm and they don't work for shit." >:-}

How much analog is even done under 200nm?

Think I recall the RF stuff more or less peaks around there.

Tim

--
Seven Transistor Labs, LLC
Electrical Engineering Consultation and Contract Design
Website: http://seventransistorlabs.com

DecadentLinuxUserNumeroUno

unread,
Apr 10, 2016, 5:16:30 AM4/10/16
to
On Sun, 10 Apr 2016 01:49:04 -0500, "Tim Williams"
<tiw...@seventransistorlabs.com> Gave us:

>"Jim Thompson" <To-Email-Use-Th...@On-My-Web-Site.com> wrote in
>message news:2chigbti6e4nsev5m...@4ax.com...
>> I get all kinds of inquiries from managers wringing their hands... "I
>> spent all this money at BigNameExpensiveConMen, Inc, for analog
>> functions on 65nm and they don't work for shit." >:-}
>
>How much analog is even done under 200nm?
>
>Think I recall the RF stuff more or less peaks around there.
>
>Tim

Perhaps, but photonics takes it all into a new realm.

http://tinyurl.com/hypyeq9

krw

unread,
Apr 10, 2016, 8:50:14 AM4/10/16
to
On Sat, 9 Apr 2016 15:53:00 -0700 (PDT), jurb...@gmail.com wrote:

>>"Initially Moore *observed* a doubling every year in the number of
>>transistors on a chip and projected this would continue for a >decade. A
>>decade later he changed the projection to a doubling every two years.
>>This has held pretty well for decades now."
>
>Moore's law has to come to odds with the law of diminishing returns. This is a reality. Throwing the GPU in the CPU does not make a better CPU. What's more, look at how the speed of these processors is stuck at a couple of GHz. Electrons only move so fast.
>
>I predicted they would figure out optical processors eventually, and they probably will, but when is eventually ?
>
>In all things there is a theoretical limit. Even if a car engine has a VE of friggin 200, it cannot exceed its theoretical compression ratio, and if you separate the components, even a turbo or supercharger cannot do it. What's more, the crank can only stand so much force.
>
>They have TV sets with 1080p resolution. do you think they are going to go higher ? They are getting to the point where people can't see any better.
>
>Computers ? They are so fast now that even Microsoft can't write software to slow them down enough anymore. They've been outdone by the people who wrote Solidworks and Rhinocerous. But at least those programs DO something. Maybe Pspice, but I aonly have LTspice. It can take a few second to analyze and do a simulation, but then this computer is like from 2008. My brand new Win 10 coputer at work is no faster.
>
>What did we gain ?
>
>Well we stimulated the economy and put a bunch of geeks to work. Ad that is what it is. We really do not need any more, they have to push it or they're out of a job. I'd use Win 98SE on a 486 if I could get the damn thing to connect to the router. It was simply not that slow. The other problem is the browser.
>
>The internet is the biggest impetus for this buy and throw out deal we got now. Friggin nags me to upgrade my browser to see a picture hosted on one of those sites, go fuck yourself. I can do everything else I want just they way it is, I have to change to see a picture ? Tell you what, you want to show me a picture get Dropbox and use the \public directory. Simple, no ads, no frames, no nothing, just the damn picture. Want a caption, take a super uper duper advanced graphics program like Paint Shop Pro 4 from 1995 or so and "expand canvas" and then go to the little "A" and insert some text. Be sure to pick the right font size first, that can be tricky at times.
>
>I would use DOS. Hell, it is probably easier to write programs for DOS but you supply the GUI. People will not give up their mouse.

You're an idiot, too.

>Anyway, the law of diminishing returns is really a factor here and one of the reasons I won't buy anything anymore. Plus they always take out functions I want.
>
>Bottom line, putting more transistors on a chip is not advance. You throw the GPU in there, that is just two chips in one package. Alright they can communicate a bit faster, but not all that much. Eventually they might put the soundcard in the CPU too and call that an advancement. But it ain't, it is just saving them a few bucks. Next maybe the wifi receiver goes in, but still all they have done is to cram more stuff into one package.
>
>Ad then, when just ONE of those billions of transistors goes bad in there, you have to shitcan the whole thing. How convenient for them.
>
>That's progress.

You've totally ignored the improvements in architecture, allowed by
unlimited transistor budgets.

Jim Thompson

unread,
Apr 10, 2016, 11:40:47 AM4/10/16
to
On Sun, 10 Apr 2016 01:49:04 -0500, "Tim Williams"
<tiw...@seventransistorlabs.com> wrote:

>"Jim Thompson" <To-Email-Use-Th...@On-My-Web-Site.com> wrote in
>message news:2chigbti6e4nsev5m...@4ax.com...
>> I get all kinds of inquiries from managers wringing their hands... "I
>> spent all this money at BigNameExpensiveConMen, Inc, for analog
>> functions on 65nm and they don't work for shit." >:-}
>
>How much analog is even done under 200nm?
>
>Think I recall the RF stuff more or less peaks around there.
>
>Tim

There are idgits trying it... you know the kind of "analog"
engineer... can't solder, but can run a simulator :-(

I had an inquiry where a manager wanted me to "review" his expensive
non-functional analog designs on 65nm and make suggestions for
improvements... on-site in Massachusetts for six months. I declined
on-site, but would do it remotely using my own tools. Manager
declined. Probably hired yet another robot :-(

Les Cargill

unread,
Apr 10, 2016, 5:16:20 PM4/10/16
to
jurb...@gmail.com wrote:
>> "Initially Moore *observed* a doubling every year in the number of
>> transistors on a chip and projected this would continue for a
>> >decade. A decade later he changed the projection to a doubling
>> every two years. This has held pretty well for decades now."
>
> Moore's law has to come to odds with the law of diminishing returns.
> This is a reality. Throwing the GPU in the CPU does not make a better
> CPU. What's more, look at how the speed of these processors is stuck
> at a couple of GHz. Electrons only move so fast.
>
> I predicted they would figure out optical processors eventually, and
> they probably will, but when is eventually ?
>

Not very soon. Optical is the domain of companies saddled with crushing
debt from all those crashes over the last 20 years. Others know
far more than I, but I wouldn't expect to see it, frankly. SFAIK
(which is pitifully little ) not much processing happens in the optical
domain.

> In all things there is a theoretical limit. Even if a car engine has
> a VE of friggin 200, it cannot exceed its theoretical compression
> ratio, and if you separate the components, even a turbo or
> supercharger cannot do it. What's more, the crank can only stand so
> much force.
>
> They have TV sets with 1080p resolution. do you think they are going
> to go higher ? They are getting to the point where people can't see
> any better.
>

They are going higher. It's called 4K and you can see them at Best Buy
and at Sams and they are spectacular.

Right now, it's custom demo reels to show them off - there is little if
any program material commercially available beyond BluRay but I'm not
usually impressed by video, but these, you can get lost in.

> Computers ? They are so fast now that even Microsoft can't write
> software to slow them down enough anymore. They've been outdone by
> the people who wrote Solidworks and Rhinocerous. But at least those
> programs DO something. Maybe Pspice, but I aonly have LTspice. It can
> take a few second to analyze and do a simulation, but then this
> computer is like from 2008. My brand new Win 10 coputer at work is no
> faster.
>
> What did we gain ?
>

Nobody really wants to face the realities of software. If you do, as I
do, high-performance* realtime, it's really sort of lonely
out here. I can read about some people doing some things some
ways on the Internet but it's mostly people knitting .NET,
Java or Python together. Or worse.

*which is always domain dependent.

Windows itself is no excuse for any of jitter, latency or the like.
You can tune out a Linux distro for pretty good high rate performance,
but it doesn't really come that way stock.

> Well we stimulated the economy and put a bunch of geeks to work.

And that's it. Some time about 1990, it stopped being "woe betide you
who choose to practice this". Too many 200 pound lumps stuck on the
couch, I guess.


> Ad
> that is what it is. We really do not need any more, they have to push
> it or they're out of a job. I'd use Win 98SE on a 486 if I could get
> the damn thing to connect to the router.


No, because FAT32 and the propensity for Win98 to just
latch. Make that Win2k and I am right behind you. I do a lot of work
at home in a Win2k VM.

> It was simply not that slow.
> The other problem is the browser.
>
> The internet is the biggest impetus for this buy and throw out deal
> we got now. Friggin nags me to upgrade my browser to see a picture
> hosted on one of those sites, go fuck yourself. I can do everything
> else I want just they way it is, I have to change to see a picture ?

but that's what the Innernet *is* to most people.

> Tell you what, you want to show me a picture get Dropbox and use the
> \public directory. Simple, no ads, no frames, no nothing, just the
> damn picture. Want a caption, take a super uper duper advanced
> graphics program like Paint Shop Pro 4 from 1995 or so and "expand
> canvas" and then go to the little "A" and insert some text. Be sure
> to pick the right font size first, that can be tricky at times.
>
> I would use DOS. Hell, it is probably easier to write programs for
> DOS but you supply the GUI. People will not give up their mouse.
>
> Anyway, the law of diminishing returns is really a factor here and
> one of the reasons I won't buy anything anymore. Plus they always
> take out functions I want.
>
> Bottom line, putting more transistors on a chip is not advance. You
> throw the GPU in there, that is just two chips in one package.

It's analogous to body count in Vietnam, or tractors
in the Soviet Union. . "Glorious Soviet production
of transistors up 10% this year."

> Alright they can communicate a bit faster, but not all that much.
> Eventually they might put the soundcard in the CPU too and call that
> an advancement. But it ain't, it is just saving them a few bucks.

If all else fails, do packaging.

> Next maybe the wifi receiver goes in, but still all they have done is
> to cram more stuff into one package.
>

I would actively not buy a rolled in wifi widget.

> Ad then, when just ONE of those billions of transistors goes bad in
> there, you have to shitcan the whole thing. How convenient for them.
>
> That's progress.
>

"I went back to the store
They gave me four more
The guy told me at the door
It's a piece of crap " - Neil Young, "Piece Of Crap"

--
Les Cargill



DecadentLinuxUserNumeroUno

unread,
Apr 10, 2016, 6:32:41 PM4/10/16
to
On Sun, 10 Apr 2016 16:17:20 -0500, Les Cargill <lcarg...@comcast.com>
Gave us:

>Not very soon. Optical is the domain of companies saddled with crushing
>debt from all those crashes over the last 20 years. Others know
>far more than I, but I wouldn't expect to see it, frankly. SFAIK
>(which is pitifully little ) not much processing happens in the optical
>domain.


IBM, Intel, Samsung, etc. are doing great things in photonics. There
is 100% optical memory now, and there are even processors in the works.
You guys have apparently not been keeping up. I posted a link to a
google images page and most of the images have links attached to them.
Go take a look at what is going on.

Optical and quantum computing are converging. It is happening.

"Not very soon"? Wake up and notice the photonic control we have.

DecadentLinuxUserNumeroUno

unread,
Apr 10, 2016, 6:36:14 PM4/10/16
to
On Sun, 10 Apr 2016 16:17:20 -0500, Les Cargill <lcarg...@comcast.com>
Gave us:

>They are going higher. It's called 4K and you can see them at Best Buy
>and at Sams and they are spectacular.
>
>Right now, it's custom demo reels to show them off - there is little if
>any program material commercially available beyond BluRay but I'm not
>usually impressed by video, but these, you can get lost in.

Looks pretty good doing CAD work. They are far better than 1080 LED
displays and even the old fine pitch CRTs. 4K and beyond is on its way
in. Wake up.. You sound like a bunch of luddite old gits.

Phil Hobbs

unread,
Apr 10, 2016, 7:10:10 PM4/10/16
to
>

>  IBM, Intel, Samsung, etc. are doing great things in photonics.  There
>is 100% optical memory now, and there are even processors in the works.
>You guys have apparently not been keeping up.  I posted a link to a
>google images page and most of the images have links attached to them.
>Go take a look at what is going on.

 > Optical and quantum computing are converging.  It is happening.

>  "Not very soon"?  Wake up and notice the photonic control we have.

I'll crispen that up to "never". There are a lot of amazing things you can do with photons, but a practical general purpose processor competitive with CMOS is not one of them.

Optical interconnection between chips, and even to a limited degree on-chip, sure. (I spent some years working to help make that happen.)

There's a very basic problem with optical logic: a few electrons sitting on a gate can control many electrons at the drain, but a few photons can't control many in that way.

Cheers

Phil Hobbs

rickman

unread,
Apr 10, 2016, 9:26:14 PM4/10/16
to
On 4/10/2016 5:17 PM, Les Cargill wrote:
> jurb...@gmail.com wrote:
>
>> In all things there is a theoretical limit. Even if a car engine has
>> a VE of friggin 200, it cannot exceed its theoretical compression
>> ratio, and if you separate the components, even a turbo or
>> supercharger cannot do it. What's more, the crank can only stand so
>> much force.
>>
>> They have TV sets with 1080p resolution. do you think they are going
>> to go higher ? They are getting to the point where people can't see
>> any better.
>>
>
> They are going higher. It's called 4K and you can see them at Best Buy
> and at Sams and they are spectacular.
>
> Right now, it's custom demo reels to show them off - there is little if
> any program material commercially available beyond BluRay but I'm not
> usually impressed by video, but these, you can get lost in.

I wonder if a 4K set is bought today if it will continue to work well
with the various sources in a couple of years? I've seen plenty of
issues with sets not properly supporting various video modes. Since 4K
is so new with so little program source material, will it all still be
properly compatible when program material is out? Or I guess I'm
wondering if the set makers have the bugs out.

--

Rick

Przemek Klosowski

unread,
Apr 11, 2016, 9:39:10 PM4/11/16
to
On Wed, 06 Apr 2016 17:11:10 -0400, Phil Hobbs wrote:

> On 04/06/2016 05:08 PM, John Larkin wrote:
>>
>>
>> http://www.eetimes.com/document.asp?doc_id=1329362&
>>
>>
>> "Moore's law is alive and well, even though the number of people that
>> say it is over also doubles every two years."
>
> "Moore's law is alive and well because it gets redefined every two
> years."
>
Remember that the original Moore's formulation was about the number of
transistors per chip. The side effects of process shrink (speed bumps,
leakage improvements) are an icing on a cake that is indeed pretty much
gone, but the increase in the number of transistors is still going strong.

Les Cargill

unread,
Apr 11, 2016, 9:51:34 PM4/11/16
to
rickman wrote:
> On 4/10/2016 5:17 PM, Les Cargill wrote:
>> jurb...@gmail.com wrote:
>>
>>> In all things there is a theoretical limit. Even if a car engine has
>>> a VE of friggin 200, it cannot exceed its theoretical compression
>>> ratio, and if you separate the components, even a turbo or
>>> supercharger cannot do it. What's more, the crank can only stand so
>>> much force.
>>>
>>> They have TV sets with 1080p resolution. do you think they are going
>>> to go higher ? They are getting to the point where people can't see
>>> any better.
>>>
>>
>> They are going higher. It's called 4K and you can see them at Best Buy
>> and at Sams and they are spectacular.
>>
>> Right now, it's custom demo reels to show them off - there is little if
>> any program material commercially available beyond BluRay but I'm not
>> usually impressed by video, but these, you can get lost in.
>
> I wonder if a 4K set is bought today if it will continue to work well
> with the various sources in a couple of years?

That's a really good question.

> I've seen plenty of
> issues with sets not properly supporting various video modes. Since 4K
> is so new with so little program source material, will it all still be
> properly compatible when program material is out? Or I guess I'm
> wondering if the set makers have the bugs out.
>

I figure to let it ride for a while myself. But it's stunning. I think
everyone will want this when it's firmed up. I at least temporarily
lost the sensation that I was watching a screen. It's completely
immersive. HD is nice, but it's still a TV, a screen.

I have no idea whatsoever how it may be that program material gets to
it. Dunno of the standard streaming services will support it or not.

The downside will be that it'll probably need to be an 80 inch screen
to get the full effect.

--
Les Cargill

rickman

unread,
Apr 11, 2016, 10:21:50 PM4/11/16
to
On 4/11/2016 9:52 PM, Les Cargill wrote:
>
> The downside will be that it'll probably need to be an 80 inch screen
> to get the full effect.

Ok by me! I've yet to buy any flat panels so when I get one I won't
mind making it really big!

--

Rick

colin_...@yahoo.com

unread,
Apr 12, 2016, 5:50:17 AM4/12/16
to
The broadcasters are also pushing high dynamic range, I.E. finally moving away from the colour limitations of forty years ago. They like it because the quality is outstanding but the bandwidth is only another 20% or so.

Colin

dca...@krl.org

unread,
Apr 12, 2016, 8:21:04 AM4/12/16
to
On Monday, April 11, 2016 at 9:51:34 PM UTC-4, Les Cargill wrote:

> The downside will be that it'll probably need to be an 80 inch screen
> to get the full effect.
>
> --
> Les Cargill

Correct. On screens smaller than about 40 inches viewed from six or more feet, there is no much difference.

Dan

krw

unread,
Apr 12, 2016, 9:19:30 AM4/12/16
to
They're cheap, now. For largeish screens, I don't see why one would
not want 4K. Even the entry level are stunning.
>
>The downside will be that it'll probably need to be an 80 inch screen
>to get the full effect.

I don't think so. 55-65" screens are stunning, as well. We have
three flat screens (1080P - two 40 and 42") in the house now. I was
ready to pull the trigger on a 65" high-end Samsung 65" but SWMBO
wasn't convinced so we let the deal escape. The best deal I can find
now is $1K more than a month ago (new models).

krw

unread,
Apr 12, 2016, 9:20:48 AM4/12/16
to
That's the theory. The reality is something different. The sets are
far better all around.

Phil Hobbs

unread,
Apr 12, 2016, 11:08:31 AM4/12/16
to
Litho generations have slowed down a lot. This is already an old plot:
http://tinyurl.com/ht39olm .

This one is a bit more recent:
http://tinyurl.com/grq5rf9 .

Litho put on a bit of a burst of speed around 2000, but that's ancient
history now.

rickman

unread,
Apr 12, 2016, 6:31:58 PM4/12/16
to
How about 18" inches viewed from two feet? One reason I haven't bought
a big screen is it would be far enough away that my laptop is actually a
much bigger screen. I wouldn't mind having a really big screen I can
use as a computer monitor. But for it to actually be "large" it would
have to be reasonably close.

--

Rick

Bob Engelhardt

unread,
Apr 12, 2016, 7:38:59 PM4/12/16
to
On 4/12/2016 6:31 PM, rickman wrote:

> How about 18" inches viewed from two feet? One reason I haven't bought
> a big screen is it would be far enough away that my laptop is actually a
> much bigger screen. ...

When I watch our 52" TV (from 6'), it's a step down from my 24" monitor
(at 2').

jurb...@gmail.com

unread,
Apr 12, 2016, 7:43:19 PM4/12/16
to
>"The downside will be that it'll probably need to be an 80 inch >screen
>to get the full effect. "

I think the only market for anything that big will be in the US. I have heard from Europeans and mainly, they don't have the room.

The next real advancement I think will be 3D. I mean real 3D with no glasses. I watched a demo of a new thing they were selling to architects to display a full 3D model without them having to build it. Remember those things at coin shops where the coin in inside the cabinet but they have this mirror that makes it appear to float in the air ? Well that thing must use some sort of similar principle. Somehow the LCD cells have to control reflectivity through a bunch of layers. Bottom line, it can be done but they did not mention the price. I would guess it is one of those things that if you have to ask you can't afford it.

Some people will never like LCDs, those are the type who buy plasma TVs, which will probably soon be obsolete. I think the people who like them like them because they are actually watching phosphors. I doubt they'll ever get to 4K resolution in a plasma, if they did, it would probably cost more than the house it is in.

The diminishing returns also applies to people. I remember them selling those super tweeters that went up to 50 KHz. Why, to annoy the dog ?

I think that part of this really good TV picture thing, if you are far enough away, is due to them getting the colorimetry and gamma right. Seems they're pretty much done with flesh correction since NTSC is a thing of the past. Now it can bealot more accurate. In fact people used to call NTSC "Never The Same Color". They fixed that in PAL, and upped the resolution some. So we invented it here, but it was not the best system. I was told the only worse system was the one in Russia during the Soviet days. I also hear their CRTs sucked, worse than a 1990s Zenith.

But my speed with TV is my little 10" Panasonic. Never watch it though, it is not even plugged i, let alone to a box or source. Well yeah, a old VHS camcorder. I don't use that anymore because there is almost nothing on which to play the tapes.

krw

unread,
Apr 12, 2016, 7:59:17 PM4/12/16
to
On Tue, 12 Apr 2016 16:43:14 -0700 (PDT), jurb...@gmail.com wrote:

>>"The downside will be that it'll probably need to be an 80 inch >screen
>>to get the full effect. "
>
>I think the only market for anything that big will be in the US. I have heard from Europeans and mainly, they don't have the room.
>
>The next real advancement I think will be 3D. I mean real 3D with no glasses. I watched a demo of a new thing they were selling to architects to display a full 3D model without them having to build it. Remember those things at coin shops where the coin in inside the cabinet but they have this mirror that makes it appear to float in the air ? Well that thing must use some sort of similar principle. Somehow the LCD cells have to control reflectivity through a bunch of layers. Bottom line, it can be done but they did not mention the price. I would guess it is one of those things that if you have to ask you can't afford it.

I have no use for 3D. I doubt that the glasses have anything to do
with it. I don't like the effect.

>Some people will never like LCDs, those are the type who buy plasma TVs, which will probably soon be obsolete. I think the people who like them like them because they are actually watching phosphors. I doubt they'll ever get to 4K resolution in a plasma, if they did, it would probably cost more than the house it is in.

I much prefer plasmas, though modern LCDs are getting incredibly
close. The issue is dynamic range (how black is the black?). It's
the main difference in picture between $1.5K TVs and $3K TVs (65",
sort of size), IMO.

Yes, plasma TVs are all but obsolete. LCD manufacturing costs have
cut them off at the knees. The only plasmas left are "professional"
or "commercial" TVs. The 42" plasma TV in our family room will soon
be replaced with a 65" LCD but I missed last year's sell off.

>The diminishing returns also applies to people. I remember them selling those super tweeters that went up to 50 KHz. Why, to annoy the dog ?

We're a long way from that in TVs. There's probably a few more
decades of improvements left.

>I think that part of this really good TV picture thing, if you are far enough away, is due to them getting the colorimetry and gamma right. Seems they're pretty much done with flesh correction since NTSC is a thing of the past. Now it can bealot more accurate. In fact people used to call NTSC "Never The Same Color". They fixed that in PAL, and upped the resolution some. So we invented it here, but it was not the best system. I was told the only worse system was the one in Russia during the Soviet days. I also hear their CRTs sucked, worse than a 1990s Zenith.

Huh?

>But my speed with TV is my little 10" Panasonic. Never watch it though, it is not even plugged i, let alone to a box or source. Well yeah, a old VHS camcorder. I don't use that anymore because there is almost nothing on which to play the tapes.

Then you should have no opinion in such matters.

rickman

unread,
Apr 12, 2016, 8:40:25 PM4/12/16
to
On 4/12/2016 7:43 PM, jurb...@gmail.com wrote:
>> "The downside will be that it'll probably need to be an 80 inch
>> >screen to get the full effect. "
>
> I think the only market for anything that big will be in the US. I
> have heard from Europeans and mainly, they don't have the room.

Not many have the room in the US either. Most walls have doors and
windows unless interior. Even then it can be hard to find six foot of
uninterrupted wall space in a good location unless the house is fairly
large.

--

Rick

tabb...@gmail.com

unread,
Apr 12, 2016, 8:55:31 PM4/12/16
to
On Wednesday, 13 April 2016 00:43:19 UTC+1, jurb...@gmail.com wrote:

> I think that part of this really good TV picture thing, if you are far enough away, is due to them getting the colorimetry and gamma right. Seems they're pretty much done with flesh correction since NTSC is a thing of the past. Now it can bealot more accurate. In fact people used to call NTSC "Never The Same Color". They fixed that in PAL, and upped the resolution some.

PAL sort-of fixed it. By shifting the hue in the other direction every other line, it cancelled out much of the time. But not when the colours got anywhere close to saturated.


> So we invented it here, but it was not the best system. I was told the only worse system was the one in Russia during the Soviet days. I also hear their CRTs sucked, worse than a 1990s Zenith.

Criticising Russian goods is fashionable, but mostly bs. It means nothing. Just don't buy the chocolates.

Some African countries had worse systems. They just stayed with old black & white only standards long after the world had gone colour.


NT

Les Cargill

unread,
Apr 12, 2016, 8:58:07 PM4/12/16
to
Oh, they are indeed. But the 80 inch is just more impressive because
there's no perceptible effect from the increased pixel size. So it
moves the bang factor up for the larger screen.

> We have
> three flat screens (1080P - two 40 and 42") in the house now. I was
> ready to pull the trigger on a 65" high-end Samsung 65" but SWMBO
> wasn't convinced so we let the deal escape. The best deal I can find
> now is $1K more than a month ago (new models).
>


Yeah, prices are still up and down. It's in the early adopter
arc.

--
Les Cargill

Richard Henry

unread,
Apr 12, 2016, 9:43:01 PM4/12/16
to
On Thursday, April 7, 2016 at 5:19:32 PM UTC-7, sean....@gmail.com wrote:
> Well it turned into a sigma curve as it must.

Sigmoid

DecadentLinuxUserNumeroUno

unread,
Apr 13, 2016, 2:15:20 AM4/13/16
to
On Tue, 12 Apr 2016 18:42:54 -0700 (PDT), Richard Henry
<pome...@hotmail.com> Gave us:

>On Thursday, April 7, 2016 at 5:19:32 PM UTC-7, sean....@gmail.com wrote:
>> Well it turned into a sigma curve as it must.
>
>Sigmoid

Mandelbrot.

krw

unread,
Apr 13, 2016, 9:21:54 AM4/13/16
to
On Tue, 12 Apr 2016 19:59:05 -0500, Les Cargill
I think the 4K prices have bottomed, more or less. I wanted to grab a
model-year-end closeout. Didn't happen and, of course, they aren't
discounting the new models as much (~$1K more).

0 new messages