On 21/07/2022 17:01,
Mut...@dastardlyhq.com wrote:
> On Thu, 21 Jul 2022 16:01:54 +0200
> David Brown <
david...@hesbynett.no> wrote:
>> On 21/07/2022 02:06, Chris Vine wrote:
>>> On Thu, 21 Jul 2022 01:29:08 +0200
>>> Manfred <non...@add.invalid> wrote:
>>>> One major problem I see with all those wannabe C++ successor is that C++
>>>> has built a foundation that is several decades long, which counts for
>>>> reliability and stability of the language (until the committee will
>>>> manage to ruin this by keeping on doing what they are doing as of late)
>>>> These qualities are very valuable in projects and organizations where
>>>> product quality matters.
>>>>
>>>> In order to gain a comparable level of recognition, a successor of C++
>>>> would have to be building a similar foundation, and die along the way.
>>>
>>> Indeed, as Keynes said "In the long run we are all dead". No one knows
>>> what language people (if they then exist) will be programming in in 100
>>> years time.
>>>
>>
>> We know some of it. We'll still have C, Cobol, and a bit of Fortran :-)
>
> It'll depend heavily on how hardware evolves and I suspect the hardware
> that exists in 100 years will bear little to no resemblence to what we
> have now either in physical construction or logical operation. Perhaps it'll
> be quantum, perhaps it'll be something that hasn't even been thought of yet.
>
C, Cobol and Fortran have been around for 50 years or more, and are all
still in serious use. Other languages come and go - and some have come,
but haven't gone yet (like C++). So the best guess we have as to the
languages of the future, is these apparently ever-lasting languages. (I
don't claim they will be the /only/ languages, or even the most popular
ones - just that they'll still be around and in use.)
Quantum computers are an expensive excuse to play with cool toys. I
think it is unlikely that they will ever actually be cost-effective for
solving real-world problems (as distinct from completely artificial ones
invented just to suit quantum computers). They /might/ turn out to be
helpful for certain specific optimisation problems. But for "everyday"
computing, they haven't a chance, and never will do.
(Feel free to contact me in a hundred years if I turn out to be wrong!)