Xml tree

233 views
Skip to first unread message

Vishal Oza

unread,
Mar 8, 2017, 9:38:30 PM3/8/17
to ISO C++ Standard - Future Proposals
I just want to know if there is interest in represent xml, xhtml (I know xml and xhtml are the same thing but I am saying this for completeness), and html as a c++ std container. The reason why I ask this is to allow C++ to work on xml with algorithms and iterators rather than raw loops, recursion, and pointers.

Klaim - Joël Lamotte

unread,
Mar 9, 2017, 5:13:34 AM3/9/17
to std-pr...@isocpp.org
You mean the DOM?

On 9 March 2017 at 03:38, Vishal Oza <vic...@gmail.com> wrote:
I just want to know if there is interest in represent xml, xhtml (I know xml and xhtml are the same thing but I am saying this for completeness), and html as a c++ std container. The reason why I ask this is to allow C++ to work on xml with algorithms and iterators rather than raw loops, recursion, and pointers.

--
You received this message because you are subscribed to the Google Groups "ISO C++ Standard - Future Proposals" group.
To unsubscribe from this group and stop receiving emails from it, send an email to std-proposals+unsubscribe@isocpp.org.
To post to this group, send email to std-pr...@isocpp.org.
To view this discussion on the web visit https://groups.google.com/a/isocpp.org/d/msgid/std-proposals/6d10a34f-a079-4f8e-88be-db8f17810314%40isocpp.org.

Vishal Oza

unread,
Mar 9, 2017, 10:54:55 AM3/9/17
to ISO C++ Standard - Future Proposals
Yes, in my proposal I would suggest to have node xml objects as real object, and limit the use of pointers to stuff that handle to underlining implmenation

Klaim - Joël Lamotte

unread,
Mar 10, 2017, 11:46:05 AM3/10/17
to std-pr...@isocpp.org
It have been proposed before at least one time I think.


On 9 March 2017 at 16:54, Vishal Oza <vic...@gmail.com> wrote:
Yes, in my proposal I would suggest to have node xml objects as real object, and limit the use of pointers to stuff that handle to underlining implmenation
--
You received this message because you are subscribed to the Google Groups "ISO C++ Standard - Future Proposals" group.
To unsubscribe from this group and stop receiving emails from it, send an email to std-proposals+unsubscribe@isocpp.org.
To post to this group, send email to std-pr...@isocpp.org.

dgutson .

unread,
Mar 10, 2017, 12:53:44 PM3/10/17
to std-proposals
On Fri, Mar 10, 2017 at 1:46 PM, Klaim - Joël Lamotte <mjk...@gmail.com> wrote:
It have been proposed before at least one time I think.


First JSON, now XML, then YAML, and let's keep counting.
What is the problem of having libraries and letting the interface be part of their quality, not only the implementation?
I really don't think that the STL should be such an elephant covering every single niche. There are library vendors for that. Boost being one of them.
 

On 9 March 2017 at 16:54, Vishal Oza <vic...@gmail.com> wrote:
Yes, in my proposal I would suggest to have node xml objects as real object, and limit the use of pointers to stuff that handle to underlining implmenation

--
You received this message because you are subscribed to the Google Groups "ISO C++ Standard - Future Proposals" group.
To unsubscribe from this group and stop receiving emails from it, send an email to std-proposals+unsubscribe@isocpp.org.
To post to this group, send email to std-pr...@isocpp.org.
To view this discussion on the web visit https://groups.google.com/a/isocpp.org/d/msgid/std-proposals/f86266f7-c953-4232-8a7a-7d77a600b15e%40isocpp.org.

--
You received this message because you are subscribed to the Google Groups "ISO C++ Standard - Future Proposals" group.
To unsubscribe from this group and stop receiving emails from it, send an email to std-proposals+unsubscribe@isocpp.org.
To post to this group, send email to std-pr...@isocpp.org.



--
Who’s got the sweetest disposition?
One guess, that’s who?
Who’d never, ever start an argument?
Who never shows a bit of temperament?
Who's never wrong but always right?
Who'd never dream of starting a fight?
Who get stuck with all the bad luck?

Nicol Bolas

unread,
Mar 13, 2017, 2:37:50 PM3/13/17
to ISO C++ Standard - Future Proposals


On Friday, March 10, 2017 at 12:53:44 PM UTC-5, dgutson wrote:


On Fri, Mar 10, 2017 at 1:46 PM, Klaim - Joël Lamotte <mjk...@gmail.com> wrote:
It have been proposed before at least one time I think.


First JSON, now XML, then YAML, and let's keep counting.
What is the problem of having libraries and letting the interface be part of their quality, not only the implementation?
I really don't think that the STL should be such an elephant covering every single niche. There are library vendors for that. Boost being one of them.

While I understand the argument in general, for the specific cases mentioned here, there are very good arguments for providing such things. It's all a matter of user-space and ease of usability.

C++ has a very small standard library. Which means that if you want to actually do something in C++ of significance, you have to either write it yourself or use a library. The problem with the latter is that C++ is also terrible at making it easy to incorporate other libraries into your build.

Just look at Boost. It's a gigantic ball of stuff. If Boost had an XML processor, how much other stuff would you have to include just to be able to process XML? Do you really need all of the other stuff Boost provides? Probably not.

And what of smaller libraries? Some of them use CMake as their build system. Others use straight-up makefiles. Others do something else. And they all vary in quality.

Oh sure, there is the danger of standardizing something that isn't used very often, or some technology that goes out of fashion or whatever. But at some point, C++ needs to grow up and start providing people with tools beyond basic stuff like containers and algorithms. Because people need those things, and the C++ world makes it very difficult to get them. Either we provide them, or they will move on to languages that do.

dgutson .

unread,
Mar 13, 2017, 4:14:41 PM3/13/17
to std-proposals
On Mon, Mar 13, 2017 at 3:37 PM, Nicol Bolas <jmck...@gmail.com> wrote:


On Friday, March 10, 2017 at 12:53:44 PM UTC-5, dgutson wrote:


On Fri, Mar 10, 2017 at 1:46 PM, Klaim - Joël Lamotte <mjk...@gmail.com> wrote:
It have been proposed before at least one time I think.


First JSON, now XML, then YAML, and let's keep counting.
What is the problem of having libraries and letting the interface be part of their quality, not only the implementation?
I really don't think that the STL should be such an elephant covering every single niche. There are library vendors for that. Boost being one of them.

While I understand the argument in general, for the specific cases mentioned here, there are very good arguments for providing such things. It's all a matter of user-space and ease of usability.

C++ has a very small standard library. Which means that if you want to actually do something in C++ of significance, you have to either write it yourself or use a library. The problem with the latter is that C++ is also terrible at making it easy to incorporate other libraries into your build.

Just look at Boost. It's a gigantic ball of stuff. If Boost had an XML processor, how much other stuff would you have to include just to be able to process XML? Do you really need all of the other stuff Boost provides? Probably not.

And what of smaller libraries? Some of them use CMake as their build system. Others use straight-up makefiles. Others do something else. And they all vary in quality.

This sounds me more a problem related to modules and/or standardizing the build system (feasible or not, out of mail scope).

 

Oh sure, there is the danger of standardizing something that isn't used very often, or some technology that goes out of fashion or whatever. But at some point, C++ needs to grow up and start providing people with tools beyond basic stuff like containers and algorithms. Because people need those things, and the C++ world makes it very difficult to get them. Either we provide them, or they will move on to languages that do.

--
You received this message because you are subscribed to the Google Groups "ISO C++ Standard - Future Proposals" group.
To unsubscribe from this group and stop receiving emails from it, send an email to std-proposals+unsubscribe@isocpp.org.
To post to this group, send email to std-pr...@isocpp.org.

Nicol Bolas

unread,
Mar 13, 2017, 4:40:53 PM3/13/17
to ISO C++ Standard - Future Proposals
On Monday, March 13, 2017 at 4:14:41 PM UTC-4, dgutson wrote:
On Mon, Mar 13, 2017 at 3:37 PM, Nicol Bolas <jmck...@gmail.com> wrote:
On Friday, March 10, 2017 at 12:53:44 PM UTC-5, dgutson wrote:
On Fri, Mar 10, 2017 at 1:46 PM, Klaim - Joël Lamotte <mjk...@gmail.com> wrote:
It have been proposed before at least one time I think.


First JSON, now XML, then YAML, and let's keep counting.
What is the problem of having libraries and letting the interface be part of their quality, not only the implementation?
I really don't think that the STL should be such an elephant covering every single niche. There are library vendors for that. Boost being one of them.

While I understand the argument in general, for the specific cases mentioned here, there are very good arguments for providing such things. It's all a matter of user-space and ease of usability.

C++ has a very small standard library. Which means that if you want to actually do something in C++ of significance, you have to either write it yourself or use a library. The problem with the latter is that C++ is also terrible at making it easy to incorporate other libraries into your build.

Just look at Boost. It's a gigantic ball of stuff. If Boost had an XML processor, how much other stuff would you have to include just to be able to process XML? Do you really need all of the other stuff Boost provides? Probably not.

And what of smaller libraries? Some of them use CMake as their build system. Others use straight-up makefiles. Others do something else. And they all vary in quality.

This sounds me more a problem related to modules and/or standardizing the build system (feasible or not, out of mail scope).

That is what makes the argument compelling. Solving the build system problem is out of scope for the committee. But making the standard library more useful out-of-the-box is not out of scope. So why not do something that helps alleviate the problem to the extent that they can?

What precisely is the advantage of having a tiny standard library? Ease-of-implementation might be one, but C++ is already a difficult language to implement. And writing a quality standard library implementation is quite difficult as well. So adding a few more things on top of that isn't exactly overburdening implementers.

dgutson .

unread,
Mar 13, 2017, 5:39:05 PM3/13/17
to std-proposals
On Mon, Mar 13, 2017 at 5:40 PM, Nicol Bolas <jmck...@gmail.com> wrote:
On Monday, March 13, 2017 at 4:14:41 PM UTC-4, dgutson wrote:
On Mon, Mar 13, 2017 at 3:37 PM, Nicol Bolas <jmck...@gmail.com> wrote:
On Friday, March 10, 2017 at 12:53:44 PM UTC-5, dgutson wrote:
On Fri, Mar 10, 2017 at 1:46 PM, Klaim - Joël Lamotte <mjk...@gmail.com> wrote:
It have been proposed before at least one time I think.


First JSON, now XML, then YAML, and let's keep counting.
What is the problem of having libraries and letting the interface be part of their quality, not only the implementation?
I really don't think that the STL should be such an elephant covering every single niche. There are library vendors for that. Boost being one of them.

While I understand the argument in general, for the specific cases mentioned here, there are very good arguments for providing such things. It's all a matter of user-space and ease of usability.

C++ has a very small standard library. Which means that if you want to actually do something in C++ of significance, you have to either write it yourself or use a library. The problem with the latter is that C++ is also terrible at making it easy to incorporate other libraries into your build.

Just look at Boost. It's a gigantic ball of stuff. If Boost had an XML processor, how much other stuff would you have to include just to be able to process XML? Do you really need all of the other stuff Boost provides? Probably not.

And what of smaller libraries? Some of them use CMake as their build system. Others use straight-up makefiles. Others do something else. And they all vary in quality.

This sounds me more a problem related to modules and/or standardizing the build system (feasible or not, out of mail scope).

That is what makes the argument compelling. Solving the build system problem is out of scope for the committee. But making the standard library more useful out-of-the-box is not out of scope. So why not do something that helps alleviate the problem to the extent that they can?

I insist: you solve the issue you mentioned with modules. Wait for them and you and everybody will be happier xml users.
 

What precisely is the advantage of having a tiny standard library? Ease-of-implementation might be one, but C++ is already a difficult language to implement. And writing a quality standard library implementation is quite difficult as well. So adding a few more things on top of that isn't exactly overburdening implementers.

that's it. no more contributions from me to this thread.
 

--
You received this message because you are subscribed to the Google Groups "ISO C++ Standard - Future Proposals" group.
To unsubscribe from this group and stop receiving emails from it, send an email to std-proposals+unsubscribe@isocpp.org.
To post to this group, send email to std-pr...@isocpp.org.

Bengt Gustafsson

unread,
Mar 14, 2017, 7:25:50 PM3/14/17
to ISO C++ Standard - Future Proposals
I think Nicol nailed it when it comes to describing the situation. What I would like to see is a set of libraries that are layered on top of the current standard library and reaches further into application land. XML parsing is a very good example. The main difference from the current
library would be that it is a source code component rather than a specification. This way the burden on compiler vendors is minimized, but to be standard compliant the libraries must be delivered with the compiler offering. Maybe ISO is not the right "host" for such an offering but some3one must put pressure on the compiler vendors to take the effort to integrate these libraries and distribute them in an as easy to use form as the current libraries. To me it seems that the old conflict between open source and commercial interests have abated enough to make something like this possible. There are so many good libraries out there but it takes so much time to get them into a useful state of compilation... so much wasted programmer time.

olafv...@gmail.com

unread,
Mar 15, 2017, 11:58:05 AM3/15/17
to ISO C++ Standard - Future Proposals
Op maandag 13 maart 2017 21:40:53 UTC+1 schreef Nicol Bolas:
This sounds me more a problem related to modules and/or standardizing the build system (feasible or not, out of mail scope).

That is what makes the argument compelling. Solving the build system problem is out of scope for the committee. But making the standard library more useful out-of-the-box is not out of scope. So why not do something that helps alleviate the problem to the extent that they can?

What precisely is the advantage of having a tiny standard library? Ease-of-implementation might be one, but C++ is already a difficult language to implement. And writing a quality standard library implementation is quite difficult as well. So adding a few more things on top of that isn't exactly overburdening implementers.

Or you make it easier (outside of ISO committee) to consume third-party libs and you solve problems for a lot more areas and users.

Bengt Gustafsson

unread,
Mar 15, 2017, 4:05:51 PM3/15/17
to ISO C++ Standard - Future Proposals, olafv...@gmail.com
I think we need some endorsement from the commitee or at least from major compiler vendors/suppliers that they will supply those libraries selected by (some other) commitee or we will not get libraries with the "plug and play" feeling that other languages mostly have, but C++ (and C) sorely lacks. That is, rather strict guidelines to be eligible.

Matthew Woehlke

unread,
Mar 16, 2017, 12:39:43 PM3/16/17
to std-pr...@isocpp.org
On 2017-03-13 16:40, Nicol Bolas wrote:
> On Monday, March 13, 2017 at 4:14:41 PM UTC-4, dgutson wrote:
>> This sounds me more a problem related to modules and/or standardizing the
>> build system (feasible or not, out of mail scope).
>
> That is what makes the argument compelling. Solving the build system
> problem is out of scope for the committee. But making the standard library
> more useful out-of-the-box is *not* out of scope. So why not do something
> that helps alleviate the problem to the extent that they can?

One of the reasons libraries like Boost, Qt, etc. are so successful is
because they are willing and able to make compatibility-breaking changes
every few years, which "core C++" historically has not been willing to do.

There are several concerns with a large standard library:

- There is no single implementation. Unless we somehow develop a system
of "blessing" a particular existing implementation when a new (hunk of)
library is added to the standard, the work load simply implementing new
stuff is... large. There are of course licensing issues with adopting an
existing implementation. Also, the various implementations are almost
certainly going to have divergence in QoI.

- Unwillingness to break compatibility will make it difficult to keep up
with new developments in software design. Just look at e.g. QML compared
to how GUI development happened in the Qt 3.x days. Or look at web
rendering technology over the last decade. Bad design choices become
immutable. Codifying solutions to these sorts of problems into a
slow-moving standard is just asking for them to become rapidly obsolete.

- Monopoly. Software, like many things, thrives on competition. Having
multiple libraries to do the same thing is both a curse and a blessing.
The library that is the best fit for my application may not be the best
fit for yours, but having choices allows each of us to use what works
best *for our use cases*. See for example the recent JSON thread and its
heated debate on DOM vs. SAX vs. StAX. One size does not fit all.
Competing libraries drive progress.

> Writing a *quality* standard library implementation is quite
> difficult as well. So adding a few more things on top of that isn't
> exactly overburdening implementers.

You assume that library implementers aren't *already* overburdened. I'm
not sure I can agree with that...

Going back to your previous comments, the real reason we have these
issues is not lack of library standardization, it is lack of packaging
and distribution standardization. If installing and using third-party
C++ libraries was as easy as installing and using third-party Python
modules, we wouldn't be having this conversation.

The work-in-progress Common Package Specification¹ hopes to alleviate
the 'consuming' side.

https://github.com/mwoehlke/cps)

--
Matthew

Matthew Woehlke

unread,
Mar 16, 2017, 12:50:32 PM3/16/17
to std-pr...@isocpp.org, olafv...@gmail.com
On 2017-03-15 16:05, Bengt Gustafsson wrote:
> Den onsdag 15 mars 2017 kl. 16:58:05 UTC+1 skrev olafv...@gmail.com:
>> Or you make it easier (outside of ISO committee) to consume third-party
>> libs and you solve problems for a lot more areas and users.
>
> I think we need some endorsement from the commitee or at least from major
> compiler vendors/suppliers that they will supply those libraries selected
> by (some other) commitee or we will not get libraries with the "plug and
> play" feeling that other languages mostly have, but C++ (and C) sorely
> lacks. That is, rather strict guidelines to be eligible.

I don't think this is the job of compiler vendors. Rather, it is the job
of operating system distributors. Linux distributions have effectively
solved this problem a long, long time ago. What we need is better tools
for MacOS and *especially* Windows to install, update and otherwise
manage "third party" libraries.

App stores, which are sharply targeted at end user applications, aren't
the answer. While they do a good job on the user experience end, I'm not
aware that they do dependency management. There is also the problem of
dealing with differing library versions.

The battle between open and closed source software comes into play here
also. Proprietary software has the choice of either bundling all its
dependencies, effectively distributing a monolithic build, or else
ceasing to function whenever shared libraries are updated. Open source
software can be packaged by the distributor and rebuilt on demand as
needed, and anyone can contribute fixes as needed when libraries change.
In a nutshell, proprietary software is excluded from playing in the
shared libraries sandbox, while open source software is part of a
community where everyone can work together.

(Making libraries "frozen in time" is not a viable option either; see my
previous post in this thread.)

--
Matthew

Olaf van der Spek

unread,
Mar 16, 2017, 3:07:10 PM3/16/17
to Matthew Woehlke, std-pr...@isocpp.org
2017-03-16 17:49 GMT+01:00 Matthew Woehlke <mwoehlk...@gmail.com>:
> I don't think this is the job of compiler vendors. Rather, it is the job
> of operating system distributors. Linux distributions have effectively
> solved this problem a long, long time ago. What we need is better tools
> for MacOS and *especially* Windows to install, update and otherwise
> manage "third party" libraries.

vcpkg?

> App stores, which are sharply targeted at end user applications, aren't
> the answer. While they do a good job on the user experience end, I'm not
> aware that they do dependency management. There is also the problem of
> dealing with differing library versions.
>
> The battle between open and closed source software comes into play here
> also. Proprietary software has the choice of either bundling all its
> dependencies, effectively distributing a monolithic build, or else
> ceasing to function whenever shared libraries are updated. Open source
> software can be packaged by the distributor and rebuilt on demand as
> needed, and anyone can contribute fixes as needed when libraries change.
> In a nutshell, proprietary software is excluded from playing in the
> shared libraries sandbox, while open source software is part of a
> community where everyone can work together.

Updates to shared libs should NOT break old consumers.
If an update is incompatible it should be installed side-by-side with
the old version.


--
Olaf

Matthew Woehlke

unread,
Mar 16, 2017, 3:22:56 PM3/16/17
to std-pr...@isocpp.org
On 2017-03-16 15:07, Olaf van der Spek wrote:
> 2017-03-16 17:49 GMT+01:00 Matthew Woehlke wrote:
>> I don't think this is the job of compiler vendors. Rather, it is the job
>> of operating system distributors. Linux distributions have effectively
>> solved this problem a long, long time ago. What we need is better tools
>> for MacOS and *especially* Windows to install, update and otherwise
>> manage "third party" libraries.
>
> vcpkg?

News to me. Maybe it will help; that would be nice if it does... I'm
guessing it relies on users to use CMake in order to be able to consume
the resulting packages? Or do they have their own system for that?

>> App stores, which are sharply targeted at end user applications, aren't
>> the answer. While they do a good job on the user experience end, I'm not
>> aware that they do dependency management. There is also the problem of
>> dealing with differing library versions.
>>
>> The battle between open and closed source software comes into play here
>> also. Proprietary software has the choice of either bundling all its
>> dependencies, effectively distributing a monolithic build, or else
>> ceasing to function whenever shared libraries are updated. Open source
>> software can be packaged by the distributor and rebuilt on demand as
>> needed, and anyone can contribute fixes as needed when libraries change.
>> In a nutshell, proprietary software is excluded from playing in the
>> shared libraries sandbox, while open source software is part of a
>> community where everyone can work together.
>
> Updates to shared libs should NOT break old consumers.
> If an update is incompatible it should be installed side-by-side with
> the old version.

In an ideal world, yes, but how often is that the case? This would
require that all libraries are installed in a way that their headers and
other artifacts are isolated in version-specific locations, so that
multiple versions can be installed. For most packages on most POSIX-like
platforms, that is simply not the case.

Major libraries of better quality do that, but many don't. (I know Qt
tries hard to keep both SC and BC within a minor revision, and their
major versions are usually co-installable. I would assume, but can't
verify from firsthand experience, that GTK, Boost and the like do also.
OTOH, I don't think PROJ does, and I'm not sure about such critical
libraries as libjpeg, libpng, zlib, etc., although on the plus side
there is usually a direct correlation between how often a library is
likely to *have* compatibility breaks and how likely they are to take
steps to mitigate consequences for their users. The problem children are
usually smaller, more "niche" libraries with sporadic releases and often
few developers.)

--
Matthew

Olaf van der Spek

unread,
Mar 16, 2017, 3:27:19 PM3/16/17
to std-pr...@isocpp.org
2017-03-16 20:22 GMT+01:00 Matthew Woehlke <mwoehlk...@gmail.com>:
> On 2017-03-16 15:07, Olaf van der Spek wrote:
>> 2017-03-16 17:49 GMT+01:00 Matthew Woehlke wrote:
>>> I don't think this is the job of compiler vendors. Rather, it is the job
>>> of operating system distributors. Linux distributions have effectively
>>> solved this problem a long, long time ago. What we need is better tools
>>> for MacOS and *especially* Windows to install, update and otherwise
>>> manage "third party" libraries.
>>
>> vcpkg?
>
> News to me. Maybe it will help; that would be nice if it does... I'm
> guessing it relies on users to use CMake in order to be able to consume
> the resulting packages? Or do they have their own system for that?

No, if you use the IDE (or maybe just MSBuild) you only have to
install a pkg once ('globally'), you don't have to add any per-project
settings.

>>> App stores, which are sharply targeted at end user applications, aren't
>>> the answer. While they do a good job on the user experience end, I'm not
>>> aware that they do dependency management. There is also the problem of
>>> dealing with differing library versions.
>>>
>>> The battle between open and closed source software comes into play here
>>> also. Proprietary software has the choice of either bundling all its
>>> dependencies, effectively distributing a monolithic build, or else
>>> ceasing to function whenever shared libraries are updated. Open source
>>> software can be packaged by the distributor and rebuilt on demand as
>>> needed, and anyone can contribute fixes as needed when libraries change.
>>> In a nutshell, proprietary software is excluded from playing in the
>>> shared libraries sandbox, while open source software is part of a
>>> community where everyone can work together.
>>
>> Updates to shared libs should NOT break old consumers.
>> If an update is incompatible it should be installed side-by-side with
>> the old version.
>
> In an ideal world, yes, but how often is that the case? This would
> require that all libraries are installed in a way that their headers and
> other artifacts are isolated in version-specific locations, so that

Headers? I'm confused, how are headers a problem for deploying
closed-source software?

> multiple versions can be installed. For most packages on most POSIX-like
> platforms, that is simply not the case.
>
> Major libraries of better quality do that, but many don't. (I know Qt
> tries hard to keep both SC and BC within a minor revision, and their
> major versions are usually co-installable. I would assume, but can't
> verify from firsthand experience, that GTK, Boost and the like do also.
> OTOH, I don't think PROJ does, and I'm not sure about such critical
> libraries as libjpeg, libpng, zlib, etc., although on the plus side
> there is usually a direct correlation between how often a library is
> likely to *have* compatibility breaks and how likely they are to take
> steps to mitigate consequences for their users. The problem children are
> usually smaller, more "niche" libraries with sporadic releases and often
> few developers.)

Distributing closed-source software for Linux is problematic, true.


--
Olaf

Matthew Woehlke

unread,
Mar 16, 2017, 4:10:41 PM3/16/17
to std-pr...@isocpp.org
On 2017-03-16 15:27, Olaf van der Spek wrote:
> 2017-03-16 20:22 GMT+01:00 Matthew Woehlke wrote:
>> On 2017-03-16 15:07, Olaf van der Spek wrote:
>>> 2017-03-16 17:49 GMT+01:00 Matthew Woehlke wrote:
>>>> I don't think this is the job of compiler vendors. Rather, it is the job
>>>> of operating system distributors. Linux distributions have effectively
>>>> solved this problem a long, long time ago. What we need is better tools
>>>> for MacOS and *especially* Windows to install, update and otherwise
>>>> manage "third party" libraries.
>>>
>>> vcpkg?
>>
>> News to me. Maybe it will help; that would be nice if it does... I'm
>> guessing it relies on users to use CMake in order to be able to consume
>> the resulting packages? Or do they have their own system for that?
>
> No, if you use the IDE (or maybe just MSBuild) you only have to
> install a pkg once ('globally'), you don't have to add any per-project
> settings.

I don't think we're on the same page...

Let's say I am writing software A that uses build system X and needs
library B. Let's say that I used vcpkg to get B on my system. How does A
know how to use B? Does vcpkg itself provide anything on top of what B
would provide if I just acquired B from its upstream and built it "the
usual way"? If it does, must X be some specific build system to take
advantage of it?

I suspect the answer is either "X must be CMake" or "X must be VS", in
which case we haven't fully solved the problem. (CPS¹ would help with
that...)

https://github.com/mwoehlke/cps)

>>> Updates to shared libs should NOT break old consumers.
>>> If an update is incompatible it should be installed side-by-side with
>>> the old version.
>>
>> In an ideal world, yes, but how often is that the case? This would
>> require that all libraries are installed in a way that their headers and
>> other artifacts are isolated in version-specific locations, so that
>
> Headers? I'm confused, how are headers a problem for deploying
> closed-source software?

They are a problem if I want to build any software that uses such
libraries. Remember, we're not talking about "leaf" applications:
usually you would not install multiple versions of those. We're talking
about *shared* libraries that are used by other applications.

Anyway, headers aren't the only problem; what about libraries with
run-time data files? (PROJ comes to mind again...) Headers are just the
most obvious thing likely to differ by version but not include version
information. *Every* artifact of an installed shared library needs to
reside in a version-specific location in some manner in order for
multiple versions of that library to be co-installable.

That means that either you have many, many versions of *every* library,
so that every installed version has the *exact* version of all its
dependencies that it was built against (and you can never update²), or
else Windows needs to figure out something like SO-versioning.

(² Admittedly this is a mixed curse. On the one hand, the end
application knows exactly what they've got library-wise. On the other,
the application can't see bug fixes *or security updates* in a library
without rebuilding the application, which is only practical for open
source applications that are built by the end user, Gentoo-style, or
where the OS distributor distributes the build. Not being able to update
the library without also updating the application is not much different
from just using static libraries, at which point you're effectively just
dodging/ignoring the problem. Historically, Linux distributions have
looked VERY POORLY on this sort of thing.)

--
Matthew

Olaf van der Spek

unread,
Mar 16, 2017, 5:02:19 PM3/16/17
to std-pr...@isocpp.org
2017-03-16 21:10 GMT+01:00 Matthew Woehlke <mwoehlk...@gmail.com>:
>>>> vcpkg?
>>>
>>> News to me. Maybe it will help; that would be nice if it does... I'm
>>> guessing it relies on users to use CMake in order to be able to consume
>>> the resulting packages? Or do they have their own system for that?
>>
>> No, if you use the IDE (or maybe just MSBuild) you only have to
>> install a pkg once ('globally'), you don't have to add any per-project
>> settings.
>
> I don't think we're on the same page...
>
> Let's say I am writing software A that uses build system X and needs
> library B. Let's say that I used vcpkg to get B on my system. How does A
> know how to use B? Does vcpkg itself provide anything on top of what B
> would provide if I just acquired B from its upstream and built it "the
> usual way"? If it does, must X be some specific build system to take
> advantage of it?
>
> I suspect the answer is either "X must be CMake" or "X must be VS", in
> which case we haven't fully solved the problem. (CPS¹ would help with
> that...)
>
> (¹ https://github.com/mwoehlke/cps)

X can be cmake or vs, if it's not I guess you'll have to point X to
the include and lib dirs vcpkg created.

>>>> Updates to shared libs should NOT break old consumers.
>>>> If an update is incompatible it should be installed side-by-side with
>>>> the old version.
>>>
>>> In an ideal world, yes, but how often is that the case? This would
>>> require that all libraries are installed in a way that their headers and
>>> other artifacts are isolated in version-specific locations, so that
>>
>> Headers? I'm confused, how are headers a problem for deploying
>> closed-source software?
>
> They are a problem if I want to build any software that uses such
> libraries. Remember, we're not talking about "leaf" applications:
> usually you would not install multiple versions of those. We're talking
> about *shared* libraries that are used by other applications.

Ah.. vcpkg only concerns itself with distributing the build
dependencies, not with further distribution to end users.

> Anyway, headers aren't the only problem; what about libraries with
> run-time data files? (PROJ comes to mind again...) Headers are just the
> most obvious thing likely to differ by version but not include version
> information. *Every* artifact of an installed shared library needs to
> reside in a version-specific location in some manner in order for
> multiple versions of that library to be co-installable.
>
> That means that either you have many, many versions of *every* library,
> so that every installed version has the *exact* version of all its

You don't need the exact versions, you merely need a compatible version.
I think on Windows this currently causes tons of duplication.

> dependencies that it was built against (and you can never update²), or
> else Windows needs to figure out something like SO-versioning.
>
> (² Admittedly this is a mixed curse. On the one hand, the end
> application knows exactly what they've got library-wise. On the other,
> the application can't see bug fixes *or security updates* in a library
> without rebuilding the application, which is only practical for open
> source applications that are built by the end user, Gentoo-style, or
> where the OS distributor distributes the build. Not being able to update
> the library without also updating the application is not much different
> from just using static libraries, at which point you're effectively just
> dodging/ignoring the problem. Historically, Linux distributions have
> looked VERY POORLY on this sort of thing.)

How so?
AFAIK Linux distros have no trouble shipping security updates of shared libs.

Thiago Macieira

unread,
Mar 17, 2017, 2:01:20 AM3/17/17
to std-pr...@isocpp.org
Em quinta-feira, 16 de março de 2017, às 09:38:14 PDT, Matthew Woehlke
escreveu:
> One of the reasons libraries like Boost, Qt, etc. are so successful is
> because they are willing and able to make compatibility-breaking changes
> every few years, which "core C++" historically has not been willing to do.

Note that the libraries implementing the standard have done compatibility
breaks too. The standard does not say how its classes must be implemented,
only what they have to do once implemented.

> There are several concerns with a large standard library:
>
> - There is no single implementation. Unless we somehow develop a system
> of "blessing" a particular existing implementation when a new (hunk of)
> library is added to the standard, the work load simply implementing new
> stuff is... large. There are of course licensing issues with adopting an
> existing implementation. Also, the various implementations are almost
> certainly going to have divergence in QoI.

Don't forget that creating more work for the Standard Library developers (of
whom there is a limited quantity) will most likely cause a drop in quality too
or a delay in providing full functionality.

> Going back to your previous comments, the real reason we have these
> issues is not lack of library standardization, it is lack of packaging
> and distribution standardization. If installing and using third-party
> C++ libraries was as easy as installing and using third-party Python
> modules, we wouldn't be having this conversation.

Or Go. All dependencies are listed by their repository URL. Go has no static
or shared library. It forces open source.


--
Thiago Macieira - thiago (AT) macieira.info - thiago (AT) kde.org
Software Architect - Intel Open Source Technology Center

Matthew Woehlke

unread,
Mar 17, 2017, 11:29:40 AM3/17/17
to std-pr...@isocpp.org
On 2017-03-17 02:01, Thiago Macieira wrote:
> Em quinta-feira, 16 de março de 2017, às 09:38:14 PDT, Matthew Woehlke
> escreveu:
>> One of the reasons libraries like Boost, Qt, etc. are so successful is
>> because they are willing and able to make compatibility-breaking changes
>> every few years, which "core C++" historically has not been willing to do.
>
> Note that the libraries implementing the standard have done compatibility
> breaks too. The standard does not say how its classes must be implemented,
> only what they have to do once implemented.

Yes, but mostly BC breaks. I'm thinking more of SC, which is the only
level of compatibility that can be talked about at the specification
level, and is also more restrictive. (You can break BC without breaking
SC, but you generally can't break SC without also breaking BC.)

Well, there have been *some* SC breaks (`auto` for example), but they
tend to take a long time to push through. Compare to something like Qt,
GTK, etc. that allow SC breaks every few years and are generally
aggressive about removing "bad" API.

>> There are several concerns with a large standard library:
>>
>> - There is no single implementation. Unless we somehow develop a system
>> of "blessing" a particular existing implementation when a new (hunk of)
>> library is added to the standard, the work load simply implementing new
>> stuff is... large. There are of course licensing issues with adopting an
>> existing implementation. Also, the various implementations are almost
>> certainly going to have divergence in QoI.
>
> Don't forget that creating more work for the Standard Library developers (of
> whom there is a limited quantity) will most likely cause a drop in quality too
> or a delay in providing full functionality.

Er... right? I was certainly *trying* to imply that with the above ;-).

>> Going back to your previous comments, the real reason we have these
>> issues is not lack of library standardization, it is lack of packaging
>> and distribution standardization. If installing and using third-party
>> C++ libraries was as easy as installing and using third-party Python
>> modules, we wouldn't be having this conversation.
>
> Or Go. All dependencies are listed by their repository URL. Go has no static
> or shared library. It forces open source.

Or perl. Or... :-)

--
Matthew

Thiago Macieira

unread,
Mar 17, 2017, 12:22:59 PM3/17/17
to std-pr...@isocpp.org
Em sexta-feira, 17 de março de 2017, às 08:29:35 PDT, Matthew Woehlke
escreveu:
> Yes, but mostly BC breaks. I'm thinking more of SC, which is the only
> level of compatibility that can be talked about at the specification
> level, and is also more restrictive. (You can break BC without breaking
> SC, but you generally can't break SC without also breaking BC.)

Sure you can. The easiest ways: #ifdef and ambiguous overloads.

> >> Going back to your previous comments, the real reason we have these
> >> issues is not lack of library standardization, it is lack of packaging
> >> and distribution standardization. If installing and using third-party
> >> C++ libraries was as easy as installing and using third-party Python
> >> modules, we wouldn't be having this conversation.
> >
> > Or Go. All dependencies are listed by their repository URL. Go has no
> > static or shared library. It forces open source.
>
> Or perl. Or... :-)

Perl is like Python: not a compiled language. Go is compiled but does not
accept pre-compiled modules.

Matthew Woehlke

unread,
Mar 17, 2017, 1:11:57 PM3/17/17
to std-pr...@isocpp.org
On 2017-03-17 12:22, Thiago Macieira wrote:
> Em sexta-feira, 17 de março de 2017, às 08:29:35 PDT, Matthew Woehlke
> escreveu:
>>>> Going back to your previous comments, the real reason we have these
>>>> issues is not lack of library standardization, it is lack of packaging
>>>> and distribution standardization. If installing and using third-party
>>>> C++ libraries was as easy as installing and using third-party Python
>>>> modules, we wouldn't be having this conversation.
>>>
>>> Or Go. All dependencies are listed by their repository URL. Go has no
>>> static or shared library. It forces open source.
>>
>> Or perl. Or... :-)
>
> Perl is like Python: not a compiled language. Go is compiled but does not
> accept pre-compiled modules.

Python modules can be pure Python code, but they can also include
pre-compiled code. In fact, a significant portion¹ of my
distro-installed Python modules are .so's.

Granted, the distribution system for these is either left to the OS
vendor or assumes availability of source code, however the latter case
also usually means use of a standard build system (i.e. setuptools).
Which... goes back to my original point: we need a standard way to
distribute libraries whose final installed form is a compiled artifact.
Python has generally solved this problem, and by nature doesn't have the
problems that C and C++ have with consuming third party libraries (or
modules, or whatever nomenclature applies).

Stuff like cppan², vcpkg³ and CPS⁴ could help here. FLOSS-based OS
distros generally solve the packaging and distribution problem, so we
just need to solve the "how to consume it" problem (CPS), which leaves
mainly Windows that still needs to solve packaging and distribution.
It's not clear if vcpkg will solve that for end users, or only for
developers (leaving application developers still needing to solve the
problem for their end users). I'm not very familiar with cppan either,
but at a glance, it looks like it may be more promising than vcpkg as an
end to end solution.

I think the best solution would be to converge on something like cppan
(which — unlike vcpkg — is cross platform), for build systems to learn
to integrate with cppan in order to obtain dependencies on demand, and
for OS vendors to adopt and package cppan so that it is able to install
vendor-packaged dependencies when available and to keep
non-vendor-packaged dependencies segregated in a way that plays nice
with those parts of the file system that are OS-managed.

The third step is not strictly necessary, but OS vendors will likely
have sufficient motivation to do it anyway if/when cppan becomes
popular. The second step can be as easy as teaching the build system to
suggest a cppan command to run when a dependency is not found.

Of course, this all works great for open source libraries. Vendors of
proprietary software will still have to carry the burden of producing
pre-built binaries for all supported platforms. I would hope that cppan
could still make it possible for users to obtain such packages in a
standard way, if suitable packages exist.

This brings up a related problem: what all is involved in identifying if
a pre-built binary will run on a user's platform? We need to know at
least the ISA, kernel, and kernel version, but what else? Standard
library version? Compiler? Compiler version? Others? (Not all of these
will matter for all kernels.)

(¹ At a rough count, I would say I have about 80 Python modules that are
either wholly or partly pre-compiled code, out of maybe 500 modules
installed. These numbers may not count "built-in" modules in either
category, and I believe the latter number is an upper bound and might be
significantly inflated.)

https://cppan.org/)

https://github.com/Microsoft/vcpkg)

Thiago Macieira

unread,
Mar 17, 2017, 4:32:35 PM3/17/17
to std-pr...@isocpp.org
Em sexta-feira, 17 de março de 2017, às 10:11:52 PDT, Matthew Woehlke
escreveu:
> On 2017-03-17 12:22, Thiago Macieira wrote:
> > Em sexta-feira, 17 de março de 2017, às 08:29:35 PDT, Matthew Woehlke
> >
> > escreveu:
> >>>> Going back to your previous comments, the real reason we have these
> >>>> issues is not lack of library standardization, it is lack of packaging
> >>>> and distribution standardization. If installing and using third-party
> >>>> C++ libraries was as easy as installing and using third-party Python
> >>>> modules, we wouldn't be having this conversation.
> >>>
> >>> Or Go. All dependencies are listed by their repository URL. Go has no
> >>> static or shared library. It forces open source.
> >>
> >> Or perl. Or... :-)
> >
> > Perl is like Python: not a compiled language. Go is compiled but does not
> > accept pre-compiled modules.
>
> Python modules can be pure Python code, but they can also include
> pre-compiled code. In fact, a significant portion¹ of my
> distro-installed Python modules are .so's.

That's not Python compiled code (that would be .pyc, even though those are
more of a cache nature than compiled). Those are native code implementations
for some Python functionality. The same happens to almost all languages that
aren't C and C++.

It doesn't change the fact that Python is actually an interpreted language. Go
isn't.

> Which... goes back to my original point: we need a standard way to
> distribute libraries whose final installed form is a compiled artifact.
> Python has generally solved this problem, and by nature doesn't have the
> problems that C and C++ have with consuming third party libraries (or
> modules, or whatever nomenclature applies).

The interesting thing is that the reason that makes C and C++ so attractive
for system programming is that there is no requirement to deploy it any
particular way. Each system can choose how to deploy it to suit its needs.

> This brings up a related problem: what all is involved in identifying if
> a pre-built binary will run on a user's platform? We need to know at
> least the ISA, kernel, and kernel version, but what else? Standard
> library version? Compiler? Compiler version? Others? (Not all of these
> will matter for all kernels.)

Yes to all of the above (including the "others").

Nevin Liber

unread,
Mar 17, 2017, 5:05:27 PM3/17/17
to std-pr...@isocpp.org
On Fri, Mar 17, 2017 at 1:01 AM, Thiago Macieira <thi...@macieira.org> wrote:
Don't forget that creating more work for the Standard Library developers (of
whom there is a limited quantity) will most likely cause a drop in quality too
or a delay in providing full functionality.

This argument, of course, can be used against adding *any* more standard libraries, modifying current libraries, etc.  Maybe we should rename this list to std-no-more-proposals.

Library vendors are on the committee.  Why not make a proposal and let them make the choice?
-- 
 Nevin ":-)" Liber  <mailto:ne...@eviloverlord.com>  +1-847-691-1404

Matthew Woehlke

unread,
Mar 17, 2017, 5:10:01 PM3/17/17
to std-pr...@isocpp.org
Python modules are generally easy to be installed. Some python modules,
which presumably are in the "easily installed" category, may consist of
compiled C/C++/etc. code. Ergo, Python is a legitimate example of a
language that has apparently solved the packaging and distribution problem.

It may be that the above is true only for open source modules... but so
what? That just means that solving the problem for C++ may be harder for
proprietary software.

I don't see how furthering this discussion is beneficial to C++.

>> Which... goes back to my original point: we need a standard way to
>> distribute libraries whose final installed form is a compiled artifact.
>> Python has generally solved this problem, and by nature doesn't have the
>> problems that C and C++ have with consuming third party libraries (or
>> modules, or whatever nomenclature applies).
>
> The interesting thing is that the reason that makes C and C++ so attractive
> for system programming is that there is no requirement to deploy it any
> particular way. Each system can choose how to deploy it to suit its needs.

Well, then, we have a catch-22. Being able to easily consume third party
libraries requires some form of standardization.

I don't see why having such a system wouldn't allow users to opt out of
that system (you can do something similar with Python by installing
stuff in non-standard paths), but then the burden is on you to find and
use your third party components.

>> This brings up a related problem: what all is involved in identifying if
>> a pre-built binary will run on a user's platform? We need to know at
>> least the ISA, kernel, and kernel version, but what else? Standard
>> library version? Compiler? Compiler version? Others? (Not all of these
>> will matter for all kernels.)
>
> Yes to all of the above (including the "others").

Whee! Well, I was at least rather hoping to pin down what "others"
includes...

One of the problems CPS needs to solve is how to specify what platform a
package pertains to, so that a user searching for a package can ignore
packages that aren't built for their system (e.g. ignore i686 packages
when building x86_64 software). I'd appreciate any insight into that
problem...

--
Matthew

Thiago Macieira

unread,
Mar 17, 2017, 6:19:22 PM3/17/17
to std-pr...@isocpp.org
Em sexta-feira, 17 de março de 2017, às 14:09:56 PDT, Matthew Woehlke
escreveu:
> Python modules are generally easy to be installed. Some python modules,
> which presumably are in the "easily installed" category, may consist of
> compiled C/C++/etc. code. Ergo, Python is a legitimate example of a
> language that has apparently solved the packaging and distribution problem.

Not exactly. The modules that contain C and C++ compiled code most often do so
because they link to a C and C++ library, which must be installed before the
Python module is compiled.

Since that isn't a solved problem, deployment of Python modules that include
compiled code is not a solved problem.

Not to mention that every time I install a new computer, I have to hunt down
to find a compatible Python-ZOAP module, because they never compile properly.
Reply all
Reply to author
Forward
0 new messages