I'm not trying to say that we should (or should not) bring back the old Concepts proposal but rather we should fix the fundamental issues first (the ones which affects the 99%) before coming up with solutions for new features, ones that are influenced by them.
I started to think that maybe the community is going about this the wrong way, maybe the first step and the highest priority above everything else should be to standardize a module system so we can get away from textual inclusion we have now. Then come back to features, features which can have an influence (and influenced by) compilation times and see how they perform.
To me it does not seem to make much sense to to accept/reject solutions which are disadvantaged by an old/outdated system (textual includes) that everyone knows the issues with it and we eventually be replaced.
Why don't precompiled headers work for you?
Adjusting the language to make compilers faster seems conceptually backwards, or a red flag.
I haven't looked into modules much, but I'd assumed the motivation was elsewhere.
I thought header repetition was a solved problem.
Loading precompiled header files also takes significant time, but modules wouldn't necessarily be any better. I'm sure that can be improved, but it's a job for compiler vendors.
Standardizing an interoperable format equivalent to a precompiled header would take forever. PCHes are just AST dumps. Standardizing a compressed, serialized AST for every C++ construct? That's much harder than standardizing an ABI. And when a compiler has to perform meaningful conversion because the native AST differs, performance is lost and the exercise becomes pointless.
So we already have a speedy binary format on every platform used for serious work, and we're never going to unify these formats anyway… what are modules supposed to fix? Just to remove the PCH build-system boilerplate by letting the compiler cache things more easily? Seems minor.
All the C++ projects I've worked on so far have had this build time project, even the smallest.
On 9/30/13 11:57 PM, Robert Zeh wrote:
> On Mon, Sep 30, 2013 at 10:07 AM, snk_kid <korcan....@googlemail.com>wrote:
>
>> Then come back to features, features which can have
>> an influence (and influenced by) compilation times and see how they perform.
>>
> I agree that the compile times for projects over 1 million lines of code
> makes life difficult --- it has certainly made my life difficult.
Why don't precompiled headers work for you?
Adjusting the language to make compilers faster seems conceptually
backwards, or a red flag. I haven't looked into modules much, but I'd
assumed the motivation was elsewhere. I thought header repetition was a
solved problem.
Loading precompiled header files also takes significant time, but
modules wouldn't necessarily be any better. I'm sure that can be
improved, but it's a job for compiler vendors.
Standardizing an interoperable format equivalent to a precompiled header
would take forever. PCHes are just AST dumps. Standardizing a
compressed, serialized AST for every C++ construct? That's much harder
than standardizing an ABI. And when a compiler has to perform meaningful
conversion because the native AST differs, performance is lost and the
exercise becomes pointless.
So we already have a speedy binary format on every platform used for
serious work, and we're never going to unify these formats anyway� what
are modules supposed to fix? Just to remove the PCH build-system
boilerplate by letting the compiler cache things more easily? Seems minor.
On Monday, September 30, 2013 10:00:07 PM UTC-7, David Krauss wrote:On 9/30/13 11:57 PM, Robert Zeh wrote:
> On Mon, Sep 30, 2013 at 10:07 AM, snk_kid <korcan....@googlemail.com>wrote:
>
>> Then come back to features, features which can have
>> an influence (and influenced by) compilation times and see how they perform.
>>
> I agree that the compile times for projects over 1 million lines of code
> makes life difficult --- it has certainly made my life difficult.
Why don't precompiled headers work for you?They cause many problems. They're often buggy.
They play very poorly with the preprocessor but yet have to play with it where as modules just side-step the preprocessor entirely.
Precompiled headers impose a large number of limitations. They're not universally available. Setting them up differs between compilers, IDEs, and build systems.
They can often _increase_ build time if used naively
and don't help nearly as much as modules can when used intelligently.
Precompiled headers are a hackjob stopgap to work around the severe limitations and problems of the C preprocessor token-pasting dependency mechanism.
Adjusting the language to make compilers faster seems conceptually
backwards, or a red flag. I haven't looked into modules much, but I'dNot fixing a language that is often characterized by its atrocious compile times when a clearly identified and widely known (to other similar languages) solution is presented is a lot more backwards, as I see it. There is a very clear problem and a clear fix; why not do it? :)
Loading precompiled header files also takes significant time, but
modules wouldn't necessarily be any better. I'm sure that can be
improved, but it's a job for compiler vendors.Other languages have long laid all this to rest. I'd suggest looking into the module sytems - and build time improvements brought about by them - in compiled languages like C#, Java, D, Go, etc.
The C include pattern is a huge problem that simply does not scale at either a parsing or an I/O level. That's not surprising considering that the whole C preprocessor was a hack to work around short-comings in the original C language itself (you might be aware of what C coding was like before the preprocessor was tacked on; lots of manual copying of prototypes and structure definitions from big printed manuals). The original C language had no way to include dependencies without pasting things in. The preprocessor is just a tool to make that pasting automated. Modules fundamentally changes and fixes the entire problem rather than just hacking it to be barely tolerable.
<pedantry> Well, it's not a hard problem to solve at the basic level of having such an interoperable format. It's an impossible problem to solve in a way that provides any significant benefit.
A portable PCH would likely just be a big C++ header file run through the preprocessor, maybe with a terser syntax. </pedantry>
So we already have a speedy binary format on every platform used for
serious work, and we're never going to unify these formats anyway� what
are modules supposed to fix? Just to remove the PCH build-system
boilerplate by letting the compiler cache things more easily? Seems minor.Modules simplify the entire coding process. No other language designed requires such an awkward split as header files. They're sometimes touted as a clean separation between implementation and interface, but that's rarely true. Templates make that untrue.
Class definitions usually make that untrue (since you the header needs to declare public and private members).
The needs of the linker - which on a related subject are in general pretty terrible - mandates a lot in terms of putting private symbols in headers to share things between translation units. Look at the pattern adopted by many/most libraries where they have to split headers between internal and external headers simply because headers by themselves don't actually solve any problems at all besides the awkward translation unit semantics inherited from C.
Modules let C++ behave more like all the other modern C-like languages. They make the problems of header dependencies _disappear_. They make the issues with public and private symbols _explicit_ rather than poorly abstracted. They remove the code-duplication necessary between a header file and a cpp file. They remove the arbitrary (from the standpoint of a newcomer) distinction between which code must be in a header and which code can/should be in a cpp go away. If fully embraced, they can make most of the hairier problems of the linker and dynamic library generation go away, too. And yes, they (can) make the compilation and linking process significantly shorter (if designed/implemented well) in cases where a PCH is infeasible or counter-productive and without all the nasty side-effects of a unity build.