How would you describe Modules?

349 views
Skip to first unread message

wolfei...@gmail.com

unread,
Jun 29, 2013, 9:39:42 PM6/29/13
to std-dis...@isocpp.org
I'm creating a presentation on modules, and I was looking for some punchy descriptions of it. Something like, I dunno, "Compiler-managed dependencies" or "The death of header files" or something like that. What would you guys suggest?

Róbert Dávid

unread,
Jun 30, 2013, 5:01:10 PM6/30/13
to std-dis...@isocpp.org, wolfei...@gmail.com


2013. június 30., vasárnap 3:39:42 UTC+2 időpontban wolfei...@gmail.com a következőt írta:
I'm creating a presentation on modules, and I was looking for some punchy descriptions of it. Something like, I dunno, "Compiler-managed dependencies" or "The death of header files" or something like that. What would you guys suggest?

I am yet to see any real description about C++ modules, and thus it is unclear what the feature is exactly about. It was maybe Herb who told in one of his talks that Modules mean different things to different people. Sure, "The death of header files" sounds good, but is it really about that? (I'd even say if it is, then it does not worth the trouble at all - go get a real IDE instead.)

Regards, Robert

Klaim - Joël Lamotte

unread,
Jun 30, 2013, 5:53:55 PM6/30/13
to std-dis...@isocpp.org, wolfei...@gmail.com

On Sun, Jun 30, 2013 at 11:01 PM, Róbert Dávid <lrd...@gmail.com> wrote:
I am yet to see any real description about C++ modules, and thus it is unclear what the feature is exactly about. It was maybe Herb who told in one of his talks that Modules mean different things to different people. Sure, "The death of header files" sounds good, but is it really about that? (I'd even say if it is, then it does not worth the trouble at all - go get a real IDE instead.)

I remember that one of the more recent videos (I'm not sure which one) with Modules implementors mentionned that Modules are basically just an alternative to header files, 
which will certainly be preferred by new libraries once it's available. In practice the current works seem to concentrate on making sure headers are "enhanced" by modules
to allow a smooth transition to module from current and older projects.

I'd say "modernized compilation model" but it's not really a punch-line.

Joel Lamotte

martinho....@gmail.com

unread,
Jul 1, 2013, 7:43:03 AM7/1/13
to std-dis...@isocpp.org, wolfei...@gmail.com

Would you mind explaining what features would a "real IDE" have that mitigate the pains of header files?

Róbert Dávid

unread,
Jul 8, 2013, 7:17:22 PM7/8/13
to std-dis...@isocpp.org, wolfei...@gmail.com, martinho....@gmail.com

Would you mind explaining what features would a "real IDE" have that mitigate the pains of header files?

This question looks like "what features would a program - that is solving some problems - have?" Come on, be a bit more specific..
What are those "pains" header files cause? To be more back on topic, out of these, what will be solved by modules?

Regards, Robert

Klaim - Joël Lamotte

unread,
Jul 9, 2013, 10:20:06 AM7/9/13
to std-dis...@isocpp.org, wolfei...@gmail.com, martinho....@gmail.com

On Tue, Jul 9, 2013 at 1:17 AM, Róbert Dávid <lrd...@gmail.com> wrote:
What are those "pains" header files cause?

Did you read the rationale from the Modules paper?
If not you should.

Joel Lamotte

Róbert Dávid

unread,
Jul 11, 2013, 3:55:48 PM7/11/13
to std-dis...@isocpp.org, wolfei...@gmail.com, martinho....@gmail.com

Well I did, but all I see there is 'faster compilation', any other benefit explained is foggy.
A bigger benefit I see in Modules that it gives some kind of DLL-ish/.so-ish concept, what is missing from the standard since.. forever, but that has absolutely nothing to do with header files, so my question still stands..

Regards, Robert

Klaim - Joël Lamotte

unread,
Jul 12, 2013, 1:46:17 PM7/12/13
to std-dis...@isocpp.org, wolfei...@gmail.com, martinho....@gmail.com

On Thu, Jul 11, 2013 at 9:55 PM, Róbert Dávid <lrd...@gmail.com> wrote:
A bigger benefit I see in Modules that it gives some kind of DLL-ish/.so-ish concept, what is missing from the standard since.. forever, but that has absolutely nothing to do with header files, so my question still stands..

No Modules apparently are not about defining library linking.
Yes it's faster compilation first, and the other foggy benefits would be helping the compiler easily find the code asked on import, which basically mean that
where the headers are is not important - the current include system is based on filesystem adresses which forces all kind of setups just to 
make sure that you are using the right headers from the right directory.....

Basically it would simplify defining relationship between the different modules defining your application,
both for easing of setting these relationship and to help the compiler and other tools help you because it can finally know 
the boundaries of each module.

This is obviously what I understand so far from the concept of Modules explained in online documents, it might be incorrect or different than the work-in-progress implementation in Clang.

Joel Lamotte


Martinho Fernandes

unread,
Jul 12, 2013, 1:51:03 PM7/12/13
to Klaim - Joël Lamotte, std-dis...@isocpp.org, DeadMG
On Fri, Jul 12, 2013 at 7:46 PM, Klaim - Joël Lamotte <mjk...@gmail.com> wrote:
>
> On Thu, Jul 11, 2013 at 9:55 PM, Róbert Dávid <lrd...@gmail.com> wrote:
>>
>> A bigger benefit I see in Modules that it gives some kind of
>> DLL-ish/.so-ish concept, what is missing from the standard since.. forever,
>> but that has absolutely nothing to do with header files, so my question
>> still stands..
>
>
> No Modules apparently are not about defining library linking.
> Yes it's faster compilation first, and the other foggy benefits would be
> helping the compiler easily find the code asked on import, which basically
> mean that
> where the headers are is not important - the current include system is based
> on filesystem adresses which forces all kind of setups just to
> make sure that you are using the right headers from the right directory.....
>

No, the current system as defined does not care about how the
filesystem is structured. The fact that all the implementations we
have are based on that premise is not due to the standard. The
standard leaves the "where to find the headers" bit completely
implementation-defined. Blame the implementations for sticking to the
traditional model all these years.

Róbert Dávid

unread,
Jul 12, 2013, 2:48:17 PM7/12/13
to std-dis...@isocpp.org, wolfei...@gmail.com, martinho....@gmail.com


2013. július 12., péntek 19:46:17 UTC+2 időpontban Klaim - Joël Lamotte a következőt írta:

On Thu, Jul 11, 2013 at 9:55 PM, Róbert Dávid <lrd...@gmail.com> wrote:
A bigger benefit I see in Modules that it gives some kind of DLL-ish/.so-ish concept, what is missing from the standard since.. forever, but that has absolutely nothing to do with header files, so my question still stands..

No Modules apparently are not about defining library linking.

Modules, or anything in the standard, including TRs/TS', does not define shared libraries, it just defines something, that is quite close to a library. The same way there is zero standard text about header files and .cpp files, just includes and compilation units. You are free to represent them in any way: it is equally conforming implementation to read up all compilation units and include headers from an SQL database, a web server, whatever, it just "happen to be" that all implementations are file-based. It also does not tell you how to represent results of compilation and linking.

With modules, you will be able to "combine" compilation results of a few compilation units. Due to the above, it does not define libraries themselves. But don't tell me that this is not something like the libraries (both static and shared) current C++ implementations create.

(It is a good theoretical question if a strictly conforming implementation is able to produce a .DLL without a main function, that is used from a non-C++ language.)

Regards, Robert

Klaim - Joël Lamotte

unread,
Jul 12, 2013, 3:30:40 PM7/12/13
to std-dis...@isocpp.org

On Fri, Jul 12, 2013 at 7:51 PM, Martinho Fernandes <martinho....@gmail.com> wrote:

No, the current system as defined does not care about how the
filesystem is structured. The fact that all the implementations we
have are based on that premise is not due to the standard. The
standard leaves the "where to find the headers" bit completely
implementation-defined. Blame the implementations for sticking to the
traditional model all these years.

I think it's incorrect in practice because it's all about headers, which are files, so you still need to locate them.
As there are no module-grouping declaration in the language to define which header are associated to which module, 
then there is no other way than using a file system location to identify the file.

Basically, in theory you are right, but in practice it's impossible to do otherwise.
Not without a way to group files under a common name that will be known by the compiler and which would be a module.

Joel Lamotte

Klaim - Joël Lamotte

unread,
Jul 12, 2013, 3:34:32 PM7/12/13
to std-dis...@isocpp.org, wolfei...@gmail.com, martinho....@gmail.com

On Fri, Jul 12, 2013 at 8:48 PM, Róbert Dávid <lrd...@gmail.com> wrote:
Modules, or anything in the standard, including TRs/TS', does not define shared libraries, it just defines something, that is quite close to a library. The same way there is zero standard text about header files and .cpp files, just includes and compilation units. You are free to represent them in any way: it is equally conforming implementation to read up all compilation units and include headers from an SQL database, a web server, whatever, it just "happen to be" that all implementations are file-based. It also does not tell you how to represent results of compilation and linking.

Ok so there is no header file either, that clarifies it, thanks.
I guess it's all filesystem just because it's the only common way to share your work.

Joel Lamotte.

Patrick Michael Niedzielski

unread,
Jul 12, 2013, 3:50:38 PM7/12/13
to std-dis...@isocpp.org
On ven, 2013-07-12 at 21:30 +0200, Klaim - Joël Lamotte wrote:
>
> On Fri, Jul 12, 2013 at 7:51 PM, Martinho Fernandes
> <martinho....@gmail.com> wrote:
>
>
> No, the current system as defined does not care about how the
> filesystem is structured. The fact that all the
> implementations we
> have are based on that premise is not due to the standard. The
> standard leaves the "where to find the headers" bit completely
> implementation-defined. Blame the implementations for sticking
> to the
> traditional model all these years.
>
> I think it's incorrect in practice because it's all about headers,
> which are files, so you still need to locate them.

A side note: the standard doesn't require that headers be actual files
in any filesystem, and is very careful to refer to them only as
"headers", not as "header files". A conforming implementation is
allowed to have them, for example, as streams of preprocessor tokens
internal to the compiler. They tend not to be implemented this way, but
Martinho is indeed correct in saying that the standard does not mandate
anything about where to find the headers, or even that they must be
files.

Have a look at section 16.2 (quoted from n3485):

16.2 Source file inclusion

1. A #include directive shall identify a header or source file
that can be processed by the implementation.

2. A preprocessing directive of the form
# include < h-char-sequence > new-line
searches a sequence of implementation-defined places for a
header identified uniquely by the specified sequence between the
< and > delimiters, and causes the replacement of that directive
by the entire contents of the header. How the places are
specified or the header identified is implementation-defined.

3. A preprocessing directive of the form
# include < h-char-sequence > new-line
# include " q-char-sequence " new-line
causes the replacement of that directive by the entire contents
of the source file identified by the specified sequence between
the " delimiters. [...]

Cheers,
Patrick Niedzielski
signature.asc

wolfei...@gmail.com

unread,
Jul 13, 2013, 7:07:00 AM7/13/13
to std-dis...@isocpp.org
Er, I think that could very easily be read as

"A[n] #include directive shall identify a (header or source) file that can be processed by the implementation."

Patrick Michael Niedzielski

unread,
Jul 13, 2013, 8:18:06 AM7/13/13
to std-dis...@isocpp.org
I don't think that's the case, because the term "header" appears many
places in the standard, but "header file" would only appear there with
that reading. Also, the next two paragraphs put "header" and "source
file" in opposition to one another.

Nicol Bolas

unread,
Jul 14, 2013, 10:19:56 PM7/14/13
to std-dis...@isocpp.org, wolfei...@gmail.com, martinho....@gmail.com


On Thursday, July 11, 2013 12:55:48 PM UTC-7, Róbert Dávid wrote:


2013. július 9., kedd 16:20:06 UTC+2 időpontban Klaim - Joël Lamotte a következőt írta:

On Tue, Jul 9, 2013 at 1:17 AM, Róbert Dávid <lrd...@gmail.com> wrote:
What are those "pains" header files cause?

Did you read the rationale from the Modules paper?
If not you should.

Joel Lamotte


Well I did, but all I see there is 'faster compilation', any other benefit explained is foggy.

That's because there is no other reason for modules. The original proposal for modules talks about DLLs and such, but the primary focus of the current modules definition and implementation (which admittedly is not a formal proposal yet; it's being built as an implementation first, with guidance by the committee, in order to prove that it can actually work and does what it is expected to. Thus avoiding the `export` issue) is to massively reduce compile times.

Modules are basically automatically generated, fine-grained precompiled headers. That's their purpose.

The fundamental problem with C++'s compilation module is that every .cpp file must compile ever header. Every time. Every time you #include <vector>, the compiler must include all of that code and compile it. Even though the compilation results will be exactly the same every time, it still has to do it. If you make one change to a .cpp file, the compiler still has to recompile <vector>.

If you change <vector>, then every .cpp file that includes it must be recompiled. And remember: recompiling a .cpp means recompiling every header that .cpp includes, directly or incorrectly. So if you change <vector>, and a .cpp file includes <iostream>, you have to recompile that .cpp's include of <iostream> too.

The idea with the current incarnation of modules is that you can use headers, but the system will substitute #includes for import directives. When it first includes a module header file, it will build a symbol definition file for that header. All other subsequent includes will just read the symbol definitions, which are pre-compiled source code, not live C++.

In this way, modules replicate the functionality of headers as much as possible. Included among the symbols for a module are macros too; it was deemed vital to allow macros to be part of modules, and for one module to rely on macros defined by another.

The current version of modules isn't necessarily about getting rid of headers (though you can do that too). The system is designed to allow you to use headers as normal and get the compilation benefits of modules via the use of an external module map file.

Róbert Dávid

unread,
Jul 15, 2013, 10:09:50 AM7/15/13
to std-dis...@isocpp.org


2013. július 15., hétfő 4:19:56 UTC+2 időpontban Nicol Bolas a következőt írta:


On Thursday, July 11, 2013 12:55:48 PM UTC-7, Róbert Dávid wrote:


2013. július 9., kedd 16:20:06 UTC+2 időpontban Klaim - Joël Lamotte a következőt írta:

On Tue, Jul 9, 2013 at 1:17 AM, Róbert Dávid <lrd...@gmail.com> wrote:
What are those "pains" header files cause?

Did you read the rationale from the Modules paper?
If not you should.

Joel Lamotte


Well I did, but all I see there is 'faster compilation', any other benefit explained is foggy.

That's because there is no other reason for modules. The original proposal for modules talks about DLLs and such, but the primary focus of the current modules definition and implementation (which admittedly is not a formal proposal yet; it's being built as an implementation first, with guidance by the committee, in order to prove that it can actually work and does what it is expected to. Thus avoiding the `export` issue) is to massively reduce compile times.

Modules are basically automatically generated, fine-grained precompiled headers. That's their purpose.

The fundamental problem with C++'s compilation module is that every .cpp file must compile ever header. Every time. Every time you #include <vector>, the compiler must include all of that code and compile it. Even though the compilation results will be exactly the same every time, it still has to do it. If you make one change to a .cpp file, the compiler still has to recompile <vector>.

If you change <vector>, then every .cpp file that includes it must be recompiled. And remember: recompiling a .cpp means recompiling every header that .cpp includes, directly or incorrectly. So if you change <vector>, and a .cpp file includes <iostream>, you have to recompile that .cpp's include of <iostream> too.
Ah come on, <vector> is part of the C++ implementation, just like the compiler or the linker. Does it come as a surprise to anyone that I need to recompile when I change the compiler, unless it is ABI-compatible? Is it a surprise to anyone that I need to recompile when I change the standard headers to an ABI-incompatible one? How does modules prevent the necessity of recompiling stuff for a changed vector module?
 
The idea with the current incarnation of modules is that you can use headers, but the system will substitute #includes for import directives. When it first includes a module header file, it will build a symbol definition file for that header. All other subsequent includes will just read the symbol definitions, which are pre-compiled source code, not live C++.
Precompiling headers on first include (with the same feature set) seems like something that any current implementation could do, we don't need a standard extension for that.. But I think that is not the acutal solution to the problem you described: we need to recompile dependencies when the interface provided by the header is changes, something that MSVC does with the "managed incremental build" feature (it's a different question how successfully).

In this way, modules replicate the functionality of headers as much as possible. Included among the symbols for a module are macros too; it was deemed vital to allow macros to be part of modules, and for one module to rely on macros defined by another.

The current version of modules isn't necessarily about getting rid of headers (though you can do that too). The system is designed to allow you to use headers as normal and get the compilation benefits of modules via the use of an external module map file.

So far, Modules is just "let's make the include problem more complex with throwing import in the picture, and force implementers into one given kind of performance optimization"? This sounds quite alarming to me. There should be more to it, I know how smart people are working on it. Does Modules allow something that I could not do without?

Regards, Robert

corn...@google.com

unread,
Jul 15, 2013, 1:51:43 PM7/15/13
to std-dis...@isocpp.org


On Monday, July 15, 2013 4:09:50 PM UTC+2, Róbert Dávid wrote:

Ah come on, <vector> is part of the C++ implementation, just like the compiler or the linker. Does it come as a surprise to anyone that I need to recompile when I change the compiler, unless it is ABI-compatible? Is it a surprise to anyone that I need to recompile when I change the standard headers to an ABI-incompatible one? How does modules prevent the necessity of recompiling stuff for a changed vector module?

You're missing the point. First, Nicol was using <vector> only as an example; any 3rd party or even your own header that is used by a lot of your code qualifies. Second, it's not that you have to recompile. Given the way C++ works, of course you have to recompile. The point is that you have to recompile the code of <vector> once for every single .cpp file you have. Not once. Once per .cpp. Modules turn this N header * M source files issue into an N header + M source files issue.
 
 
Precompiling headers on first include (with the same feature set) seems like something that any current implementation could do, we don't need a standard extension for that..

We actually do. The problem is that headers currently have absolutely no structure. They are affected by the complete preprocessor environment that is active when they're included, and they can modify the environment (and invariably do, at least with their inclusion guards). The order of headers matters. There are headers that define different things depending on some macro. If optimizing this stuff was possible, some compiler would have done it - there's certainly enough demand.

Modules define boundaries. Not every header can be a module, it has to obey some restrictions. As a benefit, import order no longer matters, and modules don't affect each other. Configuration macros become well-defined entities.

These are significant changes to the compilation model, and affect how code inside headers is interpreted. That's why a standard change is necessary and this whole thing can't just be implemented as a compiler optimization.
 

So far, Modules is just "let's make the include problem more complex with throwing import in the picture, and force implementers into one given kind of performance optimization"?

Modules are an addition to the language, so the language gets more complex. As long as we care about backwards compatibility, that's the way it will be. I'm not sure what you mean with "the include problem", but I don't think it will become more complex - modules are a way out, a simpler way of doing things. Of course, if you have to understand both complex headers and the restrictions that modules have, you will have to know more than before, but this is always the case when you're on a transition path.
 
This sounds quite alarming to me. There should be more to it, I know how smart people are working on it. Does Modules allow something that I could not do without?

Yes, lots of things.

Modules allow you to write headers without worrying that the user #defined "min" and "max" and your template implementation will run afoul of that. Without worrying that the macros "signals", "slots" and "emit" that you offer to your users as some cool DSL conflict with some third party library that the user would also like to use. And without worrying that the new template trick you added to your library will increase the build time of your client's project by 20% because he includes that trick in every single .cpp of his application.

Modules allow you to turn up your warning level to full and treat warnings as errors, without having to fix all the false positives in the third party headers you include: the compiler can easily compile the modules with different settings.

Modules allow tools (linters, refactorers, autoformatters) to really affect your code only, because boundaries are clearly delineated.

Modules could allow debuggers to load the module description instead of decoding debug info for things like type definitions. Something like this is planned for LLDB.

Modules offer an interesting extension point. For example, Clang's implementation will probably allow specifying autolink directives in the module maps, which, thanks to modules having a properly structured dependency tree, will even work with order-dependent linkers.

I think modules do quite a lot. Let's not get greedy and try to do too much.

Sebastian

Nicol Bolas

unread,
Jul 15, 2013, 2:00:29 PM7/15/13
to std-dis...@isocpp.org


On Monday, July 15, 2013 7:09:50 AM UTC-7, Róbert Dávid wrote:


2013. július 15., hétfő 4:19:56 UTC+2 időpontban Nicol Bolas a következőt írta:


On Thursday, July 11, 2013 12:55:48 PM UTC-7, Róbert Dávid wrote:


2013. július 9., kedd 16:20:06 UTC+2 időpontban Klaim - Joël Lamotte a következőt írta:

On Tue, Jul 9, 2013 at 1:17 AM, Róbert Dávid <lrd...@gmail.com> wrote:
What are those "pains" header files cause?

Did you read the rationale from the Modules paper?
If not you should.

Joel Lamotte


Well I did, but all I see there is 'faster compilation', any other benefit explained is foggy.

That's because there is no other reason for modules. The original proposal for modules talks about DLLs and such, but the primary focus of the current modules definition and implementation (which admittedly is not a formal proposal yet; it's being built as an implementation first, with guidance by the committee, in order to prove that it can actually work and does what it is expected to. Thus avoiding the `export` issue) is to massively reduce compile times.

Modules are basically automatically generated, fine-grained precompiled headers. That's their purpose.

The fundamental problem with C++'s compilation module is that every .cpp file must compile ever header. Every time. Every time you #include <vector>, the compiler must include all of that code and compile it. Even though the compilation results will be exactly the same every time, it still has to do it. If you make one change to a .cpp file, the compiler still has to recompile <vector>.

If you change <vector>, then every .cpp file that includes it must be recompiled. And remember: recompiling a .cpp means recompiling every header that .cpp includes, directly or incorrectly. So if you change <vector>, and a .cpp file includes <iostream>, you have to recompile that .cpp's include of <iostream> too.
Ah come on, <vector> is part of the C++ implementation, just like the compiler or the linker. Does it come as a surprise to anyone that I need to recompile when I change the compiler, unless it is ABI-compatible? Is it a surprise to anyone that I need to recompile when I change the standard headers to an ABI-incompatible one? How does modules prevent the necessity of recompiling stuff for a changed vector module?

I just told you. Here's a .cpp file:

#include <vector>
#include <iostream>

int main()
{
  std
::vector<float> v{5.0, 4.3};
 
for(auto &i : v)
    std
::cout << i << std::endl;
}

`<vector>` and `<iostream>` may be "part of the C++ implementation," but they are still header files, and thus will be treated no differently than any other #inclusion in your code. If it helps you understand the issue, don't think of them as system files. You can mentally replace `<vector>` and `<iostream>` with `"myVector" and "myIosystem" and chance the classes accordingly. It changes nothing about how the compiler goes about handling them.

In regular C++, in order to compile this .cpp file, both `<vector>` and `<iostream>` must be included. And "inclusion" in this case means that you copy the gigantic block of source code of both headers into this file and compile them as though this was a single, giant .cpp file.

I took a look at my visual studio standard library headers. `<vector>`, after chasing down some of its inclusions (including `<stdexcept>` which needs `<string>`) it comes out to no less than 138,697 lines, and probably more than that. `<iostream>` comes out to no less than 72,363 lines.

So this code, which looks like less than 10 lines, is really over 200,000 lines of code. Every time you make a single change in this file, you must recompile all 200,000 lines of code. Why do you need to compile 200,000 lines of code, if the compiled results of <vector> and <iostream> never changed?

If you change `<vector>`, you must recompile all 200,000 lines of code, even though the 72,000 lines belonging to <iostream> have absolutely no dependencies on `<vector>` at all. The compiled result of `<iostream>` is completely unchanged. So why spend that time compiling 72,000 lines, when you only need to compile 138,000 + 10? Again, if these being standard library headers bothers you, you can think of them as just some large, template code headers that are part of your project.

The idea with the current incarnation of modules is that you can use headers, but the system will substitute #includes for import directives. When it first includes a module header file, it will build a symbol definition file for that header. All other subsequent includes will just read the symbol definitions, which are pre-compiled source code, not live C++.
Precompiling headers on first include (with the same feature set) seems like something that any current implementation could do, we don't need a standard extension for that.

We do need a standard extension for that. And here's why.

#include works by copying text. The C++ standard requires that it works by copying text. And therefore, all of the definitions included by one #include must be visible to any subsequent code. Not just code written in the .cpp file doing the including, but also in code included after it.

Modules don't work that way. If you include a module, it is independent of any modules you may have previously included. Now, that module may itself include other modules, and what it includes will be visible within that module and can be exposed to people who include that module. But the order in which you include modules cannot change the resulting code. The order you #include things theoretically could.

So the C++ standard must have language that allows modules to be compiled independently.

But I think that is not the acutal solution to the problem you described: we need to recompile dependencies when the interface provided by the header is changes, something that MSVC does with the "managed incremental build" feature (it's a different question how successfully).


In this way, modules replicate the functionality of headers as much as possible. Included among the symbols for a module are macros too; it was deemed vital to allow macros to be part of modules, and for one module to rely on macros defined by another.

The current version of modules isn't necessarily about getting rid of headers (though you can do that too). The system is designed to allow you to use headers as normal and get the compilation benefits of modules via the use of an external module map file.

So far, Modules is just "let's make the include problem more complex with throwing import in the picture, and force implementers into one given kind of performance optimization"?

... I'm not sure what you're getting at. The point is to provide what pretty much every other major language besides C does: compile only the code that needs to be compiled. The C/C++ compilation model simply doesn't allow compilers the ability to know that changes to `<vector>` will have no impact on changes to `<iostream>` So if a file includes both, and you change `<vector>`, it must recompile `<iostream>` too.

Go, D, C#, Java, all of these major languages have automatic module mechanisms to detect this stuff and sort it out. Modules is about providing this to C and C++.
 
This sounds quite alarming to me. There should be more to it, I know how smart people are working on it. Does Modules allow something that I could not do without?

Yes.

Currently, if you #include a file, everyone who #includes you will gain access to that stuff whether they want to or not. We have come up with idioms to avoid exposing these internal details in header-only libraries, like using a `detail` namespace or whatever. But those details are still exposed; we just tell people to ignore them.

With modules you can explicitly state that some things you #include are internal to your module and are not to be exposed to those who include you. Bits of them can be exposed of course (private members of classes the user can use may have to be exposed). But the user can't name them directly, though type deduction could name them anyway.

Regards, Robert

Róbert Dávid

unread,
Jul 15, 2013, 2:35:35 PM7/15/13
to std-dis...@isocpp.org

`<vector>` and `<iostream>` may be "part of the C++ implementation," but they are still header files, and thus will be treated no differently than any other #inclusion in your code. If it helps you understand the issue, don't think of them as system files. You can mentally replace `<vector>` and `<iostream>` with `"myVector" and "myIosystem" and chance the classes accordingly. It changes nothing about how the compiler goes about handling them.

<> and "" includes are treated differently, but that does not change your point.
 
If you change `<vector>`, you must recompile all 200,000 lines of code, even though the 72,000 lines belonging to <iostream> have absolutely no dependencies on `<vector>` at all. The compiled result of `<iostream>` is completely unchanged. So why spend that time compiling 72,000 lines, when you only need to compile 138,000 + 10? Again, if these being standard library headers bothers you, you can think of them as just some large, template code headers that are part of your project.

This problem is solved by exactly what you described here (for a sec ignoring the need for the extension you continue with):

The idea with the current incarnation of modules is that you can use headers, but the system will substitute #includes for import directives. When it first includes a module header file, it will build a symbol definition file for that header. All other subsequent includes will just read the symbol definitions, which are pre-compiled source code, not live C++.

#include works by copying text. The C++ standard requires that it works by copying text. And therefore, all of the definitions included by one #include must be visible to any subsequent code. Not just code written in the .cpp file doing the including, but also in code included after it.

That's not entirely correct. The standard requires that it behaves like if it would copy text. You can do any trick you want, if it is transparent. If you can detect that nothing has changed from line x to line y (including macros that could mess up the code), you can just pick up the result from the previous compilation. Implementations don't do that (except for the mentioned Managed Incremental Build) because it is hard, I guess Modules will help this a lot.
 
Currently, if you #include a file, everyone who #includes you will gain access to that stuff whether they want to or not. We have come up with idioms to avoid exposing these internal details in header-only libraries, like using a `detail` namespace or whatever. But those details are still exposed; we just tell people to ignore them.
 
With modules you can explicitly state that some things you #include are internal to your module and are not to be exposed to those who include you. Bits of them can be exposed of course (private members of classes the user can use may have to be exposed). But the user can't name them directly, though type deduction could name them anyway.

That's a great example, thank you.

Regards, Robert

wolfei...@gmail.com

unread,
Jul 16, 2013, 10:01:09 AM7/16/13
to std-dis...@isocpp.org
On Monday, July 15, 2013 7:35:35 PM UTC+1, Róbert Dávid wrote:
That's not entirely correct. The standard requires that it behaves like if it would copy text. You can do any trick you want, if it is transparent. If you can detect that nothing has changed from line x to line y (including macros that could mess up the code), you can just pick up the result from the previous compilation. Implementations don't do that (except for the mentioned Managed Incremental Build) because it is hard, I guess Modules will help this a lot.

That's not entirely correct either. The ODR gives compilers a massive amount of leeway for not repeatedly including files. It's just that no implementation currently utilizes this. 

Nicol Bolas

unread,
Jul 16, 2013, 2:08:27 PM7/16/13
to std-dis...@isocpp.org, wolfei...@gmail.com

But that's not really possible in the general case. #including the same file from multiple translation units is not required to provide definitions that are ODR-equivalent definitions of the same symbol. This would be for things like macros and such. Nowadays, a compiler would have to detect if macros interfere in the creation of symbols or not, so that it could determine if a header is standalone by some measurement.

That's one of the main things that modules change that make everything work: the only thing that changes the definition of data in modules (besides command-line compile options) are what modules that a particular module directly or indirectly imports. Every module will be compiled the same way all the time.

That is not true of a header.

wolfei...@gmail.com

unread,
Jul 18, 2013, 3:38:50 PM7/18/13
to std-dis...@isocpp.org, wolfei...@gmail.com
But that's not really possible in the general case. #including the same file from multiple translation units is not required to provide definitions that are ODR-equivalent definitions of the same symbol.

For quite a few kinds of symbols, it actually does- for example, class definitions, class templates, inline function definitions. At least, that's how I read it. 
Reply all
Reply to author
Forward
0 new messages