IAR long compile time when tests get longer

137 views
Skip to first unread message

Mike Bechtold

unread,
Jul 15, 2020, 8:43:24 PM7/15/20
to cpputest
I am building UnitTest code and when my Test get longer then say 200 lines the compiler takes longer to the compile.  Once the code compiles the execution does not have any problem.  If my expected values are off,  I have to recompile this single file.  And If I split the test into part this sometime fixes the compile time.   But it would be nice to know the limit.

Thanks
Mike

James Grenning

unread,
Jul 15, 2020, 10:53:55 PM7/15/20
to cpputest

Hi Mike

Are you saying compiling a 200 LOC test file takes long to compile? How long?

There must be more to this story.
Where does the time go? Preprocessor, compile, link?
* Preprocessor: Is there is a heavy include load? (You could test this by #if 0 around all the source in the file giving you the problem.
* Linker: watch the build and get out your stop watch
* What kinds of change trigger the long build?
* How long does a full build from clean state take.
* change one source file, then build and link

What kind of computer is it?

Do you have a dependency-aware incremental build or do you do a clean build each time?

BTW: One thing you should move toward is run unit test off-target.

HTZH James


James Grenning -- Author of TDD for Embedded C - wingman-sw.com/tddec
Join my live-via-the-web TDD Training
wingman software
wingman-sw.com -- blog --@jwgrenning -- facebook

--
You received this message because you are subscribed to the Google Groups "cpputest" group.
To unsubscribe from this group and stop receiving emails from it, send an email to cpputest+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/cpputest/297c31be-fe32-4c9f-b6fe-61085e78fbb4o%40googlegroups.com.

Mike Bechtold

unread,
Jul 17, 2020, 10:31:36 AM7/17/20
to cppu...@googlegroups.com
Thanks for the interest, I'm not exactly sure where the time is going.  In IAR it looks like when it gets to the files that have long test functions, it takes minutes to process the file to check for changes then show the result of the compile for the file is complete.  These tests have several CHECK_EQUAL calls.  My first hunch is that the compiler is taking a long time expanding all the CHECK_EQUALs from #defines to code.   I have been using CPPUTEST for about three years now.  Over those three years, this would be my 3rd test group that exhibits this issue.  It gets progressively worse as more tests are added and the longer they get.

In the past I have commented out blocks of code, and there is no change in performance.  splitting the tests fx into smaller tests sometimes helps.  The difficulty is that there may be a static variable in the function call that is the state machine, and the flow has to  be maintained in order to test all the states.

My Unit Test is running off target,  I'm just using the IAR as a compiler and debugger.  The target is executed in the simulator mode. The ploy I was going with is to convince other developers that we are familiar tools when unit testing.   I'm still working on convincing the others that unit testing has its benefits.  I'm totally sold on it from the TDD method.   I think that they are still looking at it from after the code is a complete extra burden and not as a method of verifying their design.

I have compiled the cpputest library using IAR ARM 8.30.  

It's possible to compile and run this code using CPPUTEST in a different environment in Visual Studio.  I am using Google Test for other project that I could not easily (mainly non ARM projects).   If you have a link to how to compile using Visual Studio that would be great.

Mike


James Grenning

unread,
Jul 20, 2020, 9:54:23 PM7/20/20
to cppu...@googlegroups.com

Hi Mike

You might track this down by taking things out of you build and finding the tipping point.

Here is my VS cpputest starter kit. Maybe that would help.

https://github.com/jwgrenning/cpputest-starter-project-vs

msbec...@gmail.com

unread,
Sep 9, 2020, 10:43:18 AM9/9/20
to cpputest
Thanks for the help.  In the past I did a bisect of compile times.  I was hard to measure.  A practice that I move forward with is I make my TEST() shorter now.  This seems to help.  But I still don't have threshold # of lines for the test.  

Also thanks for the VS starter kit.  I finally got some time to start using it. I was able to finally get all my code to compile in VS.  It is much faster even with problem files. Studio just seems to rip right through them.  So it must be limit or flaw with IAR.  

I have not been able to execute cleanly through all the tests.  VS is good a catching thing that the IAR compiler / debugger can't.  Like uninitialized variables before use.  There is one big difference that I am running into right now and that the enums in IAR ARM are treated as 8-bit, and VS want to treat them as 32-bit.  These become relevant in communications messages that take structures that contain enums and other type.

Example

typedef enum  {
  NULLMSGTYPE = 0,
.....
  DATABLOB,
  COMMANDMESSAGE_C2AB
} msgType_et ;

typedef enum  {
  D_TO_A_NULL_MSG = 0,
  ....
  LOCKOUT_MODE = 0xF2,
  STANDBYREQUEST = 0xF3,
  LAST_VALUE = 0xFFFFFFFFU // Force cmdValue_et type to 32 bits
} cmdValue_et ;

#pragma pack(push,1)
struct structMsg {
  msgType_et msgType;   //8
  dataType_et dataType;  // 8
  cmdValue_et cmdValue; // 32
  uint32_t msgData;
  uint32_t CRCvalue;
};
#pragma pack(pop)    //  Restore pack() state

Studio want to treat this as 20 bytes, IAR want this as 14. 
I feel like I want to switch to studio, but this is a big hangup.  This is one on my structures that rely on the size to be packed as 1 byte aligned.
I tried to do some digging.  I hoping for an easy solution like -fshort_enum a compiler switch or or  hard path to modify the code (not liking this).

Thanks in advance
Mike

Ryan Hartlage

unread,
Sep 9, 2020, 4:36:53 PM9/9/20
to cpputest
Hey Mike,

Per the C spec enums are of size `int`. Embedded compilers will sometimes attempt to pack them into smaller integer types to save space, but even when this isn't the case, depending upon the size of an enum is inherently non-portable because `sizeof(int)` is not portable.

What I would recommend is to change how you define your enums so that you always specify a size. For example, instead of:

typedef enum {
  foo_bar,
  foo_baz
} foo_t;

You can do:

enum {
  foo_bar,
  foo_baz
};
typedef uint8_t foo_t;

By doing this you are signaling to clients that `foo_t` should be used instead of a typedef or a named enum. Since enums aren't checked in C, this has exactly the same compile-time semantics except that you control the size. This gives you total control over the representation of the size of the enum. My guess is that you use fixed-width types like `uint8_t`, `uint16_t`, etc. everywhere else in your code, so why let your compiler decide the size of your enums? This will make your code portable to the desktop, portable across different embedded platforms, and probably save you some memory. A win-win-win!

Ryan



msbec...@gmail.com

unread,
Sep 9, 2020, 11:14:38 PM9/9/20
to cpputest
I have tried this, but I'm not pleased with answer in the debugger.   Now the data type is not longer associated with variable.   Also type checking also goes away.  

C++ gives the ability to typedef enum : size { red, blue, green } color.  Where the size of the enum can be defined for C++ but this same mechanism is not available for the same code compiled in C.    So there is condition compile for those enum to define the size in C++ WIN32. But everywhere else it compiled it defined as smallest size based on the largest size of an element.  

In IAR ARM, the base type for an int is 32 bits but where enum is concerned the type is smallest based on the largest value.  ie typedef enum { red, white, blue } color would be a 8 bit.  but if the value assigned to the blue is 0xFFFF, the enum's size is 16 bits.

This code is not intended to be portable with PC but, I'm using CPPUTEST with IAR and the VS studio.  Can GCC handle smaller sized enums?    If so where can I get started with GCC setup.

Thanks again
Mike

Ryan Hartlage

unread,
Sep 10, 2020, 5:52:26 AM9/10/20
to cpputest
Hey Mike,

Just to be clear, C never type checks enums. You aren't losing that. Enums are, unfortunately, just integers. Yes, you're losing the ability for the debugger to give you the enumerated value, but you're not losing correctness because C never gave you that in the first place. I've found that having portable code is worth opening another editor tab and cross-referencing the enum's values when in the debugger.

There is a flag `-fshort-enums` for gcc that you could try that I believe gives you the behavior you want. Of course by relying upon this you're not writing standards-compliant, portable C. Even if you're not interested in portability with the PC, you are giving up portability to other microcontroller platforms.

On Windows you could try MinGW or Cygwin to run gcc, but I've found that both of those offer a pretty miserable experience (probably better than what you're getting with IAR, though). My team switched to *nix machines (Mac and Linux) because they offered an order of magnitude faster compile times. It may be worth checking out WSL or WSL2 for running a Linux distribution on Windows in order to use gcc at full performance.

Ryan



Reply all
Reply to author
Forward
0 new messages