icecream on mac

25 views
Skip to first unread message

Francesco Di Mizio

unread,
Aug 2, 2019, 5:57:21 AM8/2/19
to icecream-users
Hey guys,

fairly new to this. As a test setup I am running scheduler and agent on my mac-pro and an other agent on a mac-mini. All got installed with brew.
I do not understand if the cluster is working fine.

The iceccd connection seems to be fine. From the scheduler:

[12825] 2019-08-02 11:39:19: accepted 172.28.34.53
[12825] 2019-08-02 11:39:19: login BERDTM6336 protocol version: 39 []

iceccd:

BERDTM6336:~ king$ sudo /usr/local/Cellar/icecream/1.2_1/sbin/iceccd -vvv
No icecc user on system. Falling back to nobody.(Error: No such file or directory)
[44267] 2019-08-02 11:39:13: ICECREAM daemon 1.2.0 starting up (nice level 5)
[44267] 2019-08-02 11:39:13: 8 CPU(s) online on this server
[44267] 2019-08-02 11:39:13: allowing up to 8 active jobs
[44267] 2019-08-02 11:39:13: not detaching
[44267] 2019-08-02 11:39:13: entered process group
[44267] 2019-08-02 11:39:13: ignoring localhost lo0
[44267] 2019-08-02 11:39:13: broadcast en0 172.28.35.255
[44267] 2019-08-02 11:39:13: Netnames:
[44267] 2019-08-02 11:39:13: ICECREAM
[44267] 2019-08-02 11:39:13: ignoring localhost lo0
[44267] 2019-08-02 11:39:13: broadcast en0 172.28.35.255
[44267] 2019-08-02 11:39:13: scheduler not yet found/selected.
[44267] 2019-08-02 11:39:13: Suitable scheduler found at 172.28.34.83:8765 (version: 39)
[44267] 2019-08-02 11:39:13: scheduler not yet found/selected.
[44267] 2019-08-02 11:39:16: scheduler is on 172.28.34.83:8765 (net ICECREAM)
[44267] 2019-08-02 11:39:16: Connected to scheduler (I am known as 172.28.34.53)

Now i've also installed icemon (3.2.0) but I am not able to see the cluster. All i see is only the scheduler. Only once I managed to see an other agent-bubble. Perhaps this is just a bug with icemon? Can i assume my cluster is setup ok?

Now from the agent:

 which clang
/usr/local/opt/icecream/libexec/icecc/bin/clang
francescodimizio@BERLTM5548 ~/projects/icecream
$ clang++ -Wall -std=c++11 hw.cpp -o hw


Scheduler:
[12825] 2019-08-02 11:54:03: handle_local_job /Users/francescodimizio/projects/icecream/hw 4
[12825] 2019-08-02 11:54:04: handle_local_job_done 4

Agent:
[92781] 2019-08-02 11:54:03: accepted 7 local unix domain socket
[92781] 2019-08-02 11:54:03: send JobLocalBeginMsg to client
[92781] 2019-08-02 11:54:03: pushed local job 4
[92781] 2019-08-02 11:54:04: scheduler->send_msg( JobLocalDoneMsg( 4) );


Why do I see that handle_local_job? Is it normal?

Henry Miller

unread,
Aug 2, 2019, 6:49:32 AM8/2/19
to icecrea...@googlegroups.com
That all looks normal

Now i've also installed icemon (3.2.0) but I am not able to see the cluster. All i see is only the scheduler. Only once I managed to see an other agent-bubble. Perhaps this is just a bug with icemon? Can i assume my cluster is setup ok?

Hard to say, I'm not a max user so I don't know if icemon works there. 

Now from the agent:

 which clang
/usr/local/opt/icecream/libexec/icecc/bin/clang
francescodimizio@BERLTM5548 ~/projects/icecream
$ clang++ -Wall -std=c++11 hw.cpp -o hw

Here is your problem, icecream will not handle linking, only compiling. By not passing -c to clang it used the default which is compile and link in one step. Icecream detected that and did the build local only. 

Your correct command line is 
clang++ -Wall -std=c++11 -c hw.cpp -o hw.o

I don't know the command line to link offhand. When building a small program like you did icecream is a loss so I'd use the command you did and be done. For anything more complex I'd use a build tool like cmake (I use cmake personally but there are others that each have their own pros and cons, feel free to choose anything you want). The build tool knows the correct command lines so I don't have to remember them. 


Scheduler:
[12825] 2019-08-02 11:54:03: handle_local_job /Users/francescodimizio/projects/icecream/hw 4
[12825] 2019-08-02 11:54:04: handle_local_job_done 4

Agent:
[92781] 2019-08-02 11:54:03: accepted 7 local unix domain socket
[92781] 2019-08-02 11:54:03: send JobLocalBeginMsg to client
[92781] 2019-08-02 11:54:03: pushed local job 4
[92781] 2019-08-02 11:54:04: scheduler->send_msg( JobLocalDoneMsg( 4) );


Why do I see that handle_local_job? Is it normal

--
You received this message because you are subscribed to the Google Groups "icecream-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to icecream-user...@googlegroups.com.

Francesco Di Mizio

unread,
Aug 2, 2019, 10:23:34 AM8/2/19
to icecream-users
Hi and thanks for the reply. This was just an example of course. I have a big cmake project and I was getting the same local_job. Hence I wanted to see icecream  ship a single file to an other client.

now: 
 clang++ -Wall -std=c++11 -c hw.cpp

never returns and the client says:

[26242] 2019-08-02 16:16:31: accepted 7 local unix domain socket
[26242] 2019-08-02 16:16:31: get_native_env  (clang)
[26242] 2019-08-02 16:16:31: start_create_env clang

and that's all. Where have you read about splitting compilation and linking? I dont see anything in the docs.

Lubos Lunak

unread,
Aug 16, 2019, 5:39:41 AM8/16/19
to icecrea...@googlegroups.com
On Friday 02 of August 2019, Francesco Di Mizio wrote:
> Hi and thanks for the reply. This was just an example of course. I have a
> big cmake project and I was getting the same local_job. Hence I wanted to
> see icecream ship a single file to an other client.
>
> now:
> clang++ -Wall -std=c++11 -c hw.cpp
>
> never returns and the client says:
>
> [26242] 2019-08-02 16:16:31: accepted 7 local unix domain socket
> [26242] 2019-08-02 16:16:31: get_native_env (clang)
> [26242] 2019-08-02 16:16:31: start_create_env clang

That looks like your icecc-create-env script hangs (or just takes a long
time). Try to run it manually and see what happens.

> and that's all. Where have you read about splitting compilation and
> linking? I dont see anything in the docs.

"Like distcc, Icecream takes compile jobs from a build and distributes it
among remote machines allowing a parallel build." (second sentence in
README.md).

--
Lubos Lunak
Reply all
Reply to author
Forward
0 new messages