No space left on device when generating Highcharts_Belchertown/json/year.json.tmpl

64 views
Skip to first unread message

Scott Grayban

unread,
Sep 23, 2019, 11:09:47 PM9/23/19
to weewx-user
Getting this error and can't trace why it's happening as there is plenty of space on the SD card.

cheetahgenerator: Generate failed with exception '<type 'exceptions.IOError'>'
cheetahgenerator: **** Ignoring template /etc/weewx/skins/Highcharts_Belchertown/json/year.json.tmpl
cheetahgenerator: **** Reason: [Errno 28] No space left on device
 ****  Traceback (most recent call last):
 ****    File "/usr/share/weewx/weewx/cheetahgenerator.py", line 332, in generate
 ****      fd.write(str(compiled_template))
 ****  IOError: [Errno 28] No space left on device

# /bin/df
Filesystem     1K-blocks    Used Available Use% Mounted on
/dev/root       30196004 9176308  19462784  33% /
devtmpfs          443152       0    443152   0% /dev
tmpfs             447760       0    447760   0% /dev/shm
tmpfs             447760   45412    402348  11% /run
tmpfs               5120       4      5116   1% /run/lock
tmpfs             447760       0    447760   0% /sys/fs/cgroup
weewx_html         20480   12836      7644  63% /home/weewx/public_html
/dev/mmcblk0p6     70498   23008     47490  33% /boot
tmpfs              89552       0     89552   0% /run/user/0

Thomas Keffer

unread,
Sep 23, 2019, 11:29:42 PM9/23/19
to weewx-user
Things other than a lack of file space can  cause this error. Take a look at this StackOverflow thread: https://stackoverflow.com/questions/6998083/python-causing-ioerror-errno-28-no-space-left-on-device-results-32766-h

-tk

--
You received this message because you are subscribed to the Google Groups "weewx-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email to weewx-user+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/weewx-user/751a4bd2-3dce-4393-a62d-6655d9b6dd50%40googlegroups.com.

Scott Grayban

unread,
Sep 23, 2019, 11:35:11 PM9/23/19
to weewx-user
Well it's not INODES

# /bin/df -i
Filesystem      Inodes  IUsed   IFree IUse% Mounted on
/dev/root      1925760 277090 1648670   15% /
devtmpfs        110788    412  110376    1% /dev
tmpfs           111940      1  111939    1% /dev/shm
tmpfs           111940    580  111360    1% /run
tmpfs           111940      6  111934    1% /run/lock
tmpfs           111940     12  111928    1% /sys/fs/cgroup
weewx_html      111940     75  111865    1% /home/weewx/public_html
/dev/mmcblk0p6       0      0       0     - /boot
tmpfs           111940     11  111929    1% /run/user/0
To unsubscribe from this group and stop receiving emails from it, send an email to weewx...@googlegroups.com.
Message has been deleted

Andrew Milner

unread,
Sep 24, 2019, 1:05:35 AM9/24/19
to weewx-user
size of swapfile - or even lack of a swap file perhaps??

Scott Grayban

unread,
Sep 24, 2019, 1:06:50 AM9/24/19
to weewx-user
Not that either...

# free -m
              total        used        free      shared  buff/cache   available
Mem:            874         512          95          54         266         253
Swap:          1749         106        1642

Andrew Milner

unread,
Sep 24, 2019, 2:29:47 AM9/24/19
to weewx-user
looking at the error again - could it be out of memory rather than file space??  As you are on a Pi have you tried adjusting graphics memory v cpu memory sizes?  What size memory RPi are you using??  My Rpi3 gives
MemTotal 927  Used 269  Free 177  Shared 54  Buff/cache 480  available 540
swap total 99  used 0  free 99

so i seem to have a lot more free memory available, but i do run headless and only access via ssh - so have little need for graphics memory

Scott Grayban

unread,
Sep 24, 2019, 2:58:49 AM9/24/19
to weewx-user
GPU is 64MB

No X running. Only weewx and hostapd running

# free -m
              total        used        free      shared  buff
/cache   available
Mem:            937         160         504          33         272         693
Swap:          1875           0        1875



Scott Grayban

unread,
Sep 24, 2019, 3:03:34 AM9/24/19
to weewx-user
Oh and I also have mosquitto and apache running.

I did a reboot again -- Maybe I can figure out why I am getting this error.. Seems to be untraceable at this point.

Andrew Milner

unread,
Sep 24, 2019, 3:44:10 AM9/24/19
to weewx-user
just out of curiosity what happens if you stop weewx and run reports manually using wee_reports??

Scott Grayban

unread,
Sep 24, 2019, 3:53:31 AM9/24/19
to weewx-user
Same thing...

# wee_reports 
Using configuration file /etc/weewx/weewx.conf
Generating for all time


Sep 24 00:47:42 weewx-pi wee_reports[4524]: cheetahgenerator: **** Ignoring template /etc/weewx/skins/Highcharts_Belchertown/json/year.json.tmpl
Sep 24 00:47:42 weewx-pi wee_reports[4524]: cheetahgenerator: **** Reason: [Errno 28] No space left on device
Sep 24 00:47:42 weewx-pi wee_reports[4524]: ****  Traceback (most recent call last):
Sep 24 00:47:42 weewx-pi wee_reports[4524]: ****    File "/usr/share/weewx/weewx/cheetahgenerator.py", line 332, in generate
Sep 24 00:47:42 weewx-pi wee_reports[4524]: ****      fd.write(str(compiled_template))
Sep 24 00:47:42 weewx-pi wee_reports[4524]: ****  IOError: [Errno 28] No space left on device



It's only this one report --> /etc/weewx/skins/Highcharts_Belchertown/json/year.json.tmpl
All the others generate and I have been working this error for weeks... just can't figure out why. Settings are all correct or all the report would be failing.

Andrew Milner

unread,
Sep 24, 2019, 5:00:53 AM9/24/19
to weewx-user
perhaps you should speak directly to Pat if the issue is only with the Highcharts year plot of belchertown and see if he can reproduce the fault on his test systems.  That route is more likely to resolve the problem I would have thought.

Scott Grayban

unread,
Sep 24, 2019, 5:05:01 AM9/24/19
to weewx-user
Ok thanks.

Glenn McKechnie

unread,
Sep 24, 2019, 5:14:22 AM9/24/19
to weewx...@googlegroups.com
Thoughts...

/home/weewx/public_html is 20Meg in size, 13 Meg is used leaving 7~ish
for Belchertown and admin.
Is that enough for your setup (Here Belchertown takes 1.5 Meg)

I'm assuming by the Inode count that Public_html is in tmpfs?
Is there a size limit for tmpfs in /etc/default/tmpfs?

How large is the file that
/etc/weewx/skins/Highcharts_Belchertown/json/year.json.tmpl generates.

And, what happens if you point the html space to write to the physical
disc (temporarily). What happens then, what is the size of the
directory then (du -h)

Finally, if none of this is on a tmpfs, what does ' sudo dmeg ' say -
In particular, any write errors.

And the type of the weewx_html partition - ext4? Journal reservation
will take a chunk of that.
> --
> You received this message because you are subscribed to the Google Groups
> "weewx-user" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to weewx-user+...@googlegroups.com.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/weewx-user/9c78c0f2-1735-4072-b9f9-b2992b0080a2%40googlegroups.com.
>


--


Cheers
Glenn

rorpi - read only raspberry pi & various weewx addons
https://github.com/glennmckechnie

Scott Grayban

unread,
Sep 24, 2019, 5:25:49 AM9/24/19
to weewx-user
I increased the tmpfs size and that was it. I didn't associate the space issue because I thought I had enough showing in free. 

Andrew Milner

unread,
Sep 24, 2019, 6:38:46 AM9/24/19
to weewx-user
yes, but you would have been looking AFTER cheetah had abandoned its processing due to lack of space - so any partially written files would have been deleted by the time you came to have a look as cheetah shut down.

Pat

unread,
Sep 25, 2019, 10:16:28 PM9/25/19
to weewx-user
Off topic, but maybe relevant. Please upgrade to the latest version of Belchertown. There's been a lot of optimizations since the version you're running. For example, the "sub-skin" Highcharts_Belchertown has been fully removed.

Pat

unread,
Sep 25, 2019, 10:18:13 PM9/25/19
to weewx-user
And before I forget, here's the instructions on how to upgrade from your older version. As always, please have a working backup. 

Reply all
Reply to author
Forward
0 new messages