build robots_file fails

12 views
Skip to first unread message

Matthew Leingang

unread,
May 17, 2016, 10:21:12 AM5/17/16
to nikola-discuss
Hello,

I'm in the process of migrating about 140 html files and a dozen images to a nikola site.  I suddenly have this error in the build process.  I've narrowed it down to the robots_file target:

$ nikola build robots_file
Scanning posts......done!
Traceback (most recent call last):
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/site-packages/doit/doit_cmd.py", line 168, in run
    return command.parse_execute(args)
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/site-packages/doit/cmd_base.py", line 122, in parse_execute
    return self.execute(params, args)
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/site-packages/doit/cmd_base.py", line 405, in execute
    return self._execute(**exec_params)
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/site-packages/doit/cmd_run.py", line 253, in _execute
    return runner.run_all(self.control.task_dispatcher())
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/site-packages/doit/runner.py", line 247, in run_all
    self.run_tasks(task_dispatcher)
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/site-packages/doit/runner.py", line 210, in run_tasks
    if not self.select_task(node, task_dispatcher.tasks):
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/site-packages/doit/runner.py", line 117, in select_task
    if node.ignored_deps or self.dep_manager.status_is_ignore(task):
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/site-packages/doit/dependency.py", line 579, in status_is_ignore
    return self._get(task.name, "ignore:")
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/site-packages/doit/dependency.py", line 221, in get
    self._db[task_id] = json.loads(task_data.decode('utf-8'))
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/json/__init__.py", line 318, in loads
    return _default_decoder.decode(s)
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/json/decoder.py", line 343, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/json/decoder.py", line 359, in raw_decode
    obj, end = self.scan_once(s, idx)
ValueError: Expecting ',' delimiter: line 1 column 36721 (char 36720)

Not really sure what I did wrong.  In the meantime, can I disable automatic building of the robots_file?  I don't need it yet.

Thanks in advance for any clues.

Best,
Matthew

Roberto Alsina

unread,
May 17, 2016, 10:37:23 AM5/17/16
to nikola-discuss
You can do this in your conf.py:

DISABLED_PLUGINS = ['robots']

However, could you email me a copy of your sitemap? It seems to be broken, and I would like to know if it's the sitemap plugin creating a broken one, or the robots plugin not being smart enough.

--
You received this message because you are subscribed to the Google Groups "nikola-discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to nikola-discus...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Matthew Leingang

unread,
May 17, 2016, 12:05:35 PM5/17/16
to nikola-discuss, ral...@kde.org
Thanks for the workaround.  Is the sitemap the rss.xml file?  I don't have a file called sitemap. 

Best,
Matthew
rss.xml

Chris Warrick

unread,
May 17, 2016, 1:46:26 PM5/17/16
to Nikola—Discuss
On 17 May 2016 at 18:05, Matthew Leingang <mlei...@gmail.com> wrote:
>
> Thanks for the workaround. Is the sitemap the rss.xml file? I don't have a file called sitemap.

It’s sitemap.xml. Maybe your build failed too early for that?

Either way, I’d recommend removing .doit.db and trying to build again.

--
Chris Warrick <https://chriswarrick.com/>
PGP: 5EAAEA16

Matthew Leingang

unread,
May 17, 2016, 3:05:01 PM5/17/16
to nikola-discuss
Aha!  That did the trick.  This is my first time with doit so I didn't know about cleaning that database.

Best,
Matthew

Chris Warrick

unread,
May 17, 2016, 3:07:14 PM5/17/16
to Nikola—Discuss
On 17 May 2016 at 21:05, Matthew Leingang <mlei...@gmail.com> wrote:
> Aha! That did the trick. This is my first time with doit so I didn't know
> about cleaning that database.
>
> Best,
> Matthew

It got corrupted. Might be a good idea to scan your disk for errors
just in case. Also, you may try re-enabling robots.txt, it should work
fine.
Reply all
Reply to author
Forward
0 new messages