A call to wget with my username, password, and a "-U
$MOZILLA_USER_AGENT" could work in theory, but even if the page did
not need any Javascript to render its initial contents it would be
hard to parse, some comments would be truncated, etc...
I did try to parse the html for a while, months ago, and also the
output of marking with ctrl-A the whole text of the page displayed in
the browser, and then copying-and-pasting that to an Emacs buffer...
both things were very frustrating, and I was stumbling all the time on
corner cases and on rules that I had to guess. Having the contents of
posts as JSON will probably make things much easier.
By the way: is it possible to load fbcmd's functions from "php5 -a"
and call its functions directly? I am using Debian stable, with php5.5
run interactive programs from Emacs, using this trick here - the demo
starts at 0:16 -
but the interactive mode of "php5 -a" is a bit limited - for example,
autoloads don't work... it would certainly be much easier to just
run these things,
$p_id = '754664537913868'
echo get_json_of_page_from_id($p_id);
echo get_json_of_page_from_url($p_url);
than to also have to implement command-line options for calling these
functions...
Cheers, TIA &c =),
Eduardo Ochs