My goal is to just get the texts from each spefic URL.
Im good with Python, however didnt had much success with the comcraw package.
Can you give some hints for starting points :)? Greetings :D
Sebastian Nagel
unread,
Jan 19, 2023, 6:22:00 AM1/19/23
Reply to author
Sign in to reply to author
Forward
Sign in to forward
Delete
You do not have permission to delete messages in this group
Copy link
Report message
Sign in to report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to common...@googlegroups.com
Hi Jonathan,
> with the comcraw package.
I guess you mean [1]?
> I have a large (200k) list of ulrs
Given the size of the list, you better use the columnar index.
I've described the general idea and procedure in [2].
But there are other ways to do this as well, maybe even
more efficient. See [3], for just one example. Which programming
language and platform you want to use? The 200k URLs are about
200k web pages or web sites (domains)? Do you want only the most
recent capture per web page or also the history?