Groups
Sign in
Groups
ScraperWiki
Conversations
About
Send feedback
Help
ScraperWiki
Contact owners and managers
1–30 of 860
Welcome to the ScraperWiki Google Group!
If you're stuck on some code, or would like to ask for help finding/scraping/cleaning data, then you've come to the right place.
Mark all as read
Report group
0 selected
Alessandro Zonin
9/20/23
Social media scraping with Pahntombuster and Apify
Dear all, after a period of extensive experimentation and exploration in the dynamic field of social
unread,
Social media scraping with Pahntombuster and Apify
Dear all, after a period of extensive experimentation and exploration in the dynamic field of social
9/20/23
Concha Catalan
9/4/20
Spanish Civil War non-profit needs help scraping
Hello, If interested, please send me an email. Thank you! Concha Catalan ihr.world Haz una donación /
unread,
Spanish Civil War non-profit needs help scraping
Hello, If interested, please send me an email. Thank you! Concha Catalan ihr.world Haz una donación /
9/4/20
matze
12/25/17
fetch all the data out of this european network - for volunteering organisations in Europe
dear experts, first of all - merry christmas and happy seaasons greetings to all of you. can we fetch
unread,
fetch all the data out of this european network - for volunteering organisations in Europe
dear experts, first of all - merry christmas and happy seaasons greetings to all of you. can we fetch
12/25/17
Gabriela Campedelli
2
10/10/17
Facebook data without Opengraph
I found a code that is able to login but I have to understand more about cookies because it is giving
unread,
Facebook data without Opengraph
I found a code that is able to login but I have to understand more about cookies because it is giving
10/10/17
Martin Kaspar
,
Thad Guidry
2
9/25/17
Re: [scraperwiki] a list of all US hospitals - without the number of beds?
Yes, you can get that kind of list by contacting the US Dept of Health and Human Services. just email
unread,
Re: [scraperwiki] a list of all US hospitals - without the number of beds?
Yes, you can get that kind of list by contacting the US Dept of Health and Human Services. just email
9/25/17
matze
,
Thad Guidry
2
9/25/17
....a list of all US hospitals with the number of beds?
If I recall the data is in one or many of these https://catalog.data.gov/dataset?ext_prev_extent=-
unread,
....a list of all US hospitals with the number of beds?
If I recall the data is in one or many of these https://catalog.data.gov/dataset?ext_prev_extent=-
9/25/17
ai...@sensiblecode.io
8/1/17
Jobs: We're hiring software engineers - pls spread the word.
Sensible Code make products that turn messy data into valuable information. We're looking for
unread,
Jobs: We're hiring software engineers - pls spread the word.
Sensible Code make products that turn messy data into valuable information. We're looking for
8/1/17
Sérgio Spagnuolo
6/12/17
Cannot see the output
Hello, I am trying to extract some tweets using Yanofsky's nice code. I can run it alright in my
unread,
Cannot see the output
Hello, I am trying to extract some tweets using Yanofsky's nice code. I can run it alright in my
6/12/17
Karl Norrena
,
Aine McGuire
2
12/20/16
unable to open database file
Hello Karl, Thank you for your message and for using our service :-) I'll pass this to our
unread,
unable to open database file
Hello Karl, Thank you for your message and for using our service :-) I'll pass this to our
12/20/16
Francis Irving
8/2/16
Remote software engineer job at ScraperWiki
Hi! We're hiring a software engineer at ScraperWiki, for the first time we're offering remote
unread,
Remote software engineer job at ScraperWiki
Hi! We're hiring a software engineer at ScraperWiki, for the first time we're offering remote
8/2/16
Francis Irving
3/31/16
Project Coordinator / Marketing job at ScraperWiki
Hi all! Just to let you know, we've a Project Coordinator job going at ScraperWiki. https://blog.
unread,
Project Coordinator / Marketing job at ScraperWiki
Hi all! Just to let you know, we've a Project Coordinator job going at ScraperWiki. https://blog.
3/31/16
Marc NEVOUX
2/2/16
e-Réputation
Hello everyone , I want a developer that can help me change / clean / remove malicious data on
unread,
e-Réputation
Hello everyone , I want a developer that can help me change / clean / remove malicious data on
2/2/16
Helical Insight
8/25/15
Is your BI tool really intelligent ?
Is your #BI implementation ready to scale up for ALL of your future requirements? https://lnkd.in/
unread,
Is your BI tool really intelligent ?
Is your #BI implementation ready to scale up for ALL of your future requirements? https://lnkd.in/
8/25/15
Francis Irving
,
Thad Guidry
3
8/24/15
We're hiring! Technical Architect
Didn't know that about sunshine! Near the end of the job posting we point out... It takes only 1½
unread,
We're hiring! Technical Architect
Didn't know that about sunshine! Near the end of the job posting we point out... It takes only 1½
8/24/15
afzal bilakhiya
7/25/15
AISHA MARINE / MARINE AND INDUSTRIAL PRODUCTS PLEASE CONTACT US.
RESPECTED SIR/MADAM., Good days, We would like to inform you that we are engaged in business of
unread,
AISHA MARINE / MARINE AND INDUSTRIAL PRODUCTS PLEASE CONTACT US.
RESPECTED SIR/MADAM., Good days, We would like to inform you that we are engaged in business of
7/25/15
Jefferson Furtado
7/10/15
Libera informação de gastos somente diária... Como raspar?
Fala galera! Esse site aqui: http://187.11.133.15/portaldatransparencia/ Libera informações somente
unread,
Libera informação de gastos somente diária... Como raspar?
Fala galera! Esse site aqui: http://187.11.133.15/portaldatransparencia/ Libera informações somente
7/10/15
Paul Bradshaw
,
Thomas Levine
2
6/27/15
404 pages and masking IP address
Set up your own proxy or hire one. http://proxymesh.com/ Configure the proxy like this. export
unread,
404 pages and masking IP address
Set up your own proxy or hire one. http://proxymesh.com/ Configure the proxy like this. export
6/27/15
Филипп Кац
,
Steven Maude
2
5/7/15
PhantomJS
On 06/05/15 18:26, Филипп Кац wrote: > Just realised there is a preinstalled selenium in my
unread,
PhantomJS
On 06/05/15 18:26, Филипп Кац wrote: > Just realised there is a preinstalled selenium in my
5/7/15
Филипп Кац
,
Peter Waller
2
5/5/15
SQL error: database or disk is full
Hi, This was a temporary issue with our platform which should now be resolved. Thanks for getting in
unread,
SQL error: database or disk is full
Hi, This was a temporary issue with our platform which should now be resolved. Thanks for getting in
5/5/15
Carlos Weffer
,
Peter Waller
3
4/8/15
Latest Version of my Datasets are gone
Hi Peter You are right. I was not aware that my scraper classic account had been migrated into the
unread,
Latest Version of my Datasets are gone
Hi Peter You are right. I was not aware that my scraper classic account had been migrated into the
4/8/15
ad...@indokasih.com
,
Patrick Comerford
2
4/1/15
This dataset is empty.
I had the same problem. The weird thing is that the data is there. If you click Query with SQL and
unread,
This dataset is empty.
I had the same problem. The weird thing is that the data is there. If you click Query with SQL and
4/1/15
Patrick Maynard
2/9/15
Uber API
Passing this along from a coworker. Has anyone successfully run a Python script to query Uber's
unread,
Uber API
Passing this along from a coworker. Has anyone successfully run a Python script to query Uber's
2/9/15
Martin Kaspar
2
2/5/15
request: accredited EVS organisations and volunteering programmes - ++ 3900 records
hello again , one addtional note: regarding the scraping interests for the scraper of the
unread,
request: accredited EVS organisations and volunteering programmes - ++ 3900 records
hello again , one addtional note: regarding the scraping interests for the scraper of the
2/5/15
Jonathan Cox
, …
'Dragon' Dave McKee
5
1/12/15
Help: how to use scraperwiki library to save data locally
If you're looking to run queries over it, you should just be able to install sqlite3 (sudo apt-
unread,
Help: how to use scraperwiki library to save data locally
If you're looking to run queries over it, you should just be able to install sqlite3 (sudo apt-
1/12/15
SkyScraper
,
'Dragon' Dave McKee
3
1/5/15
How to submit a form
Thanks Dragon, much appreciated. Happy new year to you. Op maandag 5 januari 2015 12:53:42 UTC+1
unread,
How to submit a form
Thanks Dragon, much appreciated. Happy new year to you. Op maandag 5 januari 2015 12:53:42 UTC+1
1/5/15
Victoria Parsons
12/16/14
JOB: TBIJ hiring a developer
Hi all, We're looking for someone who can build tools for investigations as well as being a data
unread,
JOB: TBIJ hiring a developer
Hi all, We're looking for someone who can build tools for investigations as well as being a data
12/16/14
Desmond Purcell
, …
Páll Hilmarsson
3
11/21/14
Web Scrapper - List of UK Starbuck Location Address
The Starbucks web has an undocumented API for store locations: https://openapi.starbucks.com/location
unread,
Web Scrapper - List of UK Starbuck Location Address
The Starbucks web has an undocumented API for store locations: https://openapi.starbucks.com/location
11/21/14
Andrey Tomashevskiy
, …
Gasper Zejn
9
10/29/14
pdftoxml Error
Another alternative you could try is installing pypdf2xml [1], and then running from StringIO import
unread,
pdftoxml Error
Another alternative you could try is installing pypdf2xml [1], and then running from StringIO import
10/29/14
yaseen hussain
,
JJ M
2
9/17/14
Request for Help
You do realise that TripAdvisor specifically prohibits scraping in their term and conditions? http://
unread,
Request for Help
You do realise that TripAdvisor specifically prohibits scraping in their term and conditions? http://
9/17/14
Nilanjan Bhattacharya
,
Steven Maude
2
8/29/14
how do you output data to a csv using Ruby
Hi Nilanjan, I don't use Ruby personally, but I'd guess the easiest way is probably save your
unread,
how do you output data to a csv using Ruby
Hi Nilanjan, I don't use Ruby personally, but I'd guess the easiest way is probably save your
8/29/14