Hi all,
I'm very new to browser automation so I'm looking for best practice. I can do hacks like putting in a loop to sleep and check if browser.url has changed from the form URL, hoping there's something more deliberate in the Splinter/Selenium API than that though.
My code is automating a form submission which includes a file upload.
browser.visit(url)
browser.attach_file("file", filename)
browser.find_by_name('commit').first.click()
Fortunately the form is able to fill most fields with URL parameters, so I'm doing that with the visit(url) call. When I click the commit button it starts uploading the file. I just assumed that this call wouldn't return until the server has finished receiving the file and sent back a response, so I'm checking that the response has the right title/url/content
project_url = "{}/{}/{}".format(
app.config['GITLAB_WEB_URL'],
path,
project
)
if (browser.title.startswith("Import in progress")
and browser.url == project_url + "/import"
and browser.is_text_present("Please wait while we import the repository")):
msg = "Project is importing, check at: {}".format(project_url)
else:
# something went wrong
However in taking screenshots in the
else I'm seeing that the browser is still on the form page, so it must still be uploading the file for the form submission. I looked at the full
driver-and-element-api page on
readthedocs.io and it wasn't obvious to me if there's a method to wait for the browser to have loaded the response after the form submission has really completed.