Sure! If you have Google Chrome and you look at the AI clip page, and press CTRL/SHIFT/J, and click on the Elements tab in the Developer window that opens up, you can right click things on the page and it'll select the HTML for that element in the Dev window so you can take a better look. W3Schools has some CSS references that can help you learn about the selectors that let you tell ParseHub which elements you're looking for. You can change a regular ParseHub auto Select statement into CSS by using the green button in the left sidebar when the command is selected. The big one here is the CSS selector for the summary, .flex:has(input) + div:contains("Title:"), which means to find an element with class "flex" and which has an input box inside it (the yt url text box), and choose the following sibling if it's a div element that contains text "Title:", meaning the summary's ready; and in Parsehub it says to wait for such an element if not found for up to 2 minutes. There is a loop actually, for each item in URLs, the array in the Start Value. You can fill this in by opening the CSV with your URLs in it, in Excel, then copying the column and pasting into Visual Studio Code, and then find&replace using regular expressions: \n -> ",\n", then copy and paste the result into the Start Value between the [] brackets.
You also if the URLs was a list similar to the list1 our project makes new entries in, made earlier in the scrape, you could later in the same project run make a loop For each item in list1, and then refer to item.URL in the expression box in the left sidebar for Extracts, If's, etc. You can even make a new project which grabs the last known data from your old Youtube URLs-only project using
https://www.parsehub.com/api/v2/projects/{PROJECT_TOKEN}/last_ready_run/data?api_key={API_KEY}and filling in the {} items with info gathered from Project Settings on your old Youtube URL project. Make a Go to template in the new project using this as the GET address between double quotes, and then Extract obj as the first command in the new template, and choose JSON object as the thing to extract in the drop down list where it usually says Element text. Then you can do things like For each item in obj.list1 and use item.URL in later commands. When you're done, Extract obj again but this time set it to 0; this cleans up the workspace for your results. Hope this helps!