Exceeded maximum execution time

189 views
Skip to first unread message

Michael Timpano

unread,
Feb 13, 2024, 4:13:16 PMFeb 13
to Google Apps Script Community
I am hoping I can get some direction around the issue of exceeding maximum execution time on a google apps script run.

I have been searching this forum and online to find a work around for this issue. I am not 100% certain that I have structured the code in the most efficient and effective way. I have read about async and another method whereby you can track the last row that was processed and continue the batch run from there. I haven't done either of those methods and would like to find out the best way to approach this scenario.

I am making an API call to retrieve order numbers for a particular month (this takes a few seconds to complete and there are around 2100 numbers/rows). I have successful returned this data into an array named  let allOrderIds = [];
the script then calls another function to process each order number id which is required for the next API call to passed into a url

function processOrderTransactions(orderIds)
let allTransactionsData = []
orderIds.forEach(orderId => {
const transactionsUrl = `${rootUrl}orders/${orderId}/transactions.json`;
try {
      let response = UrlFetchApp.fetch(transactionsUrl, params);
if (response.getResponseCode() === 200) {
        console.log(`Successfully fetched transactions for order ID: ${orderId}`);
const transactionsData = JSON.parse(response.getContentText())
const data = transactionsData.transactions
        if (data) {
          let orderId, amount
data.forEach(txn => {
orderId = txn.order_id
amount = txn.amount
allTransactionsData.push([orderId, amount]) })
}
 } else {
        console.error(`Failed to fetch transactions for order ID ${orderId}: ${response.getResponseCode()}`);
      }
    } catch (error) {
      console.error(`Error fetching transactions for order ID ${orderId}:`, error.toString());
    }
  });
  appendTransactionsToSheet(allTransactionsData);
  console.log("Data writing complete. Process finished.");
}
Thanks in advance for any feedback or input with this. P.S. creating apps scripts is kind of addicting!!!

Michael O'Shaughnessy

unread,
Feb 13, 2024, 10:20:30 PMFeb 13
to google-apps-sc...@googlegroups.com
The "standard" approach I have seen is to set up a timer and check for the elapsed time.  If time is "up" create a trigger to run the script again.  Then once the "long running process" has finished you delete the trigger.

Here is a link that discusses this approach:

Hope this helps!

--
You received this message because you are subscribed to the Google Groups "Google Apps Script Community" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-apps-script-c...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-apps-script-community/11e69c59-4829-4918-b3a6-4bfd91d2b1b4n%40googlegroups.com.

Keith Andersen

unread,
Feb 13, 2024, 11:44:46 PMFeb 13
to google-apps-sc...@googlegroups.com
Michael Timpano,
Are you getting an error in the script editor?

Have you tried running the script on a reduced dataset to see if your script actually runs and is capable of outputting the desired data?

Before we can access the script properly, some debugging needs to take place.

Michael Timpano

unread,
Feb 14, 2024, 7:52:48 AMFeb 14
to google-apps-sc...@googlegroups.com
Hello Keith,

I am not getting any errors in the script or logs.  I did test a single order id number and it returned the desired data to the spreadsheet. Thanks for your response. 


You received this message because you are subscribed to a topic in the Google Groups "Google Apps Script Community" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/google-apps-script-community/HyCxPjvwyI4/unsubscribe.
To unsubscribe from this group and all its topics, send an email to google-apps-script-c...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-apps-script-community/CAFKgK%2BEA3x5oMDjwTTDM5j-evcR6LdNmxhe8zF6Mipahqu2U3w%40mail.gmail.com.

Michael Timpano

unread,
Feb 14, 2024, 7:53:42 AMFeb 14
to google-apps-sc...@googlegroups.com
Hello Michael, 

Thank you for the linked resource.  I will review it.  Thanks for your response. 

You received this message because you are subscribed to a topic in the Google Groups "Google Apps Script Community" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/google-apps-script-community/HyCxPjvwyI4/unsubscribe.
To unsubscribe from this group and all its topics, send an email to google-apps-script-c...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-apps-script-community/CAHNYQLiNCYNKfa1hgN3uq5x5Uq0kasRHb2NS%2B0dizuALCGHMaw%40mail.gmail.com.

Michael Timpano

unread,
Feb 15, 2024, 6:01:26 PMFeb 15
to Google Apps Script Community
@Michael O'Shaughnessy and @Keith Andersen

 The sample code you provided was helpful, but I am confused how to structure it. The example uses data that already exists in a spreadsheet. I am using order numbers stored in an empty array and then batch process that array using the last order id processed. The hope is to write the data all at once when the script completes processing. Maybe I should write to the sheet in batches?

This is what I am working with so far;
function processOrdersBatch(orderIds) {
  const startTime = new Date().getTime();
  const maxExecutionTime = 280000; // leaving some margin from 6-minute limit

  // Retrieve the last processed order ID, if any
  let lastProcessedOrderId = getLastProcessedOrderId();
  let startIndex = lastProcessedOrderId ? orderIds.indexOf(lastProcessedOrderId) + 1 : 0;


  for (let i = startIndex; i < orderIds.length; i++) {
    const orderId = orderIds[i];
    // Process the order ID (e.g., fetch transactions, calculate totals, etc.)
    processOrder(orderId);

    // Check if approaching the execution time limit
    if (new Date().getTime() - startTime > maxExecutionTime) {
      console.log("Approaching max execution time, stopping...");

      setLastProcessedOrderId(orderId);
      createTriggerIfNeeded(); // Correctly adjusted to target a wrapper function if needed
      return; // Exit function to ensure trigger creation logic is not bypassed

    }
  }
  // Delete the trigger if all orders have been processed
  deleteTriggerIfNeeded();
}

function setLastProcessedOrderId(orderId) {
  const scriptProperties = PropertiesService.getScriptProperties();
  scriptProperties.setProperty('lastProcessedOrderId', orderId.toString());
}

I don't know if this is the most effective and efficient way to reduce script time runs...the was successfully writing each row to the sheet one at a time, but ran into an error when the trigger ran the second time.

try {
    let response = UrlFetchApp.fetch(transactionsUrl, params);

    // console.error(`Error fetching transactions: ${error} Response: ${response.getContentText()}`);
    if (response.getResponseCode() === 200) {
      console.log(`Successfully fetched transactions for order ID: ${orderId}`);

      const transactionsData = JSON.parse(response.getContentText()).transactions;
      transactionsData.forEach(txn => {
        let data = [txn.order_id, txn.id, txn.kind, txn.gateway, txn.status, txn.message, txn.created_at, txn.processed_at, txn.amount];
        allTransactionsData.push(data);
      });

      // Update the last processed order ID
      setLastProcessedOrderId(orderId);

    } else {
      console.error(`Failed to fetch transactions for order ID ${orderId}: ${response.getResponseCode()}`);
    }
  } catch (error) {
   
    console.error(`Error fetching transactions for order ID ${orderId}:`, error.toString());
  }
  console.log("All transactions have been fetched. Now writing to sheet...");

  if (allTransactionsData.length > 0) {
    appendTransactionsToSheet2(allTransactionsData, spreadsheetId, transactionsSheetName);
    console.log("Data writing complete. Process finished.");
  }


I hope that makes sense in what I am trying to do. Thanks for your help. 

George Ghanem

unread,
Feb 19, 2024, 8:28:31 PMFeb 19
to google-apps-sc...@googlegroups.com
You are missing the getLastProcessedOrderId function, or maybe you just did not copy paste it into email.

Just a question on this, is there not a way for you to request all the orderId in one request rather than making one request at a time? Each URL request can take 3-5 seconds to process and doing 2100 of them will certainly not be possible ever within the 6 mins window.

Your code seems to be on the right path to do it via additional triggers.


Michael Timpano

unread,
Feb 20, 2024, 10:28:06 AMFeb 20
to google-apps-sc...@googlegroups.com
@George Ghanem thank you for getting back to me. 

I have a function that pulls all the order id numbers into an empty array and return those values; 
function getOrderIdsForTrans(period, ordersEndpoint) .... return allOrderIds // Return the array of order IDs Then I call this function to pass each order id number to retrieve the data for each one. Is there a better and faster way of retrieving this data?
function processOrderTransactions(orderId, spreadsheetId, transactionsSheetName) {
  const rootUrl = ROOT;
  let allTransactionsData = [];

  if (!orderId) {
    console.error("Invalid order ID:", orderId);
    return; // Skip this iteration or handle the error appropriately
  }


  // Construct the URL for fetching transactions for a specific order
  const transactionsUrl = `${rootUrl}orders/${orderId}/transactions.json`;
  console.log(`Fetching transactions for order ID: ${orderId}`);

  try {
    let response = UrlFetchApp.fetch(transactionsUrl, params);
    if (response.getResponseCode() === 200) {
      console.log(`Successfully fetched transactions for order ID: ${orderId}`);
      const transactionsData = JSON.parse(response.getContentText()).transactions;

      // Process each transaction
      transactionsData.forEach(txn => {
        allTransactionsData.push([
          orderId, txn.id, txn.kind, txn.gateway, txn.status,
          txn.message, txn.created_at, txn.processed_at, txn.amount, txn.currency
        ]);
      });

    } else {
      console.error(`Failed to fetch transactions for order ID ${orderId}: ${response.getResponseCode()}`);
    }
  } catch (error) {
    console.error(`Error fetching transactions for order ID ${orderId}:`, error.toString());
  }
  return allTransactionsData;
}

You received this message because you are subscribed to a topic in the Google Groups "Google Apps Script Community" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/google-apps-script-community/HyCxPjvwyI4/unsubscribe.
To unsubscribe from this group and all its topics, send an email to google-apps-script-c...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-apps-script-community/CAA4K68YUW-V7xNAzi935NWNgmV%2Bzxn9XEB2G5-T0TZinZmuNCA%40mail.gmail.com.

Keith Andersen

unread,
Feb 20, 2024, 11:06:16 AMFeb 20
to google-apps-sc...@googlegroups.com
Any call to fetch data from a URL is going to consume time. Processing all at once times out.

In a previous test, a smaller number of URL calls actually worked.

So, in testing I would find the nearest maximum number of URL calls that work. This will give you an idea of how much you must divide the URL calls. 

Once that is determined, you can create a process where you:
1 manually run the function X times to accomplish the task

2 set up a timed trigger with a counter to run the function X times and upon reaching the counter max - the trigger still executes - but the function is set not to run based on reaching the counter max. ? Every 10 - 15 minutes?

I would start by collecting the 2100 URLS and writing them to a sheet. Then manually or timed trigger - create a function that collects X number from the sheet to execute the needed task to gather URL specific data. 

For example, collect rows 1-500 - process in one function call. Second function call 501-1000. Etc... you could set the ending row in a helper cell somewhere or in property services and have the function pick it up as the next array starting point.

Hope that makes sense.

Passions: God, Family, Scriptures, Learning, Data Management, Google Sheets + App Script and much more!

Michael Timpano

unread,
Feb 20, 2024, 11:34:07 AMFeb 20
to google-apps-sc...@googlegroups.com
@Keith Andersen thanks for your help. I did implement something similar to what you were suggesting. I didn't want to write data to the sheet until each batch was processed. This approach seems to work well for larger datasets. This is part of the script if it helps anyone else with dealing with the timed out issue.
    // Check if time limit is approaching
    if (new Date().getTime() - timeStart > 280000) { // Adjusted to 4.6 minutes
      console.log("Approaching Google Apps Script time limit. Scheduling continuation.");
      PropertiesService.getScriptProperties().setProperty('LAST_PROCESSED_ORDER_ID', String(orderIds[i]));
      ScriptApp.newTrigger('continueJob')
        .timeBased()
        .after(60000) // Schedule after 1 minute
        .create();
      break; // Important to break here to temporarily stop execution
    }
  }

Keith Andersen

unread,
Feb 20, 2024, 11:47:51 AMFeb 20
to google-apps-sc...@googlegroups.com
Nice. 

Does setting and deleting the trigger call for authorisation?


Passions: God, Family, Scriptures, Learning, Data Management, Google Sheets + App Script and much more!

Michael Timpano

unread,
Feb 20, 2024, 11:50:27 AMFeb 20
to google-apps-sc...@googlegroups.com
Aside from the first time I authorized the script run, there was no trigger call for authorization.

Keith Andersen

unread,
Feb 20, 2024, 11:54:13 AMFeb 20
to google-apps-sc...@googlegroups.com
I have a feeling that the call to check time is time consuming in and of it self. 

How many batch runs are you making and what's the total time?

Passions: God, Family, Scriptures, Learning, Data Management, Google Sheets + App Script and much more!

Michael Timpano

unread,
Feb 20, 2024, 11:58:56 AMFeb 20
to google-apps-sc...@googlegroups.com
The initial function took 292.487s then the continueJob function ran 2 more times to complete the batch at 289.797s and 24.683s.



Keith Andersen

unread,
Feb 20, 2024, 12:08:28 PMFeb 20
to google-apps-sc...@googlegroups.com
Good working it out! 
Cheers
Keith 

Passions: God, Family, Scriptures, Learning, Data Management, Google Sheets + App Script and much more!

Michael Timpano

unread,
Feb 20, 2024, 12:11:36 PMFeb 20
to google-apps-sc...@googlegroups.com
2 weeks of my time I will never get back!...lol, but learned something new with apps script. 👍

Keith Andersen

unread,
Feb 20, 2024, 12:14:30 PMFeb 20
to google-apps-sc...@googlegroups.com
👍😁😉
We both have that T shirt!

Passions: God, Family, Scriptures, Learning, Data Management, Google Sheets + App Script and much more!

George Ghanem

unread,
Feb 21, 2024, 6:03:35 PMFeb 21
to google-apps-sc...@googlegroups.com
Hi Michael,

I would not know if there is a faster way to pull the order info. You should review the API documentation for the service you are using to see if another API call is more efficient. They may provide you ability of getting info on a set of order Id's instead of one by one. Only way to find out is to look through their documentation.


Michael Timpano

unread,
Feb 21, 2024, 6:13:32 PMFeb 21
to google-apps-sc...@googlegroups.com
Thanks George,  I appreciate your help. 

Reply all
Reply to author
Forward
0 new messages