Ihave the drive mapped on the server where the macro is located, and have tried a couple of local locations for hosting the macros as well(I mapped them in using the User Settings). However, whenever I schedule my workflow, it tells me that it cannot find my macro file.
FYI, the macro I created is a knockoff of the publish to tableau so our team could easily switch between out Prod/Dev/QC environments using our functional ID without having to create a new tool or go looking around for the url.
I sort of figured it out. When I saved it to the gallery(We still aren't sure how we are going to keep all of our workflows organized) It messed up all of my dependencies. When I scheduled from the version saved to my hard drive, everything worked great. I believe if I saved without the assets selected, it may solve this issue.
I have several apps published to the Gallery with Macros and other data on shared drives. You are correct - unselect the assets during the publishing routine and the app in the Gallery will get them from their location.
The problem that I would currently like some help with is around scheduling a workflow with a macro in it. My macro is saved on the server in a folder intended to hold everyone's macros. My installation is pointed at that macro folder as the 'Macro Repository' as recommended in the help pages. The installation that runs on the server is also pointed at that same macro folder as it's default (and only) macro repository. I can run the macro without errors from either my machine, or the server's installation, but when scheduled, I get an error that the macro can not be found. Do I also need to somehow point the scheduler (controller) at that 'macro repository', or am I doing something else wrong? In all cases, if I eddit the XML and fill in the full file path, it works fine from all machines and the scheduler.
You have the option to run only the currently selected code (Run > Run selected code in the Script Editor). However, depending on how complex your macro is, this will not work with for loops and/or with any declared variables.
In general, the solution to this dilemma is robust error checking. Where the macro gives an error and stops, you would add some if statement on the line before checking for the condition proactively, and then prompting the user to redraw their region again until they draw a usable one.
Thanks for your prompt response, and really appreciate your help! I was able to change the properties and re-run the macro, however, there is only one data output for some reason (see screenshot below). It does not seem to make sense cause I'd expect there would be at least thousands of data output. Did you happen to know what the issue might be? Thank you!
Hi @cq,
As a leader in the Alteryx Community, I have the ability to identify & mark accepted solutions on behalf of community members - and recently did so on this thread. If you have any questions or concerns with the solution(s) I selected please let me know by replying to this post.
i do want to point out that you don't necessarily need to do step 2 (don't need to select the macro output tool) - just go to Interface designer, then click on cog to change the output mode to Auto Configure by Name. I tested @jasperlch's solution and it worked for me as stated, so i've marked it as the answer.
This problem has occured when loading several files within a macro and one or more of them contained only the header row and no data. I have tried a number of permutations of input / dynamic input / dynamic rename tools but I have not yet figured out how to fix the problem.
I am operating under Version 2022.1.1.42590, and getting to the Properties panel wasn't obvious. In case anyone else is wondering: To get to the properties option to fix the "schema mix-match problem" I had to first make sure the Interface Designer is active:
View> Select: Interface Designer (Ctrl+Alt+D) > Click on the Properties Icon > Update the Output Mode to 'Auto Configure by Name'.
Hey @Daniel_Kaeppel, if you go to Options > Advanced Options > Workflow Dependencies, is your Workflow still referencing your macro from an absolute location there (i.e. locally)? If you try changing this to 'All Relative' or 'All UNC', does that remedy the issue?
@Daniel_Kaeppel when you save your workflow to gallery, did you package the assets (macro) within the workflow? You can do this in the workflow settings > manage workflow assets. I think that'd be the easiest way as the method I outlined above relies on you saving the macro to gallery as well.
I've used a sample tool configured with 'First N (4) rows', then a Join to match, then outputting unmatched remaining records to an external file...to repeat the process. No luck getting either variation of macro (standard, batch, iterative) to loop through the records 4 at a time.
To do this with a batch macro, you need to identify the batches outside the macro, then feed in all the data along with a list of the batches. No need to sample to the first 4 rows within the macro, they will be fed in 4 at a time.
Alternatively, with an iterative macro you would sample the first 4 rows then feed them out of a separate output, and iterate the remaining records. The macro would have two separate outputs, and you need to specify in the interface designer which output gets iterated back to the input.
I'm happy to help further, but the best approach might depend on what you actually want to do with your data. I assume the current process is a simplification for sharing, but what you're doing is possible without any macros at all!
I have a Excel macro that copies a specific sheet from the source workbook to a destination workbook, both are on SharePoint. The macro resides on the destination workbook. I wanted to create a scheduled flow to run the macro every business day but I'm having some issues.
My Power Automate version does not have "Run Excel Macro" action, so I am trying to use "Run Script" action which appears like it can be used to trigger a macro and not just a Office Script, and in the script field I tried a few things but it keeps giving the error that it's a Bad Request - unable to parse script reference.
I tried the following naming conventions for the script field:
Daily Orders!Module1CopySheet (without the dot)
I even tried copying/pasting my entire macro to the script field but to no avail.
Can someone please advise on how I make this happen or if there is alternative method to run this routine job?
@coolbeans23
I've had all sorts of suggestions from AI - its still hit and miss. I'm not an expert on macros so someone else might be able to help re coding scheduled macros or Windows Task Scheduler. Or have a google.
Hmm....two different AI's said you can run Excel macros from Power Automate on cloud (they could possibly be wrong) using "Run Excel Macro" or "Run Script" action (which I only have the latter in my version). I don't have Office Scripts.
Perhaps I can fill you in on more detail, the source workbook has a specific sheet that is connected to a database and displays data where some needs to be corrected, a user remediates any incorrect data in the database then refreshes the Excel sheet to see their changes reflected. My goal is to capture changes to the data for each day of the month to see what has been corrected (if any). My destination workbook has a macro that makes a static copy version of the sheet from the source workbook and creates a new tab with the current date i.e. Orders 4.20.2024, Orders 4.21.2024, Orders 4.22.2024, etc. I just need an automated way for the macro to run every business day at 11pm (even if my computer is shut down), so I thought Power Automate could help accomplish this.
I am designing a macro that has a drop down question which gets populated from a database table. The trouble I have is that the macro seems to cache the database entries from the last time I open the macro, run and safe it. I know it's possible, as for example the "Salesforce Input" tool does exactly that. you enter your credentials, hit Connect and it fetches all available objects to choose from. If I add an object in Salesforce and go back into the Alteryx tool, it automatically refreshes the drop down list and the new object is there.
Thanks, I stumbled across a Youtube video which in essence explained the same thing. the issue was indeed that I populated the drop down tool with a data flow within the same macro - this didn't run until the macro ran.
I didn't need a second workflow though to fix it. As all data is in a database, I just had to change the drop down tool to use an external source and used a select statement as the external source. Now the drop down tool has no dependency on any other tools.
Just to take it one step further. In the Salesforce tool I can select an object which then populates a list box with the columns. Would you know how to populate a second drop down tool based on the selection in the first tool?
Thanks again - if possible I would want to avoid a chain flow to achieve this. the Salesforce Input tools could somehow do it. Unfortunately with the latest version of Alteryx Designer, I can't open the Salesforce Input tool anymore to see how it was done. I know the new tools are Python based which surely allows for more flexibility.
Please help & suggest if there is any workaround for my situation as described below. I have to query data from different databases (for example: DB1, DB2, DB3) but the field names across these databases are exactly same, so the query remains the same. I want to build a macro, such that in the DROP-DOWN i would select the required database and then accordingly the query would work.
Is this possible in Macro Drop-Down? Please help
3a8082e126