I’ve been using Alteryx now for a little over 2 months and have gotten quite comfortable with the tools in the first few collections (e.g. In/Out, Preparation, Join, etc.). While I haven’t used all of them, I can quickly throw together a workflow that connects, joins, parses, filters & summarizes data and then outputs a tidy TDE for my analysis. But for some reason the Interface tools, and the macros and apps they enable, have confused me. Unlike my experience with the other tools in Alteryx, all of my initial attempts to use the Interface tools failed.
I’ve read numerous community posts and watched several videos about how to set up a macro, and even had a couple one-on-one meetings with Mac Roach, but my synapses just weren’t making the connections I needed them to…until this week, when the right amount of laziness, stubbornness and OCD combined to break through my mental block. So for those of you who may be similarly confused by how to set up macros and apps in Alteryx, I offer you my experience in the hopes it short cuts your learning time.
At our company, business users like myself don’t have direct access to our data warehouse tables but have to query the data via Business Objects universes. The intent is for people to do all of their analysis inside of Business Objects, but there’s no way I’m going to do that if I have a tool like Tableau at my disposal. So my team has set up dozens of queries in Business Objects, which delivers the data to us in the form of multiple Excel files saved to a network directory. We then use Alteryx to scrub, augment and union these files together into TDEs.
To make the Alteryx workflow run faster, we tend to copy the files from the network directory over to a local directory (e.g. our laptop or a shared team directory). I initially did this inside of Windows Explorer (copy/paste) but learned how to build a workflow in Alteryx to do all of that for me. Its key steps are:
- Identify a source and destination directory, and the file naming structure of the files I want to copy.
- Join the source and destination directory inputs by file name. There will generally be the same number of files in both directories (since the main purpose is to just update existing files), but I added a couple steps to accommodate cases where the # of files are not the same, or when there are no files yet in the destination directory.
- Check the time-stamp of both the source and destination files and tag the source files that are newer than their destination counterparts.
- Create a command formula that will be sent to the Run Command tool to execute a copy/paste of the new files.
- Copy/paste the new files from the source to the destination directory
These steps are generic enough that I found myself using this same workflow for a number of different data projects, manually changing out the directory and file naming specs each time. I knew that this workflow was a perfect candidate for a macro, since only Step 1 (i.e. the inputs) ever changed. What I wanted to be able to do was to select the project I was working on and then have the appropriate directory and file spec variables passed into this routine.
Based on what I had read, I would first need to turn the copy/paste workflow into a macro that I could then call from an analytical app (which is where I would be able to select the project I wanted to run the workflow for).
Building the Macro
All of the tools needed for building a macro (or app) are found in the Interface collection of tools.
The only thing I need to do to turn this into a macro is add steps that will allow me to feed different values into the two directory tools. There are two tools that are needed to enable that:
- Control Parameter – this tool is the configuration parameter(s) of the macro when it is called from any other module. In other words, it tells the user of the other module what this macro needs fed into it. So the Label setting (the only setting the Control Parameter has) should be clear (e.g. “Source Directory”, etc.)
- Action – this is what takes the value entered into the Control Parameter and passes it into the macro workflow.
Since I have two Directory tools in my workflow, and each Directory tool needs two inputs (a directory folder and a file spec), I will need to set up my interface tools to capture four variables. Since the file spec is the same for both the source and destination directory, I only need three Control Parameters (i.e. inputs) but four Action tools, since I need the same file spec variable passed to two different Directory tools.
As stated before, the Control Parameters are very easy to set up. Just give them a descriptive label. The Action tools, while also easy, are a bit more involved. Ultimately, though, all we need to do is tell it how to replace the values that are currently in the Directory tool.
To do that, we keep the default action type (Update Value) and then select which of the Directory tool variables we are replacing. In the example below, we are going to be replacing the directory variable. So we select that and then check “Replace a specific string”. Now this Action tool is set up to take whatever value is passed through the Control Parameter and feed that into the Directory tool, replacing the value that was put there initially.
We do the same thing for the Action tool that updates the File Spec. Only on this Action tool we select the FileSpec variable.
Once we’ve set up the four Action tools, we can save this module as a macro and move on to the next step.
Calling a Macro from an App
To add our macro to a new or existing workflow, just right-click on the canvas and select Insert -> Macro. We just need to browse to where we saved our macro and select it. It is added to the canvas as a lovely black (or sometimes blue) circle. I haven’t figured out what the colors mean, but it doesn’t seem to matter.
Initially it will be angry (red exclamation point!) until it is configured. In the Configuration window, we can see the three Control Parameters we set up in the prior step – destination directory, source directory and file spec. It is asking us to Select a Field, but there is no input stream yet that will give us a field to select. So we need to add one.
There are a number of ways to add an input to a macro, but since I want to be able to pass different values based on the project I select, I just need to set up a template input. And the easiest way to do that (in my experience) is via the Text Input tool. Just add a new Text Input tool to the canvas and add three columns, one for each of the variables we need to pass to the macro. It doesn’t matter what values we place in each of those fields (since they aren’t going to actually get used), but for clarity I like to add values that look like what I’ll be using.
Once we connect the Text Input tool to the Macro tool, we can then assign the fields in the Text Input tool to each of our macro parameters.
So far so good. But now we need to override the Text Input fields with the values we actually want passed to the Macro tool. To accomplish that we will again utilize the Action tools, only in a slightly different way.
Before we add the Action tools, though, we need to add a tool that lets the user make a selection that will determine which directory and file spec values to use. There are drop-down menus, list menus, radio buttons, etc, all of which can be used to present the user with a series of choices. In this case, I am going to use the Drop Down tool since I want to be able to select a single value from a list of several options.
The Drop Down tool can be configured to display lists via a number of different sources, but for this example I just want to type in a list of project names that I can choose from. The “Manually set values” option lets me do that.
After that I need to add one Action tool for each of the three configuration settings for my macro. Unlike before, where we used the default action type of “Update Value”, we need to select “Update Value with Formula”. This will present us with options for what we want to update. Under the Data menu, you will see the three values that we added to our Text Input template. Those are the values we want to replace. Select one of them and then click the little button to the right of the Formula bar at the bottom. This will open the Edit Formula window, where we can enter the formula we want to use. In this case, I want a different source directory to be used depending on which project is selected in the drop-down. A simple If statement will do the trick.
if [#1] = “Waterfall” then
elseif [#1] = “Project Connect” then
elseif [#1] = “PLSID” then
else “” endif
After we’ve done that for all three Action tools, we have a workflow that looks like this:
To run this app, we just click the magic wand button next to the green “play” button. A window with the drop-down menu will be presented, where we can make a selection and then click the Finish button to run the macro.
Tethering a Second Module to the App
So now I have an app that can kick off the Copy Files macro and feed it different directory and file spec variables based on the project I am working on. Pretty cool. To make this even better, I’d like to kick off the module that unions the files and generates the TDE once the files have been copied to the destination directory. That way I can get all of my data prep done with a single click of a button.
For this to work, I only need to expand upon what we’ve already built. The key addition is another macro that can kick off a module based on an input parameter. I’ve tried to get this to work with the Run Command tool, but it’s been very inconsistent. Luckily, there is a set of macros available from Adam Riley (and others) that includes a Conditional Runner tool. This tool has a single parameter, namely the location and name of the module we want to run. Since the module will differ based on the project I’m working on, I need to wrap the Runner tool inside of another macro so that I can change the module that is run.
Just as with the first macro, the key steps are:
- Add a Control Parameter with a descriptive label (Module to Run)
- Add an Action tool to take the Control Parameter and pass it to the Runner tool.
- Add a Text Input tool that has a single field (“Module Path”) that serves as the input template
Once that macro is saved, it can be added to the App workflow by right-clicking and selecting Insert Macro. To accommodate this second macro, a few enhancements to the App workflow are required:
- Add a field to the Text Input tool for “Module to Run”. This will then be connected to the Run Module macro’s Control Parameter.
- Add a fourth Action tool that will generate the “Module to Run” value using a formula similar to the other three Action tools. E.g:
- Add a Block Until Done tool (the tool with the big grey hand in the middle) and attach the two macros to it, with the Copy Files macro attached to the 1st output and the Run Module macro attached to the second output. The Block Until Done tool will make sure that all of the files are copied before the module that uses them is run.
Now I have a single, simple module that I can use to kick off all of my key data prep workflows. I just select the project I’m working on and go do something else while Alteryx copies the requisite files, does the data preparation and generates the TDE. No more need to open up multiple workflows or run them independently.
I suspect there are other ways to achieve this (Alteryx Server being one of them), but I am really happy with the efficiency this has added to my daily and weekly routines and that (once my brain connected the dots on how macros worked) it didn’t take very long to set up.
Thanks for reading.