www.dataflowmapper.com
Video Transcript
Welcome to a deep dive into DataFlowMapper's Logic Builder. In our previous tutorial, we used a simple Concatenate function to create a FullName field. Today, we're going all in. I'll show you how to use functions, variables, conditional logic, and even our manual Python editor to transform data the way you need. Doing this in Excel would mean a messy, hard-to-read nested formula, and a custom Python script would be a one-off solution. The Logic Builder gives us a reusable, readable, and powerful alternative.
Our problem today is messy, inconsistent data. We have a CSV of transactions where the product's SKU, or Stock Keeping Unit, is a mess. It has inconsistent casing, different separators like underscores and spaces, and some products are missing a SKU entirely. Our goal is to create a perfectly standardized SKU for every single product.
Let's load in our source file and pull the destination headers we need from our destination template. The transformation we need for our 'SKU' field is too complex for just one-to-one mapping. It requires multiple steps: we need to make the text uppercase, replace all the different separators with a consistent hyphen, and handle the blank ones. This is a perfect job for the Logic Builder.
We'll begin by opening the Logic Builder on the field mapping card for the destination field 'SKU'. We'll build this logic step-by-step. Let's navigate to the Functions tab and begin. Our first step is to make the SKU uppercase. I'll click 'Add Function', name it 'convertToUpper', and select the 'UPPER' function from our library, using the source 'SKU' field as the input. I'll click Apply.
Now for the powerful part: function chaining. We need to replace the separators, but we want to do it on the uppercased version we just created. So, I'll create a new function called 'removeUnderscores' and choose the 'REPLACE' function. For the first parameter, instead of choosing a source field, I'll select the output of our first function, 'convertToUpper'. This is function chaining. I'll replace underscores with hyphens. Then, I'll chain another 'REPLACE' to handle spaces by following the same process: adding a function, naming it 'removeSpaces', and using the output of our previous function, 'removeUnderscores', as the input.
Next, let's use a variable to store this result before we handle the empty SKUs. On the Variables tab, I'll create a variable named 'cleanedSKU' and assign it the final result of our 'removeSpaces' function. Notice how many of the inputs in the Logic Builder have a slider to the left. That slider allows you to switch between entering static text or choosing from the dropdown of source fields, functions, and variables.
Let's create one more function before we build the logic to handle empty SKUs. I'll name this function 'skuEmpty' and search for the 'is_empty' function. This will return true or false if the SKU is empty or not.
Now we can navigate to the If/Then tab to handle the empty SKUs. If our 'skuEmpty' function returns true, we'll set the final result to 'UNKNOWN-SKU'. Otherwise, in the 'Else' block, we'll set it to our 'cleanedSKU' variable. For reference, a result can either be set within an If/Then block or directly in the Return Result tab. All transformations must have a result set at the end to determine the output for that field's logic.
Everything we've done has generated clean Python code, which we can see right here. Now, let's see the true power of this hybrid approach. Let's say we also want to replace a forward slash from our SKU, but we forgot to create that function. I'll switch to the Manual tab.
Here is all the code that our visual builder created, and it's fully editable. I'm just going to copy and paste one of our previous functions that used 'REPLACE'. I'll paste it after my 'removeSpaces' function, rename it 'removeSlashes', change the space to a slash, and pass it the result of 'removeSpaces'. Then, I'll change the line where we set 'cleanedSKU' and set it to the result of our new 'removeSlashes' function. I can click 'Parse to UI', and the system recognizes my manual change and keeps the visual representation intact. You can seamlessly switch between visual building and manual coding when the code patterns match the Logic Builder's. You might have noticed the generated code preview might not match the manual tab after you've made edits. If you click save, the Logic Builder will override any of the generated code with what you have in the manual tab. Let's click 'Save Logic' and finish up.
If you watched our previous video on the basics of DataFlowMapper, you'd be familiar with this template and you'd know we already mapped most of these fields. Instead of doing that all over again, let's save some time and use our AI copilot. We can click 'AI Complete Mapping'. Let's exclude the 'SKU' field since we already created that logic. We'll tell the AI to map our fields and make 'FullName' a concatenation of 'FirstName' and 'LastName'. The AI will analyze the requirements and orchestrate the full mapping. Once ready, you'll see the results of the AI's mapping with an explanation, a confidence score, and code previews. We can approve and apply all of these.
Now that our mapping is done, we'll save the mapping file and name it, and then we can go back to the transformation engine. Let's click 'Transform'. The result is perfect—5,000 records transformed with no errors. We can review the data, and our 'SKU' is now uppercase, uses hyphens, and the missing ones are clearly marked. It looks like our AI did a great job mapping the rest of the data as well. We just performed a multi-step data cleaning process by chaining functions, using variables, applying conditional logic, and even tweaking the raw Python.
That's the power of the Logic Builder. It's a complete toolkit that grows with your needs, from simple visual steps to complex, custom code. You can see all the functions available in the function library in the app or by visiting DataFlowMapper.com/documentation. To start building your own powerful transformations, sign up for a free trial at DataFlowMapper.com.