www.dataflowmapper.com
Video Transcript
Welcome to DataFlowMapper. Today, we'll be doing an overview of our AI Copilot and all of its features. To demonstrate the Copilot's capabilities, we'll be tackling one of the biggest time-sinks in data onboarding: interpreting technical requirements and manually building out every single mapping and transformation for an import. I'm going to show you how our AI can take a standard import specification document and build a complete, production-ready mapping to get clean, validated, and transformed data in just a few minutes.
Here's a common scenario. We have a source file with customer data that's pretty messy. Fields are missing, have inconsistent formatting, and some contain bad data we don't want to be imported. Our destination system has strict import requirements, and we have an import specification sheet here that outlines them. It specifies everything: required fields, data types, formatting rules like email and phone number validation, and specific values for fields like 'account_status'. Feel free to pause the video and review the document.
Manually, this is no small task. You'd have to go field by field, create logic for formatting, set up validation rules, and hope you didn't misinterpret anything. It's a tedious process if you don't already have a similar mapping or existing validations for the import format. We're going to do this the fast way. Let's upload our source file, click 'Create Mapping', and upload our destination template to get started.
This is where the magic happens. We're going to use our most powerful AI feature first: 'AI Complete Mapping'. Instead of simple instructions, I'm going to copy the entire contents of that import specification sheet and paste it directly into the prompt. We're asking the AI to act like a senior implementation specialist and figure it out from scratch to get us transformed, validated data that's ready for import. I'll tell it to analyze the fields, source data, and the provided spec sheet to create any mappings, logic, filters, and validations that it deems necessary.
Now, DataFlowMapper's AI is reading that document just like a developer would. It's identifying the required fields, understanding the data types, and, most importantly, translating the rules into actual transformation logic and validation checks. Let's give it a moment and let the AI go to work.
And here we go. The AI returns a complete plan. It correctly mapped the one-to-one fields and created the necessary validations, transformations, and filters. It correctly maps 'full_name' from our source 'FirstName' and 'LastName' fields, and it has generated custom logic for 'account_status' to ensure it's one of the required values. It even created a validation rule for the 'contact_email' field. This is incredible. And this isn't a black box; we can review each suggestion, see the code it generated, and the Copilot even provides a rationale for each transformation. We have full control to approve or decline each suggestion. The AI did the heavy lifting and generated clean, readable Python that fits right into our Logic Builder.
The 'Complete Mapping' feature did 95% of the work. But what if we need to refine just one field? Let's look at 'internal_notes'. I'll open the Logic Builder for the 'internal_notes' field. When opening the Logic Builder you might get a warning that the code couldn't be parsed. That just means it can't be represented in the no-code interface. It will still work properly during transformation and can still be edited in the manual tab.
Say we want to prepend every note with 'Internal Note:'. We can use the AI Logic Assist for this.
Inside the Logic Builder, I'll click on 'AI Logic Assist'. My prompt will be simple: 'Prepend the text "Internal Note: " to the output of this field's transformation logic.' The AI generates the exact Python code needed using our CONCATENATE function. I can apply it and save, and now that specific field's logic is updated without affecting anything else. This shows how you can go from a massive, file-wide transformation to honing in on a single field. This works for filters and validations as well.
Let's save our mapping file, go back to the Transformation Engine, and click 'Transform' to see the final output of the AI's mapping.
And the result is perfect. The data is clean, the 'account_status' is standardized, and if we had an invalid email, our filter and validation rule would have caught it. We just automated what would have been hours of tedious, manual work by letting the AI read the documentation for us.
We went from a raw data file and a technical document to a complete transformation with business logic and validations in minutes. But what if you don't need complex logic and just need to align two files with similar column names? If you don't have a complex technical document and just need to map fields with similar names, that's the perfect job for our AI Copilot's 'Suggest Mappings'.
I'll clear out our mapping so we have a clean slate. With our source and destination fields loaded, I'll click 'Suggest Mappings'. The AI analyzes both sets of headers and instantly pairs the obvious matches, like 'name' to 'full_name' and 'email' to 'contact_email'. It's less powerful than 'Map All' since it doesn't create logic, but it's perfect for handling the straightforward 1-to-1 mappings in seconds or knocking out the low-hanging fruit when building a new mapping.
That's the power of AI in DataFlowMapper. It's not just about mapping fields; it's about understanding requirements and automating the entire process. To try this yourself, sign up for a free trial.