Scale MarTech Systems and Teams with Test Automation: Stack Moxie & AvePoint Webinar
AvePoint’s Tech Stack
[00:00:00] Neil Harrington: My name’s Neil Harrington. I’m the VP of sales and alliances at Stack Moxie. We’re an automated testing and monitoring platform for your marketing and sales tech stacks. We work with fast-growing companies to ensure they have quality throughout their marketing and sales and texts.
I’m joined with a Dawit Tesfaye, he’s the director of marketing technology at AvePoint, one of Microsoft’s largest partners. And I feel that every year, they win Partner of the Year at Microsoft. They have solutions across the spectrum to help you migrate, manage and understand your Office 365 instance.
Free Set Up for 5 Monitors/Tests
We’re available to help set up 5 monitors/tests.
Complete the form and we’ll get them configured together!
They’re a phenomenal organization. I’m excited to have Dawit here today because as we’ve been working together, watching him grow his organization and understand how he thinks about technology and integrations and ensuring everything’s working as expected has been really eye-opening to me.
I think he’s one of the premier thought leaders on how to do that and has a great eye for where the future is going with that. Also, more importantly, on how to make sure analytics are finishing at the end of the quarter and making sure that they’re all correct. So with that, I will turn it over to Dawit, take it away.
[00:01:08] Dawit Tesfaye: Thank you. I truly appreciate that. That introduction is going to be hard to follow through on that one. So I’m going to share my screen real quick so that you all go ahead and follow through. Awesome. So before we go through the agenda, I think one of the key things that I want to do real quick is just go over what I’ve been going through the past couple of days.
Right? So over the past two days, for those that don’t know the MarTech conference has been going on and I’ve been sitting on as many sessions as I can. It’s been great so far and I’ve been learning a lot and; it’s just been a wonderful experience. Not to knock on the conference, but one thing I have noticed is that I’ve heard a lot of people just talk about data in general, right.
[00:02:00] But a lot of them focus on bad data and how detrimental that is to achieving your goals. While I agree with them and I think all the points that they’ve made have been very valid and just been really key. The one thing I don’t really hear from them is that they don’t really go over identifying the root causes of that bad data and the different tools and technologies out there that can help automate and scale identifying those root issues. And so, one of the things that I hope at least from this presentation is that yes, we had a sense of what we’re doing at AvePoint in regards to how you’re identifying a lot of the root causes that are bad data and how we’re using Stack Moxie as our tool of choice for the automation of our entire QA platform. Real quick, one of the things I noticed from the keynote at the end of the sessions yesterday, it was that Scott Brinker (for those of you who don’t know is the person that is essentially behind MarTech), he said the big thing about big ops is that it’s not just the data that’s growing. It’s the interactions that every single one of us as an individual is having per day with data. He continues to go on and say, that’s actually growing even faster than the data in the world. The reason that stood out to me, and one of them, one of the key reasons that I wanted to read that out to you guys was again, you see that big focus on data. And you’ll see that throughout this presentation, but I want to take a different twist to it and focus more on the quality aspect of that data, right? Not just why it’s bad for your entire tech stack, as in order to achieve your goals and whatnot, but also what you can do to identify where that bad data is coming from.
And actually just plug all of those leaks and holes and whatnot. So that being said, what we’re going to be going over today is essentially:
- why QA is critical for your MarTech stack.
- How we at AvePoint solved a lot of our LinkedIn marketing attribution gaps
- How we resolved a few GDPR errors that helped us get more opt-in contacts.
And then, one key focus for me. And I’ve been hearing it a little bit in some of the sessions that have been going on in MarTech conferences, that there is a focus on people, right. Whether there is external or internal right now, I’m going to be talking a lot about the internal aspect and you’ll see a little bit more in my presentation on the external aspect of that. But internally, it’s creating career advancement opportunities essentially. People hear “automate,” and they assume job losses or things of that nature, right? “The robot is going to take my job” or “you’ve automated all of my tasks,” but that’s not necessarily the case. The case here is that I want to be able to automate the redundant tasks that people do in order to free up their time, so that they can actually spend more of it doing some insightful analysis, being proactive, looking into what I call innovation, right?
And that’s one of the pillars of the MarTech department that we’re building out at AvePoint. So that being said, I’m going to move on to the next side. So, Neil did a very good job of introducing AvePoint and giving you guys a big overview of what we do. We are one of Microsoft’s largest 365 data management solutions providers. We offer our full suite of SaaS solutions to migrate, manage and protect that 365 data and other data points as well. We do have over 8 million and you kind of users and, a big focus of ours is digital transformation, governance, and compliance.
With all that said, why does that matter? Other than a shameless plug for AvePoint, why does that matter? Right. I believe this that’s the context of what we will be discussing throughout our conversation today. We, at AvePoint, are in the B2B marketing space, right? And as such, we typically fall in that mindset of we’re selling to other organizations, and we tend to call those accounts.
That’s not new for anyone that’s been part of a major sales organization. The reality in this space is there tends to be more than just one buyer in a long, drawn-out purchasing cycle. Each account or organization could have multiple people within the buying process, and all of those people play a different role and serve a different purpose and perform different actions. Whether that be on your website, your app, or any other platform that your customers are using. They’re all going through a very different evaluation process. That’s CIO might be going through their own set of evaluation checklists that they need to check off, which will be completely different from an IT manager who’s in the weeds and has a completely different checklist.
[00:07:00] So that being said, we need to capture each and every one of their data points accurately. And in order to provide them with what they need and target them efficiently, that data has to be accurately collected. That being said, QA then plays a major role for us in our marketing organization as we move more toward data-driven strategies to effectively target and capture our audience.
And with that, our objective then becomes managing this customer data, right? So when we’re looking at the overall MarTech landscape, the common problem that we see today is navigating customer data within a fragmented technology environment. There are certain things that we need to know about our audience, right?
- Who are our customers
- What actions are they taking
- Where are those actions taking place?
Our initiatives at AvePoint are then to develop a data model that supports the customer journey, and in order to do we first have to build a solid data foundation, which relies on clean data.
[00:08:00] Therefore, QA tools like Stack Moxie become critical to this initiative to be successful.
I’m going to pause on that point real quick, just because I think it’s a very important one. For a long time at AvePoint, we’d been doing manual testing. And what we found is that yeah, we do find a lot of like errors and whatnot, but we can’t really scale that. We don’t have the manpower and resources to go ahead and just manually test everything. And so, when one of my colleagues brought Stack Moxie to my desk, it was just like a bright light bulb went off. And it was just like, yes, this is exactly what we need. Something that can automate a lot of the testings that we do and do it at scale. And run multiple different types of tests. I’ll go further into those, but I do want to make that point at the beginning of this conversation.
[00:09:00] Neil Harrington: Let me ask you that, by the way. So, when we were chatting you had said, “Hey, I was actually going to think about building sort of a testing solution or scripts myself.” How was that decision process? Walk me through what you were thinking about that because I think a lot of people want to build that, right?
When M.H. Lines, our CEO, and I were at Microsoft, we wanted to go build it. Right? Microsoft would have given us money to go build it. And it was like, “Oh, we have our day job that we’re already working a ton on, and we can’t go build this. And we’ve got to steal an engineer, which isn’t going to really happen.
But even at Microsoft, we had the budget, and we couldn’t do it.
Dawit Tesfaye: Yeah. Yeah, absolutely. And I think that’s a very good question. Right. It’s like classical build versus buy debate, and I’ve actually been having this debate a lot internally for me at the beginning. Yes, I did want to build it mainly because I did so much of the testing manually that I knew what I wanted to find.
The issue there became what you said: we need my time, I need to borrow engineers time, and all of that. And so when you take a step back and look at the bigger picture of things, is that really feasible? Can we spare some of the devs from the team to go ahead and just start working on building this application for us to start using for testing?
And then the biggest aspect of the build versus buy debate that a lot of people just gloss over is the maintenance aspect of it. Right? You’re going to have to update it. You’re going to have to make sure all the API connections are always working on time, and they’re pulling the data accurately, they’re working correctly, and all that stuff.
And if there happens to be a problem, that problem becomes yours to solve. And so again, looking into that, I had a couple of like tools in mind for robotic process automation items and stuff like that. But when my colleague mentioned Stack Moxie and I took it like into it, I found that it really goes deep into everything. For example, we use Dynamics as our CRM, and we have Marketo as our marketing automation platform.
Stack Moxie is able to go in and actually perform all of the tests that I have done manually in the past when I was testing. And it does that within Marketo and does that within your CRM. They also have other integrations for other platforms that you’re using. So I sat there and I looked at it, and the time to implementation and everything was very short when it came to purchasing Stack Moxie. So, those kinds of went into the deciding factor as to why we decided to go with Stack Moxie as opposed to build something that probably would have taken a lot longer, had a lot more issues before it became even good enough for a beta version of it, let alone a full-out version.
[00:12:00] And so, we’re moving very fast in our go-to-market strategies. I needed something just as fast to get built into our MarTech stack so that we could go ahead and progress on our go-to-markets.
So one of the first key things that I did at AvePoint prior to taking the role as the director of marketing technology was perform a MarTech gap analysis. And from our gap analysis, we identified two key issues and that showed data and attribution not being set up correctly.
And for us to achieve our objective of customer data management, we need to have a solid data foundation as collected and processed in a centralized location. We know from this tech deck analysis, we were not able to do that from us to our data.
[00:13:00] In the past, we’ve typically used band-aid solutions like adding middleware or point solutions, and that resulted in what we call a runaway stack complexity.
You typically see stacks with 50 plus for many solutions, essentially becoming a nightmare to manage. We also saw a large volume of unusable silo data, for example, different sets of contact information for the same people. Right? So you have myself having a couple of records in CRM, a couple of records in Marketo and a couple of records and another platform.
And again, that becomes a nightmare, right? Which are you targeting that right record of mine or which one shows up in a created that you’re putting in order to send out an email blast, for example. And so another question that came up is then, with all these siloed platform systems and tolls, who owns the data?
For example, in CRM, is it marketing or is it sales who owns that data? So in order for us to go ahead and build this data model that supports the customer journey, we know that our attribution setup will have to be done correctly. And our gap analysis has shown that there are definitely some holes, especially with certain platforms. for example, LinkedIn does not always pass accurate values to Marketo, etc.
I mean, we’ll go further into the LinkedIn part, but they don’t really play friendly with too many other platforms. And so again, this is another key, light bulb moment where I was like, all right, Stack Moxie, you can go ahead and help us retain routinely monitor these values, but not just monitor, but also test to see why and where the bad data is coming from.
And so one of the key outcomes of that marketing gap analysis is we need to come up with a framework that actually solves our problems, right? So going away from that point solutions, adding middleware to solve one issue, we need a full framework that solves all the issues.
[00:15:00] You can summarize all three of the goals that you stay on the side for our data collection, data integrity and quality, and data activation by just one sentence. And that would be, we want to be able to have a data hub that can centralized data from multiple sources, clean it and unify it.
That will then give us the ability to segment that data and turn it around and send it back to our multiple marketing platforms for activation purposes. Data integrity and quality is essentially like the key aspect of this site, right? This not only enhances our marketing capabilities in order to pull new prospects or leads, but also improves our current customer experience by better serving the needs of our current customers.
Therefore then customer experience becomes the end goal of this framework. By following these three objectives of data collection, data integrity, and data activation, we’re making it easier to get what the lead and prospects need in order to make it to the decision about our projects.
And we’re also able to then target accurately, and we can better serve them, serve our customers, our prospects as well by doing that
Neil Harrington: I don’t want to get too much in the weeds. So are you guys pushing this to like a data warehouse or a CDP? So somebody comes in from your website and goes to Marketo, are you then replicating that in a data warehouse or a CDP and then pushing it back to data augmentation tools and then over to your CRM.
Dawit Tesfaye: That is a perfect question. Yes, actually, we’re doing all of that. So we are pushing the all of this data into a CDC. And from the CDP, we’re pushing it into a data warehouse. Right. So we’ll have to regroup. But for one in the warehouse and one in the CDP, and then from the CDP, we’re also pushing that back to our marketing platform.
So like, the data comes in from Marketo, we’re going to be able to push it right back to Marketo. After it’s been cleaned, synthesized, and all of that we’re able to push it back to Google analytics. We’re able to push back to dynamics and all our different other marketing platforms.
Neil Harrington: And then my guess is you’ve got a direct analytics team. That’s also, since you’re a Microsoft shop using power BI to use that as everything across, because I think like the CDP and a data warehouse is like the most talked about thing in marketing operations right now has been for three years.
It might be for another half-decade. So yeah, I just wanted to understand how that sort of stack looks. Yeah, absolutely. And the CDP is critical to the stack as it is the backbone of the entire marketing tech stack. And so for us, the, the reason we’re pushing to a data warehouse is for our BI teams to go ahead and be able to then apply advanced analytics to that data, as well as applying machine learning to that data.
And that will again, help us serve our customers. Give them a better experience on our website and give them exactly what they need, whatever that may be.
Ariel Sasso: So, when performing a gap analysis, how did you determine what the key metrics were that you should focus on?
Dawit Tesfaye: So when we were looking at our gap analysis, we essentially, and that’s a really good question. I mean, we took a step back, right? We looked at our overall KPIs for the marketing organization in general.
And we assessed each and every one of those KPIs. So whether it be, convert leads convert like CPAs sessions on the website, bounce rates a lot of those vanity metrics we looked at those and we decided, okay, how are we collecting this data? Right. What technology do we have in place?
That’s giving us the ability to collect each and every one of these data points. And from that perspective then is working backwards, right? Like how is that data flowing through the entire tech stack, looking for certain gaps or breaks where there could be potential data leaks. And that’s essentially what the gap was looking to identify.
Are those specific gaps or, sorry. Yeah. The gap analysis was looking for those specific gaps. For example, looking at ROI as a KPI, How are we coming to that specific metric? Right. Are we using an attribution method, which attribution method or we’re using, whether it be a last touch-first touch or whatnot, but going even further into that, how are we then building that model, right?
Where are we using UTM parameters and things of that nature share we’re using UTM parenthood. So going backwards, it’s like, all right, where can you stand for? Amber is bright. Right. And that becomes a key part. Right? So, for example, one of the gaps identified LinkedIn, right? It doesn’t always send the correct attribution value for the UTM parameters.
And in that case, that affects the ROI for LinkedIn. One of the examples that would, that we solved for was some leads were inaccurately typed in our Marketo system. So they were coming in from LinkedIn, but they didn’t have the LinkedIn parameter on their UTM values. They had a completely different one.
And so we were aggravating them to something else. And meanwhile, our CPA in LinkedIn is, looks huge, right? And we’re not really showing a good return for it because we are again, inaccurately labeling these leads coming in as something else, other than like that. And so solving for those gaps is essentially, the whole idea behind the MarTech gap analysis.
Ariel Sasso: That’s a great answer.
[00:21:00] Neil Harrington: Yeah. What I was going to say is on, in dev-ops there’s this idea in concept called observability and it just, and it’s one of those things. We’re just going to assume that everything’s broken and is broken as it sort of natural state and that it’s not all working correctly and you’re gonna say, well done, that makes sense.
But I think it I think as we also talk to like, sort of higher-ups or other people within the organization, like they forget about that, they’re like, oh yeah, Like, we can joke that our MarTech stack glued together with bailing wire and duct tape, but then what does that actually mean for people that are watching through and doing the gap analysis of like, where are things breaking and just assuming the mindset of, we need to make sure we, assume that it’s broken and then come back.
So I think that was a good answer.
Dawit Tesfaye: Yeah. We’re definitely trying to do more of that at AvePoint; we’re just observing how the data’s actually flowing right now. Right. Because what we found is, when we assume that it’s working correctly we’re, obviously biased.
Right. We want it to be working correctly. So then, it’s hard to find those specific gaps at that point. And so right now we’re taking a step back, and we’re like, okay, let’s just look at the actual flow, observe it and see how it’s actually moving versus what we think it’s doing.
And that’s, that’s helped us out tremendously. And when we do find these gaps a lot of these different findings and actually it helps set up our testing in stacking on seed. Right. Because now I know what errors I should be looking for. Right. And then I can also set up broad tests, in Stack Moxie where I’m like, okay, maybe I don’t know the specific error, but I didn’t know there is something wrong over there.
And so let me just go ahead and set up a test that pulls in all the different values again, without going over limits. But and that also gives me an idea of okay I knew there was an error here, but wow. There was a different area here. So we also have to address that issue as well. And then just giving you guys a brief overview of our tech stack.
I mean, this is a party that is definitely not all of it, but these are some of the major players in there. Right? So Stack Moxie again, central is central to making sure that we find a lot of the errors and that we fixed them. That makes them far is going to be critical. And that’s the people part later dynamics is our CRM of choice.
And then Marketo is our marketing automation platform. Asana, we’ll get into in a little bit, is a privacy tool, a very stringent privacy tool. And it has caused some issues and data collection, but, again, we’re able to like identify some of those issues with stagnancy and then find creative ways around those.
Segment has mentioned before is our CDP and that’s critical to like the entire framework. Segment does a lot. We’ll be using it, the full power of it.
[00:24:00] And then LinkedIn is key to our lead prospecting strategies as well. And so just a quick, like one of my big goals when building this MarTech stack is essentially, I want it to build a technology stack to support the overall business objectives and goals, but also enabling marketing teams to get higher quality leads by modeling behaviors and patterns and understanding what users customers are actually doing within their journey.
Right. And providing what they need at the right moment. I think that’s critical. It’s just again, the whole observability aspect of it is looking at the customer journey by how it’s actually happening versus B going in with a bias thinking, okay. Someone comes to a site, downloads, an e-book is excited about the ebook salesperson contacts and they buy one of our SAS solutions.
That’s an assumption and it doesn’t always go down that way. The observability aspect of it is big for us. And then the other thing regarding our tech stack is the execution of the tech stack, right? The trick is to doing this effectively, to doing this effectively is to build a platform based infrastructure building on a solid data foundation.
And one that eliminates silos is that was a big part of our gap. Streamlines our data integration and find solutions to our existing and future business goals. So that is basically what I have been tasked with and solving. We’ll walk through a few of those points.
And so now, that comes out to our objectives, right, where you want to centralize our systems. We want to essentially find all this stuff that’s falling through the cracks, fix broken attribution. And then. Yeah, essentially go through a digital transformation. But one of the things that I wanted to note over here and this site is that these objectives help us achieve three main themes from our marketing mandate. That’s going to be performance, impact, and loyalty. And so for performance, we’re focused on optimizing marketing campaign performance across every campaign, channel audience, segment, product line, and market. When we’re looking at impact, we’re looking at connecting marketing performance with sales and revenue across new and existing customers.
The existing customers also being very key and loyalty, improving that customer experience across the entire customer journey. And we want to increase retention at that point. So when you take a step back, right, and you look at those three things that I talked about, one thing that plays a key role in all of those is QA.
And, one of the principles of just technology in general is that to grow exponentially, we must create value by leveraging technology and at least one key building block and this case that’s going to be insights, analysis, and data. Right. We can then use that use technology that can help us analyze the automate routine activities to disrupt traditional operations.
[00:27:00] And that’s kinda how I look at a Stack Moxie in this instance is we’re essentially using that to automate routine activities, such as like checking to see forms are broken or, finding those cracks and all the data is falling through those cracks.
And so the framework that we set out to build, right? So we’re going from gas to solutions. We’re no longer looking at one point solutions or middleware to go ahead and build a Franken text factor or whatever it’s called, but. We are focused on building essentially an ecosystem, right. And that, and we’re using a methodical approach to building that framework.
So we’re looking at it as a solution-based approach to MarTech management. And in order to start laying the foundation for our data collection, cleansing and activation process, we first needed to make sure that the data we’re collecting is clean and accurate. Again, that’s where the QA plays a huge role.
In order to do that efficiently, we need that QA to be automated. And we need different ways of testing, right? So it’s not just like we’re looking for one way of testing things. We’ll go into the different testing methodologies and a little bit, but again, the idea of being, we’re on a, make sure nothing is falling through the cracks, right?
We were going to implement automation and processes to keep data clean at the source. And so here, we’re connecting customer data from every first first-party touchpoint ensuring that the data is consistent in that. Unifying that user history data. Right. And that’s what Neil was alluding to earlier that CDP aspect of being able to get that full view of your user’s history.
Again, that, that full view of the user’s history is meaningless if it’s filled with bad data, right? So you can see why before we put a framework like this into place, we also have to identify all root causes of our bad data in our tech stack.
Neil Harrington: So cause I’m looking at this, it’s like connects customer data from first party touchpoint.
And so the one thing we also didn’t touch on any of the tech stack. Are you guys using anything? Like a behavioral within the application. So like a mixed panel or an amplitude and bringing that sort of COVID analysis back into your marketing.
Dawit Tesfaye: That is a great, very great question.
I’ll tell you why. It’s a really good question. So we’re currently in the process of looking at different platforms that could do that for us. We first said, we’re looking at Amplitude and Pendo on some other platforms. However, we do have what we call it telemetry data for our products themselves.
And this goes back to we decided to build this feature and then quickly realized we can’t really scale that feature. So we are going to be moving to building a platform that could actually help us do that. With little to no code so that we can get it into the products as quick as possible.
And so again, quickly touching on that unified user history, right? The reason why I want to keep drilling this point in, as we come to the point where our goal, that our goal becomes less centralized all data points in order to achieve a unified record of our user or customer’s history. Right? A big theme right now in MarTech is the concept of the golden record.
Everyone was looking to achieve the full 360 view of their customers are taking different approaches of doing so, but mainly what you’ll hear over and over again are the three letters CDB. Neil had mentioned that earlier and, along with CDP come, other topics such as identity resolution activation orchestration and things of that nature, what most people don’t go into great lamp when discussing these topics is that you want to build these records with accurate data, like I just mentioned, right?
If I have, for example, a golden record of myself coming in from all these different platforms, if the data from any of these different types. It isn’t that accurate? It’s not labeled correctly is dirty as are the schema doesn’t match one of another platforms records and stuff like that.
That then becomes a really big issue for the BI team. And not just the BI team, right. For the marketing team in general, because they’re relying on these golden records to go ahead and build out their segmentations, their advanced segmentation and so on. So before we even get to this step we really had to make sure our data was clean.
All those sources that we’re gathering data from. So, for example, in this diagram, if you look at the dynamics here on week, we had to make sure that’s clean, the more data that has to be clean. Not in and of itself took months to go through the market data and just scrub it.
And so. Again, we’re, we’re using Stack Moxie to essentially check out the integration between Marketo dynamics, CRM, the web, and all of that. That’s going to keep moving.
And so I definitely wanted to throw this in there. I was at Inbound in 2018. That’s a conference hosted by HubSpot. And this is when they first introduced the CA the flywheel concept to the marketing world. And everyone is basically mind blown. One thing that caught my mind, as Dharmesh and Brian don’t co-founders of HubSpot were just, Scott were describing their flywheel was the concept of friction, right?
Reducing friction was the key to making that Flagler successful and, just positive in there. He had an entire formula for the Frankston and everything, but we’re not going to get into like mathematics and all that in here. What I hear, what I want you all to take away from that is to me, friction is essentially user experience.
And the reason, that caught on to me was that I’m saying the quote that you see on your screen is you don’t need to make your product 10 times better. You need to make your experience 10 times lighter and improving that experience by 10 times as much, much easier than improving your product by 10 times, right?
You’re not going to go out there and hire hundreds of more engineers to work on your product, right. It’s just a lot easier to get the marketing team and I should start collecting clean data and figure out how you can make that easier experience a lot lighter. That’s the easier road to take. And so, like as mentioned at the beginning of our session is to me, customer experience is integral to the overall MarTech stack, right?
Going out there and getting new prospects and new leads is a big aspect of it. I also retention to me, plays a key part, and I think that also goes back to the business objective as well, that we’re trying to satisfy.
And so with that being said, that’s so balanced, mostly the MarTech focus of our session. So now I’m going to go more into the weeds of some of the testings that we do, right. And before we get into that part, wanting to give you guys a quick break with this coat. In the beginning of my career colleague has said that they had sent the statement to me and it essentially, it stuck with me ever since then.
[00:35:00] Any tool or system is only as good as the data that you input into it for years and years, it’s just been ingrained into my head. I’ve seen, it’s become very clear. As prior to AvePoint, I was as a lead MarTech consultant and a marketing and digital communications agency and working with all sorts of fines, right.
Small to mid-size to fortune 500. One of the things that I see is people spending millions on these big flashy products and platforms or whatnot and you get there, you get in as a consultant and you’re sitting there and it’s okay, that’s cool. But can I see the input of the data that’s going into this process because you can, then you might have the best of the best breed technology, but at the end of the day, if you’re putting like, bad data into it, incidence really, you might as well buy the cheapest thing on the platform because they’re both about to give you the same outcome.
And so, I see this more and more now is, and, as we, as Scott has said, the big ops data is becoming everything right for dev ops and marketing ops is in general. And so to me that is switched my perception into like how I want to shape where our more tech departments going.
And I want to put more of a focus on that quality aspect of the data. As opposed to let me go out and buy all the best platforms out there to make this MarTech stack look shiny. Because to me, the end goal is again, customer experience and just having good data for our BI teams to go ahead and analyze, and also visualize to our C-level and our key stakeholders.
So with that being said, we’re going to go straight into our testing process. So again, when I started, AvePoint was doing a lot of things manually. We’re filling and what that entailed was filling out each format, basically. Not just forums, right? All user actions on the website, doing all of that manually and seeing how that data trickles down through the entire tech stack.
And so, if I was to go to one of our websites, I’m looking at all the different events that we’re capturing from, whether they be click events or any other events listeners and, whatnot, filling out forms and seeing what happens to that data when I fill out a form. Right. So it goes to our Marketo platform.
Did they capture my UTM parameters when I filled it out? Are all fields sent sending to Marketo accurately and whatnot. And it’s, as my lead record is a duplicated or is it like what’s going on there. Right. And then also seeing “so that’s happened,” it’s in our marketing automation platform.
It should also be in our CRM. Did that record also gets synced to this year and all the same fields get sent over accurately. Is there any issue with our field mapping and stuff like that? And so you can see the process is very long and drawn out. What that part of that process is going through multiple, CSCs, Excel files and whatnot.
If you love Excel, which I do, this is a great way to spend days in Excel. But the, one of the things I’ve found, especially as I moved up to more of a director-level positions, I don’t have the time for this. On average, I was spending about 10 hours per issue and issues came up left and right.
And so, one of the things I was explaining to, not just the Stack Moxie team, but also to like part of our C-suite was test. So building out an entire infrastructure, I can’t spend 10 hours plus a day on an issue. One small issue that, it’s very. Direct and, for example, fulfill market and find something that’s wrong with some of the integrations and the data’s not passing through accurately.
[00:39:00] And I’m in there for 10 hours. That’s 10 hours. I took away from building the time infrastructure, which from my scope should be my main focus and having someone else or another tool or something find and help alleviate these issues. So we look at it from a scalability perspective, right? You can’t scale manual testing is we just don’t have enough resources for that.
We’re not going to go out and essentially hire hundreds of people that will just do a lot of this user manual testing and whatnot. And, we also identified earlier on in the session that we need multiple types of tests, right. To achieve our overall MarTech objectives. So it’s like I said, it’s not just going to be filling out forms.
Right. We need to test is the left side of yeah. Our emails and our foot are in the photos, in our email’s broken or our links and things, not working within an email or on the left side or other things. So there’s multiple tests that need to be run simultaneously. Some of them have to be done in real-time and some of them, we can do ad hoc.
And, but that requires some sort of automation, what people again, the people are being critical. And then yeah, that, that speaks to the last point is we do need the ability to automate these tests.
Neil Harrington: I also feel that it’s always funny that testing goes back to being, in a, in an Excel sheet or a Google sheet. And there’s like hundreds of on each one. And then there’s like on the bottom, 30, 40 different tabs. And then the person has to go and check the pops and then come back.
And then to your comment about the footer, like two things always seem to happen is like someone would somehow screw up a filter, which it sounds oh how do you screw up a footer? But everybody that’s ever tested or sent an email knows that gets broken all the time.
Like the one thing that would keep me up at night was that like, we would send out like a Japanese email to like a French customer and like mixing those segments up on somehow on accidents. That always kept me up at night because we did it once on a test and I was, and that always freaks me out that we’re sending the wrong language to the wrong segments.
Like not even worried about GDPR rules, just straight from the customer experience. Point of view. That is a horrible way to send an email.
Dawit Tesfaye: Yeah, absolutely. I mean, my bosses and I, every time we get emails that are not sent in error, we will just go ahead and forward it to each other.
Just Hey guys, just, it’s not just us that make mistakes. That was kinda one of those “we’re trying to avoid making those mistakes.” And again, that’s why we want a lot of these testings in place, but yeah, they do happen. And that speaks to all the different testing capabilities that Stack Moxie gives us.
[00:42:00] Right. So we were looking at smoke testing are our integrations – are they up and running? Real-time monitoring, which I’ve talked to, which I’ve talked to a lot about. We do use from stagnancy is through the leads performance expected. And we need to monitor this routinely just to make sure that, not only are things falling through the cracks, but are they coming in as the way that we expect them to come in?
Campaign testing is something we’re starting to do more of them. Now we’re testing specific campaigns before we go live. And then after we go live too, right, like, everything might be perfect. And then as soon as the developer hits a push-button to to the production from staging you down, some things happened in that process.
So we have to test after things, but. And some of the key things, and I’m not going to walk through all of these, but some of the key things that we’re walking that we’re working on right now is a regression analysis. One thing, one of the things that we’ve noticed that we tend to find and fix a lot of issues, but then they tend to regress and, that has very insightful outcomes from that because it’s not always the tech issue.
Right. And it could also be a people issue and a process issue. And so that, then I think it’s that back as again, reworking the entire MarTech department is I look into and it’s okay, the tech issues we’ve solved, but then there, the regression analysis shows us that there are process issues.
And so now I have to go in and rewrite a lot of the different processes and just make sure that those are done accurately so that we don’t regress on it on the website or in an app or something like that. And then black box testing. No, what you don’t know is the fun part. It’s out here on the right hand side.
You’ll see, I’ll give you guys a better screenshot on the next couple of sides, but this is one of our real-time monitoring tests that we have up. And we’re looking to see all the UTN tags that are coming in from our Google pay-per-click. The reason we’re looking at all of them again, right.
It goes back to that observability aspect that Neil brought up earlier. And it’s one that we want to see if everything is, if this company Amy accurately, pat yourselves on the back end, we’re doing a good job, but that’s always not the case. Right. And so it helps us analyze which campaigns are, where the batty gents are coming from.
Is that a specific region? Do we then need to go out to that region, talk to those field marketers and see like what their process is. Right. Maybe it’s just as simple as. They didn’t understand that you needed to put UTM parameters on the links and stuff like that. Or maybe it’s an actual thing where they don’t know how to do it and Marketo and stuff like that.
So that’s like how we’re identifying a lot of the leaks of the data sources is using the real-time monitoring.
[00:45:00] Neil Harrington: So this is awesome. Thank you. I love the screenshot, but like when I’m thinking about, oh I should go to testing. It feels a lot like, I should also go to the gym every day, but at five o’clock, I don’t end at five o’clock, but at six 30 when I eventually just stop I’m like, now I gotta go do dinner.
I gotta do all these other things. So like, how do you think about we do some testing today? Of course, it’s all manual, but then like, how do we, like, how do you think about implementing this and where do you start? Because you’re a busy guy, as you said, like, where do you, how do you start from a, where do you pick a starting point and how do you, implement.
Dawit Tesfaye: Yeah, that’s a really good question. It’s I don’t think there is one certain approach to it. Right. I think at first it began with me and it psych sending out some of these broader and bigger tests and, finding those gaps and issues and trying to get those fakes. But at the end of the day, when we want to scale it and build that for the full department you, the way I want to go about it is, I don’t want to put it all just on MarTech, right?
So I want everyone to share that responsibility. And so while yes, we are going to be building a QA team within the MarTech department, for example, the dev team also needs to have their own QA team. Right. That goes back to the example of the regression issue that I spoke about. So in our department in MarTech, we will do a QA of the website from an SEO perspective.
And we’ll find all these issues and we send it over to the dev team. They fixed it, but then they don’t keep it on that. So then, the next time we do an SEO audit, we will find the same issues because they’ve regressed. Right. They might’ve fixed the issue at that time, but they didn’t fix the root cause of that issue.
And so you see that back and forth and it’s like, all right we will be like, we’ll have the main QA team in MarTech, but the dev team also needs to be doing, all their different testings or those branching testing or any one of those different tests between different stage environments and whatnot.
And so is it the same applies to the rest of the marketing organization? Right? Again, we will be responsible for the main QA of like our website and other products and stuff like that. But, I’m relying on the analytics team to be like, Hey, this data that’s coming in is not an accurate, or it just doesn’t make sense.
I ran it. I ran on certain different types of analysis on it. And these are the grads. Looking at those graphs okay. Yeah. That makes no sense. So your data is bad, so let’s go-ahead run either run some some smoke testing over there or do some RTM and you can do take that observability approach for that.
And so that’s our, we’re going to build it with MarTech kind of being the central aspect of it. And then I’m relying on other departments to then feed into that to that yet that central QA team so quickly attached to attribute. Right. And this goes back to that example, I just gave from the BI team, but, attribution is as the backbone of our business intelligence and our ROI analysis. Right. All the examples of use you saw that we used that mafia to make sure that we’re.
Essentially, collecting the correct UTM variables and being passing through the different platforms. One of the things that we found was that LinkedIn typically doesn’t play friendly with their API is not frumpy. And we found these not being correctly, attribute it to LinkedIn. And we also found these being passed through without emails.
One of the key insights that actually found from the leads not being correctly, attribute it to LinkedIn. Wasn’t an actual LinkedIn problem. Whereas I was hoping it was, but it was actually, the campaigns were being set up incorrectly in Marketo. And I, again, that goes into that, this then becomes a process problem.
And so we’ll then need to address that process with that team and see why they’re having issues, setting those campaigns up correctly, and mark it up. And that’s huge for our line analysis. Because at that point, When we do those different types of analyses, that’s how we actually distribute our budgets to the different channels that we advertise it.
And so, if we’re not attributing currently to LinkedIn, we’re taking money away from LinkedIn. And in that case, that’s incorrect because a lot of these leads were for LinkedIn, but, they’re just not being tagged correctly. And so, again, the key point here is just to identify whether the issue as part of the LinkedIn platform on Marketo and then continuously monitor those leads in real-time.
[00:50:00] So here’s another example of that. And so this is like the one that I really wanted you all to see is this is our test that we have set up in Stack Moxie is a real-time monitoring test. And what it does is it looks at, what happens when a lead comes in from LinkedIn and syncs to Marketo?
And from Marketo distinct over to dynamics in this test, we’re testing whether the email for that lead is there. The country field is filled out correctly because that’s how we route our different leads for, from a sales perspective. But then again, speaking specifically to that ROI piece, the UTM parameters are key here.
If those are done incorrectly, then that really affects our budgeting and affects a whole lot of strategic planning. And that goes up to the higher ups. And so, tests like this one specifically out helped us significantly find a lot of the issues and errors that we’re seeing between those platforms.
And, again, we’re working on texting a lot of them, and I get really these up and running routinely just to be able to catch things that are wrong, but also catch things as a writer as well. Any questions on this?
Neil Harrington: My one question would be, how long did it take you to set this up?
Dawit Tesfaye: So setting up the test is actually easy. It doesn’t really take that much time, but it’s actually thinking through what you need to do. Like all the different things that you need to check. Right. And that, for me, that came from the manual testing that we did.
So it was easy to take the learnings from that and then apply it over to Stack Moxie because the tool then just makes it easy for you to, for example, check the email of field on this specific lead. And then, you can switch between the different platforms and then check emails there and check the different values there.
So the actual setup of the test within Stack Moxie is not hard; it’s the planning of the test that you need to actually spend some time on.
Neil Harrington: Yeah. Another thing that also shows it looks like it took about 40 minutes for this to run through. It’s funny. Cause everybody always thinks that it takes two minutes to go from their LinkedIn to the Marketo, to their Dynamics. And I think that it’s obvious that it’s not two minutes, it’s much longer than that flash performance on this as well.
Ariel Sasso: I have a question for implementing this. Did you find that it was easy to get adoption from leadership and the rest of your team to buy into the idea of QA and testing automation?
Dawit Tesfaye: Yes. Especially when we pointed it out to them, how it affects our eye, things of that nature.
Absolutely. But even moving past ROI and whatnot when you’re making those key business decisions on that strategic level and you’re doing it with data, that’s not actually. Then your results are, it tends to be poor. And that’s just, that’s just the very quick, easy patient to them.
And then just actually walking them through why that’s the case, right? Showing them that LinkedIn failed, where the person’s attribute it as coming from a Marketo email, when in fact the person came from LinkedIn and what that does from, from a strategic level, if I’m pulling money away from LinkedIn, I’m putting it into the marketing channels.
But that marketing channel is not bringing in the new leads that we need. Then that affects how the sales team operates and revenue in general. And they see the big picture over there or when that happens.
And then looking at we’d alluded to some of the GDPR issues that we’re having. So when we installed Asana and it is a really great privacy tool. But it’s a very stringent one, right? So we want to be in compliance with all the laws. And so we decided let’s go ahead and get this platform.
And does it essentially does the job for you? And it takes away that whole privacy net, which is a big issue right now in MarTech. And so we’re like, all right, fine, let’s install it and let them deal with it. But then we also identified that we are losing data. And using Stack Moxie, we’re able to identify where we are losing the most data.
Right. In this case, we were looking at which countries that we’re losing data from in which campaigns. And I mean, it’s not surprising that we found that a lot of the data was being lost among the European countries. Because again, they take privacy much more CRC. And we also looked into why we were losing the data, right.
The technical aspects. So one of the first things that we did, we set out to test like our hypothesis is this new tool actually causing us to lose data. Right. We use that mossy to run some real-time monitoring case, as well as some ad hoc tests. And we did find that the tool was causing some data issues and that we can see certain parameters from some of our key platforms.
And so, when we did it, as we evaluate in our practice settings for those specific countries, again, in accordance with those walls and we adjusted the settings in the tool in a creative way to build trust with the users. So that, like that, make again, looking at it from a user experience and a customer experience perspective.
It’s like, maybe they’re just not seeing the banner is too small or the buttons. Who are not easy for them to find and click accept or stuff like that. So we tested a bunch of those different ideas and essentially, when we did a lot of those testings, what we found was, and implemented them and made the banner more visible and things of that nature is that yes, some of these people are willing to give us the data.
They just, at that point did not, were not able to see it or just, it was hidden and stuff like that. So once we got more people to accept and give us some of their data real that we were then able to then better hone in our marketing skills. And that’s how we actually increased our conversions while regaining most of our data from those regions.
And then, and on this note, right again, I’ve been tasked with scaling the MarTech team and the systems that have pointed. For me, one of the biggest things when I set out to do that was let me come up with three main pillars of girth, right? And to me, those three pillars are integrations QA and innovation.
I mean, they’re very straightforward. I chose them for that reason. Right. I wanted this to be as simple as possible so that there is no confusion as to what the purpose of the department is. However, the idea is to automate and centralize as much input as much in pillars one and two.
So, the integrations automate as much and centralized as much of that as possible. Same thing with QA, and again, Stack Moxie has helped with the automation of our testing and our QA processes. When you automate as much as you can in one and two, what that then does is it frees up people’s times for pillar three, which is innovation. And innovation can be anything from being proactive, right? I mean, going out there and finding new ways to do new things as opposed to being reactive, which in that case, what I’ve found, especially from our gap analysis, is we tend to be reactive. Right. We find an issue, we fix it. Then we look for the next issue, and then we fix it right. The idea is to put that in the past and move, to stop looking for issues and start looking for new ways to do things right. Solutions. And that’s a big goal of mine, right? And so, what I want to do is build out a data-driven environment in which processes and QA are automated as much as possible.
Therefore, then we can get the team members in MarTech more time to be innovative while we’re focusing on customer experience and overall business objectives and goals. The big takeaway here is we’re doing all of this with a focus on customer experience. Right? A lot of what I talked about was today’s data and making sure it’s accurate and whatnot.
And while I focused on data, I’d also like to point out that again, technology and data at once, all problems in my experience, especially working with other organizations, the key focus is people, right? Whether it be internally like employees and whatnot, or externally like your customers and that’s where the value comes from.
So, we’re taking a focus on people approach, then process data and technology in that order. And then, and there was this coat, right? I was watching the Social Dilemma and Edward Tufts said there are only two industries that call their customers users, right? Illegal drug industries and the software industry. Our goal is to regard them as people.
And when he said that, to me, it arrived back around to we need to focus on, our customers as people, right? So that it goes into that empathetic marketing. But when you look at it overall and you’re looking at data, it’s no longer a business-centric, rather it becomes customer-centric as a result of moving all of that customer data to the center of the organization.
[01:00:00] Neil Harrington: Yeah, it’s amazing as it’s hard when we hear like digital transformation, like that’s been a buzzword for over a decade and then, Microsoft made it worse when such has said, Hey, we saw more digital transformation in 30 days and we didn’t 10 years. And, but it’s like, what is data mean?
And you wish to see these, it’s like very buzzy words. And it’s like, well, Hey, we should be thinking about it in terms of the customer. You’re a hundred percent. Right. And it’s like small chunkable things like UTM codes, not being attached and attributed correctly. And then how and thinking about how it moves through the data warehouse onto the CRM so that we can report off of it. Like getting more granular, it was great to see you get granular on that often.
Dawit Tesfaye: Yeah. If anyone has any questions, I will give out my email and stuff. So if you guys want to discuss any of this further, I’m more than willing to. I can speak about this for days.
Neil Harrington: No, this has been awesome. Thank you so much for spending some time. And, we’re going to have you back in a year to see how the organization’s built out and what sort of innovations you’ve done.
Cause I always think like innovation is also a buzzword. It’s like, we joke that there’s never a silver bullet. There’s a thousand lead bullets, right? Like you’re always just constantly optimizing and constantly iterating. And that’s what really creates innovation. Not just like the three o’clock in the morning ah-ha moment. I’m excited to have you back in a year to figure out what sort of innovations you guys have come up with.
Dawit Tesfaye: Absolutely.
[01:01:39] Neil Harrington: Awesome. Thank you Dawit for joining us today. Thank you everyone in the audience, and I really look forward to hearing from you soon.
If you would like to have a free instance of Stack Moxie, you can go to stackmoxie.com.
We have a free, always free instance that you can try. And so you can start working around with it. Ariel Sasso is our head of customer success, and there’s always happy to walk you through what we’re doing and how to make your team have QA as well. Thank you.