How do you get ad campaign data from HubSpot and Opportunity Data from Salesforce in one place?
This question was asked by one of the Marketing Ops community members, Swarnendu Dutta and the feedback was too good not to share broadly on our blog.
Below is a breakdown of replies from the members of the MO Pros community with ideas on solving this interesting challenge.
If you’re sending campaign spend and conversions into SFDC, should be simple to pull those into a report. I usually make a dashboard (for a specific campaign, campaign type, etc.) that pulls in multiple reports
A data warehouse/data lake if you have the team and patience
Swarnendu Dutta to track ad campaign spend (Facebook, LinkedIn, and Google Ads) to Salesforce, I use unique virtual credit cards (VCC) for each campaign. Yes, a small plug for my own company, Extend. In the past, I would spend hours with multiple tabs open to manually check ad spend across platforms and then update Salesforce and spreadsheets. With auto-refill VCC, I mapped charges on the VCC to Salesforce campaigns once. Now, I can report cost per acquisition and ROAS in Salesforce in near real-time. The method of tracking ad spend with unique virtual credit cards is scalable and save you time.
I recently came across CIFL (coding is for losers) and was going to watch their videos to see if its legit, but does anyone else know about what they’re doing and if its worth the effort for the use case of an agency to set up pipelines for clients? https://youtu.be/fhGBtnQocfQ|
Would begin assessing lead attribution models such as Dreamdata or CaliberMind. If your data is normalized and cleaned up with your processes, it can be as quick as a few weeks to have clear reports on your conversion metrics from all your ad spends. Calibermind utilizes Fivetran as well so if you were looking to export to an analytics tool it would be halfway there.
I’ve tackled this in the past with tools like www.klipfolio.com
I’ll let you decide what is best. You can either get a connector like SuperMetrics that runs its own ETL pipeline in the background letting you add those tables to your reports directly without actually seeing those database tables. Or you can run your own data warehouse using a tool like Singer for the E and L portions of ETL, but for the T, pipe into BigQuery or Snowflake, and then query using whatever data vis software you prefer.
For a smaller company, I’d say expect about $50 per month per data source that you connect to your warehouse if you are using Stitch. Fivetran is more expensive. Keboola is also well-priced for SMBs but it works natively with Snowflake so, if you are going the Google route, you’ll end up with an intermediary Snowflake DB before copying to BQ which is weird. Other ETL tools get cheaper at higher volumes but aren’t cost-effective at lower volumes. Your warehouse costs will initially be very cheap or even free as you get 10 GB of storage and 10 TB of queries a month and additional storage and processing cost fractions of a penny but it will add up. Setting up dashboards will require BQ SQL knowledge. When you start optimizing your warehouse your engineering time will shoot up. However, any dashboards that you do before optimizing will need to be redone and that’s a cost that is eliminated by optimizing your warehouse right away. If you are just getting started, I recommend implementing GA4 on your site and using their free BQ connection. You’ll have to be a pretty big company to exceed BQ free tier with just your analytics data. It will let you do more with that data and expose you to well-optimized raw data without incurring costs. With the GA4 raw data, you’ll want to create optimized reporting tables. I won’t go into details about this because its a big subject. Play with the raw data and then start reading or taking a course on designing and optimizing your warehouse. Use dbt for creating optimized reporting tables. It’s free and fantastic. Once you are comfortable with GA4, then start using ELT tools.
Very interesting discussion. In summary: 1. Use ETL (Fivetran/Supermetrics) + Data Warehouse (Big Query) + BI layer (Data Studio) 2. Google Ecosystem. Google sheets with connectors to Data Studio. GA + Google Ads data going directly to Google Data Studio
I have seen some SaaS marketing teams use Funnel.io + Google Data Studio. Costs $499 per month, plus little handiwork.
Finally, what is the total engineering and MarOps time in setting this up and maintaining it? What does the total tool cost? Zapier/SuperMetrics/Fivatran and Datawarehouse?
Tools that combine, Attribution+ Analytics+ Custom metrics will be more suited than Attribution only tools.
Thanks, guys. I will tell you my approach. Please let me know what do you think about it. We are a small team so implementing a data lake might not be the best way. I am using Google data studio. Getting the ads data & traffic from GA. I have a Google sheet where I am adding the raw data from SFDC for conversions then mapping it to the correct campaign name ( few leads have different campaign names ). And then adding this Google sheet to the data source in data studio and merging with google ads data source. Right now, the raw data in Google Sheets is a manual process but I am thinking of using Zapier for it.