top of page

Empowering Data-Driven Decisions: A Web Analytics Success Story for a Small Government Agency

  • Writer: Will
    Will
  • Dec 17, 2025
  • 9 min read

Executive Summary


Client: A small government agency responsible for managing and developing community resources along with supporting the local economy.


The client hosted a website with information regarding its agency activities, initiatives, and community events. They knew basic information about the website's performance like most-popular pages, number of sessions, and event counts; however, they lacked the resources (mainly time and people) to effectively understand and use their Google Analytics 4 data to create actionable insights around how to improve site performance.


Backcountry Draft was contracted to help with the following items:

  1. Integrate marketing activity data such as email, display ads, and social media with web analytics data

  2. Provide a reporting tool that would help the client understand web performance in relation to their specific goals

  3. Identify action items and areas for improvement based on website performance


Within a month, Backcountry Draft had combined web analytics data (Google Analytics 4), email marketing (proprietary tool), and weather data (government source) into a singular datasource and built a custom Tableau dashboard that visualized the data. The new dashboard and reporting capabilities immediately provided value by reducing time spent on analytics and wrangling data, creating connections between marketing activities and site performance, and improving data hygiene.


Introduction


Client Background


Our client owns a website that plays an important role in communicating events, current agency initiatives, and community information. Alongside the website, the client also has an email newsletter, multiple social media channels, and physical kiosks with signage throughout their town. The website is regularly updated and properly maintained, and the client often uses their other communication channels to direct individuals to the website. Overall, they produce quality content and messaging for their online channels.

While their website had Google Analytics 4, they lacked the time and resources to measure the impact of marketing efforts along with website performance.


Without effective analytics, our client would make decisions based on intuition and was be unable to definitively measure the impact of their work. The lack of benchmarks, KPIs, and measurement lead to an inability to demonstrate ROI for marketing activities along with inefficient resource allotment and spend.


The client's primary challenge was understanding what actions they should take to improve website performance.


Purpose of Study


This case study will examine the impact of enabling non-native analytics and custom data sources. We will discuss the Backcountry Draft methodology and approach to solving our client's challenges while keeping costs and billed consulting hours to a minimum. Additionally, initial results will be covered along with client feedback and outcomes. This project resulted not only in new reporting capabilities but also created new data-driven processes, and we will describe those changes and how efficient reporting affected our client.


Challenge


Limited Resources Lead to Limited Results


Being a small government agency, our client has a limited staff and budget. While the team is fully capable of managing, developing, and utilizing their website while strategically integrating it into their overall mission, they were unable to use their data to its fullest extent. It's a story we hear all too often at Backcountry Draft. The client's team is fully capable of using and understanding data, but they simply don't have the time or resources to build integrated datasets, dashboards, and reporting tools. Limited staffing and budget resources lead to limited results as teams are unable to devote the time necessary to build the required infrastructure to take full advantage of their data. Not only does this fact limit what a team can do, but it's often extremely frustrating as our clients are aware of what they're capable of achieving with their data, but they simply lack the resources to do it.


Data without Structure


Our client collects data from multiple sources. Their website runs Google Analytics 4, the email provider collects basic email metrics (recipients, opens, click-through-rate, etc.), and Meta provides engagement data for Instagram and Facebook posts.

This data, however, was unstructured and wasn't readily able to be joined together. For example, email data was aggregated at the email-level while web analytics data was at the date-level, and these datasets would have to be manually adjusted to combine them which was time prohibitive for our client.

Unstructured data exacerbated existing issues caused by the lack of resources further preventing our client from properly using their data to make decisions.


Taking Shots in the Dark


Without the data necessary to fully understand the impact of their actions, our client's decisions were mainly based on intuition, and they weren't able to attribute changes in performance (good or bad) to any specific action.

Without proper measurement tools or attribution, attempts to improve website performance were inefficient, time consuming, and unable to prove any return on investment. These issues resulted in marketing campaigns and promotional efforts with minimal effectiveness and an inability to appropriately demonstrate value.


Method


Understanding the Client's Problems and Data


Backcountry Draft's approach always starts with taking the time to understand our clients' problems and data. While we believe "data is data" and similar processes and methods can be applied to a wide range of problems, those processes and methods should be adjusted to fit each client's individual needs.


Introductory calls serve an incredibly important function in that they give us the best view into the current state at our client. We're able to see their existing data usage, decision-making process, and any additional growth opportunities. During this phase, our goal is to get the clearest understanding of how the client uses data and how they define success for the project.


For this project, we started with a basic requirements gathering session where we ask about the source systems, short- and long-term goals, and major pain points. During this introductory call, we discovered a handful of easy opportunities for improving their data such as implementing more descriptive campaign codes for newsletters and links.


After our introductory calls and requirements gathering sessions, we defined the project's primary question as "how do marketing efforts affect website user behavior" and outlined the following goals for this project:


  • create a singular custom dataset that includes

    • Website performance (Google Analytics 4)

    • Email performance (proprietary government provider)

    • Weather data to track impact on community events (government provider)

  • visualize the data with an easy-to-use dashboard that would answer the above question

  • interpret the data and provide regular recommendations to improve web performance


The final step during the introductory process is preliminary dashboard design approvals. We take everything we've learned and create a mock dashboard design that we think will solve the client's problem. During this step, we proposed which data visualizations we think would best answer our client's questions. Using a combination of sample data and static mockups, we create a simple and semi-functional dashboard sample to help give the client an idea of the deliverable they'll receive.


With a firm understanding of our client's current state and where they want to be at the end of the project, we begin creating a solution that will close that gap and achieve the project's goals.


Creating a Solution that Solves Their Problems


With the project goals clearly outlined, we started developing a solution that would not only fulfill their goals but do it without an overly complicated deliverable.


During our introductory calls, we learned about how our client defines success for their website and the metrics that matter to them. These metrics and KPIs guide how we construct their data and the fields we extract from the source systems, and in order keep everything as simple as possible, we only extract the fields necessary to solve our client's problems.


The data requirements for this project were fairly simple, and we decided to create a Google Analytics 4 extract via Looker Studio. We chose this path for a number of reasons. First, a direct integration between Google Analytics 4 and Looker Studio already exists; this project had a limited budget, and we didn't have the available hours to build a custom connection. Additionally, when extracting from Looker Studio, the aggregation is correct and automatically removes duplicates which further simplifies the workload and reduces the necessary hours. Finally, using Looker Studio allowed us to select only the necessary fields while giving us flexibility to expand the dataset in the future. Extracts from Looker Studio may not be the most sophisticated solution; however, this method saved us time (and the client's budget) while providing the data necessary to build the solution for this specific project.


Once we had the raw data extracts needed to build the singular dataset, we started making the connections. Seeing that this project was fairly simple, the data didn't require a lot of preparation in order to be joined. Data granularity posed the largest challenge as the different datasets had different levels. As the client wanted to better understand website performance and what factors affect it, we wanted the data to be at at the hour-level. We took this approach so the dashboard could display, visitors per hour, peak hours, and we could better understanding if social media post and email send times affected website performance.


The Google Analytics 4 data exported at the hour-level; however, the email and weather data were at different levels. Starting with email, this data extract was based around the individual email and provided basic performance data aggregated to each email. Thankfully, this extract provided the date and hour of send, and we were able to use that to join it to the web analytics data. While this solution didn't provide detailed data on email performance, it was able to indicate when an email was sent, and the goal of this project centered around web performance and not email.


Finally, we connected the weather data by densifying the data to add hours. Using this method resulted in the daily weather data figures being repeated for every hour, and though this method doesn't provide the best results, it gave us a general connection between weather and site performance for community event pages.


Implementing Something that Will Last


We wanted the end deliverable to not only solve our client's problem but provide impactful results that will be integrated in their daily decision-making processes. As consultants, the products we provide should be able to function without or with very little involvement from us; the goal is to create value and not dependencies.


With this concept in mind, we created a minimally involved process for updating, using, and maintaining our deliverables.


Starting with the dashboard, we wanted to make sure our client could easily interpret the visualizations and use it to help achieve their goals. The introductory calls once again play an important role, because we use that time to ensure our client can interpret the suggested visualizations and gather requirements for any additional items they want to see in their dashboard. Once the initial tools are built, we gathered feedback and made sure the client was happy and would be able to integrate the new data capabilities into their workflow.


Outside of the front-end deliverable, the dataset was also built to last. As previously mentioned, this project uses a combination of flat files in Tableau Prep to create the singular dataset. Using this method, we can select the columns we want to use and can easily add additional fields in the Prep flow. This flexibility lets the dataset grow without heavy development costs as our client's requirements change and grow with their organization over time.


Our method produced a cost-effective tool that's easy for our client to use and for us to update as our client's organization grows and changes.


Results


New Processes and Handover


When a client goes from having little access to their data to having everything in a centralized tool, we want to prevent them from becoming overwhelmed and guide them through how they can integrate their new capabilities into their workflows.


With the dashboard completed, processes and changes to existing reporting had to be made to help integrate the new tool. As part of our project, we provide a monthly reporting document that highlights trends and gives insights based on the previous month's performance. We regularly meet with the client to discuss the findings and provide recommendations. These review calls also serve as opportunities for us to learn how the client uses our tools and if there's opportunity for us to make any improvements or help answer any questions.


Improved Performance and Decision Making


The most immediate impact to our client was visibility. All of their web performance data became instantly and readily available in the exact format they needed. This new capability allowed them to immediately judge the performance of social media posts and advertising campaigns and adjust their communication strategies in real-time.


With the improved performance feedback from the dashboard, our client can more effectively manage advertising budget and move money from an underperforming campaign into a more effective one and reduce wasted spend.


With a small team, time can be one of the most valuable resources, and our project greatly reduced the time required to understand web performance. Our client can focus on more impactful activities rather than wrangling insights from multiple sources.


Overall, our project resulted in a reduction in time spent working with data and creating reporting, improved and immediate understanding of website performance as it relates to marketing and communication efforts, and more efficient and effective budget allotment for advertising and social media posts.






Logo for data and analytics consulting company

2026 BACKCOUNTRY DRAFT

Email

Social Media

  • Instagram
  • LinkedIn

Address

2020 North Academy Blvd Ste 261 #3115

Colorado Springs, CO 80909

bottom of page