Tag: ai search

  • Inside Bing’s New AI Performance Report: What 20,000 Copilot Citations Taught Us

    Inside Bing’s New AI Performance Report: What 20,000 Copilot Citations Taught Us

    Bing launched an AI Performance report inside Webmaster Tools earlier this month. We pulled our data the same day.

    91 days of Copilot citation data. 19,717 total citations across 86 pages. One page accounting for 69% of all of them.

    We’ve been tracking AI search visibility for clients using Scrunch and our AI Grader for months. But this is different. This is Microsoft showing us exactly how often — and why — Copilot pulls our content as a source when generating answers.

    The data is early, imperfect, and worth looking at closely. You can explore the full interactive dashboard or read on for the highlights.

    Summary statistics: 19,717 total Copilot citations, 86 unique pages cited, 5,804 peak citations in a single day, 400+ unique grounding queries

    What the AI Performance Report Shows

    Microsoft released this as a public preview in February 2026. Anyone with a verified site in Bing Webmaster Tools can access it.

    You get three data exports:

    • Daily overview — total citations and number of unique pages cited, by day
    • Page-level stats — which URLs get cited and how often
    • Grounding queries — the retrieval queries that triggered citations

    No API access yet. Fabrice Canel from Microsoft confirmed on X that API support is on their backlog but didn’t give a timeline. For now, it’s CSV exports from the dashboard.

    Our Numbers

    We pulled 91 days of data for searchinfluence.com, covering November 12, 2025 through February 10, 2026.

    The timeline tells a simple story: citations spiked hard in early December, then fell off.

    Daily Copilot citations line chart showing a massive spike on December 7 reaching 5,804 citations, with a steady decline through January and February

    December 7 hit 5,804 citations in a single day. That spike almost certainly corresponds to our AI SEO Tracking Tools 2026 analysis gaining traction in Copilot’s retrieval index. By late January, daily citations had dropped below 50.

    Average daily citations by period showing Dec 1-8 averaged 1,520, February averaged 34

    The period breakdown makes the decline even clearer. Dec 1-8 averaged 1,520 citations per day. February: 34. That’s a 97% drop in two months.

    A few possible explanations: the analysis was written for a specific moment in time and may be aging out of Copilot’s freshness window, new competing content entered Bing’s index, or Microsoft changed how Copilot’s retrieval weights sources. We’re still looking into it.

    One Page Captures Almost Everything

    Of the 86 pages Copilot cited across the full period, one captured 69% of all citations.

    Top 10 cited pages bar chart. AI SEO Tracking Tools 2026 Analysis leads with 13,599 citations

    The top four pages — all AI SEO content — accounted for 90% of total citations. Everything else on the site combined makes up the remaining 10%.

    Citation concentration donut chart showing AI SEO Tools 2026 Analysis at 69%

    That concentration is more extreme than what we see in traditional search. Google distributes traffic across many pages because users click through a list of results. AI search works differently — it picks one or two sources to ground its answer, and those sources absorb almost everything.

    Building deep authority on your strongest topics matters more than spreading thin across many. In AI search, being the second-best resource on a topic might mean getting zero citations.

    The Grounding Queries Are the Most Useful Part

    The third export — grounding queries — is where we found the most actionable data. It also revealed something about how Copilot’s retrieval system works under the hood.

    These queries aren’t what users typed into Copilot. They’re what Copilot’s retrieval system searched for internally when it needed a source to ground its answer.

    Look at these examples. Nobody types queries like this into a search box:

    • “accuracy of AI SEO GEO platforms tracking position in AI shopping guides”
    • “AI search optimization GEO platforms competitor tracking pricing features positioning”
    • “push data to analytics platforms or tag managers from AI search optimization GEO platforms”

    Those read like machine-generated retrieval queries — Copilot decomposing a user’s conversational question into keyword-dense search queries optimized for Bing’s index.

    Then there’s query fanout. Same user question, multiple retrieval variants:

    Query fanout chart showing four clusters of the same question rephrased different ways

    The “optimize content for AI search” cluster shows five variations of the same query. “Track AI model versions” shows four. Same intent, rephrased to catch different documents in the index.

    This matters for interpreting the numbers. One user conversation likely generates 3-5 citation events through this fanout process. So our “19,717 citations” probably represents closer to 4,000-6,000 actual user conversations. The raw numbers are inflated by the retrieval architecture itself.

    But the query themes are accurate. Over 400 unique grounding queries, clustered into clear topic areas:

    Grounding query themes donut chart

    AI SEO tool comparisons dominate — pricing, features, platform coverage, specific vendor evaluations. Higher ed marketing shows up as a secondary cluster. Both line up exactly with the content areas where we’ve invested the most over the past year.

    What This Means for Content Strategy

    Four things stood out from the data.

    Structured comparison content earns citations. The page capturing 69% of all citations is a detailed tool-by-tool comparison with pricing, features, trade-offs, and named vendors. AI retrieval systems need specific, structured data to ground their answers. High-level overviews without specifics don’t get pulled in.

    Grounding queries are a new form of keyword research. These aren’t the same queries that show up in Google Search Console. They represent what AI retrieval systems search for when answering user questions — a different target than traditional SEO keywords. If you have access to this data, use it to find content gaps and understand exactly what people are asking AI about your topic areas.

    AI systems cite a narrow set of pages. Even on days with 5,000+ citations, only 15-18 unique pages got referenced. Copilot picks a small number of authoritative sources rather than pulling from a wide set. Depth beats breadth.

    Citation decay is real and fast. Our 97% decline from December to February suggests either content freshness matters in AI retrieval, competitive content displaced us, or both. Publish-and-forget doesn’t work for AI visibility, just like it doesn’t work for traditional SEO. Probably more so.

    What We Can’t See Yet

    An honest look at the gaps, because there are several.

    This is Copilot only. No equivalent data exists yet from ChatGPT, Perplexity, Gemini, or Google AI Overviews. The query themes likely transfer across platforms — people ask similar questions regardless of which AI they use — but citation volumes could look very different elsewhere.

    No click-through data. Citations don’t equal traffic. We don’t know how many users clicked through from a Copilot answer to our site versus just reading the AI-generated response. Microsoft may add this metric later, but right now we can measure AI visibility without measuring engagement.

    No competitive view. We can see our own citations but not what other sites Copilot cited alongside ours for the same queries. Knowing who else gets cited — and for which queries — would make this data significantly more useful.

    The data is still in preview. Microsoft has said more data is coming throughout 2026. What we have now is a starting point.

    What We’re Doing With This

    We’re using the grounding queries to map content gaps. 400+ queries show us exactly what Copilot users are asking about our topic areas. Where our existing content doesn’t fully answer those queries, that’s where we’re focusing next.

    For clients, we’re adding Copilot citation metrics to monthly reports. “Your site was cited X times in AI search this month across Y pages” is a concrete number. Most of the industry is still guessing about AI visibility. This is actual measurement, even if it’s limited to one platform.

    And we’re layering this data alongside what we already track through Scrunch (AI visibility across ChatGPT, Perplexity, and other platforms) and our AI Grader (content readiness scores). Three data sources covering three layers: content quality, AI visibility, and actual citations. Together, they give us the closest thing to a full picture of AI search performance that exists right now.

    Check Your Own Data

    If you want to see your Copilot citation numbers, verify your site in Bing Webmaster Tools and look for the AI Performance section. The report is available for all verified sites.

    Want to see how your content scores for AI search readiness right now? Try the AI Grader — it takes about 30 seconds.

  • 30+ AI Search in Higher Education Stats [2026]

    30+ AI Search in Higher Education Stats [2026]

    Higher education discovery is becoming increasingly more distributed, more automated, and more competitive.

    Students now rely on a mix of AI tools, traditional search engines, and social platforms as they evaluate programs. Institutional strategies, however, do not always reflect how these new search elements work together.

    Below, we’ve compiled over 30 statistics that show how student search behavior has shifted and how institutions are responding (or aren’t). Use them to identify your visibility gaps, validate your priorities, and guide your strategic updates for 2026 and beyond.

    How Students Search for Higher Education Programs Today

    AI tool usage and trust in the research process

    • 50% of prospective students use AI tools at least once a week.
    • 1 in 3 prospects trust AI tools as a source for program research.
    • 79% of prospects read Google’s AI-generated overviews when they appear in search results.
    • 56% of students are more likely to trust a brand that is cited by AI.

    Search engines and university websites remain core discovery channels

    • 84% of prospects use traditional search engines to explore professional and continuing education programs.
    • 63% of prospects rely on university websites during their research process.
    • 77% of prospects trust university-owned websites over other sources.
    • 82% of prospects are more likely to consider programs that appear on the first page of search results.

    Search behavior is expanding across multiple platforms

    • 84% of prospects use search engines to research professional education opportunities.
    • 61% of prospects use YouTube.
    • 50% of prospects use AI tools.

    Social platforms still influence consideration

    • Nearly 70% of prospects say frequent recommendations on social media increase their likelihood of considering a product or service.
    • YouTube (57%), LinkedIn (49%), and Facebook (43%) are the top social media platforms for program research.

    How prospects search and what content they want

    • Multi-word search phrases dominate how prospects search for programs.
    • Prospects under age 35 show nearly twice the interest in professional and continuing education compared to older audiences.
    • 65% of prospects want clear program summaries in social content.
    • 54% of prospects look for career guidance and outcomes.
    • 50% of prospects want testimonials and real student perspectives.

    This data is drawn from AI Search in Higher Education: How Prospects Search in 2025, a research study conducted by Search Influence in partnership with UPCEA in March 2025. The study is based on survey responses from 760 prospective adult learners and examines where students search for programs, how they use AI tools and alternative platforms, and which sources they trust most during the decision-making process.

    Institutional Readiness for AI Search in Higher Education

    AI search strategy adoption across institutions

    • 60% of institutions say they are in the early stages of exploring AI search.
    • 30% of institutions report having a formal AI search strategy in place.
    • 10% of institutions have not started or do not believe AI search will significantly impact student discovery.

    Challenges slowing AI search adoption

    • 70% of institutions cite limited bandwidth or competing priorities as their biggest barrier.
    • 36.67% of institutions report a lack of in-house expertise or training.
    • 26.67% of institutions cite unclear ROI, lack of leadership buy-in/institutional support, or uncertainty about how AI search works as slowing progress.

    What institutions are prioritizing in AI search strategy

    • 59.26% of institutions prioritize the accuracy of AI-generated information about their programs.
    • 48.15% of institutions focus on improving visibility and competitive positioning in AI-driven results.
    • 22.22% of institutions say other initiatives currently take priority.
    • 14.81% of institutions are waiting to see how AI search evolves before acting.

    Tracking and Measuring Visibility in AI-Generated Search Results

    Awareness and monitoring of AI search visibility

    • 56.7% of institutions know their institution appears in AI-generated answers.
    • 26.7% of institutions have seen their institution referenced once or twice, but do not actively track it.
    • 13.3% of institutions are unsure whether they appear in AI-generated responses.
    • 64.29% of institutions that track AI visibility use dedicated tools or formal tracking methods.
    • 28.57% of institutions do not formally track their AI visibility.

    The above insights are based on the AI Search in Higher Education Snap Poll, conducted by UPCEA in October 2025. The poll surveyed 30 UPCEA member institutions to understand how colleges and universities are responding to AI-driven changes in student search behavior.

    Frequently Asked Questions About AI Search in Higher Education

    What is AI search, and how is it changing higher education discovery?

    AI search describes how people use AI-powered tools and summaries to find and compare information across many sources at once. Rather than navigating page by page, users increasingly rely on AI to surface key context and options early. In higher education, this behavior is already widely adopted, with 50% of prospective students using AI tools at least weekly and 79% reading AI-generated overviews when they appear. As a result, early impressions of programs are often formed before a student reaches a university website.

    Does AI search optimization replace traditional SEO for higher education marketing?

    No, AI search optimizations do not replace traditional SEO strategies. Rather, they build on them. AI-powered tools still rely on well-organized, relevant, and authoritative content to generate accurate summaries and recommendations. For higher education, that means strong technical foundations, clear program pages, and credible signals remain essential. AI search adds a new layer of visibility, but it only works effectively when the underlying SEO structure is sound.

    What risks do institutions face if they ignore AI search?

    Ignoring AI search increases the risk of being invisible or misrepresented during early research. Because AI-generated summaries often guide program awareness, institutions that do not appear may never enter a prospect’s consideration set. Research shows that while 56.7% of institutions believe they appear in AI-generated answers, many do not actively track that visibility, creating blind spots that can quietly undermine recruitment efforts. Awareness without measurement leaves exposure gaps.

    Can institutions influence what AI tools say about their programs?

    Yes, organizations can influence AI outputs by improving the clarity and consistency of the information AI systems reference. AI tools commonly draw from authoritative, well-structured content when generating summaries. For higher education institutions, this means program pages, admissions information, and outcome-based content play a direct role in how programs are described. Influence comes from strong content foundations rather than direct control.

    How should marketing teams prepare for continued changes in AI search?

    Marketing teams should approach AI search as an extension of modern discovery, not a separate channel. Preparation includes understanding how information is summarized, ensuring content is accurate and extractable, and monitoring visibility across AI-driven environments. Higher education teams that align content strategy with student research behavior are better positioned to adapt as AI search continues to evolve. The goal is sustained visibility, not one-time optimization.

    What This Means for Higher Education Marketing Teams

    Student behavior has moved faster than institutional strategy, creating visibility gaps at the earliest stages of discovery.

    AI-generated answers now play a meaningful role in which programs make it into a prospect’s consideration set, raising the stakes for how institutions appear in those environments. As this shift accelerates, accuracy, clarity, and consistency across owned content directly influence how programs are represented and trusted.

    To explore the full state of AI search in higher education, download AI Search in Higher Education: How Prospects Search in 2025 today

  • [Search Influence x UPCEA] Unpacking New Research on AI Search in Higher Education

    AI Search in Higher Education Series image for webinars and labs

    This blog post was updated by Ren Horst on November 4, 2025 following the webinar event.

    50% of prospective students use AI tools at least weekly to research information online. 79% read Google’s AI Overviews, and more than half say they’re more likely to trust the institutions AI cites.

    These search behaviors are no longer emerging trends. They’re the new reality for enrollment marketing.

    On October 23, UPCEA and Search Influence hosted the live webinar “AI Search in Higher Education: How Prospects Search in 2025,” unveiling findings from the AI Search Research Study.

    The session explored what’s shaping prospective student behavior today, plus how higher ed marketers can adapt their visibility strategies for AI-driven search.

    How AI Is Changing Institutional Visibility

    The new research highlights a fundamental shift in how students discover, evaluate, and ultimately choose higher education programs. Traditional search engines remain important, but AI-driven platforms are shaping decisions in ways that enrollment marketers can’t ignore.

    For higher ed institutions, the implications are clear: If you’re not present in AI-powered search experiences, you may be invisible to a significant portion of your prospective students.

    “AI Search in Higher Education: How Prospects Search in 2025” unpacked the study findings, explained why they matter, and showed you how to position your institution for visibility in 2026 and beyond.

    Key Takeaways From the Webinar

    SEO then and now

    Search strategy has evolved from keyword targeting to context and credibility. AI engines understand meaning through entities, relationships, and trusted citations, not just keyword density. The webinar discussed how this shift changes on-page SEO priorities, emphasizing entity optimization, structured data, and semantically rich content to help AI engines interpret institutional expertise.

    Authority and content signals

    Institutional visibility depends on demonstrating trust through structure, accuracy, and reputation. AI platforms prioritize content that’s organized, verifiable, and supported by credible references. The webinar explored tactics for improving these signals, like incorporating earning links, highlighting faculty expertise, and securing third-party mentions that reinforce authority beyond your own website.

    Measuring AI visibility

    Understanding your reach in AI search is becoming possible through emerging analytics. Tools such as Google Analytics 4, Search Console, and Scrunch can reveal AI-driven traffic, question-based visibility, and citation frequency. The webinar covered how marketers are starting to quantify AI exposure, sharing practical ways to integrate early visibility tracking into institutional reporting.

    The Opportunity Ahead

    AI is fundamentally changing how students search and how institutions are seen. Yet, many schools haven’t updated their strategies to reflect how AI engines surface information.

    Those who act now will gain visibility in AI and Google, positioning their programs to be found, considered, and chosen.

    Turn AI into an opportunity, not a threat. Watch the full webinar replay and download the AI Search Research Study to understand what this shift means for your enrollment goals.

  • The July Influencer: Can AI Search Affect Your Paid Ads Strategy? Plus: AI SEO Tracking Tools

    Stay ahead in digital marketing with The Influencer, Search Influence’s monthly newsletter covering SEO, digital advertising, and content strategy. Get top tips to fuel your online growth, expert insights from the Search Influence team, and our latest company news. Don’t miss out!

    Graphic for How Will AI Search Affect Paid Ads? What Marketers Need to Know blog post

    How Will AI Search Affect Paid Ads? What Marketers Need to Know

    AUDREY REYNAUD | 11-MINUTE READ

    How will AI search affect paid ads? Get answers and ROI-boosting tactics from Search Influence. Act now to adapt, cut costs, and stay ahead in AI SERPs.

    Read More


    SI Essential AI SEO Tracking Tools

    The 8 Best AI SEO Tracking Tools: A Side-by-Side Comparison

    WILL SCOTT | 19-MINUTE READ

    AI search is changing everything about SEO. Learn about the best AI SEO tracking tools that turn data into strategy with Search Influence

    Read More


    How to Secure Stakeholder Buy-In for an SEO Roadmap

    PAULA FRENCH | 18-MINUTE READ

    You know you need SEO. A roadmap will show you exactly how and why you need it. Now, with AI inside.

    Read More


    AI Search Interview on Search Engine Land showing Danny Goodwin and Will Scott

    Video Interview: Will Scott on AI content reuse, schema and semantics, and generative engine optimization

    WILL SCOTT | 27-MINUTE WATCH

    Hear the latest on AI SEO in a video interview with Will Scott by Search Engine Land editorial director, Danny Goodwin.

    Read More