Top Traffic Pages Finder
Use cases
Jupyter notebook that connects to the Google Search Console API via OAuth to rank pages by organic traffic.
Aggregates clicks, impressions, CTR, and average position at the page level.
Automatically categorises pages into traffic ranges (0, 1-100, 101-1000, 1001-5000, 5001-10000, 10001+) and identifies zero-traffic pages for review.
Exports a four-sheet Excel workbook with Plotly visualisations.
Platform
Jupyter Notebook (requires Python environment)
Input
Google OAuth credentials
GSC property and date range
Output
Excel with top pages and traffic analysis
Features
- OAuth authentication via google-searchconsole library
- Traffic range categorisation (0, 1-100, 101-1K, 1K-5K, 5K-10K, 10K+)
- Zero-traffic page isolation for review
- Keyword-to-page relationship mapping
- Four-sheet Excel export with xlsxwriter
- Plotly interactive traffic distribution charts
How to use
- 1 Set up OAuth credentials and authenticate
- 2 Enter your GSC property URL and country code
- 3 Select date range and search type
- 4 Run to aggregate page-level metrics
- 5 Review traffic distribution and zero-traffic pages
- 6 Download Excel workbook with four analysis sheets
Want me to run this for you?
I offer this as a managed service. You get the insights without touching the tool.
Related Tools
GSC Coverage Visualiser
Search ConsoleVisualise indexing issues from Search Console coverage reports with interactive Plotly treemaps and sunbursts.
GSC Data Exporter
Search ConsoleBulk download Search Console data beyond the 1,000 row limit with automatic batch processing.
GSC Question Finder
Search ConsoleExtract question-based keywords from Search Console using regex pattern matching.
Let's work together
Monthly retainers or one-off projects. No lengthy reports that sit in a drawer.
Let's Talk