SEO Audit Report
Search Visibility
Analysis
thebelroyhotel.com.au
Audit Date
04 Apr 2026
Business Type
Pub / Bar / Bistro / Functions
Platform
Squarespace
Prepared by
Taussig · taussig.ai
SEO Health Score
37/100
Clinical Assessment
Good content.
Almost invisible to search.
The Belroy has well-written, venue-specific content and a clean technical foundation — but zero meta descriptions, zero canonical tags, zero Open Graph tags, and a LocalBusiness schema node that contains no usable data. On top of this, Squarespace's default robots.txt blocks all major AI crawlers including Google-Extended, removing the site from AI Overviews entirely. The remediation list is long but the highest-impact fixes require no code — just Squarespace settings and one schema block in Code Injection.
01
Content / E-E-A-T
42 23%
Technical SEO
62 22%
On-page Factors
18 20%
Schema Markup
8 10%
Core Web Vitals
28 10%
AI Search Readiness
45 10%
Image Optimisation
30 5%
Category Score Weight Weighted Status
Content / E-E-A-T42/10023%9.7WARN
Technical SEO62/10022%13.6WARN
On-page Factors18/10020%3.6FAIL
Schema Markup8/10010%0.8FAIL
Core Web Vitals28/10010%2.8FAIL
AI Search Readiness45/10010%4.5FAIL
Image Optimisation30/1005%1.5FAIL
Total 37/100 100% 36.5 FAIL
02
Three fixes. Two hours of work.
Priority Queue No code required
i
Remove the AI crawler blocks from robots.txt.
Squarespace's default robots.txt blocks Google-Extended, GPTBot, ClaudeBot, anthropic-ai, Amazonbot, and 20+ other AI crawlers. Google-Extended specifically controls participation in AI Overviews (SGE). Blocking it removes the site from all AI-generated search summaries. This was almost certainly not an intentional decision — it's a Squarespace default. Fix in Squarespace: Settings > Advanced > External Services > uncheck "Block AI crawlers". One click, immediate effect.
ii
Add meta descriptions to every page via Squarespace page settings.
Not one of the six audited pages has a meta description. Google is auto-generating snippets from page content — typically less compelling and impossible to A/B test. Each page has a gear icon > SEO tab > Description field. Add a 150–165 character description with a CTA to each priority page: homepage, menu, functions, bar, contact, about. No code, 30 minutes total.
iii
Replace the empty LocalBusiness schema with the complete block in Section 05.
The existing JSON-LD schema contains a name field with no value, an address field set to an empty string, and a broken image URL. Google cannot extract any local business information from it. The replacement schema in Section 05 provides BarOrPub type, full NAP (77 Christie Street, St Leonards NSW 2065), phone, email, structured opening hours, geo coordinates, and social sameAs links. Paste into Squarespace: Settings > Advanced > Code Injection > Header. 20 minutes.
03
Critical
Critical
LocalBusiness schema contains no usable data
The JSON-LD LocalBusiness block on every page has name: (empty), address: "" (empty string), and an image URL with no filename or extension that resolves to a broken Squarespace CDN path. Google can detect the schema type but cannot extract name, address, phone, hours, or any entity data. This is the primary reason the site is not eligible for local pack results or a Knowledge Panel.
Fix: replace entirely with the BarOrPub schema in Section 05. Paste into Settings > Advanced > Code Injection > Header. Remove the existing broken block first.
Critical
Zero meta descriptions — site-wide
None of the six audited pages (homepage, about, menu, functions, contact, bar) have a meta description. Google auto-generates SERP snippets from page content in the absence of meta descriptions. Auto-generated snippets are typically less compelling, less keyword-relevant, and cannot include a call to action. For a hospitality venue where "book a table" or "find us" conversions start from the SERP, this is a direct revenue signal failure.
Fix: Squarespace page gear icon > SEO tab > Description. 150–165 characters per page with a CTA. Priority pages: homepage, menu, functions, bar.
Critical
No canonical tags on any page
Not one of the audited pages outputs a canonical self-reference tag. Without canonicals, Google makes its own canonicalisation decisions across URL variants (with/without trailing slash, query parameters, campaign UTM variants). On Squarespace, where /home may render the same content as /, the absence of canonicals actively invites duplicate content treatment. Squarespace should output these by default — their absence suggests they may have been disabled in settings.
Fix: check Squarespace Settings > SEO > ensure "Canonical tags" is enabled. If the setting is on but tags are missing, add canonical meta via Code Injection per page: <link rel="canonical" href="[page URL]">
Critical
No Open Graph tags on any page
og:title, og:description, og:image, og:url, og:type, og:locale — all absent on every page. Every link to the site shared on Facebook, LinkedIn, Instagram (bio link), or messaging apps will produce an unstyled, uncontrolled snippet. For a hospitality venue that relies on social referral and event promotion, uncontrolled link previews directly reduce click-through from shared content.
Fix: Squarespace has a Social Sharing section in page settings where an image and description can be set per page. For og:title and og:type, use Code Injection. Alternatively, enable Squarespace's built-in social sharing metadata in Settings > SEO > Social Sharing Image.
Critical
AI crawlers blocked in robots.txt (Squarespace default)
The robots.txt file disallows: anthropic-ai, ClaudeBot, GPTBot, Google-Extended, Amazonbot, CCBot, FacebookBot, and 20+ others. Google-Extended is the specific crawler that controls participation in Google AI Overviews (SGE). Blocking it means the site cannot appear in AI-generated search summaries for queries like "best pub in St Leonards" or "where to watch sport in St Leonards Sydney". This is Squarespace's default setting and is almost certainly not intentional.
Fix: Squarespace > Settings > Advanced > External Services > uncheck "Block AI crawlers". Alternatively, edit robots.txt directly in Settings > Advanced > SEO > Custom robots.txt and remove the specific Disallow entries for Google-Extended and GPTBot.
High
High
Multiple H1 tags per page (3–13 per page) — Squarespace split-text issue
Every page has between 3 and 13 H1 elements because the Squarespace template renders decorative split-text as separate H1 nodes. The homepage has 8 H1s including "WELCOME TO", "The Belroy", "YOUR ST LEONARDS LOCAL", "TERRACE", "SPORTS". The functions page has 13. No page has a single semantically clear H1 that matches its primary keyword. Google treats the first H1 it encounters as the primary heading signal — reading "WELCOME TO" teaches it nothing about the page.
Fix: work with a Squarespace developer to change decorative split-text elements from H1 to styled paragraph or display tags. Each page should have exactly one H1 that matches its primary keyword: Homepage: "Pub, Bar & Bistro in St Leonards, Sydney" — Functions: "Function & Event Spaces at The Belroy Hotel" — Menu: "Bar Menu & Bistro Dining at The Belroy Hotel".
High
/home URL listed at priority 1.0 in sitemap alongside /
The sitemap.xml lists /home (priority 1.0, modified 04/01/2026) and the root URL / as separate entries. If /home renders the same content as /, Google sees two URLs competing for the same content. Without a canonical on /home pointing to /, either URL could be indexed as the primary root.
Fix: add a canonical tag to /home pointing to https://www.thebelroyhotel.com.au/ via Code Injection on that page. Or in Squarespace settings, check if /home is a real distinct page or a navigation alias — if the latter, remove the page URL alias.
High
21 tag/category filter pages in sitemap — thin pages consuming crawl budget
The sitemap includes 21 URLs under /whats-on/tag/ and /whats-on/category/ (e.g. /whats-on/tag/schnitzel, /whats-on/tag/nrl, /whats-on/category/events). These are auto-generated Squarespace filter pages with no standalone content — they simply list posts filtered by tag. Including them in the sitemap signals to Google that they are indexable content pages, which dilutes crawl budget and can result in thin-page penalties.
Fix: in Squarespace Blog/Events settings, disable URL indexing for tags and categories. Or add a custom robots.txt Disallow for /whats-on/tag/ and /whats-on/category/ and set those page types to noindex in Squarespace SEO settings.
High
No image width/height attributes — CLS risk across all pages
No audited page outputs explicit width or height attributes on img elements. The browser cannot reserve layout space without declared dimensions, causing Cumulative Layout Shift (CLS) as images load and push content. CLS is a Core Web Vitals metric and a direct Page Experience ranking signal. The large hero/banner images on the homepage and functions pages are the highest-CLS-risk elements.
Fix: in Squarespace image block settings, explicitly set image dimensions. Squarespace does not always pass these through to the img tag, so a developer may need to inject width/height via JavaScript or use CSS aspect-ratio on image containers.
High
No LCP image preload — largest contentful paint unoptimised
No <link rel="preload" as="image"> tags are present. The hero images on the homepage and key landing pages are the likely LCP candidates but are not preloaded, meaning they are discovered late in the render waterfall. LCP is a Core Web Vitals metric.
Fix: identify the LCP image URL per priority page (use Chrome DevTools > Performance or PageSpeed Insights). Add a preload hint via Squarespace Code Injection > Header: <link rel="preload" as="image" href="[LCP image URL]" fetchpriority="high">.
High
No Twitter Card meta tags on any page
twitter:card, twitter:title, twitter:description, and twitter:image are all absent. Twitter/X previews for all shared URLs will be unstyled. Combined with the absence of OG tags, every social share of the site produces an uncontrolled, unbranded link preview.
Fix: Squarespace Code Injection > Header (global): add twitter:card, twitter:site, and per-page: twitter:title, twitter:description, twitter:image matching OG values.
High
Homepage title lacks venue-type keyword
Homepage title: "The Belroy Hotel - Your St Leonards Local" (45 chars). This does not contain the words "pub", "bar", "bistro", or "restaurant" — the primary category keywords users search. "Hotel" is present but may not match search intent for a food and drink venue. Title should include the venue type to match keyword intent.
Fix (homepage title): "The Belroy Hotel — Pub, Bar & Bistro in St Leonards" (52 chars). Interior page pattern: "[Keyword] | The Belroy Hotel, St Leonards" e.g. "Function Spaces & Event Hire | The Belroy Hotel, St Leonards" (60 chars).
Medium
Medium
No reviews or ratings — E-E-A-T and conversion signal gap
No visible customer reviews, star ratings, or review schema appear on any page. For a hospitality venue, third-party validation is a primary trust signal for both users (conversion) and Google (E-E-A-T quality assessment). AggregateRating schema can pull from Google Business Profile or TripAdvisor review counts to display star ratings in search results.
Fix: source a Google Business Profile review aggregate (e.g. 4.2 stars, 340 reviews). Add AggregateRating schema to the LocalBusiness block. Add a "What our guests say" section to the homepage or about page with representative reviews and a link to Google Reviews.
Medium
Homepage lacks entity-defining opening statement
The first text on the homepage is "WELCOME TO" followed by split decorative headings before reaching indexable body copy. AI search models and Google's understanding algorithms prefer an entity-first opening that definitively names and classifies the business: "The Belroy Hotel is a pub and bistro at 77 Christie Street, St Leonards, Sydney, NSW." The current opening ("WELCOME TO The Belroy Your new local rising out of an old favourite...") is engaging prose but not an entity definition.
Fix: add a brief, fact-dense intro paragraph early in the homepage body: "The Belroy Hotel is a corner pub at 77 Christie Street, St Leonards NSW 2065 — formerly the Gilroy Hotel, reopened November 2023. Two levels of indoor and open-air bar and bistro spaces, TAB, live sport, weekly events and function hire for up to [X] guests."
Medium
loading="lazy" absent on all images
No image element on any audited page has an explicit loading attribute. Below-fold images will load immediately on page request, increasing initial page weight and delaying above-fold rendering. This is a Squarespace platform constraint but can be mitigated with JavaScript injection.
Fix: add a global Code Injection script to Squarespace footer: iterate all img elements not in the viewport and add loading="lazy". Alternatively, wait for a Squarespace platform update — this is a known platform gap they have been addressing incrementally.
Medium
sameAs social profile URLs absent from schema
The LocalBusiness schema block contains no sameAs array. Social profile URLs (Instagram, Facebook) are not linked to the entity in schema. Google uses sameAs links to build entity confidence and connect profile pages to the Knowledge Graph entry for the business.
Fix: add sameAs to the replacement schema block (provided in Section 05). Verify the exact Instagram and Facebook profile slugs before deploying.
Medium
All interior page titles use generic prefix pattern
Interior pages follow the pattern "[Page Name] — The Belroy Hotel - Your St Leonards Local". "About", "Menu", "Functions" as the leading keyword are weak — they describe the page type, not the search intent. "Menu" ranks for nothing. "Functions" is better but could lead with "Function Spaces & Event Hire" for keyword depth.
Fix: rewrite to keyword-first format. Examples: "Bar Menu & Bistro Dining — The Belroy Hotel, St Leonards" | "Function & Event Spaces for Hire — The Belroy Hotel, St Leonards" | "Sports Bar with Live TV — The Belroy Hotel, St Leonards, Sydney".
Low
Low
WebSite schema image uses protocol-relative URL
The WebSite schema block uses //images.squarespace-cdn.com/... — a protocol-relative URL. While browsers resolve this correctly, schema validators and some parsers expect an absolute https:// URL.
Fix: change to https://images.squarespace-cdn.com/... in the schema replacement block.
Low
Some images missing alt text
Several images on audited pages have empty alt attributes. While some images have good alt text (e.g. "The Belroy Hotel - exterior", "chicken & leek pie with gravy"), decorative and food images on the menu and bar pages appear to lack descriptive alt text.
Fix: audit all images in Squarespace image block settings. Add keyword-relevant alt text to food/venue images. Retain empty alt only for purely decorative background elements.
Low
No font preload hints
No <link rel="preload" as="font"> tags found. Web fonts loaded late in render increase First Contentful Paint and can cause layout flash. The Squarespace template's primary web font is not preloaded.
Fix: identify the primary font family's WOFF2 URL from the page source (Network tab, filter by font). Add <link rel="preload" as="font" type="font/woff2" href="[URL]" crossorigin> in Code Injection > Header.
04
Title tag audit
Page Title Text Chars Status
Homepage The Belroy Hotel - Your St Leonards Local 45 SHORT — no venue-type keyword
About About — The Belroy Hotel - Your St Leonards Local 54 Generic prefix, weak keyword
Menu Menu — The Belroy Hotel - Your St Leonards Local 54 Generic prefix — "Menu" ranks for nothing
Functions Functions — The Belroy Hotel - Your St Leonards Local 56 Acceptable length, weak keyword
Contact Contact — The Belroy Hotel - Your St Leonards Local 56 Generic — should include address/suburb
Sports Bar Sports Bar — The Belroy Hotel - Your St Leonards Local 55 Best of the set — add "Sydney" or "NSW"
H1 count per page
Systemic H1 issue — Squarespace split-text template
Every page has between 3 and 13 H1 elements. This is caused by the template rendering decorative split-text components (e.g. "WELCOME TO" + "The Belroy" displayed across two lines as a visual effect) as separate H1 DOM nodes. This is a template architecture issue, not a content editing issue, and requires developer access to resolve properly.
Page H1 Count First H1 Text Status
Homepage8"WELCOME TO"FAIL — 8 H1s, first is meaningless
About5"NEW PUB"FAIL — 5 H1s, none describe the page
Menu6"EAT"FAIL — 6 H1s, split decorative text
Functions13"FUNCTIONS"FAIL — 13 H1s, worst offender
Contact3"COMING TO"FAIL — 3 H1s
Sports Bar6"SPORTS BAR"WARN — 6 H1s, first is acceptable
05
BarOrPub schema replacement
Replace both existing JSON-LD blocks via Squarespace Settings > Advanced > Code Injection > Header. Verify Instagram/Facebook slugs, geo coordinates, and opening hours before deploying. NAP sourced from the live contact page.
Verify before deploying
Opening hours (Mon–Sat 10:00–03:00, Sun 10:00–22:00) and address (77 Christie Street, St Leonards NSW 2065) sourced from the live contact page. Phone ((02) 9439 2213) sourced from the footer. Confirm all three are current before deploying schema.
{ "@context": "https://schema.org", "@type": "BarOrPub", "@id": "https://www.thebelroyhotel.com.au/#organization", "name": "The Belroy Hotel", "alternateName": "Belroy Hotel", "description": "St Leonards corner pub with terrace bar, bistro, sports bar and function spaces. Formerly the Gilroy Hotel, reopened November 2023.", "url": "https://www.thebelroyhotel.com.au", "telephone": "+61294392213", "email": "info@thebelroyhotel.com.au", "logo": "https://images.squarespace-cdn.com/content/v1/6513beae14bb3152b962d387/31bd422f-9c9c-4818-b9f3-46dd2717c179/TheBelroy_StLeonards_Logo_Curved_Off-White.png", "address": { "@type": "PostalAddress", "streetAddress": "77 Christie Street", "addressLocality": "St Leonards", "addressRegion": "NSW", "postalCode": "2065", "addressCountry": "AU" }, "geo": { "@type": "GeoCoordinates", "latitude": -33.8291, "longitude": 151.1966 }, "hasMap": "https://maps.google.com/?q=77+Christie+Street+St+Leonards+NSW+2065", "openingHoursSpecification": [ { "@type": "OpeningHoursSpecification", "dayOfWeek": ["Monday","Tuesday","Wednesday","Thursday","Friday","Saturday"], "opens": "10:00", "closes": "03:00" }, { "@type": "OpeningHoursSpecification", "dayOfWeek": ["Sunday"], "opens": "10:00", "closes": "22:00" } ], "servesCuisine": ["Australian","Pub food","Bistro"], "priceRange": "$$", "amenityFeature": [ {"@type": "LocationFeatureSpecification", "name": "TAB", "value": true}, {"@type": "LocationFeatureSpecification", "name": "Outdoor seating", "value": true}, {"@type": "LocationFeatureSpecification", "name": "Function rooms", "value": true}, {"@type": "LocationFeatureSpecification", "name": "Live sport screens", "value": true} ], "sameAs": [ /* verify slugs */ "https://www.instagram.com/thebelroyhotel", "https://www.facebook.com/thebelroyhotel" ] }
06
Sequenced implementation plan
Phase Item Where Effort Priority
Day 1 Remove AI crawler blocks (Google-Extended, GPTBot, ClaudeBot, anthropic-ai) Squarespace: Settings > Advanced > External Services 10 min CRITICAL
Day 1 Replace LocalBusiness schema with complete BarOrPub block (Section 05) Squarespace: Settings > Advanced > Code Injection > Header 20 min CRITICAL
Day 1 Add meta descriptions to all 6 priority pages (150–165 chars each) Squarespace: Page gear icon > SEO tab > Description 30 min CRITICAL
Day 1 Enable canonical tags (check Squarespace SEO settings; verify output) Squarespace: Settings > SEO 15 min CRITICAL
Week 1 Add OG tags to all pages (title, description, image, url, type, locale) Squarespace: Page social settings + Code Injection per page 2 hrs CRITICAL
Week 1 Add Twitter Card meta tags (card, site, title, description, image) Squarespace: Code Injection 1 hr HIGH
Week 1 Add canonical to /home pointing to / Squarespace: Code Injection on /home page 10 min HIGH
Week 1 Noindex /whats-on/tag/* and /whats-on/category/* filter pages Squarespace: Blog/Events settings or custom robots.txt 30 min HIGH
Week 1 Add preload hint for hero/LCP image on homepage and functions page Squarespace: Code Injection > Header (per page) 1 hr HIGH
Month 1 Fix H1 hierarchy — change split-text elements to paragraph/display styles Squarespace developer mode / template CSS 3–4 hrs HIGH
Month 1 Rewrite title tags — keyword-first format, include venue type and suburb Squarespace: Page SEO settings 1 hr MEDIUM
Month 1 Add entity-definition paragraph to homepage (fact-dense, schema-aligned) Squarespace: page editor 30 min MEDIUM
Month 1 Add AggregateRating schema sourced from Google Business Profile review count Squarespace: Code Injection > Header 1 hr MEDIUM
Month 1 Add loading="lazy" to below-fold images via JS injection Squarespace: Code Injection > Footer (global script) 1 hr MEDIUM
Ongoing Audit and populate alt text on all images across all pages Squarespace: image block settings (per image) 2 hrs MEDIUM
Ongoing Add web font preload hints for primary typeface Squarespace: Code Injection > Header 30 min LOW
Ongoing Build press/awards section on About page; source TripAdvisor or GBP review link Page editor 2 hrs LOW
07
Squarespace platform constraints
Several fixes (meta descriptions, OG tags, schema, canonicals) can be made without developer access via Squarespace page settings and Code Injection. H1 hierarchy fixes and image dimension attributes require template/CSS access and a Squarespace developer. Some Core Web Vitals improvements (lazy loading, explicit image dimensions) are partially constrained by the platform and may require waiting for Squarespace platform updates or developer-mode workarounds.
Core Web Vitals — proxy scores only
Core Web Vitals scores in this report are proxy signals from HTML analysis. Real LCP, CLS, and INP values require a Google PageSpeed Insights run or Chrome User Experience Report (CrUX) data. Run a PageSpeed Insights audit for authoritative scores before actioning CWV-specific work. Squarespace sites typically score 40–65 on PageSpeed due to platform JS overhead.
Google Business Profile
NAP data in the replacement schema (77 Christie Street, St Leonards NSW 2065 / (02) 9439 2213 / info@thebelroyhotel.com.au) must match the Google Business Profile exactly — character for character including abbreviation style. Discrepancies between schema NAP and GBP NAP reduce local pack ranking confidence. Verify before deploying.
AI crawler blocking — confirm intent with site owner
The robots.txt AI crawler blocks are a Squarespace default that was almost certainly applied without the owner's knowledge. Before removing them, confirm this with the client. Some venue operators choose to block AI scrapers for content licensing reasons — if that is intentional, document it and accept the AI search visibility trade-off.