Activity
Mon
Wed
Fri
Sun
Dec
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
What is this?
Less
More

Memberships

Write & Rank Blog Post With AI

Private โ€ข 53 โ€ข $10/m

AI SEO Academy

Public โ€ข 1.1k โ€ข Free

AI Ranking (FREE)

Public โ€ข 2.6k โ€ข Free

4 contributions to AI SEO Academy
๐Ÿ†• Black Friday Deal: 20% Off Moonlit For 3 Months
Moonlitplatform.com is doing an exclusive promo! Use the coupon code JONTY20 to get a 20% discount on your first 3 months. Valid until December 3rd. Here's what that looks like in terms of credits (x3): Pro Plan: Up to 6k free credits Team plan: Up to 18k free credits Business Plan: Up to 60k free credits That's a lot of extra app runs! A Moonlit app uses anywhere from 5 to 100 credits depending on their complexity. How to get the discount: Just add the coupon code to the coupon field at checkout to use the discount.
7
14
New comment 16h ago
๐Ÿ†• Black Friday Deal: 20% Off Moonlit For 3 Months
1 like โ€ข 2d
@Jonathan Boshoff Here are the two apps I demoed: Ultimate Title Tags & Met - This tool didnโ€™t work and used about 350 credits. EEAT Checker - This tool worked and used about 200 credits.
0 likes โ€ข 2d
@Jonathan Boshoff Thank you!
Why include your sitemap in your Robot.txt file
I spoke with an experienced web developer yesterday who mentioned he didnโ€™t have his sitemap in his robot.txt files.I was more than a little surprised. A sitemap should be included in your robots.txt file as its important because it helps search engines find and understand how to crawl and index your site pages. This improves your site's crawlability and indexing, which leads to better search engine rankings Creating a Robots. txt file allows search crawlers to find and identify your website's sitemap more accurately too. It also helps reduce the potential workload of the search engine bots to index your website. If a Robots. txt file is not included, the sitemap will need to be submitted manually to the search engine directly. This usually takes 8-10 seconds a page or you can do uk to 500 pages via Google Search Console. More via Bing. Organises content Using multiple sitemap links in a robots.txt file helps organise content AND ensure that search engines crawl and index the site more effectively. Most crawls by the bits usually only crawl about 20% of your website pagesโ€ฆ we can index 100% of your pages every day for you if you like. According to Google 95% of website pages get ZERO traffic. This tip will help ecommerce businesses and large website owners get more organic traffic.If you want to increase or improve your website traffic, or want a free website audit and site traffic overview. DM me or comment below with your URL. https://digitalmarketinggroundworks.com/svs/?Source=JohnJBRussell #seotips #seoadvice #seoagency #websites #organictraffic #webtraffic #sales #ecommerce #retailers #googletips #searchengines
5
10
New comment 18h ago
Why include your sitemap in your Robot.txt file
1 like โ€ข 2d
You have it backwards. The correct approach is to add your sitemap to the robot.txt file. The robots.txt file is typically the first file bots check when visiting a website.
Introduce yourself here
Add a comment here to say hi and tell us a little about yourself! (If you're up for it)
58
426
New comment 20h ago
Introduce yourself here
1 like โ€ข 7d
Greetings from Michigan, Iโ€™m Rory. Been doing SEO for nine years. Currently working at an agency helping brands grow in various industries. Looking to streamline my workflows and scale processes using AI. This community is legit and offers a heap of value. Cheers!
Keyword Auto Optimizer (With GSC Integration)
This tool helps you optimize pages for striking distance keywords in GSC! Just enter a URL and hit run. You'll get a list of optimization placements. ๐“๐ซ๐ฒ ๐ญ๐ก๐ž ๐†๐’๐‚ ๐ข๐ง๐ญ๐ž๐ ๐ซ๐š๐ญ๐ข๐จ๐ง ๐ฏ๐ž๐ซ๐ฌ๐ข๐จ๐ง: https://lnkd.in/ggYacFBW ๐“๐ซ๐ฒ ๐ญ๐ก๐ž ๐ฆ๐š๐ง๐ฎ๐š๐ฅ ๐ค๐ž๐ฒ๐ฐ๐จ๐ซ๐ ๐ฏ๐ž๐ซ๐ฌ๐ข๐จ๐ง: https://lnkd.in/gTXwJ4XW ๐‡๐จ๐ฐ ๐ข๐ญ ๐ฐ๐จ๐ซ๐ค๐ฌ: 1. Scrapes your existing published web page 2. Connects to Google Search console and pulls the highest-performing queries in the last 90 days 3. Counts and scores the n-grams usage of your page text 4. Determines where and how to increase usage for unused and underoptimized queries 5. Outputs optimized text with increased keyword frequency (marked in bold) Note: Itโ€™s not perfect. It gets confused with abbreviations and specific industry jargon and can recommend the same placement multiple times. Use your judgment when optimizing. ๐ƒ๐จ๐ž๐ฌ ๐ข๐ญ ๐ฐ๐จ๐ซ๐ค? I recently used this tactic to increase clicks to a page by 52%! It also got clicks from 30% more queries! I've used this tactic for years and recently automated it. ๐–๐ก๐ฒ ๐๐จ๐ž๐ฌ ๐ข๐ญ ๐ฐ๐จ๐ซ๐ค? Optimizing for the queries your page already ranks for helps it get more visibility and higher rankings, and helps it rank for new related queries. It will also help increase rankings for the other queries. Rising tides and such. The Python this app uses is cool: - Turns page text into n-grams. - Scores n-gram relevance to keywords using cosine similarity. - Filters out low-relevant n-gram keyword pairs. - Feeds the most relevant pairs to Claude 3.5 Sonnet for placements Full details here
5
17
New comment Aug 24
1 like โ€ข Jul 14
Excellent work! This is helpful!
1-4 of 4
Rory Pq
2
15points to level up
@ro-pq-5492
Greetings! Iโ€™m Rory, an SEO specialist from Michigan.

Active 22h ago
Joined Jun 25, 2024
United States
powered by