Simple SEO improvements for this week
Most of my posts come from my daily job, either prep from stuff I’m doing, planning on doing, or have done; In this case the coding blocks slack group, led me down the path. We were talking about Adsense in the #rants channel, and I had commented on how I have never had a payout from Adsense. I was then informed by @swharden, that I wasn’t actually displaying ads on my blog!
This of course led me down a rabbit hole of google “stuff” such as adsense, google search console, analytics, and an SEO optimization resource.
The tldr of my adsense problem was I hit the “verification threshold” of 10 dollars in revenue, but had not yet hit the “payment threshold” of 100 dollars, and the verification process involves snail mailing a pin to my residence… which happened to go to my old house… whoops! So now hopefully, within the next 3-4 weeks I’ll have another PIN so that I can start making those sweet sweet pennies every now and then! But in the interim, and seemingly for the past 8 or so months, I’ve been out of luck for any hopes of ad revenue /sad.
Anyway, that was the start of my journey pre-rabbit hole, and here’s a few pretty quick updates I was able to go through to get my site more SEO friendly.
At this point, I had submit for my new PIN, but the blog has been dusty and neglected for a while, so I figured I would take care of some things regarding its SEO. I ran my site through https://freetools.seobility.net/ and one of the pieces of advice was to not use H1s in HTML? At least I think it was this site that stated that. So, I followed the directions, and changed all MD
# s to
## . This was a pretty simple find and replace and took all of 5 minutes for the entirety of my posts.
This was a term I hadn’t heard in years! I didn’t know sites actually still used these; but apparently they are quite useful for crawlers. I use the hexo blog engine, which thankfully had an NPM plugin for generating a sitemap as a part of the static site generation process:
Then it was a matter of adding some configuration to my
and then providing my sitemap.xml file to google search console:
I don’t know if this step actually helps SEO or not, but it was recommended to me nonetheless: https://support.google.com/adsense/answer/9889911?hl=en. I added another static file to my hexo blog which at a minimum is seemingly a “best practice” for reasons of transparency.
I don’t have much more to say about this, it was a quick 7 line json file saved into sellers.json, and is now available on the site.
One additional thing I didn’t realize I needed was a robots.txt. This file, described here, is used to control which files on a site a crawler has access to, assuming it follows the rules.
Initially, I had set up my robots.txt to have the following:
The above file allowed crawlers to all pages of the site, and also shared the location of my sitemap, which notes each page on the site. This helps search engines like Google to be better able to return relevant search results when keywords are hit from my posts.
After having read a bit more through https://developers.google.com/search/docs/beginner/seo-starter-guide, I saw that it recommended your robots.txt disallow crawlers from searching through your sites own “search pages”. The point of the search engine is to find literal content, not find search pages within search pages, so this makes a lot of sense!
I have at least a few “search” type pages built into the static site that is my blog, namely:
We’ll add those to the robots.txt:
These were just a few of the low hanging fruit that were out there regarding SEO and my blog. There are definitely more things to take care of, but this is what I had time for in some downtime this week!
Obviously one of the most beneficial things you can do for your SEO… is to actually post more content… I’m working on it, sometimes!
- Photo by “Merakist” on Unsplash.
Simple SEO improvements for this week