anyone has a link which explains in simple terms the integration of Google Structured Data (with examples) ?
The different documentation (even on Google) goes in too much details and a newbie like me will just give up. ;-(
Hello and thanks,
also Google is reporting redirect error for page
I have the home page
which is the French version and is currently indexed correctly. The english version of the home page is:
I can't see any redirection error when I access the link,
> Crawl > Last crawl > Jun 1, 2022, 9:13:08 PM > Crawled as > Googlebot desktop > Crawl allowed? > Yes > Page fetch > error > Failed: Redirect error > Indexing allowed? > N/A
Hi and thanks for your replies,
@binayjha, unfortunately the list in GSC is incomplete. I have a YouTube video which is published since Oct. 2021 with a URL on a page on the website. This page is also indexed by Google since that time. But it will not appear in GSC > Links > Top linking sites > youtube.com
@Gamerseo, Do you mean backlink-checker by ahrefs ? If so unfortunately it is a paid service.
A few of my pages are actually "Discovered - currently not indexed" in GSC. I have read about this status and also have checked the page on seobility.net but can't find what am doing wrong. Any hint ? Thanks !
I was wondering whether I could find my site's inbound links specified on Youtube videos' descriptions (when I don't know which videos contain these links).
I have google : "mysite.com" site:youtube.com Is it a reliable way ? Are there better ways ?
In Google search console, I'm getting: LCP issue: longer than 4s (mobile) - Status Poor for 1,075 URLs Further I have read that LCP does not affect ranking.
On the other hand I'm trying to improve TTFB which is actually a SEO metric.
Can anyone confirm if I've got it right ?
Thanks yes, am in GSC. I just didn't notice the option to export in Excel. Thanks for pointing me to that.
However, it gives me a list of about 650 URLs and noticed that some of the indexed URLSs in Google search results are not included in this list.
I was looking for a way to extract a list of URLS from Google since I suspect that duplicate URLs have bee indexed. I followed this article:
My issue: Search results for
site:mysite.com gave me only 3 pages of 100 but I know that I have more than 1,000 pages indexed. Anyway If I search for some keywords found on a page not in this list, I will find this on Google results. Any hint how to get my 1000+ pages listed ?
It will definitely give you that extra puff to your site. Additionally, it is a good thematic backlink. The answer is yes, such a link will have a positive effect on SEO. But youtube links are rather hard to index.
I have a website talking about my country. I came across a vlogger who posts videos on youtube almost every week and talking about my country. He told me he can add links to my website in the video description. Will this help me in SEO ? I know it's the case if I published new videos myself but here it's about videos that might be few weeks to several years old...
I want to redirect :
but it should leave the following alone
For years I've been working on my URLs to prevent spaces. However it's time consuming when you have a lot of dynamic pages. I just stumble upon this URL of a well known website. Is Google more flexible about having spaces in URL now ?
5-20 photos (new pages) a day is perfectly fine, I wouldn't be worried about crawl budget at all in this case, even though your website is fairly new.
The linking situation: while it might be ambiguous, think about it; It's OK to link live anchors across the forum - related thread or thread that already answered someones' question. Yet live links to third party sites could be potentially harmful (for both parties). When linking to a source which contains spam/malicious content, it is harmful for the forum, while often times, when for example discussing penalties and grey/black hat strategies, as an OP, you don't want your site being linked directly with (which Google could easily find out...). BUT we still need to leave room for users to link to a highly reputable sources to back up their opinions like studies, white papers, Google documentation, etc.
To add on this whenever you insert a link here, it ask you for link description i.e. I know, yet most of the users would just copy paste the link without using the editors link button.
If you have any suggestions how to make linking better/more effective around here, I'll be happy to discuss.
It depends on how much resources Google have for your site (crawl budget). If it's new site and you have thousands of new photos everyday, I would not expect they will crawl them all and refresh their index on daily basis. Just part of it at best. However, if you gain better ranking positions, CTR etc., they will do eventually.
Also, please, do not link-spam the board. We use code for URLs. Whenever someone uses nice anchor text, it's a red flag and your post may be deleted. So, just use links and sanitize them.
Hi and thanks,
It's a 10 weeks old website. Am not sure what to answer for the 'Crawl budge'. Coverage is 809 valid pages / 0 errors. I do have about some 5-20 photos everyday.
> Also, please, do not link-spam the board. We use code for URLs. Whenever someone uses nice anchor text, it's a red flag and your post may be deleted. So, just use links and sanitize them.
This might be ambiguous. So one should always use code button to enter his URL ? As far as I understand a spam link is one which is not related to the post or someone who will deliberately post a link which has no sense at all being included in the message. To add on this whenever you insert a link here, it ask you for link description i.e.
[enter link description here](http://example.com)
I have a page "Latest photos" which is being updated everyday. The page contains a list of photo entries and each photo has get their own page/link.
Am requesting Google search console to reindex the page everyday. Will Google index the individual links to each photo entry ?
I have the followings URLs which are indexed in GSC :
https://mysite.com/thispath/beach (page 1) https://mysite.com/thispath/beach?start=20 (page 2) https://mysite.com/thispath/beach?start=40 (page 3) https://mysite.com/thispath/beach?start=60 (page 4)
and so on... Each page will show up a list of 20 beach photos.
I do have others like :
https://mysite.com/thispath/sunset (page 1) https://mysite.com/thispath/sunset?start=20 (page 2) https://mysite.com/thispath/sunset?start=40 (page 3) https://mysite.com/thispath/sunset?start=60 (page 4)
I have consolidated all pages so that all photos will now show on
(same URLs which used to be page 1 only) using lazyload - I do think it's a good idea for better SEO, right ?
Can I use some wild cards to remove all URLs containing : ?start=20, ?start=40, ?start=60 etc... ? Or remove all URLs with "?" in their path ?
I only have the option to use prefix in 'Removals' > "Temporary Removals'