Why the pages of my site are not indexed
Hello friends My site pages are not indexed Please someone check my site, if needed, I will give access to the site and search console
www.tahashir.com
What is your question?
Your website is indexed without www in URL. As your URL with www is getting redirected to without www URL. Please check,
https://www.google.com/search?q=site%3Atahashir.com
Please let me know if you are experiencing any other issue in this regard.
article and product is not indexed automatic
Your site pages are indexed if you want to see type "site:tahashir.com". And if you still have many pages then first find them and make a list. To index, those pages send a request to Google from Google Search Console.
You can use the TEST LIVE URL button and then request for indexing. Google is taking time. Also, you can win some backlinks for the respective URLs for a faster inclusion.
I find it strange that you did not understand my problem my dear friends My problem is that the pages of my site are not automatically indexed, and I must request indexing through the search console, and this is illogical.
It looks the sitemaps of the website are not getting updated with the new URLs created. I checked a weblog page under category URL, linked from the top menu. This page was not found in the sitemap. Also, there is a page /blog/. This page is not linked from any page on the website. This page is available in page-sitemap and post-sitemap too. There is a need to overhaul the sitemaps getting generated.
Product pages have very thin unique content. Add some unique content like product description etc.
I found the problem Google algorithms cannot crawl all the codes and you see these errors by mistake. But I don't know what the solution is
I found the problem Google algorithms cannot crawl all the codes and you see these errors by mistake. But I don't know what the solution is
You should contact your developer for these fixes.
Here are some steps you can take to troubleshoot and fix issues with Google indexing your website:
Check if your website is blocked by robots.txt: Make sure your website is not blocked by your robots.txt file, as this can prevent Google from crawling and indexing your pages.
Check for crawl errors: Use Google Search Console to check for crawl errors on your website. This will show you any issues Google encountered while trying to crawl your website, and you can fix them to allow Google to index your pages.
Check your website's loading speed: A slow loading website can prevent Google from crawling and indexing your pages. Use tools like Google PageSpeed Insights to identify any issues and optimize your website's loading speed.
Check for duplication: Google may have trouble indexing your website if it finds duplicate content on your pages. Use tools like Copyscape to check for duplication and make sure your content is unique.
Check for malware: Google may not index your website if it detects malware on your pages. Use tools like Google Safe Browsing to check for malware and take the necessary steps to fix it.
Check for technical issues: There may be other technical issues preventing Google from indexing your website, such as incorrect redirects, incorrect use of noindex tags, or broken links. Use tools like Google Search Console to identify and fix these issues.
Submit a sitemap: If you have recently made changes to your website, you can use a sitemap to help Google discover and index your pages. Use the sitemap submission feature in Google Search Console to submit your sitemap.
Request re-indexing: If you have fixed any issues and your pages are still not being indexed, you can request that Google re-index your website. Use the "Fetch as Google" feature in Google Search Console to request re-indexing.
If you want to speed up the indexing process, get some backlinks.