Martin Senko

@ms

Prague, Czech Republic

https://seoforum.com

Registered: 5 months ago
Reputation: 195
May 15, 2018
  • Martin Senko upvoted SEO Basics in 2018

    I think this is good question even for those who are more experienced. We all know what worked in 2010, 2015, etc. But what are your most recent recommendations? From what I see, backlinks still do their work. PR, social media mentions, google for business. It all works - at least for us. If we add good navigation structure, HTTPS, mobile friendly design, that's good enough to bring pages up for most keywords in general. However, when it comes to more competitive keywords, you need something more to get on the first page. Share what works for you.

  • Martin Senko replied to thread SEO Basics in 2018

    There is a LOT going on in SEO these days. It's more content driven then ever but if you know right paths, you can boost your existing content as well. 1. **AI / RankBrain** - taking in account not just how many searchers click on your page, but also how they behave. Ie. how long they stay on your page, bounce rate... Search results change accordingly. 2. **Mobile** - everything should be mobile in 2018. Most of searchers use their mobile devices (mobile phones/tables) so it's natural for Google to prioritize results with mobile users in mind (actually mobile first, as they became _majority_). 3. **Site speed** - with mobile users, your site has to be quick. They use mobile data, could be lower of lower signal quality... Your site has to be quick and efficient. So save them some bandwith and load sites as fast as possible. Use lazy loading instead of loading all the bandwith consuming content at once. 4. **Linkless mentions** - It's not a 2018 invention, but we should keep that in mind more than ever, that search engine keep track of linkless mentions as well, so it's not all about traditional links. Bing has confirmed it, and Google does that very likely too. 5. **Voice search** - Adoption of voice search is on the rise. 6. **Video and visual content** - Don't miss the visual boat. Shot videos, include visual content in your articles. It does make it easier for people to understand the topic, therefore Google likes it. 7. **AI everywhere** - SERP manipulation techniques will take a huge hit. Guest posting, backlink schemas, keyword stuffing - you name it. No, Google won't filter it all, because they can't, but they will become better and better in filtering. There is lot more coming, but this is what came to mind when I saw the question. Good luck in 2018 - we will need it :)

Apr 20, 2018
  • Martin Senko replied to thread Robots.txt and Sitemap Error

    No problem, glad to help @akhtar. You can accept answer if it answered your question.

  • Martin Senko upvoted Re: Robots.txt and Sitemap Error

    Your robots.txt file looks good to me, even though `Allow` isn't defined in the standard. However, major crawlers should know this directive. See (from [http://tools.seochat.com/tools/robots-txt-validator](http://tools.seochat.com/tools/robots-txt-validator)): > The official standard does not include Allow directive even though major crawlers (Google and Bing) support it. If both Disallow and Allow clauses apply to a URL, the most specific rule - the longest rule - applies. To be on the safe side, in order to be compatible to all robots, if one wants to allow single files inside an otherwise disallowed directory, it is necessary to place the Allow directive(s) first, followed by the Disallow. It is still nonstandard. I guess empty robots.txt would work just fine for, just like having no robots.txt at all.

  • Martin Senko upvoted Robots.txt and Sitemap Error

    Google Search Console is indicating that my has been blocked by robots.txt file. While testing it, through the robots.text testing tool it says ALLOWED. Wondering why Google mentioned the sitemap contains URLs which are blocked by robots.txt. How to resolve this issue?

  • Martin Senko replied to thread Robots.txt and Sitemap Error

    Your robots.txt file looks good to me, even though `Allow` isn't defined in the standard. However, major crawlers should know this directive. See (from [http://tools.seochat.com/tools/robots-txt-validator](http://tools.seochat.com/tools/robots-txt-validator)): > The official standard does not include Allow directive even though major crawlers (Google and Bing) support it. If both Disallow and Allow clauses apply to a URL, the most specific rule - the longest rule - applies. To be on the safe side, in order to be compatible to all robots, if one wants to allow single files inside an otherwise disallowed directory, it is necessary to place the Allow directive(s) first, followed by the Disallow. It is still nonstandard. I guess empty robots.txt would work just fine for, just like having no robots.txt at all.

  • Martin Senko upvoted blog traffic and adsense issue

    I am running a blog [http://www.aspmantra.com](http://www.aspmantra.com) from last 4 years. But not getting good traffic. I want to earn money. Also adsense is approved for this domain. From last 3 or 4 months not getting 100 % earing. I am only getting 2 % of it, other all are deducted as invalid traffic.

  • Martin Senko replied to thread blog traffic and adsense issue

    Are you using any traffic generation services? If your traffic is auto generated by bots (high bounce rate, low time spent on site, zero interactions, etc.) there is not a good signal and you should take some steps to make sure your Adsense account is not going to be banned.

  • Martin Senko replied to thread Robots.txt and Sitemap Error

    Post your robots.txt and exact URLs you're getting errors for in your Search Console. Can't really tell without knowing your robots file.

Jan 14, 2018
  • Martin Senko upvoted Re: Let's Encrypt: Is it equally good for SEO as any other paid certificate?

    Simply use Google's Let's Encrypt and never look back :P I use it for my affiliate websites, and they are doing pretty well in Google. There is no point overpaying fancy certificates to make Google love you. Think users first. As far as your users can transport their requests over to your server and vice-versa securely, that's all you should care about. I think that's also what Google cares about. To make their users secure in first place. Whole point of Google forcing you to use SSL certificate for your website is not about making it harder for you to rank. It's all about users and their security. That's why I think it's fair and we should accept that.

Dec 28, 2017
  • Martin Senko published thread Google increased length of snippets

    So, what do you think about longer snippets in search results? I think it's pretty good and this change could improve CTR. Not dramatically, but definitely in a positive way - for good content and useful sites in general. Do you see any CTR growth lately?

Dec 17, 2017
  • Martin Senko replied to thread Duplicate keyword in URL

    There is no point to implement such a redundant URL structure. It's over-optimization, and it could be bad for your site, even though this isn't ordinary keyword stuffing. I'd suggest using simple URLs. Make em easy to read, short and descriptive. Do not overuse keywords just for sake of optimization. Think visitors-first, just like Google does, and you'll be good. It's honestly bad looking URL structure, and I'd avoid it if possible.

Dec 10, 2017
Dec 7, 2017
  • Martin Senko replied to thread SEO Forum Feature Requests

    @martinkhan Sounds good to me. Having upvoted content one click away from homepage could be good for quick navigation through your favourite content without digging deep in your profile or so. Right now, you can only get upvoted threads or replies searching them through your activity feed, which is definitely time consuming and not very practical. So, thank you for smart and useful suggestion. Putting it on my to-do list.

  • Martin Senko upvoted Re: SEO Forum Feature Requests

    How about having list of upvoted threads and replies? I don't see favourite or like feature, so I could revisit my favourited threads and replies. But I think list of upvoted stuff could do that. Just an idea.

  • Martin Senko upvoted Duplicate keyword in URL

    Can duplicate keywords in URLs boost page importance on that keyword? Or is it potentially subject to penalisation? Think of some-product.com/some-product/some-product-{specific} where product specific would be either color, size, model, version or so. To me, it looks quiet spammy, but saw that on a site. They do well in G's SERP. I would rather not post their actual website for now, but you know get what i mean.

  • Martin Senko upvoted Re: Let's Encrypt: Is it equally good for SEO as any other paid certificate?

    If you are asking, because you are considering https over http, then **the answer is https**, regardless the certificate. Read more details about [https vs. http](http://blog.searchmetrics.com/us/2015/03/03/https-vs-http-website-ssl-tls-encryption-ranking-seo-secure-connection/ ). If you're comparing any paid vs. Let's encrypt the more important question here is, what is the acceptance of the Certificaton Authority (CA), that issued your certificate. Talking about Google an their Let's Encrypt, these days you're good to go with it. And yes, it does equally good job securing your website as virtually any other certificate, assuming same parameters. If you are a bank or other financial (security critical) organization, there are better options though. However, I don't think that's the case.

  • Martin Senko upvoted Re: Let's Encrypt: Is it equally good for SEO as any other paid certificate?

    SEO Forum uses Let's encrypt as well. It's just pretty common solutions these days, and it works for 99% of websites. I don't think there is anything wrong about LE certificate, even if you are bigger website. It's all about the security first, not the name of CA, and in LE case, it uses same encryption, same algo, ... so why pay for it? LE is well spread out certificate, so unless you need EV, which you likely do not, use LE and enjoy.

  • Martin Senko replied to thread Let's Encrypt: Is it equally good for SEO as any other paid certificate?

    SEO Forum uses Let's encrypt as well. It's just pretty common solutions these days, and it works for 99% of websites. I don't think there is anything wrong about LE certificate, even if you are bigger website. It's all about the security first, not the name of CA, and in LE case, it uses same encryption, same algo, ... so why pay for it? LE is well spread out certificate, so unless you need EV, which you likely do not, use LE and enjoy.

  • Martin Senko upvoted Does Google index hash URL segment?

    Hash in URL means anchor on that page. But if I use links like `something.com/page.html#section` does Google index this link as a separate page, so index would now contain both `something.com/page.html` and `something.com/page.html#section`? Obviously, where I see problem with that is duplicate content, which could be big issue for some sites (including my sites). I just don't believe Google would be so stupid and index it that way. Or at least, there must be a way to fix it (something like `rel=canonical`). I have no evidence that I was penalized for using #section, but it got me thinking lately...

  • Martin Senko upvoted SEO and analytics tools you use

    This is probably secret stuff that most people won't share, but if you don't mind sharing: What apps do you use for monitor and analyze your SEO, ranking positions, backlinks, social signals and traffic in general? No need to state the obvious - I know most or maybe generally all webmasters use Google Analytics. I am more interesting to learn about new, maybe bit specific stuff like Mixpanel, if you know what I mean.

  • Martin Senko replied to thread SEO and analytics tools you use

    I try to keep it simple. Too much of analytics tools makes me overloaded with data that I can't really use, and it takes so much time to actually analyze all the data to make good use it. This is my list of tools I use, sorted by priority: 1. Google Analytics 2. Google Tag Manager (for any website having more than one tag, that one tag would be Analytics. If I need only Analytics, I don't use Tag Manager) 3. Facebook Pixel (for remarketing) 4. Other remarketing Pixels 5. Custom Analytics tools for tracking conversions on third party websites Have no experience with Mixpanel, but from what I heard, it seems to be really comprehensive tool for nearly anything one could think of. @martinkhan do you use Mixpanel? Do you recommend it? If so, why? What's best about it?

Dec 4, 2017
  • Martin Senko upvoted Hreflang on multiple domains

    I need some help and guidance regarding **Hreflang** tag. We are working on a website, say example.com. We are planning to launch the same website in the german language with different domain - example.de. The content should remain the same on the .de website. We don't want to get into the Google penalty for duplicate content. How can I use hreflang tag on both of the websites? I am confused using the hreflang tag on both websites. Should I need to add both tags on two domains? I'd appreciate your thoughts. Thanks guys

  • Martin Senko upvoted What does soft 404 error mean?

    I see this quiet often in SEO discussions, blog posts, etc. > Soft 404 error But what does that really mean? Is there an hard 404 too? How are these different?

  • Martin Senko replied to thread What does soft 404 error mean?

    Hi Lisa. Guys at Search Engine Journal published nice article about 404 errors / error pages. I highly recommend it to read it, so you better understand what are they and when you should use 404 or eventually, how to fix 404 errors. Check the full article here: [404 vs. soft 404 errors](https://www.searchenginejournal.com/fix-404-vs-soft-404-errors/) However, to answer your question; 404 error means the page is missing, but the server responded with 200 (even it probably should be returning 404 or other 4xx HTTP response code). Google, with all the data, knows what are common patterns of 404 pages, so even after getting 200, they may flag the page in their index as missing (404). What you should do is: return 404 code AND show 404 page, when the page is missing. Otherwise, you're telling user/search engine, that the requested page exists.

Dec 3, 2017
Dec 2, 2017
  • Martin Senko upvoted How different is using subdomain vs. new domain in terms of SEO?

    I read many opinions that search engines doesn't pass nearly any of main domain authority to a subdomain, because it's basically different domain. Some people say that subdomain receives some of its main domain authority. I have no real experience with this, but I think that it makes sense that subdomain is much closer to the main domain and therefore should take at least some advantage over newly registered domain. What do you think?

  • Martin Senko published thread Site Issues and Errors

    SEO Forum has been developed by people. People make mistakes. Therefore, SEO Forum surely has some bugs and eventually may throw an unexpected error or behave the way it is not intended to. The best thing you can do in such a scenario: 1. Navigate to this thread (Site Issues and Errors) 2. Post issue you experienced here. OR (if you want it to be more of a discussion than a plain issue report) 1. Create new thread 2. Tag it `Meta` & `Site Issues and Errors` 3. Post issue in separate thread. We will do our best to fix it ASAP. Thank you for reporting!

  • Martin Senko published thread SEO Forum Feature Requests

    Once again, welcome to SEO Forum! It's a pleasure for me to welcome you here, in our own tiny piece of the internet where we meet to brainstorm and exchange our experience with nearly anything, but mostly connected with **internet marketing**, **search engine optimization** and related topics. To make sure you enjoy your stay, this is thread for you - guys with ideas. Feel free to suggest new features, changes and edits to the site, so we can grow together and make this even better and nicer place to be. I will personally try to go through all of your requests, and prioritize them according to how you upvote. Don worry to post anything. Really.