Google Penguin w/ Meta Keywords
-
It's getting really hard filtering through the Penguin articles flying around right now so excuse me if this has been addressed:
I know that Google no longer uses the meta keywords as indicators (VERY old news). But I'm just wondering if they are starting to look at them as a bigger spam indicator since Penguin is looking at over-optimization.
If yes, has anyone read good article indicating so?
The reason I ask is because I have two websites, one is authoritative and the other… not so much. Recently my authoritative website has taken a dip in rankings, a significant dip. The non-authoritative one has increased in rankings… by a lot.
Now, the authoritative website pages that use meta-keywords seem to be the ones that are having issues… so it really has me wondering. Both websites compete with each other and are fairly similar in their offerings.
I should also mention that the meta-keywords were implemented a long time ago… before I took over the account.
Also important to note, I never purchase links and never practice any spammy techniques. I am as white hat as it gets which has me really puzzled as to why one site dropped drastically.
-
Thanks for the link.
I have them on lots of pages and am not worried about them... but they are not "overstuffed".
-
It is pretty much covered in this article by Danny Sullivan.
Excerpt: So use the tag? Sure, if you want to take a chance that by overstuffing it, you’ll cause Bing to think you’re spamming. Be safe, be smart, save your time. Don’t use it.
-
Can anyone suggest one reason why google would consider meta keywords a sin?
If nobody has good reasons then I doubt that google would penalize for them.
-
While I haven't seen any Penguin update that mentioned the meta keywords specifically, there was a lot of talk of over optimization and the meta keywords can certainly fall in that arena. Old webmasters would stuff the meta keywords with tons of similar words. The meta keywords has no upside at all and can potentially be a red flag to both Bing and Google if keywords in this meta tag are not consistent with page content.
All said, I would look for other reasons why the site may of been consumed by the penguin. But if your meta keywords are excessive and repetitious I would definitely remove them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mobile Redirect - Cloaking/Sneaky?
Question since Google is somewhat vague on what they consider mobile "equivalent" content. This is the hand we're dealt with due to budget, no m.dot, etc, responsive/dynamic is on the roadmap but still a couple quarters away but, for now, here's the situation. We have two sets of content and experiences, one for desktop and one for mobile. The problem is that desktop content does not = mobile content. The layout, user experience, images and copy aren't the same across both versions - they are not dramatically different but not identical. In many cases, no mobile equivalent exists. Dev wants to redirect visitors who find the desktop version in mobile search to the equivalent mobile experience, when it exists, when it doesn't they want to redirect to the mobile homepage - which really isn't a homepage it's an unfiltered view of the content. Yeah we have push state in place for the mobile version etc. My concern is that Google will look at this as cloaking, maybe not in the cases where there's a near equivalent piece of content, but definitely when we're redirecting to the "homepage". Not to mention this isn't a great user experience and will impact conversion/engagement metrics which are likely factors Google's algorithm considers. What's the MOZ Community say about this? Cloaking or Not and Why? Thanks!
White Hat / Black Hat SEO | | Jose_R0 -
How does Google determine if a link is paid or not?
We are currently doing some outreach to bloggers to review our products and provide us with backlinks (preferably followed). The bloggers get to keep the products (usually about $30 worth). According to Google's link schemes, this is a no-no. But my question is, how would Google ever know if the blogger was paid or given freebies for their content? This is the "best" article I could find related to the subject: http://searchenginewatch.com/article/2332787/Matt-Cutts-Shares-4-Ways-Google-Evaluates-Paid-Links The article tells us what qualifies as a paid link, but it doesn't tell us how Google identifies if links were paid or not. It also says that "loans" or okay, but "gifts" are not. How would Google know the difference? For all Google knows (maybe everything?), the blogger returned the products to us after reviewing them. Does anyone have any ideas on this? Maybe Google watches over terms like, "this is a sponsored post" or "materials provided by 'x'". Even so, I hope that wouldn't be enough to warrant a penalty.
White Hat / Black Hat SEO | | jampaper0 -
80% of traffic lost over night, Google Penalty?
Hi all.
White Hat / Black Hat SEO | | Hemjakt
I have a website called Hemjakt (http://www.hemjakt.se/) which is a search engine for real estate currently only available on the Swedish market. The application crawl real estate websites and collect all estates on a single searchable application. The site has been released for a few months and have seen a steady growth since release, increasing by 20% weekly up to ~900 visitors per day. 3 days ago, over night, I lost 80% of my traffic. Instead of 900 visitors per day I'm at ~100 visitors per day and when I search for long, specific queries such as "Åsgatan 15, Villa 12 rum i Alsike, Knivsta" ( <adress><house type=""><rooms><area> <city>), I'm now only found on the fifth page. I suspect that I have become a subject of a Google Penalty. How to get out of this mess?</city></rooms></house></adress> Just like all search engines or applications, I do crawl other websites and scrape their content. My content is ~90% unique from the source material and I do add user value by giving them the possibility to compare houses, get ton of more data to compare pricing and history, giving them extra functionalities that source site do not offer and so on. My analytics data show good user engagement. Here is one example of a Source page and a page at my site:
Source: http://www.hemnet.se/bostad/villa-12rum-alsike-knivsta-kommun-asgatan-15-6200964
My Site: http://www.hemjakt.se/bostad/55860-asgatan-15/ So: How do I actually confirm that this is the reason I lost my traffic? When I search for my branded query, I still get result. Also I'm still indexed by Google. If I am penalized. I'm not attempting to do anything Black Hat and I really believe that the app gives a lot of value to the users. What tweaks or suggestions do you have to changes of the application, to be able to continue running the service in a way that Google is fine with?0 -
Has anyone used this? www.linkdetox.com/
Has anyone used this? www.linkdetox.com/ Any opinions about it?
White Hat / Black Hat SEO | | Llanero0 -
Keyword Named Domains
First, I'm new to SEO so bear with me. My company owns a list of domains with names that are keywords for us. Right now, all those domains are redirecting to our main site. None of the domains has ever had content; they were purchased recently and simple redirected. My questions are: 1) is there any value in having domains that are exact keywords on which we'd like to rank, (i.e. does this work to improve site traffic and ultimately rankings, or is this a black hat tactic)? and, 2) would there ever be any value in turning these sites into landing pages with content and outbound links that lead to our original site? Thanks for your advice.
White Hat / Black Hat SEO | | SpearOne0 -
Can Using Google Analytics Make You More Prone to Deindexation?
Hi, I'm aggressively link building for my clients using blog posts and have come upon information that using Google Analytics (as well as GWT, etc.) may increase my chance of deindexation. Anyone have any thoughts on this topic? I'm considering using Piwik as an alternative if this is the case. Thanks for your thoughts, Donna
White Hat / Black Hat SEO | | WebMarketingHUB0 -
Why did Google reject us from Google News?
I submitted our site, http://www.styleblueprint.com to Google to pontentially be a local news source in Nashville. I received the following note back: We reviewed your site and are unable to include it in Google News at this
White Hat / Black Hat SEO | | styleblueprint
time. We have certain guidelines in place regarding the quality of sites
which are included in the Google News index. Please feel free to review
these guidelines at the following link: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=35769#3 Clicking the link, it anchors to the section that says: These quality guidelines cover the most common forms of deceptive or manipulative behavior, but Google may respond negatively to other misleading practices not listed here (e.g. tricking users by registering misspellings of well-known websites). It's not safe to assume that just because a specific deceptive technique isn't included on this page, Google approves of it. Webmasters who spend their energies upholding the spirit of the basic principles will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit. etc... Now we have never intentionally tried to do anything deceptive for our rankings. I am new to SEOmoz and new to SEO optimization in general. I am working through the errors report on our campaign site but I cannot tell what they are dinging us for. Whatever it is we will be happy to fix it. All thoughts greatly appreciated. Thanks in advance, Jay0 -
Can good penalize a site, and stop it ranking under a keyword permanently
hi all we recently took on a new client, asking us to improve there google ranking, under the term letting agents glasgow , they told us they used to rank top 10 but now are on page 14 so it looks like google has slapped them one, my question is can google block you permanently from ranking under a keyword or disadvantage you, as we went though the customers links, and removed the ones that looked strange, and kept the links that looked ok. but then there ranking dropped to 21, is it worth gaining new links under there main keyword even tho it looks like google is punishing them for having some bad links. the site is www. fine..lets...ltd...co....uk all one word cheers
White Hat / Black Hat SEO | | willcraig0