Recent Articles

Halloween Inspired SEO Tricks to Keep Spiders at Bay

A Jack o'lantern

Over the years, I’ve made a habit of prodigiously extolling search engine best practices per contra to taking shortcuts designed to trick search engines into trusting that an online destination is something it is not. This Halloween, I’ve decided to produce an antithetical essay to my digital morals and beliefs by way of parody and embrace the dark side of search engine spoofs.

Fear of spiders?

Not a problem. There are many ways to keep unwanted arachnids from crawling through your content.

For starters, why not avoid using visible text on your website? Embed as much content as possible in images and pictures. Better yet, make your site into one big splash page that appears to scroll to infinity and beyond.

Also, make certain that the imagery loads as slowly as possible. Consider yourself lucky that you will be able to streamline your web metrics around paid search campaigns and not worry about those pesky organic referral terms [not provided] by Google anymore. Keeping spiders out of your content is your first step toward complete search engine invisibility.

If your site is inherently text heavy, consider using dropdown and/or pop-up boxes for navigation. Configure these with Flash or have them require JavaScript actions to function. If possible, put the rest of your web content exclusively in frames. Designing a website in such a manner is another great way to keep all those bad robots out of your site.

When it comes to URL structures, try to include as many ampersands, plus signs, equal signs, percentage signs, session IDs, and other dynamic parameters as possible in a multifaceted, splendidly deep file structure. That way, your website will be made up of really long URL strings that can confuse humans and spiders alike. Even better, add filters and internal site search functionality, metrics tags, and other superfluous attributes to your URLs, just to keep the search engines guessing about your site structure. Get ready to burn your site’s crawl equity to the ground, while watching your bandwidth spend soar, when you wrap your site up like a mummy with this navigational scheme.

If you really want to turn your website into a graveyard for search engine spiders, consider using completely unnecessary redirects on as many different URLs as possible, taking multiple hops along the way. Combine permanent and temporary redirects with soft 404 errors that can keep your content alive in search indices forever. Make certain to build in canonical tag conflicts, XML sitemap errors, perpetual calendars and such, reveling in the knowledge that you will never have to waste precious development time fixing broken links again!

Content creation budget got you down? Build in new economic efficiencies by using the exact same content across as many domains as your budget can spawn. Invest in machine-generated content instead of having to listen to those troublesome user reviews. Make “Spamglish” the official language of your website. Since you don’t have to worry about looking at what keywords Google allows to send traffic to your Frankensite, feel free to target irrelevant keywords on as many pages as possible.

Additionally, try to keep all the title tags exactly the same on the critically important pages within your site. Spiders don’t have good eyesight – sometimes you have to shout to get their attention. Consider keyword stuffing as a way to make certain that the search engines understand precisely what your site is all about. If you don’t have room to stuff unnecessary contextual redundancies into your web content, consider using hidden text that flickers like a ghost when users mouse over what looks like dead space.

Still not convinced you can hide your site from the search engines this Halloween?

Break out the big tricks, my friends, because we’ve got some link building treats to share.

If your ultimate goal is to bury a domain name for all eternity, make certain that you participate in as many link farming free-for-all sites as possible. When you get a chance to do so, go ahead and “splog” other’s guest books and forums. In addition to buying site-wide text links, demand that your backlinks be placed in the footers. While you’re at it, sell a similar “service” to others.

Consider hiding some links in places that surprise visitors and always embrace bad linking neighborhoods. You know the type of sites I’m talking about… they’re the spooky ones and the non-paranormal that a business person would avoid.
Have a wonderful Halloween this year, with the knowledge that you too can make a website completely disappear!


Disclaimer: I don't actually endorse that you try any of the above; everything in this particular column should be taken with a serious dose of tongue-in-cheek.


Original Article Post by P.J. Fusco @ Search Engine Watch

The Value of Referrer Data in Link Building

referrer-links

Before we get into this article let me first state, link building is not dead.  There are a lot of opinions floating around the web on both sides; this is just mine.  Google has shut down link networks and Matt Cutts continues to make videos on what types of guest blogging are OK.  If links were dead, would Google really put in this effort?  Would anyone get an “unnatural links” warning?

The fact is, links matter.  The death is in links that are easy to manipulate.  Some may say link building is dead but what they mean is, “The easy links that I know how to build are dead.” 

What does this mean for those of us who still want high rankings and know we need links to get them?  Simply, buckle up, because you have to take off your gaming hat and put on your marketing cap.  You have to understand people and you have to know how to work with them, either directly or indirectly.

I could write a book on what this means for link building as a whole, but this isn't a book, so I'll try to keep focused.  In this article, we're going to focus on one kind of link building and one source of high quality link information that typically goes unnoticed: referrer data.

I should make one note before we launch in, I'm going to use the term loosely  to provide additional value.  We'll get into that shortly but first, let's see how referrer data helps and how to use it.

The Value Of Referrer Data

Those of you who have ignored your analytics can stop reading now and start over with “A Guide To Getting Started With Analytics.”  Bookmark this article and maybe come back to it in a few weeks.  Those of you who do use your analytics on at least a semi-regular basis and are interested in links can come along while we dig in.

The question is, why is referrer data useful?  Let's think about what Google's been telling us about valuable links: they are those that you would build if there were no engines.  So where are we going to find the links we'd be happy about if there were no engines?  Why, in our traffic, of course.

Apart from the fact that traffic is probably one of, if not the best, indicator of the quality and relevancy of a link to your site, your traffic data can also help you find the links you didn't know you had and what you did to get them. Let's start there.

Referrers To Your Site

Every situation is a bit different (OK – sometimes more than a bit) so I'm going to have to focus on general principles here and keep it simple. 

When you look to your referrer data, you're looking for a few simple signals.  Here's what you're looking for and why:
  1. Which sites are directing traffic to you?  Discovering which sites are directing traffic to you can give you a better idea of the types of sites you should be looking for links from (i.e. others that are likely to link to you, as well). You may also find types of sites you didn't expect driving traffic. This happens a lot in the SEO realm, but obviously can also happen in other niches.  Here, you can often find not only opportunities, but relevancies you might not have predicted.
  2.  What are they linking to?  The best link building generates links you don't have to actively build. The next best are those that drive traffic.  We want to know both. In looking through your referrer data, you can find the pages and information that appeal to other website owners and their visitors.  This will tell you who is linking to you and give you ideas on the types of content to focus on creating.  There's also nothing stopping you from contacting the owner of the site that sent the initial link and informing them of an updated copy (if applicable) or other content you've created since that they might also be interested in.
  3. Who are they influential with?  If you know a site is sending you traffic, logically you can assume the people who visit that site (or the specific sub-section in the case of news-type sites) are also interested in your content (or at least more likely to be interested than standard mining techniques).  Mining the followers of that publisher for social connections to get your content in front of them is a route that can increase your success rate in link strategies ranging from guest blogging to pushing your content out via Facebook paid advertising.  Admittedly, this third area of referrer data is more akin to refining a standard link list, but it's likely a different audience than you would have encountered (and with a higher-than-standard success rate for link acquisition or other actions).
As I noted above, I plan to use the term referrer data loosely.  As if point three wasn't loose enough, we're going to quickly cover a strategy that ties nicely with this: your competitor's referrer data.

Competitor Data

You probably can't call up a competitor and ask them for their traffic referrer data (if you can, I wish I was in your sector).  For the rest of us, I highly recommend pulling backlink referrer data for your competitors using one of the many great tools out there.  I tend to use Moz Open Site Explorer and Majestic SEO personally, but there are others.

What I'm interested in here are the URLs competitors link to.  While the homepage can yield interesting information, it can often be onerous to weed through and I generally relegate that to different link time frames. 

Generally, I will put together a list of the URLs linked to, then review these as well as the pages linking to them.  This helps give us an idea of potential domains to target for links, but more importantly, they can let us know the types of relevant content that others are linking to. 

If we combine this information with the data collected above when mining our referrer data, we can be left with more domains to seek links on and broader ideas for content creation.  You'll probably also find other ways the content is being linked to. Do they make top lists?  Are they producing videos or whitepapers that are garnering links from authority sites?  All of this information meshes together to make the energies you put into your own referrer mining more effective, allowing you to produce a higher number of links per hour than you'd be able to get with your own.

Is This It?

No.  While mining your referrer data can be a great source of information regarding the types of links you have that you should be seeking more of, it's limited to the links and traffic sources you already have.  It's a lot like looking to your Analytics for keyword ideas (prior to (not provided) at least).  It can only tell you what's working of what you have already. 


A diversified link profile is the key to a healthy long term strategy.  This is just one method you can use to help find what works now and keep those link acquisition rates up while exploring new techniques.


Original Article Post by Dave Davies @ Search Engine Watch

Matt Cutts on SEO, PageRank, Spam & the Future of Google Search at Pubcon Las Vegas

pubcon-keynote

Matt Cutts kicked off day two at Pubcon with another of his signature keynotes, dispelling myths, talking about spammers and about Jason Calcanis’ keynote from day one, at the urging of the audience.

First, Cutts spoke about Google’s “Moonshot changes,” which he broke down into these areas:
  • Knowledge graph
  • Voice search
  • Conversational search
  • Google now
  • Deep Learning
He revealed that Deep Learning is the ability to make relationships between words and apply it to more words, and how it can help improve search and the nuances of search queries.

Deep Learning in Regular and Voice Search

He explained that voice search is changing the types of search queries people use, but also that it can be refined without repeating previous parts of the user’s search queries. It does this when it knows the user is still searching the same topic, but drilling down to more specifics.

Cutts shared an example where he was searching for weather and continued on with the query without having to keep retyping “What is the weather?” because Google can recognize the user is still refining the previous search query. “Will it rain tomorrow?” in voice search will bring up the weather results for location for Las Vegas, Nevada. Then when he says “What about in Mountain View?” and Google shows weather for Mountain View, knowing that it is a refined voice query. Then “How about this weekend?” is searched and it shows Saturday weather for Mountain View.

Hummingbird, Panda & Authorship

Next up, Cutts spoke about Hummingbird and he feels that a lot of the blog posts about how to rank with Hummingbird are not that relevant. The fact is, Hummingbird was out for a month and no one noticed. Hummingbird is primarily a core quality change. It doesn’t impact SEO that much, he said, despite the many blog posts claiming otherwise.

Of most interest to some SEOs is that Google is looking at softening Panda. Those sites caught in grey areas of Panda--if they are quality sites--could see their sites start ranking again.

Google is also looking at boosting authority through authorship. We have seen authorship becoming more and more important when it comes for search results and visibility in those results; Cutts confirmed this is the direction in which Google will continue to move.

Google on Mobile Search Results

Next, he discussed the role of smartphones and their impact on search results. This is definitely an area SEOs need to continue to focus on, as it is clear that sites that are not mobile-friendly will see a negative impact on their rankings in the mobile search results.

Smartphone ranking will take several things into account, he explained:
  • If your phone doesn’t display Flash, Google will not show flash sites in your results.
  • If your website is Flash heavy, you need to consider its use, or ensure the mobile version of your site does not use it.
  • If your website routes all mobile traffic to the homepage rather than the internal page the user was attempting to visit, it will be ranked lower.
  • If your site is slow on mobile phones, Google is less likely to rank it.
Cutts was pretty clear that with the significant increase in mobile traffic, not having a mobile-friendly site will seriously impact the amount of mobile traffic Google will send you. Webmasters should begin prioritizing their mobile strategy immediately.

Penguin, Google’s Spam Strategy & Native Advertising

Matt next talked about their spam strategy. When they originally launched Penguin and the blackhat webmaster forums had spammers bragging how they weren’t touched by Penguin, the webspam team’s response was, “Ok, well we can turn that dial higher.” They upped the impact it had on search results. Cutts said that when spammers are posting about wanting to do him bodily harm, he knows his spam team is doing their job well.

He did say they are continuing their work on some specific keywords that tend to be very spammy, including “payday loans,” “car insurance,” “mesothelioma,” and some porn keywords. Because they are highly profitable keywords, they attract the spammers, so they are working on keeping those specific areas as spam-free as possible through their algorithms.

He discusses advertorials and native advertising and how they are continuing to penalize those who are utilizing it without properly using disclaimers to show that it is paid advertising. Google has taken action on several dozen newspapers in US and UK that were not labeling advertorials and native advertising as such, and that were passing PageRank. He did say there is nothing wrong with advertorials and native advertising if it is disclosed as such; it’s only when it is not disclosed that Google will take action against it.

Spam networks are still on Google’s radar and they are still bringing them down and taking action against them.

Bad News for PageRank Fans

For PageRank devotees, there is some bad news. PageRank is updated internally within Google on a daily basis and every three months or so, they would push out that information to the Google toolbar so it would be visible to webmasters. Unfortunately, the pipeline they used to push the data to the toolbar broke and Google does not have anyone working on fixing it. As a result, Cutts said we shouldn’t expect to see any PageRank updates anytime soon--not anytime this year. He doesn’t know if they will fix it, but they are going to judge the impact of not updating it. The speculation that PageRank could be retired is not that far off from the truth, as it currently stands.

Communication with Webmasters, Rich Snippets, Java & Negative SEO

Google continues to increase their communication with webmasters. They made new videos covering malware and hacking, as Google is seeing these problems more and more, yet not all webmasters are clear about what it is and how to fix it. They are working on including more concrete examples in their guidelines, to make it easier for people to determine the types of things that are causing ranking issues and point webmasters in the right direction to fix it.

Cutts stressed that he is not the only face for Google search. They have 100 speaking events per year and do Hangouts on Air to educate webmasters. They hold Webmaster Office Hours, to increase communication and give users the chance to engage and ask questions of the search team.

Google is becoming smarter at being able to read JavaScript, as it has definitely been used for evil by spammers. However, Cutts cautions that even though they are doing a better job at reading it, don’t use that as an excuse to create an entire site in JS.

Rich snippets could get a revamp and they will dial back on the number of websites that will be able to display rich snippets. “More reputable websites will get rich snippets while less reputable ones will see theirs removed,” says Matt.

Matt also says negative SEO isn’t as common as people believe and is often self-inflicted. One person approached Matt to say a competitor was ruining them by pointing paid links to their site. Yet when he looked into it, he discovered paid links from 2010 pointing to their site, and said there was no way competitors would have bought paid links back in 2010 to point to their site, since the algorithm penalizing paid links wasn’t until a couple years after those paid links went live.

The Future of Google Search: Mobile, Authorship & Quality Search Results

On the future of search, he again stressed the importance of mobile site usability. YouTube traffic on mobile has skyrocketed from 6 percent two years ago, to 25 percent last year, to 40 percent of all YouTube this year. Some countries have more mobile traffic than they do desktop traffic. Cutts reiterated, “If your website looks bad in mobile, now is the time to fix that.”

Google is also working on machine learning and training their systems to be able to comprehend and read at an elementary school level, in order to improve search results.

Authorship is another area where Google wants to improve, because tying an identity to an authorship profile can help keep spam out of Google. They plan to tighten up authorship to combat spam and they found if they removed about 15 percent of the lesser quality authors, it dramatically increased the presence of the better quality authors.

They are also working on the next generation of hacked site detection, where Cutts said he is not talking about ordinary blackhat, but “go to prison blackhat.” Google wants to prevent people from being able to find any results for the really nasty search queries, such as child porn. Cutts said, “If you type in really nasty search queries, we don’t want you to find it in Google.”

Cutts’ current advice (again) to webmasters is that it's important to get ready for mobile. He spoke to the convenience for website visitors when you utilize their auto-complete web form annotations, to make it easier for people to fill out forms on your site. The mark-up to add to the forms is easy to do, and will be available in the next few months.

The next generation of the algorithm will look at the issue of ad-heavy websites, particularly those with a large number of ads placed above the fold. This is really not a surprise, as it makes for a bad user experience and Google has previously announced that their page layout algorithm is targeting this. But sites using JavaScript to make it appear to Googlebot that the ads aren’t above the fold should look at replacing the ads before the algorithm impacts them.

Matt Cutts Q&A

During Q&A, Cutts discussed links from press release sites. He said Google identified the sites that were press release syndication sites and simply discounted them. He does stress that press release links weren’t penalized, because press release sites do have value for press and marketing reasons, but those links won’t pass PageRank.

The problem of infinite scrolling websites was raised, such as how Twitter just keeps loading more tweets as you continue to scroll down. He cautions that while Google tries to do a good job, other search engines don’t handle infinite scrolling as well. He suggests any sites utilizing infinite scrolling also have static links, such as with a pagination structure, so bots can have access to all the information if their bots don’t wait for the infinite loading of the page.

Someone asked about whether being very prolific on blogs and posting a ton of posts daily has any impact on search rankings. Cutts used the Huffington Post as an example, as they have a huge number of authors, so logically they have many daily posts. However, he says posting as much as your audience expects to see is the best way to go.

In closing, Cutts said they are keeping a close eye on the mix of organic search results with non-organic search results and says he would also like to hear feedback on it.

While no new features were announced during his keynote at Pubcon, Cutts packed his presentation with many takeaways for webmasters.


Original Article Post by Jennifer Slegg @ Search Engine Watch

9 Ways to Prepare for a Future Without Cookie Tracking

It was over a year ago that I first wrote about do not track legislation, and luckily for most organizations the browser-provided imperative is loosely supported or regulated today, with very few sites adhering to interpretation and compliance of the preference.

For the most part, do not track legislation is often misunderstood by the general public, and even our regulators in its definition, usage, and most importantly the privacy implications and confidence it is meant to instill.

From a digital practitioner standpoint “do not track” is the least of my worries but upcoming news about Microsoft and Google pursuing cookie-less tracking capabilities indicates to me that education on how digital information is collected and shared will become even more important in the near future.

Rather than panicking, there is a lot we can do today to enact guiding principles that will likely ease a transition into tighter privacy controls in the future.

Education

One of the biggest problems facing digital marketing and analytics practitioners will be education. The industry has evolved so quickly that much of the technology that we rely on every day is likely taken for granted.

Personalization is one such area that relies on tracking, profiling, and delivering a lot of information about visitor preferences and behavior, which many of us likely take for granted.

One might argue that personalization is a byproduct of contextual advertising, and without underlying tracking technologies, wouldn't be possible to deliver.
Teasing apart a key delivery mechanism such as a session or persistent cookie will be very challenging, but explaining the importance of cookies and their usage to visitors and customers even more so.

What can you do to prepare?

1. Ensure your privacy policy is up to date and fully transparent.
2. Explain what tracking technologies are used (savvy users will know how to check for this themselves anyways).
3. What cookies are employed and for what reason.

Usage

It’s probably safe to say that aside from a few specific highly-regulated industries and regions, most digital marketing practitioners don’t spent too much time or due diligence in reviewing data usage models with third-party vendors and their technology.

Regulators focus both on collection and usage of data in these scenarios, particularly when third parties are involved because in many cases, these partners assume ownership of the data collected on your digital properties. This is the same reason why many browsers automatically block third-party cookies, to ensure data collection services and the usage of visitor information are being entrusted to the right recipients.

What can you do to prepare?

4. Explain how data collected is used.
5. Explain how disabling functionality may affect user experience or functionality.
6. Ensure correlating verbiage between your privacy policy and acceptable use policy are complementary.

Consent

In my opinion, this is where most of the opportunity is for much of North America. Very few companies actually gather consent in a clear and concise manner.

To be brutally honest, most of us think that relying on a single line radio box at the bottom of a registration page, with a link to a hundred page disclosure is acceptable. From a legal standpoint, it probably will cover you from any litigation, but from a customer experience perspective, hundreds of pages of disclosure tend to make the average Joe either uninterested or a little paranoid.

What can you do to prepare?

7. Humanize your terms and conditions. Less legalese and more transparency.
8. Separate your service level agreement and delivery conditions from your data consent.
9. Introduce ways visitors and customers can opt-into and out of technology that enables digital marketing and personalization quickly and easily.

Conclusion

Think about the steps you can take today to instill a greater confidence in your digital business and marketing efforts today. Sometimes little things go a long way to earn the respect and trust of visitors and customers, making the impact of future technology tracking capabilities or regulatory guidelines easier to transition into.

Have you done anything to prepare your website and visitors for the future of tracking and digital marketing personalization?


Original Article Post by Garry Przyklenk @ Search Engine Watch

Google Rolls Out AdWords Ad Rank Algorithm Update

In a quiet but not entirely unexpected move, Google has announced they are updating and improving the Google AdWords Ad Rank algorithm to take into account some of the new features they have rolled out this year, primarily their new ad extensions.

Ad extensions and formats can now affect the positioning of your ads on the Google search results page. Google uses the example: If two otherwise identical ads were to appear with the same bid and quality score, the ad with the ad extensions most likely to perform would appear in the higher ad position.
Ad Rank will also play a factor in whether or not extensions appear for your ads; Google notes that a higher Quality Score or bid (or a combination thereof) increases the likelihood of extensions appearing.

According to the announcement, ads with extensions Google expects to perform well may see a lower cost per click and higher click-through rate, while those ads without the benefit of their favorable projections could see their CPCs go up.

You may see lower or higher average CPCs in your account. You may see lower CPCs if your extensions and formats are highly relevant, and we expect a large positive performance impact relative to other competitors in the auction. 

In other cases, you may see higher CPCs because of an improvement in ad position or increased competition from other ads with a high expected impact from formats.

Google is pushing advertisers to use extensions in their ads and now that it has an impact on Ad Rank, advertisers who haven’t added extensions will need to look at incorporating them in their campaigns.

Google also reminds advertisers the ad platform will automatically choose which ad extensions should be used, based on best CTR performance. Ad Rank will currently only affect advertisements placed on the Google search results page.

If you haven’t ventured into ad extensions for AdWords yet, Google also recently added some help pages to assist advertisers in learning about different ad extensions and how to enable them.


Original Article Post by Jennifer Slegg @ Search Engine Watch

Become a Leading SEO Mechanic with Both Google & Bing Webmaster Tools

Webmaster Tools offerings from both Google and Bing can offer a wealth of insight to business owners. In order to get the whole spectrum of insights, marketers must learn just what they can do with both Google and Bing Webmaster tools. Using both together allows you greater insight into the factors contributing to the success—or lack thereof—of your SEO strategy.

Internet Marketing Ninjas COO Chris Boggs and Grant Simmons, director of SEO and social product at The Search Agency, shared their advice on better integrating data from Google Webmaster and Bing Webmaster Tools earlier this year at SES San Francisco.

Google Webmaster Tools: Proactively Monitor and Have a Plan in Place to React (P.R.E.P.A.R.E)

Internet Marketing Ninjas COO/CMO and SEMPO Chairman Chris Boggs started the presentation with the topic everyone really wanted to hear: Google Webmaster Tools (GWT). He started with SEO diagnostic principles and explained that you need to be both proactive and reactive when monitoring SEO. Marketers need to have a plan as well as the ability to manage from a reactive perspective, he said. If you come across something in your diagnoses, your analytics are going to be a good second opinion. Without tools, it’s just a guessing game.

Once you have this in mind, you can start digging into GWT by focusing on a few things first:

1. Quick Barometers
Boggs referred to the “Brand 7 Pack” as a company’s homepage and six sitelinks that appear in search results. If you don’t have seven, you have an SEO problem, he said. Your social entities such as Google+ should also be ranking, with your titles to be clear and easy to understand. If you want to see what your domain looks like from Google’s perspective and see the cleanliness of your page titles, type in “site:” and then your domain name without the “www.” Below is a screenshot of a website with a good 7 pack:

macys-7-pack-google-serp

You can then go to your Webmaster Tools account to diagnose any problems you may see and determine exactly where the problem lies and how to fix it. From a reactive mode perspective, look at your analytics and verify. It’s very important for SEOs to live by this mantra. Webmaster Tools isn’t something to take for granted. Have an agency or consultant monitor the findings in GWT and relay information to design, development, and marketing teams.

2. HTML Improvements
Visit the HTML Improvements category to determine if your titles and descriptions look bad on a Google SERP. You can see if Google agrees, then click on anything with blue writing to learn more about the problem.

Boggs was asked after the presentation what tool might get users in trouble if they don’t understand it, and this was his answer. He explained that almost every site is going to have some duplicate descriptions and titles, so he wouldn’t try to get that number down to zero. You don’t need to remove every single warning from GWT.

How to Find the Tool: Located under Search Appearance.

3. Sitelinks
You can visit the sitelinks tab to demote a certain sitelink (one of the links under your company homepage shown on a search results page like in the screenshot above). Google is going to automatically generate links to appear as your sitelinks, but you can tell Google if you don’t want something there.

How to Find the Tool: Located under Search Appearance.

4. Search Queries
Here, you can look at the top pages as well as the top queries for your site. Most people will just take the default information, but Boggs stressed that there are tabs for a reason. Look at the top queries as well as use those “more” tabs to get more information.

How to Find the Tool: Located under Search Traffic.

5. Links
You can click on “links to your site” to get a full list of those linking back the most, but the tool that many forget to use is the “internal links” tool. Internal links are very important; Boggs explained it’s worth the time to go through and look at the number of these internal links and then download the table so you can really slice it and dice it.

How to Find the Tools: Located under Search Traffic.

6. Manual Actions and Malware
With this tool, no news is good news. If you get a manual action warning, it means you need to do something that is probably substantial in order to keep your rankings where they are. Malware is also something you can look into which is another place you don’t want to see anything.

How to Find the Tool: Find manual Action under Search Traffic, Malware under Crawl.

7. Index Status
If your page index is 10x, you might have a problem. The advanced tab here gives you a much better look at that data.

How to Find the Tool: Located under Google Index.

8. Content Keywords
What you want to look for here are the words you are using in your content. You don’t want to see a lot of “here” or promotional phrases. Identify where your gaps are or where you have too much content.

How to Find the Tool: Located under Google Index.

9. Crawl Errors
Google now has a feature phone tab to help you with crawl errors. You have to understand any crawl errors that might occur and remember that you should provide data that is very specific to mobile, as well. You can also take a look at your crawl stats, which means the time spent downloading, and make sure there is no spike.

How to Find the Tools: Both located under Crawl.

Finally, Boggs explained that Google Webmasters Tools should be thought of proactively by pairing it with Google Analytics. What kinds of things is GWT telling you when it comes to your analytics and how that data is affected? Consider this screenshot from Boggs’ presentation:

gwt-ga-more-less-obvious
In the end, Boggs explained that expertise is knowing the most basic things about SEO and doing them repeatedly, perfectly, every time. You’re going to come across situations where there are a lot of hooks and changes in the algorithm. Something someone might have done one to five years ago could be a very bad move now. That’s part of the game.

Bing Webmaster Tools: Bing Stands for “Bing Is Not Google”

Director of SEO and Social Product at The Search Agency, Grant Simmonsbegan his presentation with the quote “Bing stands for Bing is not Google,” and the laughter amongst the marketers and SEOs just about said it all. It’s true; Bing is often not taken as seriously as Google because it just isn’t as popular, yet Bing Webmaster Tools (BWT) does offer some good insights that Google does not.

Once you’re signed upand logged in, consider the top things that you should look at first to really get a handle on BWT:

1. Dashboard
You want to make sure that pages you think you have are the ones the Bing has indexed. If that number isn’t what you expected, ask yourself a few questions: Are they crawling my site frequently? Am I not updating my site? These are all quick things you can see right from the dashboard, and you can even look at search keywords to see how people are finding you.

Quick Fact: Bing doesn’t use Google Analytics.

2. Diagnostic Tools
The diagnostic tools category is comprised of 7 subcategories: keyword research, link explorer, fetch as Bingbot, markup validator, SEO analyzer, verify Bingbot, and site move.

How to Find the Tool: This is a category all on its own!

3. SEO Analyzer
This tool works great when analyzing just one URL. You simply type in the URL and hit “Analyze” to get an overview of the SEO connected with that URL on the right hand side of the page. The tool will highlight any issue your site is having on the page; if you click on that highlighted section, Bing will give you the Bing best practice so you can make improvements.

How to Find the Tool: Located under Diagnostics & Tools.

4. SEO Reports
This tool shares a look at what is going on with your whole site (as opposed to just one URL). You will get a list of SEO suggestions and information about the severity of your issue, as well as a list of links associated with that particular error. The tool runs automatically every other week for all of the sites you have verified with BWT (so not your competitor’s sites).

How to Find the Tool: Located under Reports & Data.

5. Link Explorer
You can run this tool on any website to get an overview of the top links associated with that site (only the top links, however, which is considered one of the limitations of the tool). Export the links into an Excel spreadsheet and then slice and dice the information as you’d like.

How to Find the Tool: Located under Diagnostics & Tools.

6. Inbound Links
Link Explorer is probably one of the more popular tools when it comes to BWT, so it’s certainly worth mentioning. However, according to Simmons, Inbound Links is a better tool that doesn’t have as many limitations. This tool will show you trends over time so you can really see if there is value on deep page links. You can see up to 20,000 links, as well as the anchor text used, with the ability to export.

How to Find the Tool: Located under Reports & Data.

7. Crawl Information
It’s important to remember that the Bing bots are different than the Google bots, and the crawl information tool can help give you insight. From a high level, Simmons explained that when the tool gives you the stats, you should be looking at the challenges you might have from the migration you did last year. Are your 301s still in place? Are they still driving traffic? From the 302 pages, should they be made permanent? It’s also a good idea to look at the last time your site was crawled. If it’s been a while, remember Bing likes fresh content and you may need to make some updates. Again, this information is exportable.

How to Find the Tool: Located under Reports & Data.

8. Index Explorer
Simmons said this is one of the coolest things found in BWT, one reason being that Google doesn’t really have anything like it. You can see stats for a particular page, which can be good to see based on a subdirectory or section of your site. The tool has great filters and offers an awesome visual representation of crawled and indexed pages.

How to Find the Tool: Located under Reports & Data.

Of course, there is a lot more to BWT than just the eight features listed above, including the keyword research tool, geo targeting, disavow tool (they were the first to offer this), and crawl control. Their features are very comparable to Google, they have excellent navigation and even a few extra capabilities. Simmons concluded the presentation by saying that we should really focus on BWT to make a difference.

Do you think Boggs and Simmons singled out the best tools in both GWT and BWT? Simmons will speak to attendees at SES Chicago in early November on what it takes to become a leading SEO mechanic, alongside Vizion Interactive’s Josh McCoy. Keep an eye out at SEW for coverage!


Original Article Post by Amanda DiSilvestro @ Search Engine Watch

The Impact of Penguin 2.1: Recovery, Knockout Punches & Fresh Hits


penguin21-impact

On Friday, October 4th, Matt Cutts announced the release of Penguin 2.1. Based on the amount of Penguin work I do, that meant one thing. Matt just threw a serious wrench into my Friday night (and weekend plans). Similar to previous Penguin updates, I began heavily analyzing websites hit by Penguin 2.1 to identify new findings and insights.

Needless to say, the past two and a half weeks have been fascinating, as I’ve now dug into 36 sites hit by Penguin 2.1. This latest update has definitely left a path of destruction across both large and small websites, from around the world.

A Tale of 3 Penguin Victims

This post is designed to give you a peek behind the curtains, into the world of Penguin. I will focus on three different websites, with three different outcomes.

The first story is a happy one, as I’ll explain more about a company that recovered during Penguin 2.1. The second company unfortunately took it on the chin, and twice. They were first hit by Penguin 2.0, only to get hit harder by 2.1. The third represents an all-too-common example of a company not understanding what its SEO agency was doing, and was blindsided by a Penguin 2.1 hit. Let’s begin.

1. Penguin 2.1 Brings Recovery

I know there are a lot of people that don’t believe websites can recover from Penguin. But they can; I’ve written several case studies about those recoveries in case you want to learn more. Once Penguin 2.1 hit, I quickly started reviewing the reporting of previous Penguin victims to see if there was any impact from our refreshed, icy friend.

During this most recent update, two websites I’ve been helping with Penguin hits recovered. I’ll focus on one of those sites in this post. While analyzing the site’s reporting, I saw a distinct bump in Google organic traffic starting on Friday, October 4th and increasing during the weekend. Note, this was a client with multiple issues, and was hit by both Panda and Penguin (historically). That’s actually a common scenario for a number of the companies contacting me. For this company in particular, I helped them identify technical problems, content issues, and link problems, and they have worked hard to rectify their issues.

A Penguin Recovery During the 2.1 Update:
penguin21-recovery

The company was originally hit by a previous Penguin update, but delayed tackling their link issues as they worked on technical problems and content issues. If you know how I feel about the gray area of Panda or Penguin, I always feel you should move as quickly as possible while maintaining focus in order to recover from algorithm hits. The reality, though, is that not every company can move at light speed.

This company was no different. They had seen improvements from technical fixes and content work, and finally started to address Penguin over the past few months (after Penguin 2.0 rolled out). Unfortunately, Penguin was inhibiting their growth, even if they had showed signs of progress based on other SEO work.

During late spring and summer, unnatural links were removed as much as possible, while links that could not be manually removed were disavowed. By the way, that’s the approach I recommend. I’m not a big fan of disavowing all bad links, and I never have been.

Based on links downloaded from Google Webmaster Tools, Majestic SEO, and Open Site Explorer, the company tackled its unnatural link situation the best it could. Now they just needed another algorithm update to see if their hard work paid off. I recommend to any company hit by an algorithm update that they should keep driving forward as if they weren’t hit. Keep producing great content, keep leveraging social to get the word out, keep building natural links, etc.

When October 4th arrived, a spike in organic search traffic followed. The site’s Google organic traffic was up 43 percent following Penguin 2.1 (and up 67 percent to the specific landing pages that had been impacted heavily by the previous Penguin hit). The filter had been lifted and the site was being rewarded for its recovery work.

Key Takeaways:
  • Move quickly and keep a strong focus on what you need to tackle link-wise. Although this company recovered, it delayed its Penguin work for some time (and the negative impact remained).
  • Be thorough. Don’t miss links you need to nuke. Penguin is algorithmic and there is a threshold you need to pass.
  • Remove as many unnatural links as you can manually, and then disavow the rest. Avoid the knee-jerk reaction to disavow all of them.
After your Penguin work has been completed, keep your head down and drive forward. Act as if you aren’t being impacted by Penguin. You’ll send the right signals to Google throughout the downturn in traffic.

2. A Penguin 2.0 and 2.1 Combination Punch

The second example I wanted to explain was an unfortunate one-two punch from Penguin. You wouldn’t think a Penguin can pack a combination punch, but it has in several situations I’ve analyzed recently (where companies reached out to me complaining of a Penguin 2.1 hit, after a Penguin 2.0 hit.) And worse, this was after thinking they addressed their unnatural link problem thoroughly.

After getting pummeled by Penguin 2.0 on May 22nd, the company gathered its troops, thought they identified all of their unnatural links, and worked hard on removing them. After what seemed to be a thorough cleanup, they eagerly awaited another Penguin update. When Penguin 2.1 was announced by Matt Cutts, they watched their reporting with intense focus, only to be thoroughly disappointed with the outcome. They got hit even worse.

The Initial Penguin 2.0 Hit:
penguin21-combo1

The Second Penguin Hit on Oct 4th:
penguin21-combo2

So what happened? Quickly reviewing the site’s link profile revealed a problem: companies put a stake in the ground and remove as many unnatural links as they can at a given point in time. They don’t continue analyzing their links to see if more unnatural links pop up and that’s a dangerous mistake. I saw many unnatural links in their profile that were first found during the summer and fall of 2013. Many showed up after their Penguin work had been completed. Those links are what got them hit by Penguin 2.1.

Fresh Unnatural Links Caused the Penguin 2.1 Hit:
penguin21-combo3

The combination punch I mentioned above is a strong reminder that Penguin never sleeps. Don’t assume you are done with your link removals because you have a spreadsheet from a few months ago. You need to continually review your link profile to identify potential problems. If this company had done that, they would have picked up many additional unnatural links showing up this summer and fall, and dealt with them accordingly. I believe if they did, they could have avoided the nasty one-two punch of Penguin.

Key Takeaways:
  • Your Penguin work is ongoing. Don’t drop the ball.
  • Have your SEO continually monitor your link profile for unnatural links (whether that’s an internal SEO, agency, or consultant).
  • The one-two punch of Penguin is killer (and can be backbreaking). Avoid multiple algorithm hits. They aren’t fun to live through.
Unnatural links have an uncanny way of replicating across low-quality sites and networks. I have clearly seen this during my Penguin analyses. Beware.

3. A Fresh Hit, Care of a “Trusted” Third Party

In April, I wrote a column titled Racing Penguin, where I explained an unnatural links situation that looked like negative SEO. However, it ended up being a “trusted” third party that was assisting a company with its marketing efforts. Unfortunately, that situation is all too common, as businesses outsource SEO and aren’t tracking what those third parties are doing.

After Penguin 2.1 was released, I received a call from a business owner blindsided by the latest update. After showing the business owner many of the unnatural links impacting the website, he was blown away. He made it very clear that he never set up those links.

I took him through a process I normally take blindsided Penguin victims through to try and determine how the links were set up. I also explained that I’ve been contacted about negative SEO many times since Penguin 1.0, but it almost always ends up not being negative SEO. The trail typically leads to someone connected to the company (and a high percentage of those people had the right intentions, but the wrong execution).

A Penguin Hit Timeline Can Bring Answers:
penguin21-timeline

That was the case for this business owner. He contacted several people who had helped him in various capacities over the past few years, but one vendor came back with a quick and affirmative response. As it turns out, the business owner hired an agency to help with SEO and they began a linkbuilding campaign. They built links all right… just not the ones a business owner wants to build. The links were Penguin food, plain and simple.

The business owner was left trying to clean up the Penguin mess. Instead of running his business, he’s dealing with link exports from various tools, contacting webmasters, and getting familiar with the disavow tool. Yes, this business owner is getting a Masters Degree in Penguin Sciences.

How to avoid this situation? My advice is the same as it’s always been. Know who you are hiring and what they will be doing; get it in writing and make sure you know what has been completed. Ask hard questions, get clear answers, and keep the pulse of your website. Algorithms like Penguin and Panda can cause serious damage to your business, as Google organic traffic can plummet overnight. Then you’ll have months of hard recovery work ahead. Avoid this situation at all costs.

Key Takeaways:
  • Thoroughly vet the SEO agency or consultant you are planning to hire. Don’t get caught hiring a company that makes your situation worse. That’s not the point.
  • Know what your agency or consultant will be completing for you, and get that in writing.
  • Communicate with your SEO agency or consultant on a regular basis to receive updates on the projects being completed. Ask for written updates, screenshots and examples as the projects continue. Don’t get caught at the end of the project with lingering questions.
  • When you see an increase in rankings, ask why that’s happening. Try to understand the tactics being used to impact SEO. Unfortunately, there are some tactics that cause short-term gains only to cause serious, long-term problems.
Take the initiative to review your own SEO reporting to better understand what’s going on. Learn your way around Google Webmaster Tools, Google Analytics, and link analysis tools like Majestic SEO and Open Site Explorer. You might be able to nip serious SEO problems in the bud.

Summary: Penguin 2.1 Bringeth and Taketh Away

I hope these three examples provided a view into the world of Penguin. In my opinion, Penguin 2.1 was bigger and badder than Penguin 2.0. The good news is that not every site was impacted negatively. There were recoveries. Though they’re often overshadowed by the destruction, it’s important to know that Penguin victims can recover. It just takes a lot of hard work for that to happen.

If you’ve been impacted by Penguin 2.1, you need to download and analyze your inbound links, flag unnatural links, remove as many as you can manually, and then disavow what you can’t remove. As I mentioned in the second case above, don’t stop analyzing your links once the initial phase has been completed. Continually monitor your link profile to make sure additional unnatural links don’t appear. Remember, another Penguin update might be right around the corner. Good luck.



Original Article Post by Glenn Gabe @ Search Engine Watch

Searcher Personas: A Case for User-Centric SEO

It wasn't so long ago that, when educating the uninitiated on the SEO process from the bottom up, we would explain that keywords are foundational to SEO – start with your keywords and work up from there.

Lately, we've seen some very fascinating (though not unexpected) developments from Google that give users ever more control in the driver seat.

Hummingbird was a big stride toward better semantic and conversational search. "(Not provided)" took website visitor keywords data away from us. Keyword search volume data moved deeper inside AdWords with the Keyword Planner. Meanwhile, user segmentation was introduced to Google Analytics, giving marketers the ability to perform cohort analysis.

The message is clear: Google is moving away from keywords. Today's SEO is about the user and the way people explore using search queries. In fact, digital marketing as a whole is moving further into user-centricity; we, as SEO professionals, are on the bandwagon whether we like it or not.

So how do we put the user-centric concept into practice for SEO?

Keyword research is as important as ever, but we now start with searcher personas (which are very similar to user personas, marketing personas and customer personas). We use the keyword research as a data source to better understand those personas, a concept well established in marketing.

Personas: The Fundamentals

There are two main functions of the persona: to provide context around the users represented by the persona and to create a sense of empathy for those users.

In the ideal world, we would be capable of understanding and being empathetic towards each and every potential customer. Since that's impossible at the scale we work in, the idea is to group target customers together and give each group the qualities of an individual human. So each persona you create will effectively "speak for" all users represented by it.

To understand the concept of the persona, you need to understand the concept of the archetype. The definition of an archetype is "…the original pattern of which all things of the same type are copies."

Read up on Carl Jung, a famous Swiss psychoanalyst, to further your knowledge in this area. A classic example of the archetype is "The Shadow," who is expressed in popular culture by a variety of characters like Darth Vader, Agent Smith or Mr. Hyde.
qualities-of-the-shadow

Although each of these characters has unique qualities, they also share a number of collective qualities: those of "The Shadow."

When we're building a persona, we will use data (like keyword search volumes, market research, user polls, web analytics, etc.) to find those collective qualities that are similar across a large group of people. When it comes down to actually creating our persona, we want a character that is like a real person.

A real person isn't in the 35-45 year old age bracket. A real person was born at a specific time on a specific day.

In creating personas, we seek precision over accuracy. We will give our persona a specific age, knowing that it doesn't accurately reflect the age of each person represented in our persona.

Incorporating Searcher Personas into Your SEO Work

Ideally, when beginning your persona development efforts, you will take a "top-down" approach, where you begin by creating digital marketing personas that will work across all channels (not just search engine marketing). You continue by performing deeper analysis on the habits of each persona from the natural search perspective.

I prefer this approach because the same high-level personas can be used to tie together all digital (and maybe even offline) marketing efforts – everyone involved in marketing and content production (not just the SEO) uses the same personas.

When you're ready to start building personas, think about your process. Consider something like this:
  • Choose target personas. What types of people are you looking to attract to your website? Group them together into 3-5 types of people and give them titles that reflect who they are.
  • Sticky noting. Gather some team members together and brainstorm to flesh out each of your personas using your own existing knowledge and assumptions. Some people use actual sticky notes; I prefer a big Excel spreadsheet.
  • Define business context. Check your work to be certain that each persona is aligned with your business objectives and offerings, and that you fully understand the context. Add this to your sticky noting.
  • Gather data. This is possibly the most challenging portion of the process. Data is not easy to get a hold of and the good stuff is usually very expensive. It's extremely important that you use the right data and interpret it correctly, as you don't want to let poor data interpretation steer your personas off course. For search, your keyword research is an important data source.
  • Create cards. Gather all of your sticky noting and data together, then summarize it all on one shareable, printable, visually attractive "card" (a PowerPoint slide works like a charm) for each persona. Remember that you are aiming for a precise, not accurate, portrait of your persona.
The end result may look something like this:
the-trade-buyer

Ultimately, you want everyone involved in marketing to be thinking and talking (and dreaming) about the target personas when developing and executing marketing initiatives. The personas are the "people" you are marketing to, whose needs you are serving.

There is much more to share on the subject of building data-driven personas, and how the keyword research process is impacted. We would love to know, for future discussions: do you already use (or plan to use) personas in your search engine marketing initiatives?

Wes Walls of iProspect contributed to this post.


Original Article Post by Guillaume Bouchard @ Search Engine Watch

Demographics and Interests: Coming to a Google Analytics Profile Near You

I was really excited to see some new information in one of my Google Analytics profiles this week. What wasn’t so exciting is that out of the approximately 20 profiles I flit between, only one sported this new feature. With that said, my inner analytics nerd wanted to dig deeper.

How many of you are seeing “age” and “interest” data in your Google Analytics dashboards? I’d guess about 20 percent at this point in time, but it will be coming to all profiles as Google rolls the feature out.

I’m pretty excited about this for a variety of reasons, not the least of which is that as a marketer, the more information we’re armed with, the better we can do our jobs. Interest and age brackets we can apply custom segments and filters against offer a window into an entirely new way to market.

Why Age & Interests Belong in Google Analytics

Consider this: you run a mom and pop gift shop. You have a pretty good idea of your in-store clientele and what appeals to them, but you also sell those products online. Do the age and interest demographics of your online customers match those of your foot traffic customer? Now you can tell.

Now you can design and fashion your online storefront to match the ages and interests of your online traffic. Never assume that one is the same as the other.
I am under 40 (barely) and my mother is over 60; we like some of the same things, but would never shop for them in the same fashion. I do almost all of my shopping online, while my mom would hesitate and feel utterly exposed if she were to enter her credit card in an online form. We both like the same item, but you need to market to us differently.

The sticky wicket in this whole new view of analytics is implementation. It does take an edit to the tracking script, so dependent upon your skill level, you may need some help to get it going. Once you do, voila! New windows appear.

Let’s walk through the implementation.

Accessing and Activating New Features

First, check to see if you even have access to this new feature. Open a profile and click on “Audience.” If you see “Demographics,” “Interests,” and “Geo” in the menu, you’re in business. If you don’t see “Interests,” you haven't yet been given access.
ga-demographics-interests-geo 


If you click on “Interests,” you’re given a message that states your view isn’t configured for this data quite yet. There are a few steps here; they’re pretty easy to complete, but still necessary.

First, we need to access the Admin panel to enable the features:

ga-admin-settings

Click on “Admin,” then choose the correct “Account” and “Property,” then “Property Settings.” Under Property Settings, scroll to the bottom of the page and you’ll see a selection for “Enable Demographics and Interest Reports.” Slide the button over to “Yes” and save the profile view.

ga-enable-demographics-and-interest-report

Above, you can see a note stating you’re required to make a small change to your tracking code. See how by clicking “learn more” above, or read on.
The code changes are fairly straightforward; we just need to replace one line of the script. The new line of script basically allows your tracking script to support the display targeting platform.

We already know that display advertising can target via interest and age/gender, so we’re using the same script to collect that information from our guests, even if we aren't actively participating in display advertising.

To enable this feature, locate this line in your tracking script:
ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js';
and replace it with:
ga.src = ('https:' == document.location.protocol ? 'https://' : 'http://') + 'stats.g.doubleclick.net/dc.js';

Now, you're collecting the new data. Please note that data will only be collected moving forward, so the sooner you implement, the more quickly you’ll see data.

Note: You may run into a message about thresholds. This appears if you have a fairly low volume site. Google will hold back some of the original data if your sample set is very small and they’re afraid you might know too much about an individual user. They want to deliver the data, but they don’t want to deliver too much information (I call this the creep factor). They aren’t going to make it possible for us to determine behavior or demographics of an individual visitor.

The Reports

Let’s take a quick look at the reports you can see within Interests, Age, and Gender. In the “Interest” report in the left column, you can see a pretty cool report of what your website visitors are interested in:

ga-interest-report

If you click on any one of these interest items, you can see a new report with the gender and age information for each one. We can also apply custom segments and other metric filters to this report.

Want to know which interest categories are the highest converting? Scroll up to the top and click “ecommerce.” That information is at your fingertips.

Once you’ve collected a good set of information, you can start tweaking your online marketing strategies to take advantage of this information. If your audience is young, market with design and culture trends that resonate to them. If you target and attract an aged population, consider the scrolling and font size of your website and try to improve usability for your customers.

Are you seeing Age and Interest breakouts in your Google Analytics? If so, tell us what you think about them in the comments.


Original Article Post by Carrie Hill @ Search Engine Watch

Instagram Coming to Windows Phone 8 After Nokia's 'Screaming About It' for 18 Months

instagram logo

Nokia announced on Tuesday that Instagram is finally coming to Windows Phone 8, and the firm told us that it's something it has been "screaming for" for a long time.

SEW's sister publication, The INQUIRER caught up with Samuli Hanninen, vice president of software program management for Lumia smart devices, who told us that Nokia has long been calling for Instagram to come to the Windows Phone mobile operating system.

This is nothing that the firm is shy about either, with Nokia having begged the hipster photo sharing service to release an app called #2Instawithlove earlier this year.

Hanninen said, "We have been screaming about Instagram for the past 18 months. It's something we really wanted to come to the Windows Phone operating system, and now it is finally here.

"Instagram wasn't just one of the major photography apps that Lumia devices have been missing, it's one of the main apps in general that has been absent from the operating system," he added, telling us that the arrival of the app in a few weeks will boost Nokia's already "market leading" imaging tools.

Hanninen also spoke with us about the Finnish firm's new tablet, the Nokia Lumia 2520. Unlike other tablets on the market, Nokia's debut Windows 8.1 slate has a major focus on photography, despite taking photos with a tablet often being viewed as not the done thing, to put it lightly.

"Camera technology on a tablet is by no means as important as it is on a smartphone, but it's important - and it's something Nokia wants to push," Hanninen said.

"Here in Abu Dhabi, we've seen people who only have a tablet device - not a smartphone - and who want to take pictures using it. Currently, tablet makers don't focus on photography so these people are missing out, so that's something Nokia is looking to change."

It looks like we'll see more tablets being waved in the air at concerts, then. Yippee. 

This article was originally published on the Inquirer.

Google Doodle Pays Homage to 'Salsa Queen' Celia Cruz

Google pays homage today to Celia Cruz, a Cuban-American entertainer who reigned as the “Salsa Queen” to adoring fans worldwide. A singer and dancer, Cruz made many gold albums and won several Grammy Awards during her time as an entertainer. She died at the age of 77 in 2003. 
celia-cruz-google-doodle 

Cruz grew up in Cuba but moved to the United States, where she resided in New Jersey for the majority of her life. Cruz was reportedly interested in singing at an early age, but her father wanted her to stay in school to become a teacher. 

However, Cruz had different plans. In her twenties, she joined a Cuban orchestra as their lead singer. “CafĂ© con Leche” was the name of the group that made Cruz famous. Under this name, she and her team of musicians toured for many years. 

Cruz performed her entire life and said of retirement: “It is absolute death, and I’m not talking about artists, because some performers change the focus of their career … I believe inactivity is a cancer in the soul… I’ve always thought that I’ll retire when God takes away my abilities… like Miguelito Valdez, I want to bid farewell to life while on stage.”


Today on her birthday, the official site for Celia Cruz is opening up the stage to her fans, asking them to post a photo, video or message on social media using the hashtag #dearcelia.


Original Article Post by Jessica Lee @ Search Engine Watch
 
Support : Creating Website | Johny Template | Mas Template
Copyright © 2013. Google Updates - All Rights Reserved
Template Created by Creating Website Published by Mas Template
Proudly powered by Blogger