Build One Way Links by Pinging | DonnaMiller.NET

What is pinging?

Pinging is a way of letting search engines know that you’ve created a new post, or updated an existing post on your blog. A ‘ping’ is a message sent to a ‘ping server’ notifying it that the blog has been updated with new content.

What is a ping server?

A ping server is a web-based service that accepts messages in the XML-RPC format. It uses the information in these XML-RPC messages to publish a list of blogs which have updated content. Some of these ping servers run their own blog search engines, and some propagate the information to a number of other ping servers and search engines.

How does pinging work?

Many blog publishing platforms, such as Wordpress, have a built-in pinging tool. In the case of Wordpress, pinging is turned on by default and pings go out to one ping server which updates several search engines.

You can also notify a ping server that you have updated content manually by going to their website and submitting your blog’s name and URL.

One popular ping server is PingOMatic, found at http://pingomatic.com/. When your blog pings PingOMatic it then notifies a number of search engines that your blog has been updated. Many of these search engines also have their own ping server as well.

These search engines can then update their listings with new content almost instantly. This of course will give you one-way backlinks to your blog from these search engines, many of which have a high Google PageRank.

4 things you should know about pinging

1. Only ping when you have updated content. Otherwise, your site might be blacklisted as a spam blog, also known as a splog.

2. If you tend to post to your blog and then re-write the post frequently you might want to make sure you only ping when you actually post new content, not on updates. If you are using Wordpress there are a few plugins that make sure that pings are only sent when a post is created.

3. It’s a good idea to ping several ping services, especially the ones that notify several search engines. Don’t overdo it though. Make sure that if you are pinging a niche search engine that your blog falls into the relevant niche.

4. Make sure that if a search engine is notified by a ping server you are already using that you don’t add that search engines own ping service to your list of servers to ping. Because blog spammers have abused ping servers if you are pinging a search engine multiple times your site might end up being blacklisted, and not appear in the search engines at all.

Popular ping servers

Finally, here’s a list of popular ping servers you might try to get you started.

http://rpc.pingomatic.com
http://pingoat.com/goat/RPC2
http://blogsearch.google.com/ping/RPC2
http://api.feedster.com/ping
http://api.my.yahoo.com/RPC2
http://ping.blo.gs/
http://ping.feedburner.com
http://ping.syndic8.com/xmlrpc.php
http://rpc.technorati.com/rpc/ping
http://rpc.weblogs.com/RPC2



Build One Way Links by Pinging

Why You Should Be Careful with Ads on Your Site

There is nothing wrong with trying to make money from a website. You put hard work into it, you provide value to other people, so you should get compensated for it.

The mistake many website owners and bloggers do, however, is to get greedy once they see real bucks coming in. That is when they put one of the following factors on the site:
  • too many ads
  • ads heavily blended with the content
  • ads with flashy colors
  • unrelated ads
  • animated ads that are distracting
While in the short run these “methods” might increase your revenues, over the long term they will actually hurt your profitability. Having too many ads or intrusive ones will hurt the user experience and make you lose readers along the way.

And you don’t need to trust my advice here. Recently I came across a very short post from a Digg user where he was basically asking Digg to remove the video ads on the front page. Not only these ads were distracting because they played video, but they were also not relevant since they linked to a dating site.

Guess what, the little post created a huge buzz within the Digg community, and it received almost 4,000 diggs.
Now, if digg users, who are loyal and very attached to their site, would not stand some intrusive ads on the site, what makes you think that your readers will?

Why You Should Be Careful with Ads on Your Site

SEO Jargon

The guys over SEOMoz have published the “Complete Glossary of Essential SEO Jargon.” Below you will find my favorite ones:
Cloak: The practice of delivering different content to the search engine spider than that seen by the human users. This Black Hat tactic is frowned upon by the search engines and caries a virtual death penalty of the site/domain being banned from the search engine results.
Doorway (gateway): A web page that is designed to attract traffic from a search engine and then redirect it to another site or page. A doorway page is not exactly the same as cloaking but the effect is the same in that users and search engines are served different content.
Google bomb: The combined effort of multiple webmasters to change the Google search results usually for humorous effect. The “miserable failure” - George Bush, and “greatest living American” - Steven Colbert Google bombs are famous examples.
Google bowling: Maliciously trying to lower a sites rank by sending it links from the “bad neighborhood” - Kind of like yelling “Good luck with that infection!” to your buddy as you get off the school bus - there is some controversy as to if this works or is just an SEO urban myth.
Google dance: The change in SERPs caused by an update of the Google database or algorithm. The cause of great angst and consternation for webmasters who slip in the SERPs. Or, the period of time during a Google index update when different data centers have different data.
Hub: (expert page) a trusted page with high quality content that links out to related pages.

Keyword Cannibalization:
The excessive reuse of the same keyword on too many web pages within the same site. This practice makes it difficult for the users and the search engines to determine which page is most relevant for the keyword.

Latent Semantic Indexing (LSI):
This mouthful just means that the search engines index commonly associated groups of words in a document. SEOs refer to these same groups of words as “Long Tail Searches”. The majority of searches consist of three or more words strung together. See also “long tail”. The significance is that it might be almost impossible to rank well for “mortgage”, but fairly easy to rank for “second mortgage to finance monster truck team”. Go figure.
Link Condom: Any of several methods used to avoid passing link love to another page, or to avoid possible detrimental results of indorsing a bad site by way of an outgoing link, or to discourage link spam in user generated content.
Sandbox: There has been debate and speculation that Google puts all new sites into a “sandbox,” preventing them from ranking well for anything until a set period of time has passed. The existence or exact behavior of the sandbox is not universally accepted among SEOs.

Keep your Website Fresh for SEO

Keeping it fresh is another thing that is great for SEO. Make sure to update your website once a week. 

The changes don't have to be big. They could be very small like changing the homepage text or posting comments on your services page. Any change will be considered as keeping your site fresh which search engines love. One of the worst things you can do for SEO is to have a stale website.

No one likes outdated information and neither do search engines.

Webmasters Still Feeling Aftershocks of Google SERP changes

By: Navneet Kaushal

At the beginning of this month, I had reported about the Google SERP changes for July 2008 along with the confusion and chaos it had created. It seems that, the Webmasters are still feeling the aftershocks of the Google SERP changes and according to some, these massive fluctuations are still happening.

Webmaster World had a long thread regarding these fluctuations at the beginning of the Google SERP changes. Now, that thread has been expanded to a 'Part 2' series, discussing the ongoing fluctuations in Rankings in Google SERPs. Here are some of the posts from that thread:

"I run a large, well-established website that has been around for years, generating millions of visitors per month across millions of pages of documents. We have always been diligent about tracking all of our website metrics so that we understand user behavior, where our audience is coming from, and can use the data in order to improve user experience. 

Recently we have been experiencing *very* erratic Google organic traffic which jumps up and down by 30%-80%. The cycle has now repeated itself 6 times over about 6 weeks time. While there have been times in the past where our Google organic traffic has increased and decreased, it always has done it in a measured manner; we have never before seen erratic behavior from Google. 

Here are the traffic specifics:
June 3, Google organic drops by 30% vs. normal
June 4, Google organic traffic returns to normal
June 9, Google organic again drops by 30% vs. normal
June 17, Google organic returns to normal
June 19 , Google organic again drops by 30% vs. normal
June 27, Google organic returns to normal
July 9, Google organic again drops by 30% vs. normal
July 11, Google organic returns to normal
July 12, Google organic again drops, but this time by 80% of normal
July 13, Google organic returns to normal 


While we are constantly in the process of refining our site, the only major change over the last couple months has been to our "related articles" component which does what it sounds like: if you are looking at article A, here are a handful of other articles that are highly relevant to the one you are viewing. Over time, we have been tuning the algorithm that generates these links so as to improve relevancy. 

I have also noticed some other artifacts:
Google bot spidering activity has increased, reaching a plateau of about 140% of pre-link change levels; on some days approaching 1 million pages/day.
The number of page indexed in our Google Webmaster site map reports jumped by 12%.

Any ideas about what might be going on here?"
"Its an update to my previous post, well i had fixed the metas errors in google webmaster tool. Its been little over a week now and errors are reducing by like 30-50 a day. its probably going to take a month for google bot to remove it completely."
"it can be a tip of an ice berg before it hit you badly. Start looking for potential problems in your pages e.g dupe content, meta titles/desc and key word stuffing and any thing which is breaching google tos."
"We are not doing any purposeful keyword stuffing of any sort, however we have over 5 million articles on our site and it is certainly possible that when we are pulling over "related articles" that the titles are very similar.
Is it normal to see organics go up and down repeatedly prior to a penalty?"
"I think that's extremely unlikely. Google wants webmasters to use their service - and all it does is give you reports and a way to communicate as the authenticated responsible person. In other words, it's a report only, and not a cuase of anything. 

Depending on how you generate a sitemap you may introduce spidering problems, expose duplicated urls and so on - that I can see possibly causing some trouble. But not just a WMT account on its own. And I've got scores of them as background experience, too."


Webmasters Still Feeling Aftershocks of Google SERP changes