Latest Updates

Showing posts with label articles. Show all posts
Showing posts with label articles. Show all posts

Monday, July 25, 2011

Should You Rebuild Your Website?

If your site boasts thousands of pages of content, but no longer attracts the amount of traffic it used to, it may be time to rebuild. That's especially true if your site has been around for many years without an overhaul. But how can you tell? And where do you begin?
Michael Martinez offers an instructive personal case history on Xenite.org. This science fiction themed website shared a server with some of Martinez's other websites. When the server reached the end of its service life, Martinez faced the unenviable task of “digging into code and applications that were several years old” and basically obsolete, according to his case study. He would have to upgrade the operating system and the hardware as well.
So Martinez and his partner dived in to do the upgrade – and basically everything broke. Forum functionality, email, you name it; “In fact, the whole server went dark for up to a week,” he recalled. But the bigger problem was that he couldn't just move everything over easily. Xenite.org includes tens of thousands of HTML pages, all of them hand-coded. With that many pages, “the idea of importing them into a CMS with little to no opportunity to fix problems is not very enticing.”
Martinez started moving the site over in April. By the end of May, with thousands of broken pages and more Perl scripts to fix than he wanted to think about, he realized that he hadn't touched many of these pages in years...and wondered whether his traffic would even notice if they no longer existed. So he took a quick look at Google Analytics, and discovered “the unmistakable dip of a Panda update in our Google referral traffic beginning around May 9.”
It was then that Martinez decided that he needed to do more than just set the site back up more or less as it had been before. He'd written earlier about getting rid of content that is no longer serving visitors, and letting go of websites that no longer served their purpose. In this case, it meant deleting everything from www.xenite.org, except for a few sub-domains. “The old Xenite has been swept away and I have no intention of ever restoring it to the light of day again. This is precisely the kind of medicine I have been prescribing for people who are suffering from the Panda Syndrome,” Martinez explained.
If you own a problem website, or you're looking at upgrading or overhauling your site, it's worth your time to dig deeper to see what's really going on. How many visitors are you getting? What trends do you see over time? Are there lots of pages that no one has visited in years? Is your site a Panda victim? Then it may be time to wipe the slate clean and consider your options.

How to Increase Website Page Views

In this search engine optimization tutorial you will learn ways to increase page views on your website. A must-read for SEO's looking to increase traffic and revenue!
Increasing website page views should be an important goal to any webmaster. From a user experience point of view, “page views” are a much more important indication of website quality than “visits” or “unique visits”. Consider the following scenario:
Website A:
22,000 unique visitors
23,456 page views

Website B:
15,792 unique visitors
48,102 page views
Statistically speaking, you might think of Website A as a more powerful, higher quality site and a better optimized website because of its higher unique visits. But its not, because of the following reasons:
  • Website A page views are less than Website B. Even though Website A has much higher unique visitors, each of these visits poorly translates to higher page views. In fact, when you compute the average page views per visit of Website A:
Average Page Views per visit = Total Page Views/ Unique Visitors= 23456/22000 = 1.1
This means that each visitor is “not interested in reading the other pages of Website A”. It is because each visitor will only stay on one page on the average then quits/leaves the site.
  • Website B is a far superior site. I say this because of the higher page views and the average page view per visit:

  • Average page views per visit = 48102/15792 = 3.05

  • This means that each user will at least read/check 3 pages before finally leaving the site. This shows quality and better user experience on the part of Website B. It also implies that Website B does a great job of maximizing each unique visitor by letting them view more than one page. It also translates to higher average time on site for each visitor - a clear indication that the website is serving the “correct content” to the “correct type of visitor”. With respect to SEO, Website B does in fact appear to optimize the content and the targeted visitors better than Website A, which in turn translates to higher page views.
    Website A may indicate that it’s targeting the wrong type of visitors or maybe serving inappropriate content. This could be a sign that the SEO practice of this website fails to deliver overall “success” despite having an increase in visits.
    Financially, its actually higher page views that translates to better website income (whether in terms of Adsense, sales, CPM, etc) than higher unique visitors. It’s troubling how many SEO companies are so focused on increasing unique visitors and using this as a success goal when in fact a more accurate figure of “success” is higher page views.
    Time will come that if Website A is continually adding more valuable content, it will surpass Website B in terms of unique visitors and substantially their difference is beyond comparison.
    So what makes a site like Website B a success in SEO?
    Ways You Can Increase Website Page Views
    The good thing about increasing page views is that it is not an extremely complex activity. It requires good planning on the side of the website owner pertaining to the website design/flow. It also requires planning on the type of content to serve to the website readers. It also requires research on getting the correct type of visitors.
    For the best illustration of this process, let’s use a highly successful website with very high page views:http://www.youtube.com/ and let's suppose a user from a search engine will do a search for the keyword “How to train for a 5k run video”
    When the user clicks one of the first page results in Google, a Youtube video with a title tag “How to Train for a 5K” appears, such as shown below:

    This gives us the first tip:
    TIP #1:  A more descriptive and accurate title tag can earn quality and well targeted visitors from a search engine.
    The user is expecting to see some video related content relating to training for a 5k run. The Youtube video title appears relevant to the visitor because it nearly matches the search query. The user will do one of the following in response to the content provided by YouTube:
    1.) The user plays the video. Average video length on Youtube can span from a minute to more than 5 minutes. This type of content is complete since the user can see the process as well as get more details.
    This gives us the second tip:
    TIP #2: Put engaging, substantial and detailed content on your website.
    This does not only apply to video, but quality text content as well. Well-written text content with quality images can provide a more engaging reading experience. The details in the content can help the reader spend quality time on your website. If your content is too thin, the user might not be willing to spend more time with it. Make sure though it is within reasonable limits.
    2.) While the user is viewing the content, they might read comments from the viewers to get more insight about the video, or the user will comment on the video after viewing it. Reading and enabling users to submit comments adds more time to your visitors that increases chances of viewing another piece of content on your website, leading to our third tip:
    TIP #3: Enable user comments on your website. Moderate it and make sure they are quality and will be useful to your visitors.
    What happens next?
    3.) The user will take a look at the Youtube page on “How to train for a 5K”, the user will notice that Youtube gives suggested videos of related topics, so the user clicks another video for details. This is the fourth tip:
    Tip #4: Show links to your related content in the content that the visitor is viewing/reading.
    This is the most important factor to increasing page views. Without this related content, the user might opt to use the search query in your website, or they will leave the site.
    Visibility of suggested or “related content” is very important. In the Youtube example above, the suggested videos are placed in the side of the video. This is not applicable to all types of websites. For example in Wordpress, the side section of the website is commonly used for sidebar and navigation.
    Find a way you can integrate and add related content. If you are using Wordpress there is a plug-in called “Yet another Related Post”: http://wordpress.org/extend/plugins/yet-another-related-posts-plugin/
    You can add that plug-in to your Wordpress website and it will automatically generate related links at the end of your blog post pointing to related content. For example in this page: http://www.webhostingpro.co.za/wordpress-plugin-review-yet-another-related-posts-plugin, it talks about this Wordpress plug-in. And at the bottom you will see the related post such as:
    As you can see, the related post also talks about Wordpress. Going back to the Youtube example, after the user clicks on the related videos, the visitor might want to search something like “How to run 5k faster” but it’s not shown as one of the related videos. In this case, the user will enter the query in the Youtube search box. This gives us the fifth tip for increasing page views:
    Tip #5: Add a search box across your entire website, particularly those pages with content. The search result should also be of quality and relevant to the query.
    Yet, a lot of websites does not have a search box. As a result, the visitor will leave the site if he/she did not find anything further relevant. In Wordpress, search boxes are widgets.  The same with Blogger/Blogspot; however for best results (particularly aiming for high relevance) you should be using the Google custom search widget. You can embed the Google search box directly in your website. The search results will be more relevant than those that are provided by your default search box. To do this in Wordpress,  you can read this tutorial:http://tools.devshed.com/c/a/Blog-Help/Add-Google-Custom-Search-to-WordPress-Blogs/
    When your visitor is reading the content, the user might find internal links using relevant anchor text pointing to other related pages in your website. This is the final and sixth tip:
    Tip #6: Another big factor in increasing page views is to add internal links to your content using related anchor text. These internal links should be pointing to other relevant pages in your website.
    A good example to this is Wikipedia.org. If you happen to read any Wikipedia article, you might be surprised that you probably spend hours in Wikipedia. The primary reason is that as you are reading the content, you clicked one of its internal links pointing to another related pages. And when you land on that page, you read and click again another internal link. As a result, the amount of time you spent reading is very high so as the number pages you have viewed or read.

    How to Drive Your SEO Crazy

    It's understood that nobody actually sets out to drive their SEO crazy, but you wouldn't know it to listen to SEOs tell their side of the story. Many problems can be boiled down to a lack of understanding and a lack of communication. In this article, I'm going to list some of the common mistakes that site owners make which drive their SEOs crazy – and derail their own SEO campaigns.
    I got this list from Stoney deGeyter, writing for Search Engine Guide. I plan to expand a little on many (but not all) of the mistakes he mentions to show you why each one is a bad idea, and what you should do instead. This way, you'll avoid the horror of, as deGeyter puts it, “waking up in the morning to find that Google has forgotten who you are and kicked you to the curb like a drunken date the morning after.”
    The first mistake deGeyter mentions is overwriting your SEO's changes by editing an outdated copy of your website and publishing it live – and worse, forgetting to tell your SEO about it. Your SEO created, or helped you create, the new version of your website for a reason: it's supposed to perform better than your old one. If your new site needs work, editing an old copy of it and then publishing that one will undo all of the positive changes (and hard work) your SEO already put in. This is why you should always discuss changes to your site with your SEO first.
    The second way you can mess up your SEO campaign is by uploading a robots.txt file that “disallows” the search engines from crawling and indexing your entire site. Sadly, that's a pretty simple mistake for someone who is not technically inclined to make, if they're trying to do things they don't fully understand. You're trying to keep scraper bots from getting at your content? That's great; just don't disallow the bots that really DO need to see your content! This is one mistake that can keep Google from seeing your site at all. There's nothing wrong with wanting to learn how to handle the technical aspects of your website, but if you're not entirely sure of what you're doing (and maybe even if you are), let your SEO check over these kinds of things. His or her job involves making sure you're visible to the search engines.
    Along those lines, the third mistake deGeyter mentions involves changing and re-developing your CMS. Yes, sometimes the CMS needs to be updated – but if it isn't done cautiously, it can lead to well-ranked URLs losing their ranking. That's another technical aspect of your site and its ranking in which you need to have your SEO as well as your website developer involved. Sometimes, you can't avoid changing URLs. In that case, you can at least set up proper 301 redirects as soon as possible – and your SEO can help you with that as well.
    Have you ever heard the saying “if it isn't broken, don't fix it”? That brings us to the fourth mistake site owners make that drive SEOs crazy: changing all of the website's URLs to be “keyword friendly” when you're already ranking very well for your keywords. I have to give site owners credit for trying to understand SEO here. Yes, keywords are important; yes, URLs and title pages are important; and yes, it's possible that keyword-friendly URLs can help your standing in the SERPs. But before you make those kinds of changes, please check with your SEO. Anything that involves a change to a URL is pretty serious; you're likely to lose whatever ranking that page held previously. That's why you try to avoid changing URLs when your pages are ranking well.

    Google Shortcuts and Circles

    Are you looking for more proof that Google is taking online social networking seriously, and that Google+ has some staying power? Look no further than a recent purchase by the search giant, and the many uses the digiterati are finding for one major feature of the new social network.
    The purchase this time is not a company, but a URL. It's g.co, acquired from .CO Internet SAS. This organization handles .co domain names. Google's hardly the first big company to acquire one of these URLs; Twitter owns T.Co and Overstock boasts O.Co. The interesting question is what the search company will do with their brand new domain.
    In that area, they're quite forthcoming. It's not going to be a URL shortener, as you might expect from Google's increased involvement on social networks; the search firm already owns the public URL shortener goo.gl. Rather, g.co will be used to host URLs that send users only to web pages on Google properties.
    It's a brilliant move. As the company itself explains in a blog post, you can't always tell what website you're going to be directed to if you're clicking on a link for a shortened URL. If you know that a g.co URL will only go to websites that link to official Google products and services, you'll feel a lot safer and more confident that you won't go somewhere dangerous when you you make that click.
    As Rachel King explains on ZDNet, this action by the search company will mean that seeing g.co in a link “legitimizes the link and makes it more trustworthy to the end user.” It offers other advantages, too, such as “streamlin[ing] access to Google sites in general, especially when users want to share something quickly on social media sites...Having a shorter URL also makes it easier and cheaper when using URL addresses in advertisements.”
    That latter point won't actually be very useful to advertisers in this case, unless they're linking to Google properties. On the other hand, creating an ad with a link to a Blogger blog for your company – or a Google+ profile, even though businesses aren't supposed to have their own pages on Google+ yet – could represent an interesting new channel for promoting your business on the web. But don't rush to try this yet, as Google is still rolling out the service. At the time of writing, in fact, Google displayed only a landing page at g.co. We'll see how this develops.

    Increase Website Authority with Wikipedia Guidelines

    This search engine optimization article aims to teach you how to increase your website's authority by following Wikipedia guidelines.
    Becoming an authority is important to a website. It means you are a “reliable”, “factual” or “accurate” source of information. This will increase your “web trust” in the eyes of the major search engines like Google, Bing and Yahoo. It also increases visitor and reader confidence in your content and services.
    There are some attempts to measure authority such as this article: http://www.seochat.com/c/a/Google-Optimization-Help/Measuring-Website-Authority-in-Google/, where it is shown that authority level in Google is related to these 3 factors:
    1: Age of the domain
    2: Link popularity
    3: Size of the website
    One big question remains: “Why are some websites more trusted sources in search engines and Wikipedia despite having similar or lower authority level?” Or, to state this question another way: How do you become an “authoritative” website?
    Wikipedia, the biggest online encyclopedia, released some information on how your website can become authoritative in nature. This article will take a look at this information and extract useful input for SEO and website owners or blog authors.
    What Wikipedia considers a “Trusted Resource”?
    On the Internet, almost any website or blog can claim they are an “expert” of something. But Wikipedia looks beyond that. If you are writing for Wikipedia, they have a strict set of rules that any contributor should follow when citing resources. Failure to follow these guidelines can result in an article in Wikipedia being edited or even removed from the site. The primary reason is that Wikipedia only allows articles backed up by a “trusted resource”.
    But again, since anyone can claim they are expert of something, it can be a bit hard for Wikipedia contributors to decide which information can be trusted or not. This is where the guidelines can be very important. One rule for adding article topics in Wikipedia is that if there are no “verifiable and trusted resource” on that topic, then that article should not be added to Wikipedia. This is why you often see a lot of “self-promotion” articles in Wikipedia get deleted; because of the absence of reliable source supporting that topic. Wikipedia will not immediately delete those type of articles, instead they show some warnings to any interested contributors, such as in this page:http://en.wikipedia.org/wiki/OfficeSIP_Messenger. Take note of this warning: “If notability cannot be established, then article is likely to be merged, redirected, or deleted”
    “Verifiable and trusted resource” means third party content sources. The most important verifiable and trusted resources are as follows:
    1.) Educational journals – the journals published by prominent universities and colleges.
    2.) Books authored and published by universities
    3.) Mainstream newspapers
    4.) Expert authors of blogs, self-published sources
    Source: http://en.wikipedia.org/wiki/Wikipedia:SOURCES#Self-published_sources
    #1 to #3 are easy and understandable. The content published in those sources are very reliable and verifiable because:
    1.) There are lots of people involved in the fact-checking process during the editing and publishing process. For example, before an article or a research paper can be published to an academic journal, it will be reviewed by an expert or a number of editors.
    2.) They are supported with facts and research which is verifiable.
    3.) They are trusted sources of information because they are publicly recognized experts in their field. For example, a mathematical text book written by a Ph.D in Math is more accurate than those published elsewhere.
    #4 warrants a more in-depth discussion in the next section.
    Definition of “Expert Authors”
    Most websites on the Internet are those in the blogosphere that self-publish their content. In this way, only the blog author is solely responsible for fact-checking its content. In general, self-published sources in Wikipedia are not a reliable source of information, as anyone can publish blog content and claim to be an expert.
    However, self-published material written by an “expert author” according to Wikipedia will be considered a reliable source. Remember that it is so hard to show that you are an authority if your content does not reflect your expertise. So how you can become a “real expert author”?
    1.) First, you need to have some expertise in a certain topic. For example, if you want to start a blog about bone fractures, you should be a licensed orthopedist. In this way, your content reflects your expertise in treating bone fractures and giving advice to your readers pertaining to bone related illness.
    2.) It's not enough just to be certified to practice a profession. You should have some work previously published by “reliable third-party publications”.
    Source: http://en.wikipedia.org/wiki/Wikipedia:SOURCES#Self-published_sources
    What are these reliable third-party publications? They are as follows:
    1.) Official publications of your professional organization. For example if your profession is Electrical engineering, your official publication can be coming from the Institute of Electrical and Electronics Engineers, such as here:http://www.ieee.org/publications_standards/. If you have written an article published by IEEE, then its great because it can help in establishing your authority as an author.
    2.) If you are a student studying a certain course (for example you are a Mathematics major), your official publication can either be your University journal or other university mathematical journals.
    Best practices of Web Authors to become an Authority
    There are millions of blogs on the Internet. They are created for different purposes, such as advertising, search engine optimization, sharing of information or even self-promotion. Be careful of your primary purpose for starting a blog, because it will have a strong impact on it's authoritativeness. A website or blog with “dark hat” purposes created for spamming, SEO content factory, link advertising, affiliate content factory, page rank selling, etc. will have very low reputation in the web industry and it will be impossible to become an “authority” or a “trusted resource”. And reputation on the web is not easy to attain. So what are the best practices that as a webmaster or blog author should do in relation to increasing the site or blogs authority?
    1.) Only start a blog with topics matching your core expertise. This will let you create the best content possible. Also, being an expert author of that topic means you will not easily run out of article topics for your blog.
    2.) Limit the niche of your blog to at least one, reflecting your core expertise. This will have the best results for search engines in identifying your website as an authority of the subject matter. For example, if you have a multi-niched website with topics in sports, entertainment, and medicine, it has some disadvantages in self-publishing if you are a single author because you can never be an expert for all of those topics.
    You should have at least one major niche (with sub-niches) for best long term results. This will let you create a lot of article ideas that still belong to that niche.
    Case Example: Mr. X is an athletic coach in running, cycling and triathlons. He writes some articles that are published in reliable sports magazines, such as USA Triathlon Magazine. Mr. X's sole expertise is triathlon. He decided to create a blog that shares his information about these sports. What niche should he select?
    Answer: First, he could write a blog all about triathlons. But this topic is so specific that he might run out of topics in the long term. Also, this a very specific niche that only attracts specific visitors which are triathletes.
    To increase the potential of having a high amount of visitors and publishing a lot of content in the future, he might decide to write a blog about “multi-sports” in general. In this blog, he writes post relating to running, triathlons, cycling, etc. As a result, it will attract a lot of visitors in the site which are inclined to these sports.
    To search engines, Mr. X is an expert author about “multi-sports”. A lot of websites, including Wikipedia, might use Mr. X's articles, even if they are self-published because:
    a.) He has shown some expertise in these types of sports. He practices and coaches a lot in that field.
    b.) He has some published articles in reliable sports publications.
    3.) Boost your authority by showcasing your published articles in the reliable publications of your field. For example,  Mr. X can create a page in his site called “About Mr. X” or “My published articles”. In this page, he can provide a link to his articles published in reliable publications.
    3.) When writing content; whenever possible implement strict content quality practices such as fact-checking, citations, spelling and grammar-check.
    4.) Make sure your website conveys a professional image to your visitors. Avoid too many ads, pop-ups, spammy comments, spammy external links and link pages in your site.
    5.) Moderate comments and only publish helpful comments. Although Wikipedia does not consider a website or blog comment as a reliable source, moderating helpful comments can be helpful to create a user-friendly website.
    6.) Make sure your website or blog does not fall into some “questionable sources” as defined by Wikipedia:http://en.wikipedia.org/wiki/Wikipedia:SOURCES#Selfpublished_and_questionable_sources_as_sources_on_themselves
    a.) The content is promotional in nature. Common example are content that are primarily written to induce the reader to buy something as stated on a content (affiliate-based content).
    b.) Content that relies heavily on personal opinion without any citations or sources.
    By following the tips in this tutorial, the quality of your blog or website content will dramatically increase and this will translate to better user experience as well increasing the authority level of your site.

    Yahoo Site Explorer Merging with MWT

    Say good-bye to Yahoo Site Explorer. The essential tool for webmasters looking to optimize for the Yahoo search engine seems to have become redundant. Both Microsoft and Yahoo stated that YSE will be shut down later this year, and encouraged switching to Microsoft Webmaster Tools.
    In a post on Yahoo's search blog, Herman Minocha, the product manager for Yahoo Site Explorer, acknowledged that a year ago his company said that they would continue to support and extend the service, even after completing the transition to Microsoft-powered searches. However, after input from users and the team from Bing Webmaster Central, the companies realized that “Having two webmaster portals for a single source for organic results does not add enough value,” Minocha noted.
    So when, exactly, will this shutdown take place? Minocha says that it will happen “once all markets are transitioned” to the the Microsoft Search platform. He gives no more specific date than “later this year,” but notes that the search blog will continue to update readers on the status of the transition.
    In the meantime, Minocha says that webmasters should start using Bing Webmaster Center, but they don't need to stop using Yahoo Site Explorer just yet. “In a large part of the world, we have not yet transitioned” to the Microsoft search platform, Minocha explained, so “to continue to receive relevant organic traffic from Yahoo! Search results, we encourage you to continue using Site Explorer as you have in the past until we fully switch over.”
    If you didn't see the writing on the wall for the Yahoo search engine, this is the clearest indication yet that it will cease to have a separate existence. In fact, Minocha explicitly states that webmasters will not need to optimize separately for the Yahoo and Bing search engines in the future. That will happen as soon as this year, when Yahoo's organic results are completely powered by Bing in all parts of the world.
    While this event will simplify search engine optimization for many people, it's hard to see Yahoo fade away without mixed feelings. Much older than Bing, and several years older than Google, Yahoo stayed relevant as a search site much longer than earlier entries to the field, such as Excite, Lycos, and even Ask. Once the transition to Bing-powered results is complete, will the combined Microsoft-Yahoo search engine become a stronger competitor to Google, or will the combination simply fall victim to the search behemoth's competitive strength as so many others have before? There's no telling, but even if it does, Yahoo will probably continue to exist as a source of news and various information, as its associated sites and services still see a huge amount of traffic.
    If you're using the Site Explorer APIs, your extensions are running out. Yahoo had intended to shut those down at the end of last year, but extended the deadline. Those will now be closed on September 15, 2011.

    Moneycontrol Market Reports

    Moneycontrol IPO News