DNN SEO
DotNetNuke & Search Engine Optimization
facebook junkie


Want to know how search engines process URLs and Domain names?. Here is a video from Nathan Buggia from the Live Search Webmaster Center.

URLs and Domains (SMX East 2008)
View SlideShare presentation or Upload your own. (tags: seo sem)

 

Here is a great video from Matt Cutts that outlines the steps to remove your URL from the Google Search Engine by using Google Webmaster tools and some other methods.



Matt Cutts Discusses Webmaster Tools

 

From Google AdWords Blog:

"Did you know that 20% of the queries Google receives each day are ones we haven’t seen in at least 90 days, if at all? With that kind of unpredictable search behavior, it's extremely difficult to create a keyword list that covers all relevant queries using only exact match."

 




DotNetNuke.com has announced that the 2nd Release Candidate for DNN v.5 has been released. No word yet on when we will get the RTM version.

 

I've started a new blog dedicated to SEO Tools, specially those that Google provides. In the past when you search for SEO Tools you generally get a lot of spam and some mediocre tools. I hope to provide a one-stop-shop for the Search Engine Optimization tools that all webmasters need.

Please visit me @ www.GoogleSEOTools.net and voice your opinion....

 



Wan't to join us on Facebook?. Go ahead and join us @ DNN SEO Facebook Group. Invite your friends to spread the word.






 

From the October Google Webmaster Chat Q &A:

Adrienne, San Francisco: For SEO, I'd like to improve my rankings by removing technical obstacles (starting with dynamic URL parameters), what are the most important site fixes to make and how can I document before and after success metrics using the Google Webmaster Tool?

JohnMu: To check the crawlability of your site, I would recommend crawling your site with a crawler like Xenu's Link Sleuth (freeware, for Windows). Doing that gives you a rough look at how search engines view your site and can point you towards areas where crawlers get stuck in a loop or start crawling duplicates based on the URL parameters.

 

From the October Google Webmaster Chat Q &A:

Jane, Ireland: Does the geotargeting feature in Webmaster tools hold as much weight as having a country-specific TLD?

Kaspar aka Guglarz: Hi Jane!
Google uses a bunch of signals like location of the server or the TLD in order to determine which users might be interested in the sites content. Geotargeting is a way for webmasters who use non country specific TLD's like .net/ to tell Google which your target group was, if the site is specifically targeted to users from a particular area. Think of the site of a small, local hardware store or a vet for example. Potentially, their main target users would be people living in the nearby area. Geotargeting is not to be used for language targeting though.

[Post-chat edit: Using the tool may have some effect on non-country-restricted searches, but it probably won't be the same as having the country-specific TLD. Most sites will see results somewhere between the two extremes (no effect, and total equivalence with ccTLDs). 

 

From the October Google Webmaster Chat Q &A:

Anonymous: Sitemaps question - How do we know what pages in the sitemap are NOT indexed? The report shows us how many are in the submitted sitemap and how many are indexed, but not which ones are or are not indexed...

JohnMu: One way you can do that is to set up separate Sitemap files for the different parts of your website. Doing that you can find which areas of your site are not being crawled and indexed as much as you would like. Perhaps you'll also stumble upon areas that you don't want indexed completely?

 

From the October Google Webmaster Chat Q &A:

Anonymous: Duplicate content - How many unique URLs (domains) are permitted to point at the same destination before it becomes a problem?
e.g. domain1.com, domain2.com, domain3.com all point to domain.com, which has its own pages...

Maile Ohye: Hi there, I'm happy to help with this question, but I'm not sure how it's duplicate content. In your example it seems like domain.com is the only site hosting content. It's common for companies to buy misspelling <-- did i spell that right? of their domain and then 301 to the correct version. Can you further explain the scenario for us?

Translation: It's not a problem.....

Kevin, Boston Ma: My company controls the main domain out of Italy with a .eu extension. We have an English version of the site. Will the European domain hurt our ranking?

Kaspar aka Guglarz: No, you won't have any ranking disadvantages due to the .eu/ domain :-)

 

On Tuesday DotNetNuke Corporation announced that they will be releasing a new version of DNN, it will be titled DotNetNuke Professional Edition. For the time being they are not releasing much details. It was announced by Shaun Walker in his OpenForce keynote in Las Vegas, it will be released in 2009 and contain additional professional modules.

Prices and conditions will be announced soon. The DotNetNuke application framework and project modules will remain free, they will be called DotNetNuke Community Edition.

Stay tuned for more details.....

 

From the October Google Webmaster Chat Q &A:


tewmonkey, cardiff,wales: Until recentley (the last six months or so) a high ranking was achievable by submitting articles to article directories (providing they were 40%-60% unique),  it no longer seems to be the case.  Have links from article sites been de-valued at all?

Matt Cutts: In my experience, not every article directory site is high-quality. Sometimes you see a ton of articles copied all over the place, and it's hard to even find original content on the site. The user experience for a lot of those article directory sites can be pretty bad too. So you'd see users landing on those sorts of pages have a bad experience. 

If you're thinking of boosting your reputation and getting to be well-known, I might not start as the very first thing with an article directory. Sometimes it's nice to get to be known a little better before jumping in and submitting a ton of articles as the first thing.

 

From the October Google Webmaster Chat Q &A:

Cynthia, San Francisco: Recently went through a rebranding of our company name. The old domain name was successful in page ranking, however the new domain name has terrible  page ranking.  Do 301 redirects transfer the site equity from the old domain to new domain?

Answer: Hi Cynthia, This is a pretty common question, so we actually did a blog post about it recently. In short, 301's are the best way to retain users and search engine traffic when moving domains. You can find the full post here: http://googlewebmastercentral.blogspot.com/2008/04/best-practices-when-moving-your-site.html

 

From the October Google Webmaster Chat Q &A:

Anonymous: In addition to a XML sitemap, does it make any sense to have also an HTML sitemap on the same website? Does HTML sitemap helps improve the rating?

JohnMu: A HTML sitemap file can help search engines, especially those that don't use XML Sitemap files. Also, the 404 widget in Webmaster Tools (which you can place on your 404 pages) will use "/sitemap.htm" and similar files to help users to find the content they're looking for. So yes, I would recommend making HTML sitemap files, however I'd focus on the user and not the search engines.

 

From the October Google Webmaster Chat Q &A:


Anonymous: Sitemaps question: If my website has an extremely large number of pages, like Amazon.com, should I include every single URL that I want indexed in my XML sitemap?  If not, how should I go about populating my XML sitemap?

Wysz: Feel free to use your Sitemap to list all of your pages... that's what it's for! :) However, if you have many duplicate URLs for the same content, then you may want to only list your preferred versions of the URLs in your Sitemap.

 

From the October Google Webmaster Chat Q &A:

TylerDee, TX: Are .gov and .edu back links still considered more "link juice" than the common back link?

Matt Cutts: This is a common misconception--you don't get any PageRank boost from having an .edu link or .gov link automatically. Hah John, I beat you to it! If you get an .edu link and no one is linking to that .edu page, you're not going to get any PageRank at all because that .edu page doesn't have any PageRank.

JohnMu: We generally treat all links the same - be it from .gov or .edu or .info sites.

 

From the October Google Webmaster Chat Q &A:


Scott, Minnetonka: Do inbound links from other sites owned by the same company help or hurt rank?

Matt Cutts: I find that inbound links from the same company tend to break down into two camps. You'll find mom/pops that have a very few sites in one camp, and that can make sense if those sites are linked; in the other camp, I've see SEOs have 1000 or 2000 different domains and cross-link them. I definitely would not recommend that.

I think a lot of the litmus test in my mind is whether it makes sense to a regular person for those domains to be interlinked. If you look at a product like Coke, people aren't surprised to see that they have coca-cola.co.nz and several other domains. If you go to coke.com, it's perfectly reasonable to ask users which country they're coming from, and then send them to one of a bunch of domains. But if a regular user lands on example.com and finds 20 or 30 cross-links at the bottom of the page and they look like off-topic or cookie-cutter or spammy domains, that's going to look bad to almost anyone.

Maile Ohye: Hey Scott, I'm not trying to give you the run around, but this is a bit situation specific... overall, though, I wouldn't look at these links as helping or hurting your rank when written in a helpful manner to the user.

So, if you run a sporting goods site, and you link to your sister companies for camping and tailgating equipment, that's good for the user. More happy users can lead to more buzz, leading to better rankings.

If your sister companies are just linked at the footer of the page, in hopes of cross-advertising or getting more links, it's not likely to add value to ranking or the user. In extreme cases, if it's a bad neighborhood, these links will certainly not help you.

Put yourself in the user's seat, and do what makes sense for these links. Good luck!

 

From the October Google Webmaster Chat Q &A:

Anonymous: Suppose my website supports English and French.  Should the English version of a particular page and the French version have different URLs?  Any other best practices for multi-lingual site architecture?

Matt Cutts: If you can afford it, I would do domain.com and domain.fr. If that's not possible, I would consider doing en.domain.com and fr.domain.com. If that's not possible, then domain.com/en and domain.com/fr can work. In webmaster tools, you can geographically target a site (and I believe parts of a site such as fr.domain.com), which will help as well.

 

From the October Google Webmaster Chat Q &A:


Rick Rayn, Indiana: What weight does the age of a site and the amount of time a domain is registered for have on it's search placement?

Matt Cutts: In the majority of cases, it actually doesn't matter--we want to return the best information, not just the oldest information. Especially if you're a mom/pop site, we try to find ways to rank your site even if your site is newer or doesn't have many links. I think it is fair for Google to use that as a signal in some circumstances, and I try never to rule a signal out completely, but I wouldn't obsess about it.

 

This is important for everyone, but it’s a particular challenge for online retailers. If you’re selling the same widget that 50 other retailers are selling, and everyone is using the boilerplate descriptions from the manufacturer, this is a great opportunity. Write your own product descriptions, using the keyword research you did earlier to target actual words searchers use, and make product pages that blow the competition away. Plus, retailer or not, great content is a great way to get inbound links.

 

The page title is the single most important on-page SEO factor. It’s rare to rank highly for a primary term (2-3 words) without that term being part of the page title. The meta description tag won’t help you rank, but it will often appear as the text snippet below your listing, so it should include the relevant keyword(s) and be written so as to encourage searchers to click on your listing.

 

Be patient

Posted In: . By Yo,

SEO isn’t about instant gratification. Results often take months to see, and this is especially true the smaller you are, and the newer you are to doing business online.

 

Cool video from John Reese on Traffic Secrets 2.0


 

Today we will be publishing our first SEO Review, we we're asked by Nacho, Microsoft MVP and webmaster of Messenger Adictos, a popular Spanish website about Windows Live.

Google Page Rank: 5
Very good page Rank, it could be higher if more focus was placed on the Social Media aspect of the website.

Meta Tags
It has Meta tags for the Title, Description and Keywords, all very important. And it’s customized for each page on the website, this will allow search engines to distinguish and rank multiple parts of the website. Although Keyword density is a little high, perhaps more diversity is needed here.

SiteLinks
Google has successfully generated SiteLinks, this means that your site has become “worthy enough” in the eyes of Google and allows your users to access different parts of your website directly from the search results.

SERPS
Search Engine Results Placement, this is an area where Messenger Adictos could use a lot of SEO work. Currently it ranks #1 for “Messenger Adictos” in Google, Yahoo & Live. But if anyone were to type that into Google you already know that they we’re going to visit you anyway.

I conducted a Rank Check on the other Keywords that they have included in the Meta Tags, but the only one that was ranked was “Messenger 9” on Yahoo @ #116.

More SEO work should be invested in this area, since Organic Search & SERPS are the #1 gateways to bring more traffic into your website.

Conclusion
www.MessengerAdictos.com is search engine friendly, has a lot of users and a good page rank. In order to increase traffic & users more work should be done in the SERPS & Organic Search area.

 

The Robots.txt file on your website tells search engines how you want them to crawl. Specially which directory or files you don't wan't appearing in the SERPs.

For DNN we should pay special attention to this because you will often see DNN's supporting directories show up in search results.

The easiest way to avoid this by taking this sample Robots.txt file that DotNetNuke.com provides and build your way up from there.

You could use Google's Robots.txt tool to check your Robots.txt file to make sure that all your Ducks are in a row.