Domain Crawling


Domain crawling is probably the best solution for maintaining both a main site and a regional version. With domain crawling the regional listing is far more comprehensive as compared to the other mechanisms explained above. Some pages, although regional may be listed in the main listing as well.
__________________
WinHost Web Hosting

Site Clusttered with old files


Hi, there is problem with my site,
the site is clusttered with old files.

What to do now ?

Hi,

Whenever possible, upload your site using the utilities that come with your web-site development software. For instance, if you made your site with SiteStudio, FrontPage or Dreamweaver, use their integrated web publishing tools. If you made your site with simple text editors, or if your site-building software does not have a publishing utility, use freestanding FTP clients, such as CuteFTP, SmartFTP, or the built-in web-based FTP agent.

Well, it is there, that site publishing tools don’t remove your old web content from the server. For instance, if you used SiteStudio to upload a site with 30 pages and later you published an updated 10 page version of this site, your directory on the server will have all the new pages and the old pages that haven’t been overwritten. If you publish many versions of the website, the site may become cluttered with old files.

Spamming – A Definite No-No


A couple of years ago spamming may have worked wonders for your website. However, with sophisticated algorithms being developed by all popular search engines, spamming can only backfire. Algorithms, these days, can easily detect spam and not only ignore your website but also ban your website.
Besides, instead of spending considerable time and effort on spamming you can always follow other proven strategies and have a higher rank with most search engines. Spamming can also easily irritate readers. Think about it — if your homepage has unnecessary repetitions of a particular keyword, it is bound to frustrate a reader. Consequently your site, instead of being content rich, would be junk rich. This can have nothing but a negative impact on your business.
__________________
WinHost Web Hosting

To delete postmaster or to change its quota


When we create a mailbox, we get postmaster mailbox too, how we can delete it or change its quota ?

You get Postmaster mailbox for free and you can neither delete it nor change its quota. Webmaster is a regular mailbox and it counts towards your total maiboxes.

Hi,

With due respect to the members, I just want to ask whether can i change the mail quota and how ?

Regards,

Welcome Mub,

For each of your mailboxes you get some default amount of disk space for storing incoming and outgoing mail. Load statistics for each of your mailbox can be found in the Properties section for every individual mailbox.
It shows how much disk space you are using out of your mailbox quota. You can’t store more MBs than your mailbox quota allows. You can change your mailbox quota, by clicking the Change icon next to the statistics readings for each individual mailbox.
– recurrent fee for the days that remain to the end of the current billing period is refunded.
– you are charged the recurrent fee of increased mailbox quota prorated to the rest of the billing period.
Please, take care that your hosting plan may be configured to prevent you from setting very high mailbox quota.

I do want store more MBs than my mailbox quota allows?

But, you can’t store more MBs than your mailbox quota allows.

Search Engine Cloaking


Search engine cloaking is a technique used by webmasters to enable them to get an advantage over other websites. It works on the idea that one page is delivered to the various search engine spiders and robots, while the real page is delivered to real people. In other words, browsers such as Netscape and MSIE are served one page, and spiders visiting the same address are served a different page.
The page the spider will see is a bare bones HTML page optimized for the search engines. It won’t look pretty but it will be configured exactly the way the search engines want it to be for it to be ranked high. These ‘ghost pages’ are never actually seen by any real person except for the webmasters that created it of course.
When real people visit a site using cloaking, the cloaking technology (which is usually based on Perl/CGI) will send them the real page, that look’s good and is just a regular HTML page.
The cloaking technology is able to tell the difference between a human and spider because it knows the spiders IP address, no IP address in the same, so when an IP address visits a site which is using cloaking the script will compare the IP address with the IP addresses in its list of search engine IP’s, if there’s a match, the script knows that it’s a search engine visiting and sends out the bare bones HTML page setup for nothing but high rankings.
There are two types of cloaking. The first is called User Agent Cloaking and the second is called IP Based Cloaking. IP based cloaking is the best method as IP addresses are very hard to fake, so your competition won’t be able to pretend to be any of the search engines in order to steal your code.
User Agent Cloaking is similar to IP cloaking, in that the cloaking script compares the User Agent text string which is sent when a page is requested with it’s list of search engine names (user agent = name) and then serves the appropriate page.

The problem with User Agent cloaking is that Agent names can be easily faked. Search Engines can easily formulate a new anti-spam method to beat cloakers, all they need to do is fake their name and pretend they are a normal person using Internet explorer or Netscape, the cloaking software will take Search Engine spiders to the non – optimized page and hence your search engine rankings will suffer.
To sum up, Search engine cloaking is not as effective as it used to be, this is because the search engines are becoming increasingly aware of the different cloaking techniques being used be webmasters and they are gradually introducing more sophisticated technology to combat them. It may be considered as unethical by Search Engines if not used properly.

__________________
WinHost Web Hosting

Does separating Domain Name Keywords with hyphens affect rankings?


Domain Names play a huge role in search engine optimization for obtaining high search engine ranking. Type in any keyword into Google for example, and the chances are high that you will find that more than 80% of the first 10 sites contain that particular keyword in their domain names. Thus, domain name is one area that can be fully utilized (obviously taking into consideration that you are optimizing a brand new site from scratch and not an existing one).
It is also a good idea to include hyphens (-) between each of the keywords within your domain name. This tells a search engine spider that each word is a separate word, not

one continuous word. The search engines which use keywords in the domain name as a part of their ranking formula will not be able to recognize keywords unless they are separated by a hyphen (or a slash or underscore for sub-directories). Search Engines prefer the use of hyphens in domain names because they can produce more accurate search results by being able to recognize specified key words in your URL.
Keep in mind that a domain with words separated by hyphens will be harder for users to remember (and may also decrease the value of the domain). However, if keywords are used with hyphens the keywords may be interpreted as such by search engines thereby helping rank a site higher.

__________________
WinHost Web Hosting

Taking advantage of Alphanumeric Search Engine listings


To have a good web site marketing plan, you must realize alphabetical priority of domain names is still used by some search engines as a key factor in their ranking formula. Alphabetical hierarchy is even more important to web site marketing, because this method is used by directories, which strictly list sites in alphabetical order based on the results of the keyword(s) search.
Your web site marketing plan must acknowledge Yahoo!. Yahoo! is the number one directory and search site with 52.7 million different visitors each month, accounting for 69.1 percent of all Internet surfers. An astounding 53.4 percent of all search-related traffic comes from Yahoo!, or as much as half the traffic received to many sites. Many

prefer to use Yahoo! because each site submitted to Yahoo! is human-reviewed, delivering more accurate search results for their visitors.
To include alphabetical hierarchy in your web site marketing strategy, realize that alphabetical priority does not only consist of letters, it includes numbers that can rank higher than an “A”.
This is the order: -0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ
This means that the domain “www.l-800-ABC-NEWS.com” will rank higher with the search sites that use alphabetical priority in their ranking formula, than just “www.ABC.com”

__________________
WinHost Web Hosting

Click Popularity


Another factor influencing search engine placement in some search engines is Click Popularity. The number of users clicking on links to your page from the search results is counted. Pages that are frequently clicked will get a popularity boost.
Your site is awarded a certain number of points each time someone clicks your link from the search results. If your Web site already has a high ranking you will get fewer points compared to a low ranking site. This way all sites have an equal chance to get click through points awarded.
Don’t be tempted to click your own link over and over again. Repeated clicks from the same IP will be detected. Clicking on your link and quickly returning to the search engine again might actually hurt your rank. The search engines will believe you did not find anything interesting at the page. That is not a good search engine optimization strategy.
How can you influence click popularity then? By putting some work into your page title and description Meta tag. These are the main factors influencing people’s decision to click your link. High quality content will make visitors stay at your search engine optimized web site, and will stop them from .quickly returning to the search engine.
__________________
WinHost Web Hosting

Registering a keyword rich domain


Domain name registration is one of the most important things to consider when you are designing a site for high search engine placement. Here are some dos and don’ts of domain name registration.
Some Webmasters use shared domains or sub-domains available for free from popular Web hosting services, or some kind of free domain name redirect service. This might be cheap, but if you want a good search engine placement, it’s not an option.

• Some search engines do ban free Web hosts because search engine spammers frequently use them for hosting duplicate sites (mirrors) and doorway pages.
• Sharing domains or IP addresses with spammers can get your search engine position penalized or your entire site banned. Note this statement from AltaVista: “You could wind up being penalized or excluded simply because the underlying IP address for that service is the same for all the virtual domains it includes.”
• Most search engines limit the number of submissions or number of listings for each domain. This will make it very hard to get your site indexed. Other sites on the same domain might already take all the available spots.
• If you do manage to get your site indexed, the search engine will have a hard time finding the “theme” for your site if you are sharing a domain with other sites on many different subjects. Pages are no longer ranked one by one, all content within the domain is considered.
• Without your own domain, you will be forced to start working from scratch again if the host goes out of business or if he decides to change your URL. Many Webmasters have lost their search engine positions, link popularity and Web traffic because of this.
Keywords in the domain name are crucial. It makes sense to put your primary keywords into your domain name.
• Separate multiple keywords like my-keywords-phrase.com instead of typing it all in one word: mykeywordphrase.com. This will make it possible for the search engines to understand your keyword phrases correctly.
• Keep in mind that Yahoo and some other search engines reject domain submissions with URL’s in excess of 54 characters. You would be wise to stay under the 55 character limit when choosing a domain name.
• Directories like Yahoo, LookSmart and ODP will not look for keywords in the text of your page, and editors will often edit keywords out of your title and description. This leaves your internet domain name as the single most important place to put keywords for your site.
• Too many dashes in a domain name might trigger the spam filters of some search engines.
• Yet another benefit of keyword rich domain names is in reciprocal linking. If the domain name keywords appear within the text of incoming links, you will get a major boost in ranking, especially in Google.

__________________
WinHost Web Hosting

Increase DB Quota


Hi,

Can I increase DB Quota ?

Hi,

you can always increase your DB Quota limit by paying recurrent fee for the increased amount, which would be usually less than payments for overlimit usage. You hosting plan can be also configured to prevent you from setting very high DB Quota.
You can change your DB quota limit, by clicking the Change icon next to the statistics readings for each individual DB in this case:

MS SQL DB:
1. recurrent fee for the days remaining to the end of the current billing period is refunded.
2. you are charged the recurrent fee of increased DB quota limit prorated for the rest of the billing period.

MySQL and PostgreSQL DB
1. recurrent fee for the days remaining to the end of the current billing period is refunded.
2. your DB quota is prorated to the days elapsed from the beginning of the billing period. And if your DB usage is more than this, you are charged extra fee for the overlimit.
3. You are charged the recurrent fee of increased DB quota limit prorated to the days left to the end of the billing period.

DB Quota shows how much disk space you are using out of this DB quota.

DB Quota doesn’t stop you from using more disk space, except for the MS SQL DB. But if you go over it, you will be charged extra fee for excess.