Author Topic: Check service page tool..  (Read 2420 times)

0 Members and 1 Guest are viewing this topic.

Offline polonus

  • Avast Überevangelist
  • Probably Bot
  • *****
  • Posts: 33916
  • malware fighter
Check service page tool..
« on: April 30, 2008, 04:02:37 PM »
Look here:
http://www.bruceclay.com/cgi-bin/checkredirect.cgi?seoprivate=ON&evaluate=http%3A%2F%2Fwww.fravia.com

Change it for the page you wanna check, e.g.
http://www.bruceclay.com/cgi-bin/checkredirect.cgi?seoprivate=ON&evaluate=http%3A%2F%2Fforum.avast.com/index.php

http://www.bruceclay.com/cgi-bin/checkredirect.cgi

     
Code: [Select]
            <P><strong>First</strong>, check your server for problems:</p>
            <P>This tool will perform a SOCKET TCP/IP Read for your site, a Request Read, and a Browser-type Get, comparing various header and source data to see if the site is the same and error free for each method.</p>
       
            <center>
              <form method="get" action="http://www.bruceclay.com/cgi-bin/checkredirect.cgi">
                <input type="hidden" name="seoprivate" value="ON">
                <table>
                  <TR>

                    <TD>Your HTTP File </TD>
                    <TD><input type="text" name="evaluate" size=90 value="http://www."></TD>
                  </TR>
                </TABLE>
             
                  <input type="Submit" value="Check -- Allow 2 minutes">
              </form>
            </center>
            <P>The purpose of the Check Server Page is to check the configuration of your web server for errors that could create problems for search engines. This information is very helpful since search engines often reduce web site rankings due to web server errors they encounter. At the very least, even if you encounter a common error that will not cause you to be dropped from the index, a "cleaner" site is likely to rank above you in the search engine results if all else is equal.</P>

            <P>The following is a listing of the common codes produced by web servers:
            </P>
            <center>
              <table bgcolor="fofofo" border="1" width="60%">
                <tr>
                  <td align=center>Code</td>
                  <td align=center>Description</td>
                </tr>

                <tr>
                  <td>200</td>
                  <td>OKAY - No errors returned from server</td>
                </tr>
                <tr>
                  <td>206</td>
                  <td>Partial Content - Server error</td>

                </tr>
                <tr>
                  <td>301</td>
                  <td>Permanently Moved - Acceptable to search engines</td>
                </tr>
                <tr>
                  <td>302</td>

                  <td>Document found elsewhere - Redirect code - usually unacceptable to search engines</td>
                </tr>
                <tr>
                  <td>304</td>
                  <td>Not modified since last retrieval - Not likely for a search engine spider</td>
                </tr>
                <tr>

                  <td>400</td>
                  <td>Bad Request - Usually a spider or browser error.</td>
                </tr>
                <tr>
                  <td>401</td>
                  <td>Authentication required - Password protected access</td>
                </tr>

                <tr>
                  <td>403</td>
                  <td>Access forbidden - always protected</td>
                </tr>
                <tr>
                  <td>404</td>
                  <td>Document not found at this location - Page not found</td>

                </tr>
                <tr>
                  <td>408</td>
                  <td>Request timeout - probable server problem</td>
                </tr>
                <tr>
                  <td>500</td>

                  <td>Internal Server Error - Script failed</td>
                </tr>
              </table>
   
            </center>
            <P><strong>Second</strong>, if there are ANY errors reported then correct them. The tool has a help file describing many remedies for each section of the report. For instance, error with robots.txt means that you do not have one or it is corrupted. Add one to the httpdocs directory (same as your home page) even if it is empty. Dirty IP list, contact your ISP and complain. Redirects, your server may be mis-configured... read the bottom of the report since the remedy is well defined right there. </P>
         
          <!-- END Tech Tips -->
          <P>
   

polonus
« Last Edit: April 30, 2008, 04:09:47 PM by polonus »
Cybersecurity is more of an attitude than anything else. Avast Evangelists.

Use NoScript, a limited user account and a virtual machine and be safe(r)!