Google/Blogspot blocking proxy access

May 17, 2006 at 8:46 am 2 comments

For the past week or so, maybe longer, I get a "503 – Connect failed" message every time I try to connect to a Blogspot.com blog through my SSH proxy.

I'd had this issue in the past, and then it seemed to go away. But now it's back, and I can't read Lewlew's or Morrigan's or Jefftoo's blogs – or any others hosted on Blogspot.com. (Including my own.)

Well, after going to the source (Doh!), I realized that this is an across the board issue between Blogspot and my proxy provider. Here's what I found.

The good folks at Cotse.com and Cotse.net, in mulling over this dilemma, don't seem to be considering that perhaps Google (owner of Blogspot, and with whom they're having other issues) ain't interested in receiving connections from computers they can't catalog and trace back.

Looks like I might need to create some alternatives for my blog activities. Odd, that today I'm able to sign in at Blogger.com through my proxy. I guess it's only the blog readers they're tracking – for the moment.

If you use Blogspot, whether to blog or to read, Cotse asks you to contact the Blogspot people to request that they remedy the situation. If you do blog there, you might be missing out on readership because of this issue.

Update – June 5, 2006:  Cotse.net reports that Blogger/Google has finally unblocked their pages to users of Cotse's proxy surfing services. No comment, explanation, or apology was made.  Cotse expressed thanks to the Electronic Freedom Foundation for offering to help find a solution.

Advertisements

Entry filed under: Privacy, Writer's Life.

“Opting out en masse” Brits to be Forced to Hand Over Encryption Keys

2 Comments Add your own

  • 1. David  |  May 19, 2006 at 10:09 pm

    Make sure to update the link in your blogroll.

    The new link is…

    http://jeffersoniantoo.wordpress.com/

    Thanks! 🙂

    Reply
  • 2. David  |  January 24, 2008 at 4:17 pm

    Also just as a side note remember it is extremely important to do a proper robots.txt for your proxy if you don’t then your proxy will crawl the web and google will index other peoples websites under your domain. This seems great but its pretty unethical if you ask me and causes a lot more problems then the benefits of fresh content.

    Here is the proper format for PHProxy 5

    User-agent: *
    Disallow : /index.php?

    Just put that in your robots.txt file and upload it to the root of your server.

    There are other ones for cgi proxy and glype.

    If your worried you can’t do well in the Search Engines without stealing content check out my site at http://www.proxybolt.com and I guarantee you I am doing very well in the SE’s and have a ton of visitors without having to steal.

    This is not how to block a proxy rather how proxy owners can stop indexing sites.

    Reply

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Trackback this post  |  Subscribe to the comments via RSS Feed


Recent Posts