Borked links

Jump to navigation Jump to search

This might be a dumb question, but is there any query you can run (or an already existing 'special page I can access) that would show broken links? (not red links, though it's fine if you have to weed through those). I've been two two pages where references are fully out of date. I'm guessing it's not really possible as often there are re-direct pages to "this page cannot be found" which would mean ironically that the reference link is going *somewhere* just not where we want it to go.

I'm hoping that made sense... it's way too early for me to be up, but if you can't sleep - you wiki!

Pink mowse.pngGodotI live in the Infinite monkey cage14:38, 21 December 2011

There's a bot that does that. You should ask Blue, she does the botting these days.

-- Nx / talk14:40, 21 December 2011
 

Are you talking about internal links to redirect pages? Double redirects? I'm a little confused.

Blue (is useful)04:15, 27 December 2011

She's talking about external links.

Nowwhat?04:17, 27 December 2011

Oh, I understand. Yeah, a bot can do that. The best way to do it would be to have the bot add a template like (broken link) next to a broken external link, which would link to and add the page to a category called "Pages with broken external links."

Blue (is useful)04:22, 27 December 2011

that would be a great thing. Then we could remove the links or try to find other cites.

Pink mowse.pngGodotI live in the Infinite monkey cage05:00, 27 December 2011

How does it know that they're broken?

Peter Urist for Mod!06:52, 27 December 2011
        try:
            urllib2.urlopen(url)
        except urllib2.HTTPError, err:
            if err.code == 404 or err.code == 400:

I hope you know Python.

Blue (is useful)06:55, 27 December 2011

So you can do that...

Will that get all useless links, do you think?

Peter Urist for Mod!07:00, 27 December 2011
 

Surely there is some way to obtain the errorlevel in Python without having to catch an exception?

Mjollnir.svgListenerXTalkerX07:25, 27 December 2011

Probably, but would it be faster or simpler?

Blue (is useful)08:06, 27 December 2011

Simpler, certainly. Faster, I understand that unnecessary use of exceptions in C++ is inefficient, though I am unsure whether that applies to Python.

Mjollnir.svgListenerXTalkerX08:16, 27 December 2011

As Nx is our resident techie and this is his talk page, I'll defer to him on this. I'm not extensively knowledgeable about Python anyway.

Blue (is useful)08:25, 27 December 2011

Wait, aren't you using Pywikipediabot's weblinkchecker.py? Where is that code snippet from?

-- Nx / talk09:55, 27 December 2011

Oh, that exists. Well I kind of reinvented the wheel a bit.

Blue (is useful)16:26, 27 December 2011
 
 
 
 

What's wrong with using an exception?

Python uses exceptions for error handling, and I think it's a more elegant way than having to check the return value against some arbitrary value that's defined as error and may be -1, 0, Null etc. depending on what the function is.

-- Nx / talk09:27, 27 December 2011

I meant that Blue should try to find some function in Python that just returned 404 or whatever if the link was broken, without raising an exception.

Mjollnir.svgListenerXTalkerX09:44, 27 December 2011
 
 
 
 

Basically the Python URL module can raise HTML error codes, and the ones we're looking for are 404 (Not Found) and 400 (Bad Request).

Blue (is useful)06:57, 27 December 2011

But it picked up a redirect as well. Is that covered under 400?

Peter Urist for Mod!07:03, 27 December 2011

Perhaps. I'll have to investigate further. I'll hold off on running the not until we're absolutely sure it works, because it will go through every single mainspace article. And then we can use it for fun, CP and recipe. So we should be sure.

Blue (is useful)07:11, 27 December 2011

Some kind of hidden suppression template may be helpful, in case of links that aren't broken but get picked up anyway.

Peter Urist for Mod!07:53, 27 December 2011

As in suppress {{broken}} itself? Not sure what you mean.

Blue (is useful)08:08, 27 December 2011

I mean you should be able to replace a false positive with '{{unbroken}}' or something and that would prevent it being caught in the next round of the bot.

For instance if you wanted to say "The claim was first made here but has since been removed" or similar without having to remove the template again when you next ran the bot.

Peter Urist for Mod!08:19, 27 December 2011

I see, yes, that would be good to have. It's analogous to {{nostub}}.

Blue (is useful)08:23, 27 December 2011
 
 
 
 

That link returns a 301, so the bot shouldn't be picking it up

-- Nx / talk09:34, 27 December 2011

Actually, I think the bot is supposed to change urls when it detects a redirect, but I'm not sure.

-- Nx / talk09:41, 27 December 2011