Fugliness

Jump to navigation Jump to search

Still seems a bit buggy/clunky to me, though I very much like the overall idea. Just sayin'.

Blue (pester)20:42, 29 November 2010

The idea is great, but it makes things take up far too much space. It's not just the sig/buttons, it's all the extra whitespace. With just 5 comments, this topic was already taking up half a screen. That will make long topics on WIGO:CP or SB almost unreadable.

SuspectedReplicant retire me20:56, 29 November 2010

I agree with you there - I'm not sure places like TWIGOCP and the Saloon Bar would benefit from this system. Perhaps for that type of informal discussion a modified, streamlined version of liquidthreads would be more appropriate, but they're obviously not developing something like that.

The forumspace, however, would benefit from looking more like an organized forum.

Blue (pester)21:03, 29 November 2010

TWIGO:CP and the Saloon bar would benefit more because they have a lot of unrelated threads, with a higher chance of ECs, and a huge db size overhead. The Forum is actually a partial solution to the same problem LQT is trying to solve, because it separates threads into individual pages, LQT goes further and makes each comment a separate page.

-- Nx / talk21:13, 29 November 2010

I am getting substantial load times on this page. Is it just me?

Tmtoulouse (talk)21:15, 29 November 2010

Nope. It's incredibly slow to load, and seems to be screwing up Firefox too - it's started taking up a lot more processor time than usual.

SuspectedReplicant retire me21:20, 29 November 2010
 

Problem is, if you had the same amount of comment traffic we do now, this version of LQT would increase the screen length what, four times? That's a hell of a lot of scrolling, and to my mind that makes things less readable, even with the benefit of doing away with the occasional EC (and they only ever happen to me occasionally, so to me they're not a big deal).

Kels (talk)21:22, 29 November 2010
 

This layout on those pages would be a total disaster. We'd make the World's Highest Website look like a speed bump.

SuspectedReplicant retire me21:22, 29 November 2010
 
 

I've decreased the padding and hidden the toolbar (the reply, parent, more buttons in the lower right of each comment) until you hover over a comment. Better?

-- Nx / talk21:43, 29 November 2010

That is actually quite a bit better.

Blue (pester)21:54, 29 November 2010

I agree its looking pretty good, just the load times on this page seem excessive.

Tmtoulouse (talk)21:56, 29 November 2010

Yeah, LQT could do with some optimizations. The problem is that since each comment is a separate page, there's a performance overhead - similar to how template signatures slow down the saloon bar drastically.

-- Nx / talk22:00, 29 November 2010

Isn't this essentially the same as the best of Conservapedia, in that you have hundreds of little pages being displayed on one using Ajax? Why does that load so much faster than?

- π00:08, 30 November 2010

Not ajax, it's assembled server-side, but it still has to collect all those pages (in the case of bestof, they're just database entries, not MW pages) and parse them separately. I think the separate parsing is the problem.

-- Nx / talk07:13, 30 November 2010

Definitely looks better now, although I still think the big talk pages will be too big - it's just the way this works. Plus it's so slow! SB takes 4s to load. This takes 20. I shudder to think how long it would take on the long talk pages.

SuspectedReplicant retire me12:55, 30 November 2010

I'll do some profiling and try to figure out a way to speed things up.

-- Nx / talk19:58, 30 November 2010

Without wanting to be called Captain Obvious, and without knowing the table structure, it looks like some serious indexing is needed - presumably there's some kind of ThreadId along with a PageId? A multipart index on those two - PageId first - would be my guess. But yes, profiling is the way to go. Actually, it already seems quicker than it was earlier.

SuspectedReplicant retire me20:06, 30 November 2010

I think the bottleneck is in the parser though, at least that's my experience with wigo, which has a similar problem performance-wise (lot's of small, separate pieces of wikitext)

-- Nx / talk20:07, 30 November 2010

I know you know more about MW than I do, so I shall defer to your better knowledge... but as a SQL DBA I would still suggest a quick look at the indexing.

SuspectedReplicant retire me20:19, 30 November 2010
 
 

Ok, here's an excerpt from an example output. Out of 13 seconds (it's slower when profiling is on), Article::getContent takes up only 0.314916 seconds. There are 254 Article::getContents calls, but only 157 call Article::loadContent calls, and 174 Revision::loadText (this is what loads the revision text from the db) calls. Revision::loadText takes up only 0.088527 seconds, so the database is definitely not the bottleneck. The parser is. In fact, the number of wfMsgReal calls greatly concerns me...

(format is seconds count name)

13.215451      1 - -total
3.611685  26819 - wfMsgReal
3.447692   1287 - Parser::parse
3.399038   1287 - Parser::parse-OutputPage::parse
2.164450    138 - WikiEditorHooks::addModules
2.131158   1292 - Parser::internalParse
1.788212   2005 - Parser::replaceVariables
1.526407    694 - Parser::transformMsg
1.496620    694 - Parser::preprocess
1.345063   3222 - PPFrame_DOM::expand
0.996161   1142 - Parser::braceSubstitution
0.904751   3313 - Preprocessor_DOM::preprocessToObj
0.775065      1 - Setup.php
0.762118      1 - Setup.php-SetupSession
0.580727   2004 - Parser::clearState
0.411560   3308 - Preprocessor_DOM::preprocessToXml
0.314916    254 - Article::getContent
0.302848    157 - Article::loadContent
0.278424   3313 - Preprocessor_DOM::preprocessToObj-loadXML
0.274240    795 - Database::query
-- Nx / talk21:46, 30 November 2010

You're right that it's not the major cause, but if I was a DBA on a system taking 1/3s to return this little content.... I'd be concerned. Is that the only call getting content? Remember - I know nothing about MW but loadContent seems to take a while too.

I know MySQL is only an open source database, not a proper product, but I wouldn't expect a database to be taking this long over a simple query based on a very few params.

SuspectedReplicant retire me21:53, 30 November 2010

The db stuff is much less than that actually - I'll have to insert more profiling calls to figure out why loadcontent takes so long (I have an idea), but the actual db access is in Revision::loadText, which is only 0.088 seconds. The profiling info also contains all the database queries, but I didn't write them down - they were really quick, so it's definitely not the database. And even 0.3 seconds is relatively small compared to the total.

-- Nx / talk22:09, 30 November 2010

True, but I hope you're right that the DB stuff is less than 0.3s - it should be (essentially) zero for a system of this size.

I'm not trying to annoy here, but please just take a look at the indexing anyway? Yes, it's mainly down to the parser, but every little helps.

SuspectedReplicant retire me22:20, 30 November 2010
 
 
 
 
 
 
 
 

I found the cause and fixed it, it should now be much better. This page is now slightly slower than the saloon bar. I'll see what other optimizations I can do.

-- Nx / talk22:59, 1 December 2010

That is a lot smoother.

- π23:06, 1 December 2010

That is much better. Thanks, Nx.

SuspectedReplicant retire me13:45, 2 December 2010