مَوادَس کُن گٔژھِو

رُکُن:ReyBrujo/Dumps

وِکیٖپیٖڈیا پؠٹھٕ، اَکھ آزاد اِنسایکلوپیٖڈیا
Dumps

April 29, 2007

[اؠڈِٹ]
[اؠڈِٹ]

Articles with more than 10 external links as of April 29, 2007. Only articles in the main space are considered.

External
links
Article ID Article
160 3 Main Page
60 1035 KS devanagari
30 1040 यूनिकोड
27 815 युनिकोड
18 890 जर्मनी
17 879 रूस
17 817 कश्‍मीर
15 894 जापान
13 918 नेपाल
12 908 तुर्किये
12 953 मलयेशिया
10 919 नेदरलैंड्स
10 931 पुर्तगाल
10 935 फिलिपीन्स
SELECT COUNT(el_from) AS total, el_from, page_title
FROM externallinks, page
WHERE externallinks.el_from = page_id AND page_is_redirect = 0 AND page_namespace = 0
GROUP BY el_from
ORDER BY total DESC;
[اؠڈِٹ]

Sites linked more than 10 times as of April 29, 2007. Only articles in the main space are considered.

Link count Site
20 http://www.unicode.org
SELECT COUNT(el_to) AS total, SUBSTRING_INDEX(el_to, '/', 3) AS search
FROM externallinks, page
WHERE page_id = el_from AND page_namespace = 0
GROUP BY search
ORDER BY total DESC;

Additional information

[اؠڈِٹ]

Some more information about this dump:

  • 493 articles that are in the main space and not redirects
  • 513 articles and redirects in the main space
  • 674 pages in all namespaces
  • 28 redirects in all namespaces
  • 3791 external links in every namespace
  • 718 external links in the main space

Very probable spambot pages

[اؠڈِٹ]

If index.php is found in a page title, it is very likely the article talk page has been created by a spambot. These pages should be deleted and protected if possible.

Article ID Article
2489 W/index.php
2523 Talk:W/w/w/index.php?title=W/w/w/index.php
2516 MediaWiki talk:Ipb expiry invalid/w/w/index.php

Possible spambot pages

[اؠڈِٹ]

Possible pages created by spambots ending with /.

Article ID Article
789 Wikipedia:Broken/
SELECT page_id, page_title, page_namespace
FROM page
WHERE page_title LIKE '%index.php%' OR page_title LIKE '%/wiki/%' OR page_title LIKE '%/w/%' OR page_title LIKE '%/';
Archive

Dump analysis archive

[اؠڈِٹ]

This is the archive of dump analysis.

commons:User:ReyBrujo/Dumps ak:User:ReyBrujo/Dumps