用VB编写的去重程序,可以实现对TXT文件的去重复数据的处理,内含源代码。-Prepared to use VB to re-process, can be achieved on the TXT file to deal with duplicate data, including source code.
SHOW FULL COLUMNS FROM `jrk_downrecords` [ RunTime:0.001859s ]
SELECT `a`.`aid`,`a`.`title`,`a`.`create_time`,`m`.`username` FROM `jrk_downrecords` `a` INNER JOIN `jrk_member` `m` ON `a`.`uid`=`m`.`id` WHERE `a`.`status` = 1 GROUP BY `a`.`aid` ORDER BY `a`.`create_time` DESC LIMIT 10 [ RunTime:0.148968s ]
SHOW FULL COLUMNS FROM `jrk_tagrecords` [ RunTime:0.001871s ]
SELECT * FROM `jrk_tagrecords` WHERE `status` = 1 ORDER BY `num` DESC LIMIT 20 [ RunTime:0.003235s ]
SHOW FULL COLUMNS FROM `jrk_member` [ RunTime:0.001881s ]
SELECT `id`,`username`,`userhead`,`usertime` FROM `jrk_member` WHERE `status` = 1 ORDER BY `usertime` DESC LIMIT 10 [ RunTime:0.006236s ]
SHOW FULL COLUMNS FROM `jrk_searchrecords` [ RunTime:0.001735s ]
SELECT * FROM `jrk_searchrecords` WHERE `status` = 1 ORDER BY `num` DESC LIMIT 5 [ RunTime:0.004725s ]
SELECT aid,title,count(aid) as c FROM `jrk_downrecords` GROUP BY `aid` ORDER BY `c` DESC LIMIT 10 [ RunTime:0.025712s ]
SHOW FULL COLUMNS FROM `jrk_articles` [ RunTime:0.002183s ]
UPDATE `jrk_articles` SET `hits` = 1 WHERE `id` = 213863 [ RunTime:0.001885s ]