not working. by the way. is autoupdate manga working if you use singemanga crawl ?try to use set to local and do 2 - 3 single crawler then use private proxy for the starter the auto update
some of the reason why your auto crawler not work is your ip got banned by fanfox. so is better you try to register to good private proxy. or try the free trial on ScraperAPI.not working. by the way. is autoupdate manga working if you use singemanga crawl ?
Thanks for sharing your steps. Wish I could follow your steps in the beginning.some of the reason why your auto crawler not work is your ip got banned by fanfox. so is better you try to register to good private proxy. or try the free trial on ScraperAPI.
the auto update is working on my site and i use bunnycdn as storage. for single craw i use local to get some chapter, then wait it show on the queue list on the crawler progress tab. when you grab single chapter you had to stop the auto crawler. this is the step that i use.
1. i use single craw to get 70 manga that i want , i crawl atleast 10 chapter for each manga. "private proxy used" i use webshare proxy.
2. try to disable and enable the plugin " i got my auto crawler work after do this "
the important step is that there is lot of manga title on the queue list. it's mean that the auto crawl work, even i cant delete it.Thanks for sharing your steps. Wish I could follow your steps in the beginning.
I cannot delete the queue list. I tried to reinstall the plugin but still exists. I don't know where the data stored in the database.
So, the crawler doesn't work for me and the debug log show this: Extract folder does not exist.
I don't know how to fix this.![]()
Glad to know the detail. It explains a lot of my confusion.the important step is that there is lot of manga title on the queue list. it's mean that the auto crawl work, even i cant delete it.
my trick is i use manual crawl " single crawl manga" to create list manga that i want to auto crawl grab. event now from 80 title manga that i list by single craw. only 35 manga that listed on auto crawl. i use auto crawl when there is a lot of update avaliable on queue list. usually there is 1-3 manga added on the queue list every day from the 80 manga that i manually crawl. so from the begining 16 manga added to auto update in couple of days it's got 35. i dont know about the licensed version. since the nulled usually not 100% work.
i used same version from that topic -what version fanfox crawler that you use, i got that result when use the old version
don't know the answer yet maybe is possible, i use manual craw to get the manga list that i want. then manually add it to the queue list you can check @shemmy post in this thread, but i add my list on the upload.json1 more question, what if we just want craw manga as daily ranking, example: http://fanfox.net/ranking/
dont want to craw all data. it impossible?
sure, you can use bunny cdncan we change local storage to another like google drive, google photos ? I try crawler 1000 magas that occupy 200gb space @@
did you crawl on your localhost pc?can we change local storage to another like google drive, google photos ? I try crawler 1000 magas that occupy 200gb space @@
no bro, i save image manga on storage of cpanel hosting, but just have 480gb storagedid you crawl on your localhost pc?
Unless you have plenty of storage local storing wont cut it if you're trying to add a lot of mangano bro, i save image manga on storage of cpanel hosting, but just have 480gb storage)))