dramaticcat@sh.itjust.works to Lemmy Shitpost@lemmy.world · 1 year agoChad scrapersh.itjust.worksimagemessage-square98fedilinkarrow-up11.06Karrow-down129
arrow-up11.03Karrow-down1imageChad scrapersh.itjust.worksdramaticcat@sh.itjust.works to Lemmy Shitpost@lemmy.world · 1 year agomessage-square98fedilink
minus-squarebill_1992@lemmy.worldlinkfedilinkarrow-up165arrow-down2·1 year agoEveryone loves the idea of scraping, no one likes maintaining scrapers that break once a week because the CSS or HTML changed.
minus-squareAnonymousllama@lemmy.worldlinkfedilinkarrow-up22·1 year agoThis one. One of the best motivators. Sense of satisfaction when you get it working and you feel unstoppable (until the next subtle changes happens anyway)
minus-squarecamr_on@lemmy.worldlinkfedilinkarrow-up28·1 year agoI loved scraping until my ip was blocked for botting lol. I know there’s ways around it it’s just work though
minus-squarePennomi@lemmy.worldlinkfedilinkEnglisharrow-up42·1 year agoI successfully scraped millions of Amazon product listings simply by routing through TOR and cycling the exit node every 10 seconds.
minus-squarecamr_on@lemmy.worldlinkfedilinkarrow-up15·1 year agoThat’s a good idea right there, I like that
minus-squareferret@sh.itjust.workslinkfedilinkEnglisharrow-up5·1 year agolmao, yeah, get all the exit nodes banned from amazon.
minus-squarePennomi@lemmy.worldlinkfedilinkEnglisharrow-up12·1 year agoThat’s the neat thing, it wouldn’t because traffic only spikes for 10s on any particular node. It perfectly blends into the background noise.
minus-squarenilloc@discuss.tchncs.delinkfedilinkEnglisharrow-up3·1 year agoQueue Office Space style error and scrape for 10 hours on each node.
minus-squareTouching_Grass@lemmy.worldlinkfedilinkarrow-up8arrow-down1·edit-21 year agoYou guys use IP’s?
minus-squaresynae[he/him]@lemmy.sdf.orglinkfedilinkEnglisharrow-up7·1 year agoToken ring for me baybeee
minus-squarecamr_on@lemmy.worldlinkfedilinkarrow-up7·1 year agoI’m coding baby’s first bot over here lol, I could probably do better
minus-squaredangblingus@lemmy.worldlinkfedilinkarrow-up11·1 year agoOr in the case of wikipedia, every table on successive pages for sequential data is formatted differently.
minus-squareMatriks404@lemmy.worldlinkfedilinkarrow-up8arrow-down1·1 year agoJust use AI to make changes ¯_(ツ)_/¯
Everyone loves the idea of scraping, no one likes maintaining scrapers that break once a week because the CSS or HTML changed.
deleted by creator
This one. One of the best motivators. Sense of satisfaction when you get it working and you feel unstoppable (until the next subtle changes happens anyway)
I feel this
I loved scraping until my ip was blocked for botting lol. I know there’s ways around it it’s just work though
I successfully scraped millions of Amazon product listings simply by routing through TOR and cycling the exit node every 10 seconds.
That’s a good idea right there, I like that
This guy scrapes
lmao, yeah, get all the exit nodes banned from amazon.
That’s the neat thing, it wouldn’t because traffic only spikes for 10s on any particular node. It perfectly blends into the background noise.
Queue Office Space style error and scrape for 10 hours on each node.
You guys use IP’s?
Token ring for me baybeee
I’m coding baby’s first bot over here lol, I could probably do better
Or in the case of wikipedia, every table on successive pages for sequential data is formatted differently.
Just use AI to make changes ¯_(ツ)_/¯
Here take these: \\
¯(ツ)/¯\\ Thanks