Downloading (chunks of) the database
I am working on a emulator front end for arcade cabinets and use the game db for scraping images and meta data (obv). Thanks for this service, it’s quite awesome. However, I expect each user will go through the same things—basically collecting data for the same games, querying thegamesdb.net api for the same things. So I am wondering if it would be clever to ship the front end with the most common data right away.
Now I am wondering what you guys (esp. the makers of the games db) think about this. On the plus side this will get a lot stress off your servers. But I am wondering
- The data is yours, is it “cool” to just DL it and ship it with my (open source) front end
- It might be a lot of data, maybe even all data for common retro gaming systems like nes, snes, sega master system / genensis, n64, CPS systems, Neo Geo etc. . Is there even a way to DL it?
- Images and other binary data should be excluded since the file size of the front end would explode
What do you think?
You can definitely write stuff to scrape whatever info you need for every system you need and have it offline. This will add the complication of how will you handle updates (what if the cover was wrong, what if a release date is added?). You’re at the beginning of a long road buddy, but you can do it!
Ok thanks, but I would need to scrape it all one by one, there is no downloadable DB, right?
@Luke Nope. You’ll need to decide what all you care about anyway. Like if it was one single file (all the images, all the metadata) it’d be many, many gigs. For your project, for example, you’ll probably just want the thumbnails for the front covers and the metadata.