I am developing a plugin which relies on external data to be fetched from time to time. To speed things up, I´d like to cache the received data and refresh it in intervals of 10 or 15 minutes.
This will be no big problem because the external server will only refresh the data in intervals somewhat like that, too.
But I think it will on the one hand take quite some load from the external server and also improve the speed of the local site, because the data is already there.
At the beginning I was pretty sure that WP Transients would be the best way to go, but I´m not sure about that anymore. I´ve always seen that lots of plugins use their own tables for storing whatever data, and I´m wondering whether I should push my external data through Transients API into the
_options-table or if a dedicated table structure would be suited better for this case.
Is there some objective, measurable point in using a table over Transients or vice versa?
Update 1: The data is XML-like structured data with a regional context. So users could theoretically be in need of fetching dozens of datasets, but that won´t probably happen for a lot of them. Also there are “combined” datasets right on the server side which you can fetch, like in “city datasets” and “area datasets”.
(Ask me anything if it´s unclear.)
If the data you’re fetching is only a single or at most a few instances, then the Transients API should do the job; keep in mind it also handles the auto-expiration for you.
If the data could potentially be an unlimited (or at least medium-large) number of instances, especially if there may be even slightly complex ways of finding the correct entry to retrieve (besides just an ID), then a custom table would be advisable.
Having said that, there’s a third potential option: the file system. You could store the data serialized/jsonified in a file in a subfolder of wp-content. This is of course provided the data in question is effectively public and doesn’t need to be protected in the slightest.
refresh it in intervals of 10 or 15 minutes
Transients are expensive to create, so creating and flushing them every 10 to 15 minutes is really a bad idea. Transients are only useful to hold small pieces of data over a longish period of time. I always believe that transients should not be flushed and recreated more than once a day. I tend to set my transient for 30 days and flush them only when something happens, like publishing a new post or updating a term.
Transients also aren’t meant to keep tons of data, like storing 100 post objects. You can check my answer here to check how I used my transient, I just stored an array of sorted post ID’s instead of my complete
wp_posts table. Although I have one db call more per page due to my transient, I save a lot on memory use due to the sorting process.
I would really consider a new table in your situation to handle this external data. This would, however, require you to write your own CRUD process to store and handle the data. You can probably build your own cache system also to cache the external data, but that is entirely up to you.
Before you jump in and create your own table, you should read @tosho’s answer to the following question