each rss has a max of 20 links, I'm fetching it and storing it in a mysql db, and I've set a primary key as the url, so each url is logged only once in the database, and as I'm querying all the rss sources every 30 seconds, I end up trying to insert 100 links into the database, and there is a new url once every 3-4 hours.
What would be a better solution?
Inserting duplicate urls and handle error.
Or
checking the database if the url exists before inserting it into database.
Or
Doing an insert ignore.
You can (depending on the DB) make the URL a primary key and upsert on it
Обсуждают сегодня