running at the same time: one receives queries and writes their data to some queue file
the other one reads the data, does some processing with the last query in the queue, which takes ~a minute or so (there may be a new query to be written during that period of time), and then clears the data it has processed, and starts the cycle again until the queue is empty.
the question is - what should the queue file be? i need some kind of a indexed database that can be operated by 2 files simultaneously? like a list but separated from the python files. the database doesn't even have to be key:value, but preferably index:value
i thought about using simple txt files but from what i've read about deleting lines and operating txt files it would be a total mess, since every approach caches lines in a list using readlines() but that would be a problem up-to-date reading and editing
If you need a tiny database why not use something like sqlite?
i tried using redis for this and writing queries with indices by having an iterator variable in the database that i read to assign the keys for variables😅 obviously, it didn't turn out that well, so now i'm looking for a different db. from what i've read now i'm not sure how sqlite will be different. can i access its values just by indices?
sure. you can. there is a rowid. sqlite is a database that you can save into a file too so you would essentially have a file that could also be version controlled (git or whatever)
awesome! thank you❤️
Обсуждают сегодня