Now if item_id or unique_name change, that counts as a record
being deleted and a new one inserted.
If neither of that changed but any other field did, the old
field gets updated with the new value.
Program receives the full items list periodically. I don't think
they will ever remove records from there, but if they do the
program should not corruct the db. So far the simple approach
drop table items/create table items ensured the items list is
always fresh, but it prevented foreign key constraints on shop
items. Also, if an item was removed from items, any entry in
shop would become orphaned or worse point to the wrong item if
they recycled the id. Solution is to not use item_id at all for
the relationship between shop items and items table, instead
have my own internal id that is valid forever. Now records
removed upstream get just marked as deleted (by adding a removal
timestamp). The same item_id can be reused since the unique part
now must be (item_id, removal_date), in other words there is only
one non-deleted item_id, but there can be several deleted ones.
Code stores the full items set into items_staging, marks records
in items but not in items_staging as deleted, then adds any
missing records to items from items_staging, then drops items_staging.
It contains the list of cards inserted into the
slots of the items in the shop_items table
I can't enable the foreign key on card_id because
the items table gets razed periodically - need
to find a solution to this
Dates in the json reply seem to come in two different formats.
One is the reply timestamp, second one is the actual
content (for example shop opening dates)
This splits the thread pool code from the event code, and
class names are now wrong (I will update them in the next
commit). Also, this disregards the roar11 vs tf library
thing, taskflow was not working anyways and I will probably
remove all the conditional compilation stuff about it for
now.