This fixes the build. The make function for the moment
just takes a string since that's all that's ever needed.
This is supposed to change since the postgresql database
will need a port, an url, password etc. Maybe
connection strings are still a thing, but if not then
I'll probably need to find a way to accept arbitrary
parameters in OriginsDB::make().
Currently you can't disable it since sqlite is the only
supported backend and you must have at least one enabled
and available on the system. This will be needed for later.
Previously the code would just throw an exception and
quit (foreign key constraint violation). Add an empty
row with the missing id so it will work. When the item
list returns that item the next time, the empty row will
be filled.
This is quite a big update, the new feature is that when the
program starts it loads the next update time from the db if
present. If so it resumes the timers from there. Maximum wait
time is currently capped at 24h.
TimerOroApi is not templated on methods anymore, rather on
the new DBOperation enum. This is so that it has that enum
value to pass it to other functions. This could've been a
separate commit, but wth...
Magic enum allows me to iterate over an enum's values at
build time, it's useful for building the query in OriginsDB
for the SELECT in the access table
(see make_select_last_access_str()).
This puts ThreadPool::join() back to being private.
Eventia events now should inherit from Event, which will take
care of setting up the required callbacks so that events will
be notified about the main event loop stopping. Inheriting
from Event is currently not enforced.
This works in a much simpler way than the items table.
Here we just upsert new values, with the unique constraint
on the internal item_id (not the upstream one, which is
never entered into this table).
This should mean that if an item from the items table with
eg: item_id=501,id=9 gets marked as deleted and re-entered as
item_id=501,id=5000, then icon with item_id=9 should remain
in the table (there is no delete ever), and a new one with
item_id=5000 gets added. Originally they both referred to
item 501 and they still do, but because of how the items
table now works, icons should also benefit and historical
data should remain in the face of upstream changes.
Now if item_id or unique_name change, that counts as a record
being deleted and a new one inserted.
If neither of that changed but any other field did, the old
field gets updated with the new value.
Program receives the full items list periodically. I don't think
they will ever remove records from there, but if they do the
program should not corruct the db. So far the simple approach
drop table items/create table items ensured the items list is
always fresh, but it prevented foreign key constraints on shop
items. Also, if an item was removed from items, any entry in
shop would become orphaned or worse point to the wrong item if
they recycled the id. Solution is to not use item_id at all for
the relationship between shop items and items table, instead
have my own internal id that is valid forever. Now records
removed upstream get just marked as deleted (by adding a removal
timestamp). The same item_id can be reused since the unique part
now must be (item_id, removal_date), in other words there is only
one non-deleted item_id, but there can be several deleted ones.
Code stores the full items set into items_staging, marks records
in items but not in items_staging as deleted, then adds any
missing records to items from items_staging, then drops items_staging.
It contains the list of cards inserted into the
slots of the items in the shop_items table
I can't enable the foreign key on card_id because
the items table gets razed periodically - need
to find a solution to this