251 Commits

Author SHA1 Message Date
Dniel97
d75f62bcb4 Merge pull request 'importer fix + extras' (#241) from Keeboy/artemis:develop into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/241
Reviewed-by: Dniel97 <dniel97@noreply.gitea.tendokyu.moe>
2026-01-29 05:00:47 +00:00
Keeboy99
04019da9ac importer fix + extras 2026-01-29 11:43:10 +13:00
Hay1tsme
d91c21d047 Merge pull request '[chuni[ Bad profile subtrophy id defaults fix (issue #235)' (#239) from daydensteve/artemis_chuni_webui_improvements:chuni_subtrophy_db_fix into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/239
2026-01-09 02:16:20 +00:00
daydensteve
b3824f038f Fixed bad default subtrophy values (issue #235) 2026-01-08 20:34:28 -05:00
Dniel97
f2afb3cff5 Merge branch 'feature/chunithm/xverse' into develop 2026-01-01 20:47:29 +01:00
Dniel97
8408d30dc5 chuni: add stage import and frontend config 2026-01-01 20:40:27 +01:00
beerpsi
2cbf34dc28 CHUNITHM X-VERSE support (#238)
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/238
Co-authored-by: beerpsi <beerpsi@duck.com>
Co-committed-by: beerpsi <beerpsi@duck.com>
2026-01-01 21:35:23 +00:00
beerpsi
29a52d2712 chuni: fix map area/unlock challenge conditions (#237)
- Document all map area/unlock challenge condition IDs
- Add conditions for missing secret maps in LUMINOUS PLUS/VERSE

Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/237
Reviewed-by: Dniel97 <dniel97@noreply.gitea.tendokyu.moe>
Co-authored-by: beerpsi <beerpsi@duck.com>
Co-committed-by: beerpsi <beerpsi@duck.com>
2025-12-31 14:37:46 +00:00
Hay1tsme
5ba0c8b04c properly encode allnet response for some older titles 2025-11-24 09:31:41 -05:00
daydensteve
b1e629b3d7 [chuni] misc frontend improvements/fixes (i.e. webp instead of png; css error; hide subtrophies on old version) (#234)
TL;DR avatar and userbox frontend pages can get hella slow when loading the first time when a ton of stuff is unlocked. Its driven primarily by all the images the server has to push to the client. To reduce the burden, these changes switch from using png to webp for all scaped images during import, reducing image sizes to roughly 20% of their png-equivalent.

The filelist is long so here's a summary list of changes:
- Replaced png assets with webp versions
- Updated read.py to save assets as webp instead of png
- Updated frontend.py and jinja to use webp instead of png
- Added a conversion function ran by both the importer and the frontend on launch that looks for previously imported png files and converts them to webp. Only included for the sake of anyone who already did imports since the frontend improvements were introduced.
- [bugfix] Fixed a css bug in the avatar jinja that affected Save/Reset button use on super narrow screens

Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/234
Co-authored-by: daydensteve <daydensteve@gmail.com>
Co-committed-by: daydensteve <daydensteve@gmail.com>
2025-10-18 15:25:23 +00:00
Hay1tsme
e11db14292 Merge pull request '[mai2] Prism Plus support' (#232) from SoulGateKey/artemis:prism_plus_support into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/232
2025-10-07 17:59:48 +00:00
SoulGateKey
77152bf25c change down_revision when final merge 2025-10-07 17:59:02 +00:00
Hay1tsme
f346d8572d chuni: fix typo in upgrade script 2025-10-07 13:26:54 -04:00
Hay1tsme
ce621065a4 Merge pull request 'CHUNITHM VERSE support' (#224) from feature/chuni_verse_support into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/224
2025-09-27 20:20:21 +00:00
Hay1tsme
2d84865155 add ota update channels 2025-09-27 16:17:44 -04:00
Hay1tsme
10d38e14ae config: fix typo preventing ssl_cert from working correctly 2025-09-26 13:59:12 -04:00
Hay1tsme
8194520cca Merge pull request 'Remove duplicate get_opts in Ongeki static' (#228) from Kayori/artemis:smallcleanup into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/228
2025-09-24 00:24:21 +00:00
Hay1tsme
2612fc984c tui: more work 2025-09-21 00:33:35 -04:00
Hay1tsme
dd546dcce2 idz: pretty up no key message, fix double handshake 2025-09-20 16:07:30 -04:00
SoulGateKey
21415de775 add lut 2025-09-19 23:13:57 +08:00
Dniel97
44168d6e71 chuni: fix card maker reqs 2025-09-18 21:38:48 +02:00
SoulGateKey
d16ebe27d9 Merge pull request 'develop' (#16) from develop into prism_plus_support
Reviewed-on: https://gitea.tendokyu.moe/SoulGateKey/artemis/pulls/16
2025-09-18 13:05:46 +00:00
Hay1tsme
5ec6cc0398 tui: arcade management view 2025-09-17 18:08:34 -04:00
Hay1tsme
c92ede9e55 tui: add edit user view 2025-09-17 12:38:14 -04:00
Hay1tsme
92422684ef sao: fix error on unknown response type 2025-09-16 22:50:06 -04:00
Hay1tsme
3df0f3fb06 idac: fix incorrect game_cfg variable name 2025-09-16 21:34:17 -04:00
Hay1tsme
0c800759bb diva: fix binary requests not being awaited 2025-09-16 21:31:27 -04:00
Hay1tsme
3ad56306bf remove unused funcs from TitleServlet 2025-09-16 20:59:35 -04:00
Hay1tsme
d5c68a624f billing: bomb out early if we have unsent logs to avoid duplicating work that's never used 2025-09-16 18:11:52 -04:00
SoulGateKey
fa18b4c6a2 Merge pull request 'develop' (#15) from develop into prism_plus_support
Reviewed-on: https://gitea.tendokyu.moe/SoulGateKey/artemis/pulls/15
2025-09-16 17:54:52 +00:00
Hay1tsme
7e254a0281 Merge pull request '[Mai2] enhance music score path handling and improve error logging' (#226) from SoulGateKey/artemis:mai2_enhance_musicscoreapi into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/226
2025-09-16 17:37:52 +00:00
Dniel97
41dbf4fb78 chuni: add int datatype to user foreign key 2025-09-01 23:03:18 +02:00
Dniel97
415bbc92b3 chuni: add userbox sub trophy, fix unlock challenge 2025-08-31 19:49:06 +02:00
ThatzOkay
abe480d007 remove duplicate method 2025-08-23 20:35:58 +02:00
SoulGateKey
cccb5ce1a7 Merge branch 'develop' into mai2_enhance_musicscoreapi 2025-08-20 18:28:49 +00:00
SoulGateKey
064f2b6b54 enhance music score path handling and improve error logging 2025-08-21 02:27:29 +08:00
Kayori
b62c89b749 merge upstream 2025-08-20 17:45:22 +00:00
Dniel97
3c7ac3ac58 chuni: fix encrypted hash, update unlock challenge req 2025-08-19 19:26:09 +02:00
Dniel97
91f06ccfd2 chuni: initial verse support 2025-08-19 19:26:08 +02:00
Hay1tsme
9f916a6302 mai2: fix card importer 2025-08-19 10:54:54 -04:00
Hay1tsme
628cb89436 Merge pull request '[Mai2] rival support' (#223) from SoulGateKey/artemis:mai2_rival_support into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/223
2025-08-09 06:02:01 +00:00
Hay1tsme
3172aa0838 Merge pull request 'mai2: support unfavoriting music' (#225) from ppc/artemis:mai2_favorites into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/225
2025-08-09 06:00:47 +00:00
ppc
2d61646307 support removing favourite music entires 2025-08-06 18:04:40 +01:00
SoulGateKey
46d79d156b Merge branch 'mai2_rival_support' into prism_plus_support 2025-08-01 00:55:12 +08:00
SoulGateKey
de8294820a UserFriendRegist Bugfix 2025-08-01 00:54:48 +08:00
Kayori
fb4e10c2ae merge upstream 2025-07-29 12:18:03 +00:00
SoulGateKey
15e8eb535b Merge remote-tracking branch 'origin/prism_plus_support' into prism_plus_support 2025-07-28 20:06:24 +08:00
SoulGateKey
392fdb3783 Merge branch 'mai2_rival_support' into prism_plus_support 2025-07-28 01:04:00 +08:00
SoulGateKey
1833e8abde add mai2 rival management features to frontend and templates 2025-07-27 03:06:30 +08:00
SoulGateKey
91545bb974 Merge branch 'mai2_rival_support' into prism_plus_support 2025-07-26 18:43:37 +08:00
SoulGateKey
2a12f84dd9 add implement of GetUserFriendCheckApi and UserFriendRegistApi 2025-07-26 18:42:48 +08:00
SoulGateKey
e52362d87f Merge pull request 'develop' (#14) from develop into prism_plus_support
Reviewed-on: https://gitea.tendokyu.moe/SoulGateKey/artemis/pulls/14
2025-07-25 18:16:21 +00:00
SoulGateKey
e3ec58b238 Merge branch 'mai2_MusicScoreApi_support' into prism_plus_support 2025-07-26 01:38:45 +08:00
SoulGateKey
fbfe3c2adb merge upstream 2025-07-25 11:08:27 +00:00
SoulGateKey
621113f1d5 Merge pull request 'add support for GetGameMusicScoreApi' (#13) from mai2_MusicScoreApi_support into develop
Reviewed-on: https://gitea.tendokyu.moe/SoulGateKey/artemis/pulls/13
2025-07-25 11:07:54 +00:00
Hay1tsme
84db07a3c9 Merge pull request 'add support for GetGameMusicScoreApi' (#220) from SoulGateKey/artemis:mai2_MusicScoreApi_support into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/220
2025-07-25 00:09:07 +00:00
SoulGateKey
18701576e3 add support for GetGameMusicScoreApi
the api is used to deliver charts from the server
for mai2 Prism and later
2025-07-24 20:20:08 +08:00
SoulGateKey
0a4dc8dbb0 add Gate 7,8,9,10 support 2025-07-22 19:55:33 +08:00
SoulGateKey
180c027575 Sync Develop branch's update 2025-07-22 03:37:39 +08:00
Kayori
a86b8eeddb merge upstream 2025-07-13 21:35:02 +00:00
Hay1tsme
c59ce54eb0 Merge pull request 'docker: add nginx in docker compose file' (#218) from zaphkito/artemis:nginx_docker into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/218
2025-07-13 05:22:28 +00:00
Hay1tsme
07c08a7070 Merge pull request 'chu&mai2: add option to use https' (#217) from zaphkito/artemis:chumai_https into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/217
2025-07-13 05:22:18 +00:00
Hay1tsme
d77a509c71 Merge pull request 'mai2: fix Exp and Chn encryption' (#216) from zaphkito/artemis:develop into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/216
2025-07-13 05:21:19 +00:00
Kayori
d72603d101 merge upstream 2025-07-01 19:44:59 +00:00
zaphkito
6e5a83f3d7 mai2: correct game code check when not encrypted and enable encrypted only 2025-06-14 12:24:07 +00:00
zaphkito
cf6666b866 mai2: fix Exp and Chn encryption 2025-06-14 15:07:55 +08:00
zaphkito
592e6961e8 chu&mai2: add option to use https 2025-06-14 14:51:21 +08:00
zaphkito
4a417f869a docker: a little change 2025-06-14 05:33:37 +08:00
zaphkito
d8bd26d16e nginx: add WAHLAP billing in example config file 2025-06-14 05:29:25 +08:00
zaphkito
faafee3cbe docker: add nginx in docker compose file 2025-06-14 05:18:11 +08:00
Hay1tsme
3a8a67cee2 Merge pull request 'feature/SDHJ' (#215) from Keeboy/artemis:feature/SDHJ into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/215
2025-06-03 16:17:59 +00:00
Keeboy99
02bfc7dba2 SDGB support + extras 2025-06-02 16:31:57 +12:00
Keeboy99
33b7db0e98 Added support for multiple Allnet Lite keys + extras 2025-05-31 20:04:28 +12:00
Keeboy99
4875caab93 ChimeDB qr code lookup/userid complete 2025-05-31 12:23:40 +12:00
Keeboy99
3e848d684f SDHJ title server support added + encryption 2025-05-31 09:36:53 +12:00
Keeboy99
1e4cb0b380 Simple Download Order implementation for Allnet Lite 2025-05-31 08:44:51 +12:00
Keeboy99
5e5365d22b Allnet Lite Power On Support 2025-05-31 07:15:35 +12:00
SoulGateKey
611806828a add Gate 5,6 judgement 2025-05-15 08:19:42 +08:00
SoulGateKey
8b050e89eb fix conflict mark 2025-05-15 03:53:42 +08:00
SoulGateKey
394ec74fb7 add key unlock condition define 2025-05-15 03:14:24 +08:00
SoulGateKey
a2a333b13f Merge remote-tracking branch 'origin/prism_plus_support' into prism_plus_support
# Conflicts:
#	titles/mai2/index.py
2025-05-15 02:45:56 +08:00
SoulGateKey
933d8bea21 add import 2025-05-15 02:44:37 +08:00
SoulGateKey
f70af35343 Merge branch 'refs/heads/develop' into prism_plus_support
# Conflicts:
#	core/data/alembic/versions/16f34bf7b968_mai2_kaleidx_scope_support.py
#	core/data/alembic/versions/5cf98cfe52ad_mai2_prism_support.py
#	core/data/alembic/versions/5d7b38996e67_mai2_prism_support.py
#	core/data/alembic/versions/bdf710616ba4_mai2_add_prism_playlog_support.py
#	titles/mai2/index.py
#	titles/mai2/prism.py
#	titles/mai2/read.py
#	titles/mai2/schema/static.py
2025-05-15 02:41:55 +08:00
Hay1tsme
8e83527e82 Merge pull request 'fix_mai2_internal_ver' (#211) from SoulGateKey/artemis:fix_mai2_internal_ver into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/211
2025-05-05 04:42:52 +00:00
Hay1tsme
e6d7888655 billing: fix infinite loop 2025-05-03 15:51:50 -04:00
SoulGateKey
f1a0557f94 fix mai2 internal ver 2025-05-01 17:30:06 +00:00
Hay1tsme
a74ca85300 allnet: fix playhistory 2025-04-24 23:56:19 -04:00
Hay1tsme
eea9ca21ca allnet: basic playhistory 2025-04-24 23:04:01 -04:00
Hay1tsme
ce475e801b allnet: save billing traces 2025-04-17 20:11:33 -04:00
Hay1tsme
ada9377c06 ongeki: remove BM card duplicate check 2025-04-09 18:09:41 -04:00
Hay1tsme
2640f23a00 ongeki: fix opt reader 2025-04-09 00:10:54 -04:00
Hay1tsme
c955c1ae37 mai2: fix opt reader 2025-04-08 23:45:15 -04:00
Hay1tsme
9a14e54328 ongeki: add opts to reader 2025-04-08 17:59:19 -04:00
Hay1tsme
47affd898f mai2: add opts to reader 2025-04-08 17:42:17 -04:00
Hay1tsme
e16bfc713a chuni: add opt to reader 2025-04-08 00:41:49 -04:00
SoulGateKey
1e7f367d0f Merge pull request 'develop' (#9) from develop into prism_plus_support
Reviewed-on: https://gitea.tendokyu.moe/SoulGateKey/artemis/pulls/9
2025-04-08 04:37:17 +00:00
SoulGateKey
34fbfff06e Merge branch 'develop' into Hay1tsme-develop 2025-04-08 12:31:58 +08:00
Hay1tsme
a7077fb41c mai2: add prism to version lut 2025-04-07 23:16:06 -04:00
Hay1tsme
311b439cc7 Merge pull request 'chuni: userbox ui and read cleanup' (#206) from daydensteve/artemis-develop-chuni-userbox-ui-fix:develop into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/206
2025-04-08 03:14:57 +00:00
Hay1tsme
9dd30773b2 Merge pull request 'mai2:prism_support' (#205) from SoulGateKey/artemis:prism_support into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/205
2025-04-08 03:14:36 +00:00
SoulGateKey
ecd4cc205e Add new KaleidxScope Condition handle method 2025-04-08 08:34:21 +08:00
SoulGateKey
926cb9e3bd Merge remote-tracking branch 'origin/prism_support' into prism_support 2025-04-08 08:33:40 +08:00
SoulGateKey
e757990682 Merge pull request 'develop' (#5) from Hay1tsme/artemis:develop into prism_support
Reviewed-on: https://gitea.tendokyu.moe/SoulGateKey/artemis/pulls/5
2025-04-08 00:32:33 +00:00
SoulGateKey
e3f13ee871 Merge branch 'prism_support' into develop 2025-04-08 00:31:31 +00:00
SoulGateKey
703068e965 delete unused alembic file
create new alembic file
2025-04-08 08:11:01 +08:00
SoulGateKey
73fa77368d Merge branch 'refs/heads/develop' into prism_support 2025-04-08 11:32:49 +08:00
SoulGateKey
aa34ba3a2d Merge pull request 'develop' (#3) from Hay1tsme/artemis:develop into develop
Reviewed-on: https://gitea.tendokyu.moe/SoulGateKey/artemis/pulls/3
2025-04-08 00:03:00 +00:00
Hay1tsme
bd3ccef770 Merge pull request 'mai2_item_present fixed' (#208) from SoulGateKey/artemis:fix_mai2_present into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/208
2025-04-07 23:41:34 +00:00
SoulGateKey
4cb120dbb6 Merge pull request 'add opt static tables' (#2) from Hay1tsme/artemis:develop into develop
Reviewed-on: https://gitea.tendokyu.moe/SoulGateKey/artemis/pulls/2
2025-04-07 23:30:18 +00:00
SoulGateKey
ceee8e3477 Merge branch 'develop' into fix_mai2_present 2025-04-07 23:25:42 +00:00
SoulGateKey
ed2b6044ff mai2_item_present fixed 2025-04-08 11:18:40 +08:00
Hay1tsme
1cab68006d add opt static tables 2025-04-07 18:31:11 -04:00
SoulGateKey
134af15ed7 update readme.md and game_specific_info.md 2025-04-07 09:15:29 +08:00
SoulGateKey
a4bcca9171 add clientplaytimeapi support 2025-04-07 09:15:29 +08:00
SoulGateKey
d598c8fba0 add prism+ playlog support 2025-04-07 09:15:29 +08:00
SoulGateKey
3b7a577ea2 add prism+ consts and cm support 2025-04-07 09:15:29 +08:00
daydensteve
eb601e3293 fixed read failures in older chuni versions where sortname doesn't exist. Also noticed some character import errors associated with & 2025-04-05 20:10:07 -04:00
daydensteve
5906bc3486 fixed inappropriate use of character illustration id instead of base character id. The Userbox jinja would break if the profile was using an alternate character illustration 2025-04-05 18:05:00 -04:00
SoulGateKey
dd10508e68 unused database deleted 2025-04-04 09:12:08 +08:00
SoulGateKey
c0df7cd084 database support for prism
kaleidxScope Key Condition store
2025-04-04 09:10:41 +08:00
SoulGateKey
9a7fc007bc standardization KaleidxScope variable names 2025-04-04 05:42:53 +08:00
SoulGateKey
756c7ce951 update readme.md and game_specific_info.md 2025-04-02 13:56:56 +08:00
SoulGateKey
4ceac7db35 add clientplaytimeapi support 2025-04-02 13:49:46 +08:00
SoulGateKey
f8888c2392 add prism+ playlog support 2025-04-02 12:42:40 +08:00
SoulGateKey
9bc18f179d add prism+ consts and cm support 2025-04-02 12:37:43 +08:00
SoulGateKey
cc7afa6b67 database add kaleidx scope support 2025-04-02 11:57:08 +08:00
SoulGateKey
94b3c47c3c update readme.md and game_specific_info.md 2025-04-02 09:42:40 +08:00
SoulGateKey
3d84e32892 add Kaleidx Scope Support 2025-04-02 09:42:08 +08:00
SoulGateKey
d77d02c2dd database add Mai2Prism support 2025-04-02 06:42:28 +08:00
SoulGateKey
36354ae109 add GetUserKaleidxScopeApi handler 2025-04-02 05:46:44 +08:00
SoulGateKey
814c4fd284 add GetGameKaleidxScopeApi handler 2025-04-02 04:54:01 +08:00
SoulGateKey
6821ab6f46 add UploadUserPlaylogListApi handler for Exp version 2025-04-02 04:35:39 +08:00
SoulGateKey
6b1b607db0 add GetGameMusicScoreApi handler 2025-04-02 04:35:14 +08:00
SoulGateKey
f4b9f48ed6 add cardmaker support for Prism 2025-04-02 03:07:59 +08:00
SoulGateKey
fa94c029ca add GetUserNewItemListApi handler 2025-04-02 03:07:35 +08:00
SoulGateKey
1d545b2bd2 add const.py of prism 2025-04-02 02:32:34 +08:00
Hay1tsme
96a252cbf3 ongeki: fix act 3 database upgrade script 2025-03-29 11:25:07 -04:00
Hay1tsme
62e61ec975 Merge pull request 'O.N.G.E.K.I. bright MEMORY Act.3 support added' (#204) from feature/ongeki_act3 into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/204
2025-03-29 15:23:45 +00:00
Hay1tsme
f939d4976e chuni: make total columns BigInt, for #203 2025-03-29 11:22:12 -04:00
Dniel97
fbcc53aeae ongeki: update ongeki_static_tech_music_uk 2025-03-26 21:20:22 +01:00
Dniel97
a2f71dc553 ongeki: bright MEMORY Act.3 support added 2025-03-26 15:25:34 +01:00
Hay1tsme
60002a466f diva: put full name in frontend header 2025-03-25 11:32:12 -04:00
Hay1tsme
c1fa528e45 chuni: fix frontend 500 if no profile is available 2025-03-25 11:30:44 -04:00
Hay1tsme
017cdecbaa db: fix missing param in add_arcade_owner 2025-03-25 11:22:37 -04:00
Hay1tsme
b6d22ef41d frontend: arcade management rework 2025-03-25 10:43:26 -04:00
Hay1tsme
20d9a2da9c sao: fix frontend 2025-03-22 00:58:56 -04:00
Hay1tsme
afdcd9a731 mai2: remove print statements from frontend 2025-03-22 00:58:49 -04:00
Hay1tsme
882560a790 adb: fix semantics with FelicaLookupEx 2025-03-21 09:47:44 -04:00
Hay1tsme
376b77be29 frontend: serial rollover after 9999 generated serials 2025-03-20 14:53:28 -04:00
Dniel97
cdd46d51b7 chuni: fix favorite music list 2025-03-02 18:34:06 +01:00
Hay1tsme
399f983bea allnet: fix download order response, billing logging 2025-03-02 04:10:06 -05:00
Hay1tsme
360dfdfdc1 Merge pull request 'ongeki: use the latest applicable version' (#200) from akanyan/artemis:fix/ongeki/versions into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/200
2025-02-22 15:24:55 +00:00
Hay1tsme
0f52b89033 remove deprecated warn 2025-02-21 23:51:59 -05:00
Hay1tsme
d4394788b7 Merge pull request 'Fix Crossbeats Read Script' (#201) from Galexion/artemis:develop into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/201
2025-02-15 20:29:03 +00:00
Galexion
f3f0569755 Fixes Capitalization on CrossBeats Read.py 2025-01-27 19:50:49 +00:00
akanyan
59a3c28134 ongeki: use the latest applicable version 2025-01-20 22:34:05 +00:00
Midorica
b62e9beb67 Merge pull request 'ongeki: proper handling of music ranking list' (#195) from akanyan/artemis:feat/ongeki/music-ranking into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/195
2025-01-14 02:31:51 +00:00
Midorica
6ea27b1632 Merge pull request 'ongeki: read music version from the xml' (#194) from akanyan/artemis:fix/ongeki/read-music into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/194
2025-01-14 01:56:02 +00:00
akanyan
fa667d15f2 ongeki: proper handling of music ranking list 2025-01-06 18:39:49 +00:00
akanyan
ab64eea5c9 ongeki: read music version from the xml 2024-12-30 18:31:22 +00:00
Hay1tsme
0cf41ff389 TUI: add card management screen 2024-12-20 17:40:55 -05:00
Hay1tsme
e93fcfd706 Merge pull request 'Fix: AimeDB Felica LookupEx rename package parameter' (#188) from SoulGateKey/artemis:lookup_fix into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/188
2024-12-19 06:41:02 +00:00
Kevin Trocolli
b81d5c9cc5 adb: fix minor logging typo 2024-12-19 01:37:50 -05:00
Hay1tsme
6a305d2514 Merge pull request '[database] fix invalid transaction being left open' (#187) from beerpsi/artemis:fix/invalid-transaction into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/187
2024-12-19 06:14:45 +00:00
Hay1tsme
e8c90634b6 Merge pull request '[chunithm] fix rival music not showing up in game' (#190) from beerpsi/artemis:fix/chunithm/rivals into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/190
2024-12-19 06:14:10 +00:00
Hay1tsme
5d2f0eaae6 Merge pull request '[chunithm] support luminous+' (#193) from beerpsi/artemis:feat/chunithm/luminousplus into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/193
2024-12-19 06:13:49 +00:00
beerpsi
5475b52336 [chunithm] support luminous+ 2024-12-19 13:03:37 +07:00
Kevin Trocolli
f830764990 tui: fix minor alignment issue 2024-12-19 00:21:39 -05:00
Kevin Trocolli
e8cd6e9596 tui: add user lookup 2024-12-19 00:17:00 -05:00
Hay1tsme
326b5988af add half-working TUI 2024-12-18 16:35:28 -05:00
Kevin Trocolli
e8ea328e77 mai2: add add_consec_login call if get_consec_login returns None #189 2024-12-15 20:21:03 -05:00
Kevin Trocolli
1dceff456d cxb: added missing r which fixes an issue on ubuntu 24.04.1 2024-12-15 20:16:18 -05:00
beerpsi
fe8f365d8a [chunithm] fix rival music not showing up in game 2024-12-12 20:49:39 +07:00
beerpsi
d6d98d20cb fix: typing shenanigans 2024-12-12 20:47:34 +07:00
SoulGateKey
5ecc7984c7 Fix: AimeDB Felica LookupEx rename package parameter 2024-12-08 08:44:32 +08:00
SoulGateKey
d797e5f6b7 Merge pull request 'update develop' (#4) from Hay1tsme/artemis:develop into develop
Reviewed-on: https://gitea.tendokyu.moe/SoulGateKey/artemis/pulls/4
2024-12-08 00:12:03 +00:00
Kevin Trocolli
a8f5ef1550 allnet: properly dfi encode downloadorder responses 2024-12-01 14:19:55 -05:00
Kevin Trocolli
383859388e chuni: fix 'NoneType' object has no attribute 'split' in score.py 2024-11-29 22:20:55 -05:00
beerpsi
476a911df9 [database] fix invalid transaction being left open 2024-11-25 20:13:51 +07:00
SoulGateKey
fe9a04ef8e Merge pull request 'develop' (#3) from Hay1tsme/artemis:develop into develop
Reviewed-on: https://gitea.tendokyu.moe/SoulGateKey/artemis/pulls/3
2024-11-20 19:25:36 +00:00
beerpsi
58a5177a30 use SQL's limit/offset pagination for nextIndex/maxCount requests (#185)
Instead of retrieving the entire list of items/characters/scores/etc. at once (and even store them in memory), use SQL's `LIMIT ... OFFSET ...` pagination so we only take what we need.

Currently only CHUNITHM uses this, but this will also affect maimai DX and O.N.G.E.K.I. once the PR is ready.

Also snuck in a fix for CHUNITHM/maimai DX's `GetUserRivalMusicApi` to respect the `userRivalMusicLevelList` sent by the client.

### How this works

Say we have a `GetUserCharacterApi` request:

```json
{
    "userId": 10000,
    "maxCount": 700,
    "nextIndex": 0
}
```

Instead of getting the entire character list from the database (which can be very large if the user force unlocked everything), add limit/offset to the query:

```python
select(character)
.where(character.c.user == user_id)
.order_by(character.c.id.asc())
.limit(max_count + 1)
.offset(next_index)
```

The query takes `maxCount + 1` items from the database to determine if there is more items than can be returned:

```python
rows = ...

if len(rows) > max_count:
    # return only max_count rows
    next_index += max_count
else:
    # return everything left
    next_index = -1
```

This has the benefit of not needing to load everything into memory (and also having to store server state, as seen in the [`SCORE_BUFFER` list](2274b42358/titles/chuni/base.py (L13)).)

Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/185
Co-authored-by: beerpsi <beerpsi@duck.com>
Co-committed-by: beerpsi <beerpsi@duck.com>
2024-11-16 19:10:29 +00:00
Hay1tsme
cb009f6e23 wacca: tiny cleanup 2024-11-14 12:39:21 -05:00
Hay1tsme
2274b42358 Merge pull request '[database] make async' (#184) from beerpsi/artemis:fix/async-database into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/184
2024-11-14 06:15:49 +00:00
beerpsi
789d50c406 use AsyncSession directly
see the warnings in https://docs.sqlalchemy.org/en/14/orm/extensions/asyncio.html#using-asyncio-scoped-session
2024-11-14 13:10:14 +07:00
beerpsi
4c33f4282a oops forgot a dependency on aiomysql 2024-11-14 12:38:00 +07:00
beerpsi
bc7524c8fc fix: make database async 2024-11-14 12:36:22 +07:00
Hay1tsme
1331d473c9 Merge pull request '[mai2] Implement GetGameRankingAPI . Fix photo merge , Add UserScoreRankingAPI handler' (#181) from SoulGateKey/artemis:develop into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/181
2024-11-13 05:37:00 +00:00
Midorica
b7a006f7ee core: pushing changes regarding MySQL ssl toggle that is now mandatory 2024-11-12 10:53:02 -05:00
Hay1tsme
65100920e3 Merge pull request '[chuni] web ui - customization support (user box, avatar, map icon, system voice)' (#182) from daydensteve/artemis-develop:chuni_ui_overhaul into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/182
2024-11-12 12:03:14 +00:00
SoulGateKey
7a307b4d69 Merge pull request 'Fix mai2 photo merge problem and Add UserScoreRankingAPI handler' (#2) from mai2_tournament_support into develop
Reviewed-on: https://gitea.tendokyu.moe/SoulGateKey/artemis/pulls/2
2024-11-12 05:42:20 +00:00
SoulGateKey
f4dff9b4c1 fix: mai2 photos cant be merged 2024-11-11 21:16:19 +08:00
SoulGateKey
8a6250bebd Formatted log print
Change log level
2024-11-11 21:11:33 +08:00
daydensteve
eb18ad22b8 hardened ui against the db not being upgraded or importer not being ran 2024-11-08 09:17:12 -05:00
daydensteve
954bd565d3 reduced db access with new chuni webui customizations 2024-11-07 20:28:28 -05:00
SoulGateKey
f272e97eae Formatted log print
Change log level
2024-11-06 02:44:07 +08:00
SoulGateKey
aa7ae6cb51 Formatted log print 2024-11-06 02:38:18 +08:00
daydensteve
3a44b18d91 fixed erroneously wide trophy select 2024-11-03 19:27:20 -05:00
daydensteve
c8186ccef0 fixed doc typo 2024-11-03 19:20:36 -05:00
daydensteve
4a701a5755 chuni doc updates 2024-11-03 19:19:05 -05:00
daydensteve
f5205801a8 Added customization unlock overrides 2024-11-03 19:12:49 -05:00
daydensteve
626ce6bd96 userbox, avatar, mapicon, and voice ui configuration 2024-11-03 18:37:09 -05:00
daydensteve
e49c70b738 more enums! 2024-11-03 16:37:27 -05:00
daydensteve
c2d4abcc26 db and import updates for userbox, avatar, voice, and map icon 2024-11-03 16:37:05 -05:00
daydensteve
2f6974cab6 new chuni ui images/directories 2024-11-03 08:48:13 -05:00
daydensteve
9b89cef51c ignore visual studio pro files 2024-11-03 08:46:12 -05:00
SoulGateKey
221517e310 TODO: GetUserScoreRankingApi 2024-10-30 12:37:18 +08:00
SoulGateKey
52b397f31f Merge remote-tracking branch 'origin/develop' into sgkdev
# Conflicts:
#	titles/mai2/schema/profile.py
2024-10-30 12:28:26 +08:00
SoulGateKey
b84e17a66b Merge pull request 'mai2_handle_get_game_ranking' (#1) from mai2_handle_get_game_ranking into develop
Reviewed-on: https://gitea.tendokyu.moe/SoulGateKey/artemis/pulls/1
2024-10-30 04:18:53 +00:00
SoulGateKey
b6e7e0973b Delete unused dependency 2024-10-11 16:19:07 +00:00
SoulGateKey
598e4aad76 Update mai2/schema/score.py to support new handle_get_game_ranking 2024-10-11 16:16:40 +00:00
SoulGateKey
a673d9dabd Delete unused dependency 2024-10-11 16:12:53 +00:00
SoulGateKey
398fa9059d Update mai2/base.py using the ORM 2024-10-11 16:09:53 +00:00
SoulGateKey
29f4a6a696 revert 033c1aa776
revert Update 卖
2024-10-11 16:08:15 +00:00
SoulGateKey
033c1aa776 Update 卖 2024-10-11 16:06:17 +00:00
SoulGateKey
bbf41ac83f Merge branch 'develop' into mai2_handle_get_game_ranking 2024-10-11 15:56:05 +00:00
Kevin Trocolli
451754cf3c sao: fix my store 2024-10-06 16:09:09 -04:00
Kevin Trocolli
0cef797a8a mai2: rework photo uploads, relates to #67 2024-10-06 03:47:10 -04:00
SoulGateKey
58ae491a8c add pymysql to requirements.txt 2024-10-03 19:47:36 +00:00
SoulGateKey
3843ac6eb1 mai2: calc GetGameRanking result 2024-10-03 19:32:17 +00:00
daydensteve
ed5e7dc561 [chuni] Added truncation to long Title and Artist Name values on import (#178)
I noticed the importer failing to import music 523 (Niji-iro no Flügel) from an omni pack due to the artist name being crazy long.

To address this, I added truncation to max column value length for both the Title and Artist Name values. Considered doing this for the other 3 string fields as well but I can't imagine those ever being problematic.

Import now succeeds with a warning generated about the truncation occurring

Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/178
Co-authored-by: daydensteve <daydensteve@gmail.com>
Co-committed-by: daydensteve <daydensteve@gmail.com>
2024-09-25 15:21:30 +00:00
daydensteve
b04840f3dd [chuni] Frontend favorites support (#176)
I had been itching for the favorites feature since I'm bad with japanese so figured I'd go ahead and add it. I've included a few pics to help visualize the changes.

### Summary of user-facing changes:
- New Favorites frontend page that itemizes favorites by genre for the current version (as selected on the Profile page). Favorites can be removed from this page via the Remove button
- Updated the Records page so that it only shows the playlog for the currently selected version and includes a "star" to the left of each title that can be clicked to add/remove favorites. When the star is yellow, its a favorite; when its a grey outline, its not. I figure its pretty straight forward
- The Records and new Favorites pages show the jacket image of each song now (The Importer was updated to convert the DDS files to PNGs on import)

### Behind-the-scenes changes:
- Fixed a bug in the chuni get_song method - it was inappropriately comparing the row id instead of the musicid (note this method was not used prior to adding favorites support)
- Overhauled the score scheme file to stop with all the hacky romVersion determination that was going on in various methods. To do this, I created a new ChuniRomVersion class that is populated with all base rom versions, then used to derive the internal integer version  number from the string stored in the DB. As written, this functionality can infer recorded rom versions when the playlog was entered using an update to the base version (e.g. 2.16  vs 2.15 for sunplus or 2.22 vs 2.20 for luminous).
- Made the chuni config version class safer as it would previously throw an exception if you gave it a version not present in the config file. This was done in support of the score overhaul to build up the initial ChuniRomVersion dict
- Added necessary methods to query/update the favorites table.

### Testing
- Frontend testing was performed with playlog data for both sunplus (2.16) and luminous (2.22) present. All add/remove permutations and images behavior was as expected
- Game testing was performed only with Luminous (2.22) and worked fine

Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/176
Co-authored-by: daydensteve <daydensteve@gmail.com>
Co-committed-by: daydensteve <daydensteve@gmail.com>
2024-09-25 14:53:43 +00:00
Hay1tsme
1d8e31d4ab docs: add missing games 2024-09-23 14:46:48 -04:00
Hay1tsme
045465ed4e idz: disabled by default to silence warnings for people who don't feel like configuring games they don't intend to use 2024-09-23 14:46:41 -04:00
Hay1tsme
aa8e33a13e docs: add pokken to game specific info 2024-09-23 14:20:25 -04:00
ppc
f47175a144 [mai2] add buddies plus support (#177)
Adds favorite music support (there's an option in the results screen to star a song), handlers for new methods and fixes upsert failures for `userFavoriteList`.
The `UserIntimateApi` has been added but didn't seem to add any data during testing, and `CreateTokenApi`/`RemoveTokenApi` have also been added but I think they're only used during guest play.

---
Tested on 1.45 with no errors/game crashes (see logs). Card Maker hasn't been tested as I don't have a setup to play with.

Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/177
Co-authored-by: ppc <albie@ppc.moe>
Co-committed-by: ppc <albie@ppc.moe>
2024-09-23 17:21:29 +00:00
Hay1tsme
e85728f33c chuni/mai2: remove upsert from put_playlog 2024-09-20 17:10:48 -04:00
ppc
5c60cde14f update docs 2024-09-19 23:10:54 +01:00
ppc
8d04d74f52 migration consistency 2024-09-18 17:59:24 +01:00
ppc
d85c575c61 add nullcheck 2024-09-18 11:29:14 +01:00
ppc
196aa601f3 handle GetUserMissionDataApi 2024-09-16 18:01:06 +01:00
ppc
d8169e37cc add mai2 UserIntimateApi 2024-09-16 17:56:22 +01:00
ppc
77aa1afaa0 add mai2 favorite music support 2024-09-16 16:55:09 +01:00
ppc
e128631e8f fix upsert failures 2024-09-16 12:39:15 +01:00
ppc
b01ac24799 add shop stock/friend bonus handlers 2024-09-16 09:54:39 +00:00
ppc
01dad267b9 add mai2 database upgrade 2024-09-15 20:48:24 +00:00
ppc
cc302b6e56 update cm reader 2024-09-15 19:22:47 +00:00
ppc
ee4eddd639 add buddies plus support 2024-09-15 19:22:39 +00:00
Hay1tsme
6f6a300879 Merge pull request '[BUGFIX] Fixed Chusan Map Overload' (#175) from EmmyHeart/artemis:develop into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/175
2024-09-14 04:34:44 +00:00
EmmyHeart
82004cb743 Fix map overload in Chusan 2024-09-14 01:30:29 +00:00
EmmyHeart
8f4c08f825 Fix map overload in Chusan 2024-09-14 01:28:35 +00:00
Hay1tsme
7ebd9bfb8a Merge pull request '[chuni] Auto stock tickets at login' (#170) from daydensteve/artemis:chuni_ticket_stock into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/170
2024-09-13 18:12:32 +00:00
Hay1tsme
19c9740f4b Merge pull request '[aime] fix ADB header parsing error when using generated keychips' (#173) from ppc/artemis:aime-regex-fix into develop
Reviewed-on: https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/173
2024-09-13 18:11:37 +00:00
ppc
1db020b5fc fix generated keychip validation failures 2024-09-09 16:53:14 +00:00
Hay1tsme
d1048694d4 Fix --config option not being respected, fixes #172 2024-09-06 10:36:57 -04:00
Hay1tsme
944b80129b chuni: fix ultimate/worlds end chart reading, closes #63 2024-09-05 11:45:22 -04:00
Hay1tsme
73dda06413 mai2: add warning about portrait uploading not being supported. #67 2024-09-05 11:37:52 -04:00
daydensteve
eacd4a2f43 Adding stock_tickets and stock_count chuni mods. Enables specified tickets to be auto-stocked on login 2024-09-02 20:00:59 -04:00
Kayori
eb66c9159f Merge pull request 'develop' (#12) from Hay1tsme/artemis:develop into develop
Reviewed-on: https://gitea.tendokyu.moe/Kayori/artemis/pulls/12
2024-08-21 09:40:35 +00:00
Midorica
dd33144040 adding luminous to readme 2024-08-16 09:26:12 -04:00
ThatzOkay
5c45091cec Merge pull request 'develop' (#11) from Hay1tsme/artemis:develop into develop
Reviewed-on: https://gitea.tendokyu.moe/ThatzOkay/artemis/pulls/11
2024-07-19 12:22:22 +00:00
ThatzOkay
00d3b6e69a Merge pull request 'develop' (#10) from Hay1tsme/artemis:develop into develop
Reviewed-on: https://gitea.tendokyu.moe/ThatzOkay/artemis/pulls/10
2024-07-09 22:48:20 +00:00
160 changed files with 13023 additions and 2017 deletions

1
.gitignore vendored
View File

@@ -145,6 +145,7 @@ dmypy.json
cython_debug/
.vscode/*
.vs/*
# Local History for Visual Studio Code
.history/

View File

@@ -1,6 +1,13 @@
# Changelog
Documenting updates to ARTEMiS, to be updated every time the master branch is pushed to.
## 20250803
+ CHUNITHM VERSE support added
## 20250327
+ O.N.G.E.K.I. bright MEMORY Act.3 support added
+ CardMaker support updated
## 20240811
### System
+ Change backend from Twisted to Starlette

View File

@@ -1,7 +1,5 @@
from core.config import CoreConfig
from core.allnet import AllnetServlet, BillingServlet
from core.aimedb import AimedbServlette
from core.title import TitleServlet
from core.utils import Utils
from core.mucha import MuchaServlet
from core.frontend import FrontendServlet

View File

@@ -120,7 +120,7 @@ class ADBHeader:
if self.store_id == 0:
raise ADBHeaderException(f"Store ID cannot be 0!")
if re.fullmatch(r"^A[0-9]{2}[E|X][0-9]{2}[A-HJ-NP-Z][0-9]{4}$", self.keychip_id) is None:
if re.fullmatch(r"^A[0-9]{2}[A-Z][0-9]{2}[A-HJ-NP-Z][0-9]{4}$", self.keychip_id) is None:
raise ADBHeaderException(f"Keychip ID {self.keychip_id} is invalid!")
return True

View File

@@ -39,10 +39,10 @@ class ADBFelicaLookupExRequest(ADBBaseRequest):
def __init__(self, data: bytes) -> None:
super().__init__(data)
self.random = struct.unpack_from("<16s", data, 0x20)[0]
idm, pmm = struct.unpack_from(">QQ", data, 0x30)
idm, dfc, self.arbitrary = struct.unpack_from(">QH6s", data, 0x30)
self.card_key_ver, self.write_ct, self.maca, company, fw_ver, self.dfc = struct.unpack_from("<16s16sQccH", data, 0x40)
self.idm = hex(idm)[2:].upper()
self.pmm = hex(pmm)[2:].upper()
self.dfc = hex(dfc)[2:].upper()
self.company = CompanyCodes(int.from_bytes(company, 'little'))
self.fw_ver = ReaderFwVer.from_byte(fw_ver)

View File

@@ -137,7 +137,7 @@ class AimedbServlette():
resp_bytes = resp
elif resp is None: # Nothing to send, probably a goodbye
self.logger.warn(f"None return by handler for {name}")
self.logger.warning(f"None return by handler for {name}")
return
else:
@@ -177,7 +177,7 @@ class AimedbServlette():
async def handle_lookup(self, data: bytes, resp_code: int) -> ADBBaseResponse:
req = ADBLookupRequest(data)
if req.access_code == "00000000000000000000":
self.logger.warn(f"All-zero access code from {req.head.keychip_id}")
self.logger.warning(f"All-zero access code from {req.head.keychip_id}")
ret = ADBLookupResponse.from_req(req.head, -1)
ret.head.status = ADBStatus.BAN_SYS
return ret
@@ -208,7 +208,7 @@ class AimedbServlette():
async def handle_lookup_ex(self, data: bytes, resp_code: int) -> ADBBaseResponse:
req = ADBLookupRequest(data)
if req.access_code == "00000000000000000000":
self.logger.warn(f"All-zero access code from {req.head.keychip_id}")
self.logger.warning(f"All-zero access code from {req.head.keychip_id}")
ret = ADBLookupExResponse.from_req(req.head, -1)
ret.head.status = ADBStatus.BAN_SYS
return ret
@@ -254,7 +254,7 @@ class AimedbServlette():
req = ADBFelicaLookupRequest(data)
idm = req.idm.zfill(16)
if idm == "0000000000000000":
self.logger.warn(f"All-zero IDm from {req.head.keychip_id}")
self.logger.warning(f"All-zero IDm from {req.head.keychip_id}")
ret = ADBFelicaLookupResponse.from_req(req.head, "00000000000000000000")
ret.head.status = ADBStatus.BAN_SYS
return ret
@@ -270,7 +270,7 @@ class AimedbServlette():
ac = card['access_code']
self.logger.info(
f"idm {idm} ipm {req.pmm.zfill(16)} -> access_code {ac}"
f"idm {idm} pmm {req.pmm.zfill(16)} -> access_code {ac}"
)
return ADBFelicaLookupResponse.from_req(req.head, ac)
@@ -283,7 +283,7 @@ class AimedbServlette():
idm = req.idm.zfill(16)
if idm == "0000000000000000":
self.logger.warn(f"All-zero IDm from {req.head.keychip_id}")
self.logger.warning(f"All-zero IDm from {req.head.keychip_id}")
ret = ADBFelicaLookupResponse.from_req(req.head, "00000000000000000000")
ret.head.status = ADBStatus.BAN_SYS
return ret
@@ -323,7 +323,7 @@ class AimedbServlette():
idm = req.idm.zfill(16)
if idm == "0000000000000000":
self.logger.warn(f"All-zero IDm from {req.head.keychip_id}")
self.logger.warning(f"All-zero IDm from {req.head.keychip_id}")
ret = ADBFelicaLookupExResponse.from_req(req.head, -1, "00000000000000000000")
ret.head.status = ADBStatus.BAN_SYS
return ret
@@ -344,7 +344,7 @@ class AimedbServlette():
user_id = -1
self.logger.info(
f"idm {idm} ipm {req.pmm} -> access_code {access_code} user_id {user_id}"
f"idm {idm} dfc {req.dfc} -> access_code {access_code} user_id {user_id}"
)
resp = ADBFelicaLookupExResponse.from_req(req.head, user_id, access_code)
@@ -382,7 +382,7 @@ class AimedbServlette():
user_id = -1
if req.access_code == "00000000000000000000":
self.logger.warn(f"All-zero access code from {req.head.keychip_id}")
self.logger.warning(f"All-zero access code from {req.head.keychip_id}")
ret = ADBLookupResponse.from_req(req.head, -1)
ret.head.status = ADBStatus.BAN_SYS
return ret

View File

@@ -7,6 +7,7 @@ import logging
import coloredlogs
import urllib.parse
import math
import random
from typing import Dict, List, Any, Optional, Union, Final
from logging.handlers import TimedRotatingFileHandler
from starlette.requests import Request
@@ -17,7 +18,10 @@ from datetime import datetime
from enum import Enum
from Crypto.PublicKey import RSA
from Crypto.Hash import SHA
from Crypto.Cipher import AES
from Crypto.Util.Padding import pad
from Crypto.Signature import PKCS1_v1_5
import os
from os import path, environ, mkdir, access, W_OK
from .config import CoreConfig
@@ -132,12 +136,29 @@ class AllnetServlet:
async def handle_poweron(self, request: Request):
request_ip = Utils.get_ip_addr(request)
pragma_header = request.headers.get('Pragma', "")
useragent_header = request.headers.get('User-Agent', "")
is_dfi = pragma_header == "DFI"
is_lite = useragent_header[5:] == "Windows/Lite"
lite_id = useragent_header[:4]
data = await request.body()
if not self.config.allnet.allnet_lite_keys and is_lite:
self.logger.error("!!!LITE KEYS NOT SET!!!")
raise AllnetRequestException()
elif is_lite:
for gameids, key in self.config.allnet.allnet_lite_keys.items():
if gameids == lite_id:
litekey = key
if is_lite and "litekey" not in locals():
self.logger.error("!!!UNIQUE LITE KEY NOT FOUND!!!")
raise AllnetRequestException()
try:
if is_dfi:
req_urlencode = self.from_dfi(data)
elif is_lite:
req_urlencode = self.dec_lite(litekey, data[:16], data)
else:
req_urlencode = data
@@ -145,20 +166,30 @@ class AllnetServlet:
if req_dict is None:
raise AllnetRequestException()
req = AllnetPowerOnRequest(req_dict[0])
if is_lite:
req = AllnetPowerOnRequestLite(req_dict[0])
else:
req = AllnetPowerOnRequest(req_dict[0])
# Validate the request. Currently we only validate the fields we plan on using
if not req.game_id or not req.ver or not req.serial or not req.ip or not req.firm_ver or not req.boot_ver:
if not req.game_id or not req.ver or not req.serial or not req.token and is_lite:
raise AllnetRequestException(
f"Bad auth request params from {request_ip} - {vars(req)}"
)
elif not is_lite:
if not req.game_id or not req.ver or not req.serial or not req.ip or not req.firm_ver or not req.boot_ver:
raise AllnetRequestException(
f"Bad auth request params from {request_ip} - {vars(req)}"
)
except AllnetRequestException as e:
if e.message != "":
self.logger.error(e)
return PlainTextResponse()
if req.format_ver == 3:
if is_lite:
resp = AllnetPowerOnResponseLite(req.token)
elif req.format_ver == 3:
resp = AllnetPowerOnResponse3(req.token)
elif req.format_ver == 2:
resp = AllnetPowerOnResponse2()
@@ -175,11 +206,14 @@ class AllnetServlet:
)
self.logger.warning(msg)
resp.stat = ALLNET_STAT.bad_machine.value
if is_lite:
resp.result = ALLNET_STAT.bad_machine.value
else:
resp.stat = ALLNET_STAT.bad_machine.value
resp_dict = {k: v for k, v in vars(resp).items() if v is not None}
return PlainTextResponse(urllib.parse.unquote(urllib.parse.urlencode(resp_dict)) + "\n")
if machine is not None:
if machine is not None and not is_lite:
arcade = await self.data.arcade.get_arcade(machine["arcade"])
if self.config.server.check_arcade_ip:
if arcade["ip"] and arcade["ip"] is not None and arcade["ip"] != req.ip:
@@ -257,7 +291,10 @@ class AllnetServlet:
)
self.logger.warning(msg)
resp.stat = ALLNET_STAT.bad_game.value
if is_lite:
resp.result = ALLNET_STAT.bad_game.value
else:
resp.stat = ALLNET_STAT.bad_game.value
resp_dict = {k: v for k, v in vars(resp).items() if v is not None}
return PlainTextResponse(urllib.parse.unquote(urllib.parse.urlencode(resp_dict)) + "\n")
@@ -265,8 +302,12 @@ class AllnetServlet:
self.logger.info(
f"Allowed unknown game {req.game_id} v{req.ver} to authenticate from {request_ip} due to 'is_develop' being enabled. S/N: {req.serial}"
)
resp.uri = f"http://{self.config.server.hostname}:{self.config.server.port}/{req.game_id}/{req.ver.replace('.', '')}/"
resp.host = f"{self.config.server.hostname}:{self.config.server.port}"
if is_lite:
resp.uri1 = f"http://{self.config.server.hostname}:{self.config.server.port}/{req.game_id}/{req.ver.replace('.', '')}/"
resp.uri2 = f"{self.config.server.hostname}:{self.config.server.port}"
else:
resp.uri = f"http://{self.config.server.hostname}:{self.config.server.port}/{req.game_id}/{req.ver.replace('.', '')}/"
resp.host = f"{self.config.server.hostname}:{self.config.server.port}"
resp_dict = {k: v for k, v in vars(resp).items() if v is not None}
resp_str = urllib.parse.unquote(urllib.parse.urlencode(resp_dict))
@@ -277,10 +318,16 @@ class AllnetServlet:
int_ver = req.ver.replace(".", "")
try:
resp.uri, resp.host = TitleServlet.title_registry[req.game_id].get_allnet_info(req.game_id, int(int_ver), req.serial)
if is_lite:
resp.uri1, resp.uri2 = TitleServlet.title_registry[req.game_id].get_allnet_info(req.game_id, int(int_ver), req.serial)
else:
resp.uri, resp.host = TitleServlet.title_registry[req.game_id].get_allnet_info(req.game_id, int(int_ver), req.serial)
except Exception as e:
self.logger.error(f"Error running get_allnet_info for {req.game_id} - {e}")
resp.stat = ALLNET_STAT.bad_game.value
if is_lite:
resp.result = ALLNET_STAT.bad_game.value
else:
resp.stat = ALLNET_STAT.bad_game.value
resp_dict = {k: v for k, v in vars(resp).items() if v is not None}
return PlainTextResponse(urllib.parse.unquote(urllib.parse.urlencode(resp_dict)) + "\n")
@@ -308,18 +355,38 @@ class AllnetServlet:
"Pragma": "DFI",
},
)
elif is_lite:
iv = bytes([random.randint(2, 255) for _ in range(16)])
return PlainTextResponse(content=self.enc_lite(litekey, iv, resp_str))
return PlainTextResponse(resp_str)
return PlainTextResponse(resp_str.encode(req.encode))
async def handle_dlorder(self, request: Request):
request_ip = Utils.get_ip_addr(request)
pragma_header = request.headers.get('Pragma', "")
useragent_header = request.headers.get('User-Agent', "")
is_dfi = pragma_header == "DFI"
is_lite = useragent_header[5:] == "Windows/Lite"
lite_id = useragent_header[:4]
data = await request.body()
if not self.config.allnet.allnet_lite_keys and is_lite:
self.logger.error("!!!LITE KEYS NOT SET!!!")
raise AllnetRequestException()
elif is_lite:
for gameids, key in self.config.allnet.allnet_lite_keys.items():
if gameids == lite_id:
litekey = key
if is_lite and "litekey" not in locals():
self.logger.error("!!!UNIQUE LITE KEY NOT FOUND!!!")
raise AllnetRequestException()
try:
if is_dfi:
req_urlencode = self.from_dfi(data)
elif is_lite:
req_urlencode = self.dec_lite(litekey, data[:16], data)
else:
req_urlencode = data.decode()
@@ -327,7 +394,10 @@ class AllnetServlet:
if req_dict is None:
raise AllnetRequestException()
req = AllnetDownloadOrderRequest(req_dict[0])
if is_lite:
req = AllnetDownloadOrderRequestLite(req_dict[0])
else:
req = AllnetDownloadOrderRequest(req_dict[0])
# Validate the request. Currently we only validate the fields we plan on using
if not req.game_id or not req.ver or not req.serial:
@@ -343,28 +413,46 @@ class AllnetServlet:
self.logger.info(
f"DownloadOrder from {request_ip} -> {req.game_id} v{req.ver} serial {req.serial}"
)
resp = AllnetDownloadOrderResponse(serial=req.serial)
if is_lite:
resp = AllnetDownloadOrderResponseLite()
else:
resp = AllnetDownloadOrderResponse(serial=req.serial)
if (
not self.config.allnet.allow_online_updates
or not self.config.allnet.update_cfg_folder
):
return PlainTextResponse(urllib.parse.unquote(urllib.parse.urlencode(vars(resp))) + "\n")
resp = urllib.parse.unquote(urllib.parse.urlencode(vars(resp))) + "\n"
if is_dfi:
return PlainTextResponse(
self.to_dfi(resp) + b"\r\n", headers={ "Pragma": "DFI" }
)
elif is_lite:
iv = bytes([random.randint(2, 255) for _ in range(16)])
return PlainTextResponse(content=self.enc_lite(litekey, iv, resp))
return PlainTextResponse(resp)
else:
machine = await self.data.arcade.get_machine(req.serial)
if not machine or not machine['ota_enable'] or not machine['is_cab'] or machine['is_blacklisted']:
return PlainTextResponse(urllib.parse.unquote(urllib.parse.urlencode(vars(resp))) + "\n")
if not machine or not machine['ota_channel'] or not machine['is_cab']:
resp = urllib.parse.unquote(urllib.parse.urlencode(vars(resp))) + "\n"
if is_dfi:
return PlainTextResponse(
self.to_dfi(resp) + b"\r\n", headers={ "Pragma": "DFI" }
)
elif is_lite:
iv = bytes([random.randint(2, 255) for _ in range(16)])
return PlainTextResponse(content=self.enc_lite(litekey, iv, resp))
return PlainTextResponse(resp)
update = await self.data.arcade.get_ota_update(req.game_id, req.ver, machine['ota_channel'])
if update:
if update['app_ini'] and path.exists(f"{self.config.allnet.update_cfg_folder}/{update['app_ini']}"):
resp.uri = f"http://{self.config.server.hostname}:{self.config.server.port}/dl/ini/{update['app_ini']}"
if path.exists(
f"{self.config.allnet.update_cfg_folder}/{req.game_id}-{req.ver.replace('.', '')}-app.ini"
):
resp.uri = f"http://{self.config.server.hostname}:{self.config.server.port}/dl/ini/{req.game_id}-{req.ver.replace('.', '')}-app.ini"
if path.exists(
f"{self.config.allnet.update_cfg_folder}/{req.game_id}-{req.ver.replace('.', '')}-opt.ini"
):
resp.uri += f"|http://{self.config.server.hostname}:{self.config.server.port}/dl/ini/{req.game_id}-{req.ver.replace('.', '')}-opt.ini"
if update['opt_ini'] and path.exists(f"{self.config.allnet.update_cfg_folder}/{update['opt_ini']}"):
resp.uri += f"|http://{self.config.server.hostname}:{self.config.server.port}/dl/ini/{update['opt_ini']}"
if resp.uri:
self.logger.info(f"Sending download uri {resp.uri}")
@@ -383,6 +471,9 @@ class AllnetServlet:
"Pragma": "DFI",
},
)
elif is_lite:
iv = bytes([random.randint(2, 255) for _ in range(16)])
return PlainTextResponse(content=self.enc_lite(litekey, iv, res_str))
return PlainTextResponse(res_str)
@@ -403,7 +494,7 @@ class AllnetServlet:
f"{self.config.allnet.update_cfg_folder}/{req_file}", "r", encoding="utf-8"
).read())
self.logger.info(f"DL INI File {req_file} not found")
self.logger.warning(f"DL INI File {req_file} not found")
return PlainTextResponse()
async def handle_dlorder_report(self, request: Request) -> bytes:
@@ -507,6 +598,17 @@ class AllnetServlet:
zipped = zlib.compress(unzipped)
return base64.b64encode(zipped)
def dec_lite(self, key, iv, data):
cipher = AES.new(bytes(key), AES.MODE_CBC, iv)
decrypted = cipher.decrypt(data)
return decrypted[16:].decode("utf-8")
def enc_lite(self, key, iv, data):
unencrypted = pad(bytes([0] * 16) + data.encode('utf-8'), 16)
cipher = AES.new(bytes(key), AES.MODE_CBC, iv)
encrypted = cipher.encrypt(unencrypted)
return encrypted
class BillingServlet:
def __init__(self, core_cfg: CoreConfig, cfg_folder: str) -> None:
self.config = core_cfg
@@ -576,39 +678,19 @@ class BillingServlet:
rsa = RSA.import_key(open(self.config.billing.signing_key, "rb").read())
signer = PKCS1_v1_5.new(rsa)
digest = SHA.new()
traces: List[TraceData] = []
try:
req = BillingInfo(req_dict[0])
except KeyError as e:
self.logger.error(f"Billing request failed to parse: {e}")
return PlainTextResponse("result=5&linelimit=&message=field is missing or formatting is incorrect\r\n")
for x in range(1, len(req_dict)):
if not req_dict[x]:
continue
try:
tmp = TraceData(req_dict[x])
if tmp.trace_type == TraceDataType.CHARGE:
tmp = TraceDataCharge(req_dict[x])
elif tmp.trace_type == TraceDataType.EVENT:
tmp = TraceDataEvent(req_dict[x])
elif tmp.trace_type == TraceDataType.CREDIT:
tmp = TraceDataCredit(req_dict[x])
traces.append(tmp)
except KeyError as e:
self.logger.warn(f"Tracelog failed to parse: {e}")
kc_serial_bytes = req.keychipid.encode()
machine = await self.data.arcade.get_machine(req.keychipid)
if machine is None and not self.config.server.allow_unregistered_serials:
msg = f"Unrecognised serial {req.keychipid} attempted billing checkin from {request_ip} for {req.gameid} v{req.gamever}."
await self.data.base.log_event(
"allnet", "BILLING_CHECKIN_NG_SERIAL", logging.WARN, msg, ip=request_ip, game=req.gameid, version=req.gamever
"allnet", "BILLING_CHECKIN_NG_SERIAL", logging.WARN, msg, ip=request_ip, game=req.gameid, version=str(req.gamever)
)
self.logger.warning(msg)
@@ -619,18 +701,101 @@ class BillingServlet:
"billing_type": req.billingtype.name,
"nearfull": req.nearfull,
"playlimit": req.playlimit,
"messages": []
}
playhist = "000000/0:000000/0:000000/0"
if machine is not None:
await self.data.base.log_event("billing", "BILLING_CHECKIN_OK", logging.INFO, "", log_details, None, machine['arcade'], machine['id'], request_ip, req.gameid, req.gamever)
if self.config.allnet.save_billing:
lastcredit = await self.data.arcade.billing_get_last_playcount(machine['id'], req.gameid)
if lastcredit is not None:
last_playct = lastcredit['playct']
else:
last_playct = 0
# Technically if a cab resets it's playcount and then does more plays then the previous
# playcount before a billing checkin occours, we will lose plays equal to the current playcount.
if req.playcnt < last_playct: await self.data.arcade.billing_add_playcount(machine['id'], req.gameid, req.playcnt)
elif req.playcnt == last_playct: pass # No plays since last checkin, skip update
else: await self.data.arcade.billing_add_playcount(machine['id'], req.gameid, req.playcnt - last_playct)
plays = await self.data.arcade.billing_get_playcount_3mo(machine['id'], req.gameid)
if plays is not None and len(plays) > 0:
playhist = ""
for x in range(len(plays) - 1, -1, -1): playhist += f"{plays[x]['year']:04d}{plays[x]['month']:02d}/{plays[x]['playct']}:"
playhist = playhist[:-1]
for x in range(1, len(req_dict)):
if not req_dict[x]:
continue
try:
tmp = TraceData(req_dict[x])
if tmp.trace_type == TraceDataType.CHARGE:
tmp = TraceDataCharge(req_dict[x])
if self.config.allnet.save_billing:
await self.data.arcade.billing_add_charge(
machine['id'],
tmp.game_id,
float(tmp.game_version),
tmp.play_count,
tmp.play_limit,
tmp.product_code,
tmp.product_count,
tmp.func_type,
tmp.player_number
)
self.logger.info(
f"Charge Trace from {req.keychipid}: {tmp.game_id} v{tmp.game_version} - player {tmp.player_number} got {tmp.product_count} of {tmp.product_code} func {tmp.func_type}"
)
elif tmp.trace_type == TraceDataType.EVENT:
tmp = TraceDataEvent(req_dict[x])
log_details['messages'].append(tmp.message)
self.logger.info(f"Event Trace from {req.keychipid}: {tmp.message}")
elif tmp.trace_type == TraceDataType.CREDIT:
tmp = TraceDataCredit(req_dict[x])
if self.config.allnet.save_billing:
await self.data.arcade.billing_set_credit(
machine['id'],
req.gameid,
tmp.chute_type.value,
tmp.service_type.value,
tmp.operation_type.value,
tmp.coin_rate0,
tmp.coin_rate1,
tmp.bonus_addition,
tmp.credit_rate,
tmp.credit0,
tmp.credit1,
tmp.credit2,
tmp.credit3,
tmp.credit4,
tmp.credit5,
tmp.credit6,
tmp.credit7
)
self.logger.info(
f"Credit Trace from {req.keychipid}: {tmp.operation_type} mode, {tmp.credit_rate} coins per credit, breakdown: {tmp.credit0} | {tmp.credit1} | {tmp.credit2} | {tmp.credit3} | {tmp.credit4} | {tmp.credit5} | {tmp.credit6} | {tmp.credit7} | "
)
except KeyError as e:
self.logger.warning(f"Tracelog failed to parse: {e}")
await self.data.base.log_event("billing", "BILLING_CHECKIN_OK", logging.INFO, "", log_details, None, machine['arcade'], machine['id'], request_ip, req.gameid, str(req.gamever))
self.logger.info(
f"Unregistered Billing checkin from {request_ip}: game {req.gameid} ver {req.gamever} keychip {req.keychipid} playcount "
f"Billing checkin from {request_ip}: game {req.gameid} ver {req.gamever} keychip {req.keychipid} playcount "
f"{req.playcnt} billing_type {req.billingtype.name} nearfull {req.nearfull} playlimit {req.playlimit}"
)
else:
log_details['serial'] = req.keychipid
await self.data.base.log_event("billing", "BILLING_CHECKIN_OK_UNREG", logging.INFO, "", log_details, None, None, None, request_ip, req.gameid, req.gamever)
await self.data.base.log_event("billing", "BILLING_CHECKIN_OK_UNREG", logging.INFO, "", log_details, None, None, None, request_ip, req.gameid, str(req.gamever))
self.logger.info(
f"Unregistered Billing checkin from {request_ip}: game {req.gameid} ver {req.gamever} keychip {req.keychipid} playcount "
@@ -638,16 +803,14 @@ class BillingServlet:
)
if req.traceleft > 0:
self.logger.warn(f"{req.traceleft} unsent tracelogs")
kc_playlimit = req.playlimit
kc_nearfull = req.nearfull
self.logger.info(f"Requesting 20 more of {req.traceleft} unsent tracelogs")
return PlainTextResponse("result=6&waittime=0&linelimit=20\r\n")
playlimit = req.playlimit
while req.playcnt > playlimit:
playlimit += 1024
while req.playcnt > req.playlimit:
kc_playlimit += 1024
kc_nearfull += 1024
playlimit = kc_playlimit
nearfull = kc_nearfull + (req.billingtype.value * 0x00010000)
nearfull = req.nearfull + (req.billingtype.value * 0x00010000)
digest.update(playlimit.to_bytes(4, "little") + kc_serial_bytes)
playlimit_sig = signer.sign(digest).hex()
@@ -656,16 +819,11 @@ class BillingServlet:
digest.update(nearfull.to_bytes(4, "little") + kc_serial_bytes)
nearfull_sig = signer.sign(digest).hex()
# TODO: playhistory
resp = BillingResponse(playlimit, playlimit_sig, nearfull, nearfull_sig, req.requestno, req.protocolver)
resp = BillingResponse(playlimit, playlimit_sig, nearfull, nearfull_sig, req.requestno, req.protocolver, playhist)
resp_str = urllib.parse.unquote(urllib.parse.urlencode(vars(resp))) + "\r\n"
self.logger.debug(f"response {vars(resp)}")
if req.traceleft > 0:
self.logger.info(f"Requesting 20 more of {req.traceleft} unsent tracelogs")
return PlainTextResponse("result=6&waittime=0&linelimit=20\r\n")
return PlainTextResponse(resp_str)
@@ -705,6 +863,15 @@ class AllnetPowerOnResponse:
self.minute = datetime.now().minute
self.second = datetime.now().second
class AllnetPowerOnRequestLite:
def __init__(self, req: Dict) -> None:
if req is None:
raise AllnetRequestException("Request processing failed")
self.game_id: str = req.get("title_id", None)
self.ver: str = req.get("title_ver", None)
self.serial: str = req.get("client_id", None)
self.token: str = req.get("token", None)
class AllnetPowerOnResponse3(AllnetPowerOnResponse):
def __init__(self, token) -> None:
super().__init__()
@@ -736,6 +903,30 @@ class AllnetPowerOnResponse2(AllnetPowerOnResponse):
self.timezone = "+09:00"
self.res_class = "PowerOnResponseV2"
class AllnetPowerOnResponseLite:
def __init__(self, token) -> None:
# Custom Allnet Lite response
self.result = 1
self.place_id = "0123"
self.uri1 = ""
self.uri2 = ""
self.name = "ARTEMiS"
self.nickname = "ARTEMiS"
self.setting = "1"
self.region0 = "1"
self.region_name0 = "W"
self.region_name1 = ""
self.region_name2 = ""
self.region_name3 = ""
self.country = "CHN"
self.location_type = "1"
self.utc_time = datetime.now(tz=pytz.timezone("UTC")).strftime(
"%Y-%m-%dT%H:%M:%SZ"
)
self.client_timezone = "+0800"
self.res_ver = "3"
self.token = token
class AllnetDownloadOrderRequest:
def __init__(self, req: Dict) -> None:
self.game_id = req.get("game_id", "")
@@ -743,12 +934,23 @@ class AllnetDownloadOrderRequest:
self.serial = req.get("serial", "")
self.encode = req.get("encode", "")
class AllnetDownloadOrderRequestLite:
def __init__(self, req: Dict) -> None:
self.game_id = req.get("title_id", "")
self.ver = req.get("title_ver", "")
self.serial = req.get("client_id", "")
class AllnetDownloadOrderResponse:
def __init__(self, stat: int = 1, serial: str = "", uri: str = "") -> None:
def __init__(self, stat: int = 1, serial: str = "", uri: str = "null") -> None:
self.stat = stat
self.serial = serial
self.uri = uri
class AllnetDownloadOrderResponseLite:
def __init__(self, result: int = 1, uri: str = "null") -> None:
self.result = result
self.uri = uri
class TraceDataType(Enum):
CHARGE = 0
EVENT = 1
@@ -758,14 +960,27 @@ class BillingType(Enum):
A = 1
B = 0
class TraceDataCreditChuteType(Enum):
COMMON = 0
INDIVIDUAL = 1
class TraceDataCreditOperationType(Enum):
COIN = 0
FREEPLAY = 1
class float5:
def __init__(self, n: str = "0") -> None:
def __init__(self, n: str = "0"):
nf = float(n)
if nf > 999.9 or nf < 0:
raise ValueError('float5 must be between 0.000 and 999.9 inclusive')
return nf
self.val = nf
def __float__(self) -> float:
return self.val
def __str__(self) -> str:
return f"%.{2 - int(math.log10(self.val))+1}f" % self.val
@classmethod
def to_str(cls, f: float):
return f"%.{2 - int(math.log10(f))+1}f" % f
@@ -776,13 +991,13 @@ class BillingInfo:
self.keychipid = str(data.get("keychipid", None))
self.functype = int(data.get("functype", None))
self.gameid = str(data.get("gameid", None))
self.gamever = float(data.get("gamever", None))
self.gamever = float5(data.get("gamever", None))
self.boardid = str(data.get("boardid", None))
self.tenpoip = str(data.get("tenpoip", None))
self.libalibver = float(data.get("libalibver", None))
self.libalibver = float5(data.get("libalibver", None))
self.datamax = int(data.get("datamax", None))
self.billingtype = BillingType(int(data.get("billingtype", None)))
self.protocolver = float(data.get("protocolver", None))
self.protocolver = float5(data.get("protocolver", None))
self.operatingfix = bool(data.get("operatingfix", None))
self.traceleft = int(data.get("traceleft", None))
self.requestno = int(data.get("requestno", None))
@@ -815,7 +1030,7 @@ class TraceData:
self.date = datetime.strptime(data.get("dt", None), BILLING_DT_FORMAT)
self.keychip = str(data.get("kn", None))
self.lib_ver = float(data.get("alib", 0))
self.lib_ver = float5(data.get("alib", 0))
except Exception as e:
raise KeyError(e)
@@ -824,7 +1039,7 @@ class TraceDataCharge(TraceData):
super().__init__(data)
try:
self.game_id = str(data.get("gi", None)) # these seem optional...?
self.game_version = float(data.get("gv", 0))
self.game_version = float5(data.get("gv", 0))
self.board_serial = str(data.get("bn", None))
self.shop_ip = str(data.get("ti", None))
self.play_count = int(data.get("pc", None))
@@ -848,9 +1063,9 @@ class TraceDataCredit(TraceData):
def __init__(self, data: Dict) -> None:
super().__init__(data)
try:
self.chute_type = int(data.get("cct", None))
self.service_type = int(data.get("cst", None))
self.operation_type = int(data.get("cop", None))
self.chute_type = TraceDataCreditChuteType(int(data.get("cct", None)))
self.service_type = TraceDataCreditChuteType(int(data.get("cst", None)))
self.operation_type = TraceDataCreditOperationType(int(data.get("cop", None)))
self.coin_rate0 = int(data.get("cr0", None))
self.coin_rate1 = int(data.get("cr1", None))
self.bonus_addition = int(data.get("cba", None))
@@ -874,7 +1089,7 @@ class BillingResponse:
nearfull: str = "",
nearfull_sig: str = "",
request_num: int = 1,
protocol_ver: float = 1.000,
protocol_ver: float5 = float5("1.000"),
playhistory: str = "000000/0:000000/0:000000/0",
) -> None:
self.result = 0
@@ -888,7 +1103,7 @@ class BillingResponse:
self.nearfull = nearfull
self.nearfullsig = nearfull_sig
self.linelimit = 100
self.protocolver = float5.to_str(protocol_ver)
self.protocolver = str(protocol_ver)
# playhistory -> YYYYMM/C:...
# YYYY -> 4 digit year, MM -> 2 digit month, C -> Playcount during that period
@@ -987,7 +1202,9 @@ app_billing = Starlette(
allnet = AllnetServlet(cfg, cfg_dir)
route_lst = [
Route("/sys/servlet/PowerOn", allnet.handle_poweron, methods=["GET", "POST"]),
Route("/net/initialize", allnet.handle_poweron, methods=["GET", "POST"]),
Route("/sys/servlet/DownloadOrder", allnet.handle_dlorder, methods=["GET", "POST"]),
Route("/net/delivery/instruction", allnet.handle_dlorder, methods=["GET", "POST"]),
Route("/sys/servlet/LoaderStateRecorder", allnet.handle_loaderstaterecorder, methods=["GET", "POST"]),
Route("/sys/servlet/Alive", allnet.handle_alive, methods=["GET", "POST"]),
Route("/naomitest.html", allnet.handle_naomitest),

View File

@@ -9,7 +9,9 @@ from starlette.responses import PlainTextResponse
from os import environ, path, mkdir, W_OK, access
from typing import List
from core import CoreConfig, TitleServlet, MuchaServlet, AllnetServlet, BillingServlet, AimedbServlette
from core import CoreConfig, TitleServlet, MuchaServlet
from core.allnet import AllnetServlet, BillingServlet
from core.chimedb import ChimeServlet
from core.frontend import FrontendServlet
async def dummy_rt(request: Request):
@@ -74,7 +76,9 @@ if not cfg.allnet.standalone:
allnet = AllnetServlet(cfg, cfg_dir)
route_lst += [
Route("/sys/servlet/PowerOn", allnet.handle_poweron, methods=["GET", "POST"]),
Route("/net/initialize", allnet.handle_poweron, methods=["GET", "POST"]),
Route("/sys/servlet/DownloadOrder", allnet.handle_dlorder, methods=["GET", "POST"]),
Route("/net/delivery/instruction", allnet.handle_dlorder, methods=["GET", "POST"]),
Route("/sys/servlet/LoaderStateRecorder", allnet.handle_loaderstaterecorder, methods=["GET", "POST"]),
Route("/sys/servlet/Alive", allnet.handle_alive, methods=["GET", "POST"]),
Route("/naomitest.html", allnet.handle_naomitest),
@@ -86,6 +90,14 @@ if not cfg.allnet.standalone:
Route("/dl/ini/{file:str}", allnet.handle_dlorder_ini),
]
if cfg.chimedb.enable:
chimedb = ChimeServlet(cfg, cfg_dir)
route_lst += [
Route("/wc_aime/api/alive_check", chimedb.handle_qr_alive, methods=["POST"]),
Route("/qrcode/api/alive_check", chimedb.handle_qr_alive, methods=["POST"]),
Route("/wc_aime/api/get_data", chimedb.handle_qr_lookup, methods=["POST"])
]
for code, game in title.title_registry.items():
route_lst += game.get_routes()

139
core/chimedb.py Normal file
View File

@@ -0,0 +1,139 @@
import hashlib
import json
import logging
from enum import Enum
from logging.handlers import TimedRotatingFileHandler
import coloredlogs
from starlette.responses import PlainTextResponse
from starlette.requests import Request
from core.config import CoreConfig
from core.data import Data
class ChimeDBStatus(Enum):
NONE = 0
READER_SETUP_FAIL = 1
READER_ACCESS_FAIL = 2
READER_INCOMPATIBLE = 3
DB_RESOLVE_FAIL = 4
DB_ACCESS_TIMEOUT = 5
DB_ACCESS_FAIL = 6
AIME_ID_INVALID = 7
NO_BOARD_INFO = 8
LOCK_BAN_SYSTEM_USER = 9
LOCK_BAN_SYSTEM = 10
LOCK_BAN_USER = 11
LOCK_BAN = 12
LOCK_SYSTEM_USER = 13
LOCK_SYSTEM = 14
LOCK_USER = 15
class ChimeServlet:
def __init__(self, core_cfg: CoreConfig, cfg_folder: str) -> None:
self.config = core_cfg
self.config_folder = cfg_folder
self.data = Data(core_cfg)
self.logger = logging.getLogger("chimedb")
if not hasattr(self.logger, "initted"):
log_fmt_str = "[%(asctime)s] Chimedb | %(levelname)s | %(message)s"
log_fmt = logging.Formatter(log_fmt_str)
fileHandler = TimedRotatingFileHandler(
"{0}/{1}.log".format(self.config.server.log_dir, "chimedb"),
when="d",
backupCount=10,
)
fileHandler.setFormatter(log_fmt)
consoleHandler = logging.StreamHandler()
consoleHandler.setFormatter(log_fmt)
self.logger.addHandler(fileHandler)
self.logger.addHandler(consoleHandler)
self.logger.setLevel(self.config.aimedb.loglevel)
coloredlogs.install(
level=core_cfg.aimedb.loglevel, logger=self.logger, fmt=log_fmt_str
)
self.logger.initted = True
if not core_cfg.chimedb.key:
self.logger.error("!!!KEY NOT SET!!!")
exit(1)
self.logger.info("Serving")
async def handle_qr_alive(self, request: Request):
return PlainTextResponse("alive")
async def handle_qr_lookup(self, request: Request) -> bytes:
req = json.loads(await request.body())
access_code = req["qrCode"][-20:]
timestamp = req["timestamp"]
try:
userId = await self._lookup(access_code)
data = json.dumps({
"userID": userId,
"errorID": 0,
"timestamp": timestamp,
"key": self._hash_key(userId, timestamp)
})
except Exception as e:
self.logger.error(e.with_traceback(None))
data = json.dumps({
"userID": -1,
"errorID": ChimeDBStatus.DB_ACCESS_FAIL,
"timestamp": timestamp,
"key": self._hash_key(-1, timestamp)
})
return PlainTextResponse(data)
def _hash_key(self, chip_id, timestamp):
input_string = f"{chip_id}{timestamp}{self.config.chimedb.key}"
hash_object = hashlib.sha256(input_string.encode('utf-8'))
hex_dig = hash_object.hexdigest()
formatted_hex = format(int(hex_dig, 16), '064x').upper()
return formatted_hex
async def _lookup(self, access_code):
user_id = await self.data.card.get_user_id_from_card(access_code)
self.logger.info(f"access_code {access_code} -> user_id {user_id}")
if not user_id or user_id <= 0:
user_id = await self._register(access_code)
return user_id
async def _register(self, access_code):
user_id = -1
if self.config.server.allow_user_registration:
user_id = await self.data.user.create_user()
if user_id is None:
self.logger.error("Failed to register user!")
user_id = -1
else:
card_id = await self.data.card.create_card(user_id, access_code)
if card_id is None:
self.logger.error("Failed to register card!")
user_id = -1
self.logger.info(
f"Register access code {access_code} -> user_id {user_id}"
)
else:
self.logger.info(f"Registration blocked!: access code {access_code}")
return user_id

View File

@@ -1,5 +1,9 @@
import logging, os
from typing import Any
import logging
import os
import ssl
from typing import Any, Union, Dict
from typing_extensions import Optional
class ServerConfig:
def __init__(self, parent_config: "CoreConfig") -> None:
@@ -41,7 +45,7 @@ class ServerConfig:
@property
def ssl_cert(self) -> str:
return CoreConfig.get_config_field(
self.__config, "core", "title", "ssl_cert", default="cert/title.pem"
self.__config, "core", "server", "ssl_cert", default="cert/title.pem"
)
@property
@@ -176,6 +180,60 @@ class DatabaseConfig:
self.__config, "core", "database", "protocol", default="mysql"
)
@property
def ssl_enabled(self) -> bool:
return CoreConfig.get_config_field(
self.__config, "core", "database", "ssl_enabled", default=False
)
@property
def ssl_cafile(self) -> Optional[str]:
return CoreConfig.get_config_field(
self.__config, "core", "database", "ssl_cafile", default=None
)
@property
def ssl_capath(self) -> Optional[str]:
return CoreConfig.get_config_field(
self.__config, "core", "database", "ssl_capath", default=None
)
@property
def ssl_cert(self) -> Optional[str]:
return CoreConfig.get_config_field(
self.__config, "core", "database", "ssl_cert", default=None
)
@property
def ssl_key(self) -> Optional[str]:
return CoreConfig.get_config_field(
self.__config, "core", "database", "ssl_key", default=None
)
@property
def ssl_key_password(self) -> Optional[str]:
return CoreConfig.get_config_field(
self.__config, "core", "database", "ssl_key_password", default=None
)
@property
def ssl_verify_identity(self) -> bool:
return CoreConfig.get_config_field(
self.__config, "core", "database", "ssl_verify_identity", default=True
)
@property
def ssl_verify_cert(self) -> Optional[Union[str, bool]]:
return CoreConfig.get_config_field(
self.__config, "core", "database", "ssl_verify_cert", default=None
)
@property
def ssl_ciphers(self) -> Optional[str]:
return CoreConfig.get_config_field(
self.__config, "core", "database", "ssl_ciphers", default=None
)
@property
def sha2_password(self) -> bool:
return CoreConfig.get_config_field(
@@ -202,6 +260,53 @@ class DatabaseConfig:
self.__config, "core", "database", "memcached_host", default="localhost"
)
def create_ssl_context_if_enabled(self):
if not self.ssl_enabled:
return
no_ca = (
self.ssl_cafile is None
and self.ssl_capath is None
)
ctx = ssl.create_default_context(
cafile=self.ssl_cafile,
capath=self.ssl_capath,
)
ctx.check_hostname = not no_ca and self.ssl_verify_identity
if self.ssl_verify_cert is None:
ctx.verify_mode = ssl.CERT_NONE if no_ca else ssl.CERT_REQUIRED
elif isinstance(self.ssl_verify_cert, bool):
ctx.verify_mode = (
ssl.CERT_REQUIRED
if self.ssl_verify_cert
else ssl.CERT_NONE
)
elif isinstance(self.ssl_verify_cert, str):
value = self.ssl_verify_cert.lower()
if value in ("none", "0", "false", "no"):
ctx.verify_mode = ssl.CERT_NONE
elif value == "optional":
ctx.verify_mode = ssl.CERT_OPTIONAL
elif value in ("required", "1", "true", "yes"):
ctx.verify_mode = ssl.CERT_REQUIRED
else:
ctx.verify_mode = ssl.CERT_NONE if no_ca else ssl.CERT_REQUIRED
if self.ssl_cert:
ctx.load_cert_chain(
self.ssl_cert,
self.ssl_key,
self.ssl_key_password,
)
if self.ssl_ciphers:
ctx.set_ciphers(self.ssl_ciphers)
return ctx
class FrontendConfig:
def __init__(self, parent_config: "CoreConfig") -> None:
self.__config = parent_config
@@ -257,7 +362,7 @@ class AllnetConfig:
)
@property
def allow_online_updates(self) -> int:
def allow_online_updates(self) -> bool:
return CoreConfig.get_config_field(
self.__config, "core", "allnet", "allow_online_updates", default=False
)
@@ -268,6 +373,17 @@ class AllnetConfig:
self.__config, "core", "allnet", "update_cfg_folder", default=""
)
@property
def save_billing(self) -> bool:
return CoreConfig.get_config_field(
self.__config, "core", "allnet", "save_billing", default=False
)
@property
def allnet_lite_keys(self) -> Dict:
return CoreConfig.get_config_field(
self.__config, "core", "allnet", "allnet_lite_keys", default={}
)
class BillingConfig:
def __init__(self, parent_config: "CoreConfig") -> None:
self.__config = parent_config
@@ -358,6 +474,28 @@ class AimedbConfig:
self.__config, "core", "aimedb", "id_lifetime_seconds", default=86400
)
class ChimedbConfig:
def __init__(self, parent_config: "CoreConfig") -> None:
self.__config = parent_config
@property
def enable(self) -> bool:
return CoreConfig.get_config_field(
self.__config, "core", "chimedb", "enable", default=True
)
@property
def loglevel(self) -> int:
return CoreConfig.str_to_loglevel(
CoreConfig.get_config_field(
self.__config, "core", "chimedb", "loglevel", default="info"
)
)
@property
def key(self) -> str:
return CoreConfig.get_config_field(
self.__config, "core", "chimedb", "key", default=""
)
class MuchaConfig:
def __init__(self, parent_config: "CoreConfig") -> None:
self.__config = parent_config
@@ -379,6 +517,7 @@ class CoreConfig(dict):
self.allnet = AllnetConfig(self)
self.billing = BillingConfig(self)
self.aimedb = AimedbConfig(self)
self.chimedb = ChimedbConfig(self)
self.mucha = MuchaConfig(self)
@classmethod

View File

@@ -45,6 +45,14 @@ class AllnetCountryCode(Enum):
SOUTH_KOREA = "KOR"
TAIWAN = "TWN"
CHINA = "CHN"
AUSTRALIA = "AUS"
INDONESIA = "IDN"
MYANMAR = "MMR"
MALAYSIA = "MYS"
NEW_ZEALAND = "NZL"
PHILIPPINES = "PHL"
THAILAND = "THA"
VIETNAM = "VNM"
class AllnetJapanRegionId(Enum):

View File

@@ -1,8 +1,18 @@
from __future__ import with_statement
from alembic import context
from sqlalchemy import engine_from_config, pool
import asyncio
import os
from pathlib import Path
import threading
from logging.config import fileConfig
import yaml
from alembic import context
from sqlalchemy import pool
from sqlalchemy.engine import Connection
from sqlalchemy.ext.asyncio import async_engine_from_config
from core.config import CoreConfig
from core.data.schema.base import metadata
# this is the Alembic Config object, which provides
@@ -37,20 +47,29 @@ def run_migrations_offline():
script output.
"""
raise Exception('Not implemented or configured!')
raise Exception("Not implemented or configured!")
url = config.get_main_option("sqlalchemy.url")
context.configure(
url=url, target_metadata=target_metadata, literal_binds=True)
context.configure(url=url, target_metadata=target_metadata, literal_binds=True)
with context.begin_transaction():
context.run_migrations()
def run_migrations_online():
"""Run migrations in 'online' mode.
def do_run_migrations(connection: Connection) -> None:
context.configure(
connection=connection,
target_metadata=target_metadata,
compare_type=True,
compare_server_default=True,
)
In this scenario we need to create an Engine
with context.begin_transaction():
context.run_migrations()
async def run_async_migrations() -> None:
"""In this scenario we need to create an Engine
and associate a connection with the context.
"""
@@ -59,21 +78,42 @@ def run_migrations_online():
for override in overrides:
ini_section[override] = overrides[override]
connectable = engine_from_config(
core_config = CoreConfig()
with (Path("../../..") / os.environ["ARTEMIS_CFG_DIR"] / "core.yaml").open(encoding="utf-8") as f:
core_config.update(yaml.safe_load(f))
connectable = async_engine_from_config(
ini_section,
prefix='sqlalchemy.',
poolclass=pool.NullPool)
poolclass=pool.NullPool,
connect_args={
"charset": "utf8mb4",
"ssl": core_config.database.create_ssl_context_if_enabled(),
}
)
with connectable.connect() as connection:
context.configure(
connection=connection,
target_metadata=target_metadata,
compare_type=True,
compare_server_default=True,
)
async with connectable.connect() as connection:
await connection.run_sync(do_run_migrations)
await connectable.dispose()
def run_migrations_online():
try:
loop = asyncio.get_running_loop()
except RuntimeError:
# there's no event loop
asyncio.run(run_async_migrations())
else:
# there's currently an event loop and trying to wait for a coroutine
# to finish without using `await` is pretty wormy. nested event loops
# are explicitly forbidden by asyncio.
#
# take the easy way out, spawn it in another thread.
thread = threading.Thread(target=asyncio.run, args=(run_async_migrations(),))
thread.start()
thread.join()
with context.begin_transaction():
context.run_migrations()
if context.is_offline_mode():
run_migrations_offline()

View File

@@ -0,0 +1,30 @@
"""ONGEKI update ongeki_static_tech_music_uk
Revision ID: 1d0014d35220
Revises: 91c682918b67
Create Date: 2025-03-26 20:44:55.590992
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision = '1d0014d35220'
down_revision = '91c682918b67'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_constraint('ongeki_static_tech_music_uk', 'ongeki_static_tech_music', type_='unique')
op.create_unique_constraint('ongeki_static_tech_music_uk', 'ongeki_static_tech_music', ['version', 'eventId', 'musicId', 'level'])
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_constraint('ongeki_static_tech_music_uk', 'ongeki_static_tech_music', type_='unique')
op.create_unique_constraint('ongeki_static_tech_music_uk', 'ongeki_static_tech_music', ['version', 'musicId'])
# ### end Alembic commands ###

View File

@@ -0,0 +1,164 @@
"""acc_opt_tables
Revision ID: 263884e774cc
Revises: 1d0014d35220
Create Date: 2025-04-07 18:05:53.349320
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision = '263884e774cc'
down_revision = '1d0014d35220'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('chuni_static_opt',
sa.Column('id', sa.BIGINT(), nullable=False),
sa.Column('version', sa.INTEGER(), nullable=False),
sa.Column('name', sa.VARCHAR(length=4), nullable=False),
sa.Column('sequence', sa.INTEGER(), nullable=False),
sa.Column('whenRead', sa.TIMESTAMP(), server_default=sa.text('now()'), nullable=False),
sa.Column('isEnable', sa.BOOLEAN(), server_default='1', nullable=False),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('version', 'name', name='chuni_static_opt_uk'),
mysql_charset='utf8mb4'
)
op.create_table('cm_static_opts',
sa.Column('id', sa.BIGINT(), nullable=False),
sa.Column('version', sa.INTEGER(), nullable=False),
sa.Column('name', sa.VARCHAR(length=4), nullable=False),
sa.Column('sequence', sa.INTEGER(), nullable=True),
sa.Column('gekiVersion', sa.INTEGER(), nullable=True),
sa.Column('gekiReleaseVer', sa.INTEGER(), nullable=True),
sa.Column('maiVersion', sa.INTEGER(), nullable=True),
sa.Column('maiReleaseVer', sa.INTEGER(), nullable=True),
sa.Column('whenRead', sa.TIMESTAMP(), server_default=sa.text('now()'), nullable=False),
sa.Column('isEnable', sa.BOOLEAN(), server_default='1', nullable=False),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('version', 'name', name='cm_static_opts_uk'),
mysql_charset='utf8mb4'
)
op.create_table('mai2_static_opt',
sa.Column('id', sa.BIGINT(), nullable=False),
sa.Column('version', sa.INTEGER(), nullable=False),
sa.Column('name', sa.VARCHAR(length=4), nullable=False),
sa.Column('sequence', sa.INTEGER(), nullable=False),
sa.Column('cmReleaseVer', sa.INTEGER(), nullable=False),
sa.Column('whenRead', sa.TIMESTAMP(), server_default=sa.text('now()'), nullable=False),
sa.Column('isEnable', sa.BOOLEAN(), server_default='1', nullable=False),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('version', 'name', name='mai2_static_opt_uk'),
mysql_charset='utf8mb4'
)
op.create_table('ongeki_static_opt',
sa.Column('id', sa.BIGINT(), nullable=False),
sa.Column('version', sa.INTEGER(), nullable=False),
sa.Column('name', sa.VARCHAR(length=4), nullable=False),
sa.Column('sequence', sa.INTEGER(), nullable=False),
sa.Column('cmReleaseVer', sa.INTEGER(), nullable=False),
sa.Column('whenRead', sa.TIMESTAMP(), server_default=sa.text('now()'), nullable=False),
sa.Column('isEnable', sa.BOOLEAN(), server_default='1', nullable=False),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('version', 'name', name='ongeki_static_opt_uk'),
mysql_charset='utf8mb4'
)
op.add_column('chuni_static_avatar', sa.Column('opt', sa.BIGINT(), nullable=True))
op.create_foreign_key(None, 'chuni_static_avatar', 'chuni_static_opt', ['opt'], ['id'], onupdate='cascade', ondelete='SET NULL')
op.add_column('chuni_static_cards', sa.Column('opt', sa.BIGINT(), nullable=True))
op.create_foreign_key(None, 'chuni_static_cards', 'cm_static_opts', ['opt'], ['id'], onupdate='cascade', ondelete='SET NULL')
op.add_column('chuni_static_character', sa.Column('opt', sa.BIGINT(), nullable=True))
op.create_foreign_key(None, 'chuni_static_character', 'chuni_static_opt', ['opt'], ['id'], onupdate='cascade', ondelete='SET NULL')
op.add_column('chuni_static_charge', sa.Column('opt', sa.BIGINT(), nullable=True))
op.create_foreign_key(None, 'chuni_static_charge', 'chuni_static_opt', ['opt'], ['id'], onupdate='cascade', ondelete='SET NULL')
op.add_column('chuni_static_events', sa.Column('opt', sa.BIGINT(), nullable=True))
op.create_foreign_key(None, 'chuni_static_events', 'chuni_static_opt', ['opt'], ['id'], onupdate='cascade', ondelete='SET NULL')
op.add_column('chuni_static_gachas', sa.Column('opt', sa.BIGINT(), nullable=True))
op.create_foreign_key(None, 'chuni_static_gachas', 'cm_static_opts', ['opt'], ['id'], onupdate='cascade', ondelete='SET NULL')
op.add_column('chuni_static_login_bonus', sa.Column('opt', sa.BIGINT(), nullable=True))
op.create_foreign_key(None, 'chuni_static_login_bonus', 'chuni_static_opt', ['opt'], ['id'], onupdate='cascade', ondelete='SET NULL')
op.add_column('chuni_static_login_bonus_preset', sa.Column('opt', sa.BIGINT(), nullable=True))
op.create_foreign_key(None, 'chuni_static_login_bonus_preset', 'chuni_static_opt', ['opt'], ['id'], onupdate='cascade', ondelete='SET NULL')
op.add_column('chuni_static_map_icon', sa.Column('opt', sa.BIGINT(), nullable=True))
op.create_foreign_key(None, 'chuni_static_map_icon', 'chuni_static_opt', ['opt'], ['id'], onupdate='cascade', ondelete='SET NULL')
op.add_column('chuni_static_music', sa.Column('opt', sa.BIGINT(), nullable=True))
op.create_foreign_key(None, 'chuni_static_music', 'chuni_static_opt', ['opt'], ['id'], onupdate='cascade', ondelete='SET NULL')
op.add_column('chuni_static_system_voice', sa.Column('opt', sa.BIGINT(), nullable=True))
op.create_foreign_key(None, 'chuni_static_system_voice', 'chuni_static_opt', ['opt'], ['id'], onupdate='cascade', ondelete='SET NULL')
op.add_column('chuni_static_trophy', sa.Column('opt', sa.BIGINT(), nullable=True))
op.create_foreign_key(None, 'chuni_static_trophy', 'chuni_static_opt', ['opt'], ['id'], onupdate='cascade', ondelete='SET NULL')
op.add_column('mai2_static_cards', sa.Column('opt', sa.BIGINT(), nullable=True))
op.create_foreign_key(None, 'mai2_static_cards', 'cm_static_opts', ['opt'], ['id'], onupdate='cascade', ondelete='SET NULL')
op.add_column('mai2_static_event', sa.Column('opt', sa.BIGINT(), nullable=True))
op.create_foreign_key(None, 'mai2_static_event', 'mai2_static_opt', ['opt'], ['id'], onupdate='cascade', ondelete='SET NULL')
op.add_column('mai2_static_music', sa.Column('opt', sa.BIGINT(), nullable=True))
op.create_foreign_key(None, 'mai2_static_music', 'mai2_static_opt', ['opt'], ['id'], onupdate='cascade', ondelete='SET NULL')
op.add_column('mai2_static_ticket', sa.Column('opt', sa.BIGINT(), nullable=True))
op.create_foreign_key(None, 'mai2_static_ticket', 'mai2_static_opt', ['opt'], ['id'], onupdate='cascade', ondelete='SET NULL')
op.add_column('ongeki_static_cards', sa.Column('opt', sa.BIGINT(), nullable=True))
op.create_foreign_key(None, 'ongeki_static_cards', 'ongeki_static_opt', ['opt'], ['id'], onupdate='cascade', ondelete='SET NULL')
op.add_column('ongeki_static_events', sa.Column('opt', sa.BIGINT(), nullable=True))
op.create_foreign_key(None, 'ongeki_static_events', 'ongeki_static_opt', ['opt'], ['id'], onupdate='cascade', ondelete='SET NULL')
op.add_column('ongeki_static_gachas', sa.Column('opt', sa.BIGINT(), nullable=True))
op.create_foreign_key(None, 'ongeki_static_gachas', 'cm_static_opts', ['opt'], ['id'], onupdate='cascade', ondelete='SET NULL')
op.add_column('ongeki_static_music', sa.Column('opt', sa.BIGINT(), nullable=True))
op.create_foreign_key(None, 'ongeki_static_music', 'ongeki_static_opt', ['opt'], ['id'], onupdate='cascade', ondelete='SET NULL')
op.add_column('ongeki_static_rewards', sa.Column('opt', sa.BIGINT(), nullable=True))
op.create_foreign_key(None, 'ongeki_static_rewards', 'ongeki_static_opt', ['opt'], ['id'], onupdate='cascade', ondelete='SET NULL')
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_constraint("ongeki_static_rewards_ibfk_1", 'ongeki_static_rewards', type_='foreignkey')
op.drop_column('ongeki_static_rewards', 'opt')
op.drop_constraint("ongeki_static_music_ibfk_1", 'ongeki_static_music', type_='foreignkey')
op.drop_column('ongeki_static_music', 'opt')
op.drop_constraint("ongeki_static_gachas_ibfk_1", 'ongeki_static_gachas', type_='foreignkey')
op.drop_column('ongeki_static_gachas', 'opt')
op.drop_constraint("ongeki_static_events_ibfk_1", "ongeki_static_events", type_='foreignkey')
op.drop_column('ongeki_static_events', 'opt')
op.drop_constraint("ongeki_static_cards_ibfk_1", "ongeki_static_cards", type_='foreignkey')
op.drop_column('ongeki_static_cards', 'opt')
op.drop_constraint("mai2_static_ticket_ibfk_1", "mai2_static_ticket", type_='foreignkey')
op.drop_column('mai2_static_ticket', 'opt')
op.drop_constraint("mai2_static_music_ibfk_1", "mai2_static_music", type_='foreignkey')
op.drop_column('mai2_static_music', 'opt')
op.drop_constraint("mai2_static_event_ibfk_1", "mai2_static_event", type_='foreignkey')
op.drop_column('mai2_static_event', 'opt')
op.drop_constraint("mai2_static_cards_ibfk_1", "mai2_static_cards", type_='foreignkey')
op.drop_column('mai2_static_cards', 'opt')
op.drop_constraint("chuni_static_trophy_ibfk_1", "chuni_static_trophy", type_='foreignkey')
op.drop_column('chuni_static_trophy', 'opt')
op.drop_constraint("chuni_static_system_voice_ibfk_1", "chuni_static_system_voice", type_='foreignkey')
op.drop_column('chuni_static_system_voice', 'opt')
op.drop_constraint("chuni_static_music_ibfk_1", "chuni_static_music", type_='foreignkey')
op.drop_column('chuni_static_music', 'opt')
op.drop_constraint("chuni_static_map_icon_ibfk_1", "chuni_static_map_icon", type_='foreignkey')
op.drop_column('chuni_static_map_icon', 'opt')
op.drop_constraint("chuni_static_login_bonus_preset_ibfk_1", "chuni_static_login_bonus_preset", type_='foreignkey')
op.drop_column('chuni_static_login_bonus_preset', 'opt')
op.drop_constraint("chuni_static_login_bonus_ibfk_2", "chuni_static_login_bonus", type_='foreignkey')
op.drop_column('chuni_static_login_bonus', 'opt')
op.drop_constraint("chuni_static_gachas_ibfk_1", "chuni_static_gachas", type_='foreignkey')
op.drop_column('chuni_static_gachas', 'opt')
op.drop_constraint("chuni_static_events_ibfk_1", "chuni_static_events", type_='foreignkey')
op.drop_column('chuni_static_events', 'opt')
op.drop_constraint("chuni_static_charge_ibfk_1", "chuni_static_charge", type_='foreignkey')
op.drop_column('chuni_static_charge', 'opt')
op.drop_constraint("chuni_static_character_ibfk_1", "chuni_static_character", type_='foreignkey')
op.drop_column('chuni_static_character', 'opt')
op.drop_constraint("chuni_static_cards_ibfk_1", "chuni_static_cards", type_='foreignkey')
op.drop_column('chuni_static_cards', 'opt')
op.drop_constraint("chuni_static_avatar_ibfk_1", "chuni_static_avatar", type_='foreignkey')
op.drop_column('chuni_static_avatar', 'opt')
op.drop_table('ongeki_static_opt')
op.drop_table('mai2_static_opt')
op.drop_table('cm_static_opts')
op.drop_table('chuni_static_opt')
# ### end Alembic commands ###

View File

@@ -0,0 +1,66 @@
"""add_billing_tables
Revision ID: 27e3434740df
Revises: ae364c078429
Create Date: 2025-04-17 18:32:06.008601
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision = '27e3434740df'
down_revision = 'ae364c078429'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('machine_billing_charge',
sa.Column('id', sa.BIGINT(), nullable=False),
sa.Column('machine', sa.Integer(), nullable=False),
sa.Column('game_id', sa.CHAR(length=5), nullable=False),
sa.Column('game_ver', sa.FLOAT(), nullable=False),
sa.Column('play_count', sa.INTEGER(), nullable=False),
sa.Column('play_limit', sa.INTEGER(), nullable=False),
sa.Column('product_code', sa.INTEGER(), nullable=False),
sa.Column('product_count', sa.INTEGER(), nullable=False),
sa.Column('func_type', sa.INTEGER(), nullable=False),
sa.Column('player_number', sa.INTEGER(), nullable=False),
sa.ForeignKeyConstraint(['machine'], ['machine.id'], onupdate='cascade', ondelete='cascade'),
sa.PrimaryKeyConstraint('id'),
mysql_charset='utf8mb4'
)
op.create_table('machine_billing_credit',
sa.Column('id', sa.BIGINT(), nullable=False),
sa.Column('machine', sa.Integer(), nullable=False),
sa.Column('chute_type', sa.INTEGER(), nullable=False),
sa.Column('service_type', sa.INTEGER(), nullable=False),
sa.Column('operation_type', sa.INTEGER(), nullable=False),
sa.Column('coin_rate0', sa.INTEGER(), nullable=False),
sa.Column('coin_rate1', sa.INTEGER(), nullable=False),
sa.Column('coin_bonus', sa.INTEGER(), nullable=False),
sa.Column('credit_rate', sa.INTEGER(), nullable=False),
sa.Column('coin_count_slot0', sa.INTEGER(), nullable=False),
sa.Column('coin_count_slot1', sa.INTEGER(), nullable=False),
sa.Column('coin_count_slot2', sa.INTEGER(), nullable=False),
sa.Column('coin_count_slot3', sa.INTEGER(), nullable=False),
sa.Column('coin_count_slot4', sa.INTEGER(), nullable=False),
sa.Column('coin_count_slot5', sa.INTEGER(), nullable=False),
sa.Column('coin_count_slot6', sa.INTEGER(), nullable=False),
sa.Column('coin_count_slot7', sa.INTEGER(), nullable=False),
sa.ForeignKeyConstraint(['machine'], ['machine.id'], onupdate='cascade', ondelete='cascade'),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('machine'),
mysql_charset='utf8mb4'
)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_table('machine_billing_credit')
op.drop_table('machine_billing_charge')
# ### end Alembic commands ###

View File

@@ -0,0 +1,28 @@
"""mai2_buddies_plus
Revision ID: 28443e2da5b8
Revises: 5ea73f89d982
Create Date: 2024-09-15 20:44:02.351819
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = '28443e2da5b8'
down_revision = '5ea73f89d982'
branch_labels = None
depends_on = None
def upgrade():
op.add_column('mai2_profile_detail', sa.Column('point', sa.Integer()))
op.add_column('mai2_profile_detail', sa.Column('totalPoint', sa.Integer()))
op.add_column('mai2_profile_detail', sa.Column('friendRegistSkip', sa.SmallInteger()))
def downgrade():
op.drop_column('mai2_profile_detail', 'point')
op.drop_column('mai2_profile_detail', 'totalPoint')
op.drop_column('mai2_profile_detail', 'friendRegistSkip')

View File

@@ -0,0 +1,31 @@
"""chuni_subtrophy_db_fix
Revision ID: 318d52559e83
Revises: 8b57e9646449
Create Date: 2026-01-08 19:13:29.803912
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision = '318d52559e83'
down_revision = '8b57e9646449'
branch_labels = None
depends_on = None
def upgrade():
op.alter_column('chuni_profile_data', 'trophyIdSub1', existing_type=mysql.INTEGER(), server_default='-1')
op.alter_column('chuni_profile_data', 'trophyIdSub2', existing_type=mysql.INTEGER(), server_default='-1')
# fix any current profiles where the bad defaults were used
op.execute("UPDATE chuni_profile_data SET trophyIdSub1=-1 WHERE trophyIdSub1 IS NULL")
op.execute("UPDATE chuni_profile_data SET trophyIdSub2=-1 WHERE trophyIdSub2 IS NULL")
def downgrade():
# dont bother "unfixing" the table
pass

View File

@@ -0,0 +1,122 @@
"""chuni_ui_overhaul
Revision ID: 41f77ef50588
Revises: d8cd1fa04c2a
Create Date: 2024-11-02 13:27:45.839787
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision = '41f77ef50588'
down_revision = 'd8cd1fa04c2a'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('chuni_static_avatar', sa.Column('sortName', mysql.VARCHAR(length=255), nullable=True))
op.add_column('chuni_static_avatar', sa.Column('isEnabled', mysql.TINYINT(display_width=1), server_default=sa.text('1'), autoincrement=False, nullable=True))
op.add_column('chuni_static_avatar', sa.Column('defaultHave', mysql.TINYINT(display_width=1), server_default=sa.text('0'), autoincrement=False, nullable=True))
op.create_table('chuni_static_character',
sa.Column('id', mysql.INTEGER(display_width=11), autoincrement=True, nullable=False),
sa.Column('version', mysql.INTEGER(display_width=11), autoincrement=False, nullable=False),
sa.Column('characterId', mysql.INTEGER(display_width=11), autoincrement=False, nullable=True),
sa.Column('name', mysql.VARCHAR(length=255), nullable=True),
sa.Column('sortName', mysql.VARCHAR(length=255), nullable=True),
sa.Column('worksName', mysql.VARCHAR(length=255), nullable=True),
sa.Column('rareType', mysql.INTEGER(display_width=11), server_default=sa.text('0'), autoincrement=False, nullable=True),
sa.Column('imagePath1', mysql.VARCHAR(length=255), nullable=True),
sa.Column('imagePath2', mysql.VARCHAR(length=255), nullable=True),
sa.Column('imagePath3', mysql.VARCHAR(length=255), nullable=True),
sa.Column('isEnabled', mysql.TINYINT(display_width=1), server_default=sa.text('1'), autoincrement=False, nullable=True),
sa.Column('defaultHave', mysql.TINYINT(display_width=1), server_default=sa.text('0'), autoincrement=False, nullable=True),
sa.PrimaryKeyConstraint('id'),
mysql_collate='utf8mb4_general_ci',
mysql_default_charset='utf8mb4',
mysql_engine='InnoDB'
)
op.create_index('chuni_static_character_uk', 'chuni_static_character', ['version', 'characterId'], unique=True)
op.create_table('chuni_static_map_icon',
sa.Column('id', mysql.INTEGER(display_width=11), autoincrement=True, nullable=False),
sa.Column('version', mysql.INTEGER(display_width=11), autoincrement=False, nullable=False),
sa.Column('mapIconId', mysql.INTEGER(display_width=11), autoincrement=False, nullable=True),
sa.Column('name', mysql.VARCHAR(length=255), nullable=True),
sa.Column('sortName', mysql.VARCHAR(length=255), nullable=True),
sa.Column('iconPath', mysql.VARCHAR(length=255), nullable=True),
sa.Column('isEnabled', mysql.TINYINT(display_width=1), server_default=sa.text('1'), autoincrement=False, nullable=True),
sa.Column('defaultHave', mysql.TINYINT(display_width=1), server_default=sa.text('0'), autoincrement=False, nullable=True),
sa.PrimaryKeyConstraint('id'),
mysql_collate='utf8mb4_general_ci',
mysql_default_charset='utf8mb4',
mysql_engine='InnoDB'
)
op.create_index('chuni_static_mapicon_uk', 'chuni_static_map_icon', ['version', 'mapIconId'], unique=True)
op.create_table('chuni_static_nameplate',
sa.Column('id', mysql.INTEGER(display_width=11), autoincrement=True, nullable=False),
sa.Column('version', mysql.INTEGER(display_width=11), autoincrement=False, nullable=False),
sa.Column('nameplateId', mysql.INTEGER(display_width=11), autoincrement=False, nullable=True),
sa.Column('name', mysql.VARCHAR(length=255), nullable=True),
sa.Column('texturePath', mysql.VARCHAR(length=255), nullable=True),
sa.Column('isEnabled', mysql.TINYINT(display_width=1), server_default=sa.text('1'), autoincrement=False, nullable=True),
sa.Column('defaultHave', mysql.TINYINT(display_width=1), server_default=sa.text('0'), autoincrement=False, nullable=True),
sa.Column('sortName', mysql.VARCHAR(length=255), nullable=True),
sa.PrimaryKeyConstraint('id'),
mysql_collate='utf8mb4_general_ci',
mysql_default_charset='utf8mb4',
mysql_engine='InnoDB'
)
op.create_index('chuni_static_nameplate_uk', 'chuni_static_nameplate', ['version', 'nameplateId'], unique=True)
op.create_table('chuni_static_trophy',
sa.Column('id', mysql.INTEGER(display_width=11), autoincrement=True, nullable=False),
sa.Column('version', mysql.INTEGER(display_width=11), autoincrement=False, nullable=False),
sa.Column('trophyId', mysql.INTEGER(display_width=11), autoincrement=False, nullable=True),
sa.Column('name', mysql.VARCHAR(length=255), nullable=True),
sa.Column('rareType', mysql.TINYINT(display_width=11), server_default=sa.text('0'), autoincrement=False, nullable=True),
sa.Column('isEnabled', mysql.TINYINT(display_width=1), server_default=sa.text('1'), autoincrement=False, nullable=True),
sa.Column('defaultHave', mysql.TINYINT(display_width=1), server_default=sa.text('0'), autoincrement=False, nullable=True),
sa.PrimaryKeyConstraint('id'),
mysql_collate='utf8mb4_general_ci',
mysql_default_charset='utf8mb4',
mysql_engine='InnoDB'
)
op.create_index('chuni_static_trophy_uk', 'chuni_static_trophy', ['version', 'trophyId'], unique=True)
op.create_table('chuni_static_system_voice',
sa.Column('id', mysql.INTEGER(display_width=11), autoincrement=True, nullable=False),
sa.Column('version', mysql.INTEGER(display_width=11), autoincrement=False, nullable=False),
sa.Column('voiceId', mysql.INTEGER(display_width=11), autoincrement=False, nullable=True),
sa.Column('name', mysql.VARCHAR(length=255), nullable=True),
sa.Column('sortName', mysql.VARCHAR(length=255), nullable=True),
sa.Column('imagePath', mysql.VARCHAR(length=255), nullable=True),
sa.Column('isEnabled', mysql.TINYINT(display_width=1), server_default=sa.text('1'), autoincrement=False, nullable=True),
sa.Column('defaultHave', mysql.TINYINT(display_width=1), server_default=sa.text('0'), autoincrement=False, nullable=True),
sa.PrimaryKeyConstraint('id'),
mysql_collate='utf8mb4_general_ci',
mysql_default_charset='utf8mb4',
mysql_engine='InnoDB'
)
op.create_index('chuni_static_systemvoice_uk', 'chuni_static_system_voice', ['version', 'voiceId'], unique=True)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_index('chuni_static_systemvoice_uk', table_name='chuni_static_system_voice')
op.drop_table('chuni_static_system_voice')
op.drop_index('chuni_static_trophy_uk', table_name='chuni_static_trophy')
op.drop_table('chuni_static_trophy')
op.drop_index('chuni_static_nameplate_uk', table_name='chuni_static_nameplate')
op.drop_table('chuni_static_nameplate')
op.drop_index('chuni_static_mapicon_uk', table_name='chuni_static_map_icon')
op.drop_table('chuni_static_map_icon')
op.drop_index('chuni_static_character_uk', table_name='chuni_static_character')
op.drop_table('chuni_static_character')
op.drop_column('chuni_static_avatar', 'defaultHave')
op.drop_column('chuni_static_avatar', 'isEnabled')
op.drop_column('chuni_static_avatar', 'sortName')
# ### end Alembic commands ###

View File

@@ -0,0 +1,85 @@
"""CHUNITHM VERSE support
Revision ID: 49c295e89cd4
Revises: 7070a6fa8cdc
Create Date: 2025-03-09 14:10:03.067328
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
from sqlalchemy.sql import func
# revision identifiers, used by Alembic.
revision = "49c295e89cd4"
down_revision = "7070a6fa8cdc"
branch_labels = None
depends_on = None
def upgrade():
### commands auto generated by Alembic - please adjust! ###
op.add_column("chuni_profile_data", sa.Column("trophyIdSub1", sa.Integer()))
op.add_column("chuni_profile_data", sa.Column("trophyIdSub2", sa.Integer()))
op.add_column("chuni_score_playlog", sa.Column("monthPoint", sa.Integer()))
op.add_column("chuni_score_playlog", sa.Column("eventPoint", sa.Integer()))
op.create_table(
"chuni_static_unlock_challenge",
sa.Column("id", sa.Integer(), primary_key=True, nullable=False),
sa.Column("version", sa.Integer(), nullable=False),
sa.Column("unlockChallengeId", sa.Integer(), nullable=False),
sa.Column("name", sa.String(length=255)),
sa.Column("isEnabled", sa.Boolean(), server_default="1"),
sa.Column("startDate", sa.TIMESTAMP(), server_default=func.now()),
sa.Column("courseId1", sa.Integer()),
sa.Column("courseId2", sa.Integer()),
sa.Column("courseId3", sa.Integer()),
sa.Column("courseId4", sa.Integer()),
sa.Column("courseId5", sa.Integer()),
sa.UniqueConstraint(
"version", "unlockChallengeId", name="chuni_static_unlock_challenge_uk"
),
mysql_charset="utf8mb4",
)
op.create_table(
"chuni_item_unlock_challenge",
sa.Column("id", sa.Integer(), primary_key=True, nullable=False),
sa.Column("version", sa.Integer(), nullable=False),
sa.Column(
"user",
sa.Integer(),
sa.ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
sa.Column("unlockChallengeId", sa.Integer(), nullable=False),
sa.Column("status", sa.Integer()),
sa.Column("clearCourseId", sa.Integer()),
sa.Column("conditionType", sa.Integer()),
sa.Column("score", sa.Integer()),
sa.Column("life", sa.Integer()),
sa.Column("clearDate", sa.TIMESTAMP(), server_default=func.now()),
sa.UniqueConstraint(
"version",
"user",
"unlockChallengeId",
name="chuni_item_unlock_challenge_uk",
),
mysql_charset="utf8mb4",
)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_column("chuni_score_playlog", "eventPoint")
op.drop_column("chuni_score_playlog", "monthPoint")
op.drop_column("chuni_profile_data", "trophyIdSub2")
op.drop_column("chuni_profile_data", "trophyIdSub1")
op.drop_table("chuni_static_unlock_challenge")
op.drop_table("chuni_item_unlock_challenge")
# ### end Alembic commands ###

View File

@@ -0,0 +1,43 @@
"""mai2_intimacy
Revision ID: 54a84103b84e
Revises: bc91c1206dca
Create Date: 2024-09-16 17:47:49.164546
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy import Column, Integer, UniqueConstraint
# revision identifiers, used by Alembic.
revision = '54a84103b84e'
down_revision = 'bc91c1206dca'
branch_labels = None
depends_on = None
def upgrade():
op.create_table(
"mai2_user_intimate",
Column("id", Integer, primary_key=True, nullable=False),
Column("user", Integer, nullable=False),
Column("partnerId", Integer, nullable=False),
Column("intimateLevel", Integer, nullable=False),
Column("intimateCountRewarded", Integer, nullable=False),
UniqueConstraint("user", "partnerId", name="mai2_user_intimate_uk"),
mysql_charset="utf8mb4",
)
op.create_foreign_key(
None,
"mai2_user_intimate",
"aime_user",
["user"],
["id"],
ondelete="cascade",
onupdate="cascade",
)
def downgrade():
op.drop_table("mai2_user_intimate")

View File

@@ -0,0 +1,53 @@
"""Mai2 PRiSM support
Revision ID: 5cf98cfe52ad
Revises: 263884e774cc
Create Date: 2025-04-08 08:00:51.243089
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = '5cf98cfe52ad'
down_revision = '263884e774cc'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('mai2_score_kaleidxscope',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('user', sa.Integer(), nullable=False),
sa.Column('gateId', sa.Integer(), nullable=True),
sa.Column('isGateFound', sa.Boolean(), nullable=True),
sa.Column('isKeyFound', sa.Boolean(), nullable=True),
sa.Column('isClear', sa.Boolean(), nullable=True),
sa.Column('totalRestLife', sa.Integer(), nullable=True),
sa.Column('totalAchievement', sa.Integer(), nullable=True),
sa.Column('totalDeluxscore', sa.Integer(), nullable=True),
sa.Column('bestAchievement', sa.Integer(), nullable=True),
sa.Column('bestDeluxscore', sa.Integer(), nullable=True),
sa.Column('bestAchievementDate', sa.String(length=25), nullable=True),
sa.Column('bestDeluxscoreDate', sa.String(length=25), nullable=True),
sa.Column('playCount', sa.Integer(), nullable=True),
sa.Column('clearDate', sa.String(length=25), nullable=True),
sa.Column('lastPlayDate', sa.String(length=25), nullable=True),
sa.Column('isInfoWatched', sa.Boolean(), nullable=True),
sa.ForeignKeyConstraint(['user'], ['aime_user.id'], onupdate='cascade', ondelete='cascade'),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('user', 'gateId', name='mai2_score_best_uk'),
mysql_charset='utf8mb4'
)
op.add_column('mai2_playlog', sa.Column('extBool2', sa.Boolean(), nullable=True, server_default=sa.text("NULL")))
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_column('mai2_playlog', 'extBool2')
op.drop_table('mai2_score_kaleidxscope')
# ### end Alembic commands ###

View File

@@ -0,0 +1,42 @@
"""update_channels
Revision ID: 7070a6fa8cdc
Revises: f6007bbf057d
Create Date: 2025-09-27 16:09:55.853051
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision = '7070a6fa8cdc'
down_revision = 'f6007bbf057d'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('machine_update',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('game', sa.CHAR(length=4), nullable=False),
sa.Column('version', sa.VARCHAR(length=15), nullable=False),
sa.Column('channel', sa.VARCHAR(length=260), nullable=False),
sa.Column('app_ini', sa.VARCHAR(length=260), nullable=True),
sa.Column('opt_ini', sa.VARCHAR(length=260), nullable=True),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('game', 'version', 'channel', name='machine_update_uk'),
mysql_charset='utf8mb4'
)
op.add_column('machine', sa.Column('ota_channel', sa.VARCHAR(length=260), nullable=True))
op.drop_column('machine', 'ota_enable')
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('machine', sa.Column('ota_enable', mysql.TINYINT(display_width=1), autoincrement=False, nullable=True))
op.drop_column('machine', 'ota_channel')
op.drop_table('machine_update')
# ### end Alembic commands ###

View File

@@ -0,0 +1,98 @@
"""CHUNITHM X-VERSE
Revision ID: 8b57e9646449
Revises: bdf710616ba4
Create Date: 2025-12-12 16:09:07.530809
"""
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "8b57e9646449"
down_revision = "bdf710616ba4"
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.add_column(
"chuni_profile_data",
sa.Column("stageId", sa.Integer(), nullable=False, server_default="99999"),
)
op.create_table(
"chuni_static_linked_verse",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("version", sa.Integer(), nullable=False),
sa.Column("linkedVerseId", sa.Integer(), nullable=False),
sa.Column("name", sa.String(length=255), nullable=True),
sa.Column("isEnabled", sa.Boolean(), server_default="1", nullable=False),
sa.Column(
"startDate", sa.TIMESTAMP(), server_default=sa.text("now()"), nullable=True
),
sa.Column("courseId1", sa.Integer(), nullable=True),
sa.Column("courseId2", sa.Integer(), nullable=True),
sa.Column("courseId3", sa.Integer(), nullable=True),
sa.Column("courseId4", sa.Integer(), nullable=True),
sa.Column("courseId5", sa.Integer(), nullable=True),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint(
"version", "linkedVerseId", name="chuni_static_linked_verse_pk"
),
mysql_charset="utf8mb4",
)
op.create_table(
"chuni_item_linked_verse",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("user", sa.Integer(), nullable=False),
sa.Column("linkedVerseId", sa.Integer(), nullable=False),
sa.Column("progress", sa.String(length=255), nullable=True),
sa.Column("statusOpen", sa.Integer(), nullable=True),
sa.Column("statusUnlock", sa.Integer(), nullable=True),
sa.Column("isFirstClear", sa.Integer(), nullable=True),
sa.Column("numClear", sa.Integer(), nullable=True),
sa.Column("clearCourseId", sa.Integer(), nullable=True),
sa.Column("clearCourseLevel", sa.Integer(), nullable=True),
sa.Column("clearScore", sa.Integer(), nullable=True),
sa.Column("clearDate", sa.String(length=25), nullable=True),
sa.Column("clearUserId1", sa.Integer(), nullable=True),
sa.Column("clearUserId2", sa.Integer(), nullable=True),
sa.Column("clearUserId3", sa.Integer(), nullable=True),
sa.Column("clearUserName0", sa.String(length=20), nullable=True),
sa.Column("clearUserName1", sa.String(length=20), nullable=True),
sa.Column("clearUserName2", sa.String(length=20), nullable=True),
sa.Column("clearUserName3", sa.String(length=20), nullable=True),
sa.ForeignKeyConstraint(
["user"], ["aime_user.id"], onupdate="cascade", ondelete="cascade"
),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint("user", "linkedVerseId", name="chuni_item_linked_verse_uk"),
mysql_charset="utf8mb4",
)
op.create_table(
"chuni_static_stage",
sa.Column("id", sa.Integer(), primary_key=True, nullable=False),
sa.Column("version", sa.Integer(), nullable=False),
sa.Column("stageId", sa.Integer(), nullable=False),
sa.Column("name", sa.String(length=255)),
sa.Column("imagePath", sa.String(length=255)),
sa.Column("isEnabled", sa.Boolean(), server_default="1"),
sa.Column("defaultHave", sa.Boolean(), server_default="0"),
sa.Column("opt", sa.BIGINT(), sa.ForeignKey("chuni_static_opt.id", ondelete="SET NULL", onupdate="cascade")),
sa.UniqueConstraint(
"version", "stageId", name="chuni_static_stage_uk"
),
mysql_charset="utf8mb4",
)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_column("chuni_profile_data", "stageId")
op.drop_table("chuni_item_linked_verse")
op.drop_table("chuni_static_linked_verse")
op.drop_table("chuni_static_stage")
# ### end Alembic commands ###

View File

@@ -0,0 +1,92 @@
"""chuni_fix_total_scores
Revision ID: 91c682918b67
Revises: 9c42e54a27fe
Create Date: 2025-03-29 11:19:46.063173
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision = '91c682918b67'
down_revision = '9c42e54a27fe'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.alter_column('chuni_profile_data', 'totalMapNum',
existing_type=mysql.INTEGER(display_width=11),
type_=sa.BigInteger(),
existing_nullable=True)
op.alter_column('chuni_profile_data', 'totalHiScore',
existing_type=mysql.INTEGER(display_width=11),
type_=sa.BigInteger(),
existing_nullable=True)
op.alter_column('chuni_profile_data', 'totalBasicHighScore',
existing_type=mysql.INTEGER(display_width=11),
type_=sa.BigInteger(),
existing_nullable=True)
op.alter_column('chuni_profile_data', 'totalExpertHighScore',
existing_type=mysql.INTEGER(display_width=11),
type_=sa.BigInteger(),
existing_nullable=True)
op.alter_column('chuni_profile_data', 'totalMasterHighScore',
existing_type=mysql.INTEGER(display_width=11),
type_=sa.BigInteger(),
existing_nullable=True)
op.alter_column('chuni_profile_data', 'totalRepertoireCount',
existing_type=mysql.INTEGER(display_width=11),
type_=sa.BigInteger(),
existing_nullable=True)
op.alter_column('chuni_profile_data', 'totalAdvancedHighScore',
existing_type=mysql.INTEGER(display_width=11),
type_=sa.BigInteger(),
existing_nullable=True)
op.alter_column('chuni_profile_data', 'totalUltimaHighScore',
existing_type=mysql.INTEGER(display_width=11),
type_=sa.BigInteger(),
existing_nullable=True,
existing_server_default=sa.text("'0'"))
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.alter_column('chuni_profile_data', 'totalUltimaHighScore',
existing_type=sa.BigInteger(),
type_=mysql.INTEGER(display_width=11),
existing_nullable=True,
existing_server_default=sa.text("'0'"))
op.alter_column('chuni_profile_data', 'totalAdvancedHighScore',
existing_type=sa.BigInteger(),
type_=mysql.INTEGER(display_width=11),
existing_nullable=True)
op.alter_column('chuni_profile_data', 'totalRepertoireCount',
existing_type=sa.BigInteger(),
type_=mysql.INTEGER(display_width=11),
existing_nullable=True)
op.alter_column('chuni_profile_data', 'totalMasterHighScore',
existing_type=sa.BigInteger(),
type_=mysql.INTEGER(display_width=11),
existing_nullable=True)
op.alter_column('chuni_profile_data', 'totalExpertHighScore',
existing_type=sa.BigInteger(),
type_=mysql.INTEGER(display_width=11),
existing_nullable=True)
op.alter_column('chuni_profile_data', 'totalBasicHighScore',
existing_type=sa.BigInteger(),
type_=mysql.INTEGER(display_width=11),
existing_nullable=True)
op.alter_column('chuni_profile_data', 'totalHiScore',
existing_type=sa.BigInteger(),
type_=mysql.INTEGER(display_width=11),
existing_nullable=True)
op.alter_column('chuni_profile_data', 'totalMapNum',
existing_type=sa.BigInteger(),
type_=mysql.INTEGER(display_width=11),
existing_nullable=True)
# ### end Alembic commands ###

View File

@@ -0,0 +1,40 @@
"""remove ongeki_static_music_ranking_list
Revision ID: 9c42e54a27fe
Revises: 41f77ef50588
Create Date: 2025-01-06 18:24:16.306748
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision = '9c42e54a27fe'
down_revision = '41f77ef50588'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_index('ongeki_static_music_ranking_uk', table_name='ongeki_static_music_ranking_list')
op.drop_table('ongeki_static_music_ranking_list')
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('ongeki_static_music_ranking_list',
sa.Column('id', mysql.INTEGER(), autoincrement=True, nullable=False),
sa.Column('version', mysql.INTEGER(), autoincrement=False, nullable=False),
sa.Column('musicId', mysql.INTEGER(), autoincrement=False, nullable=False),
sa.Column('point', mysql.INTEGER(), autoincrement=False, nullable=False),
sa.Column('userName', mysql.VARCHAR(length=255), nullable=True),
sa.PrimaryKeyConstraint('id'),
mysql_collate='utf8mb4_0900_ai_ci',
mysql_default_charset='utf8mb4',
mysql_engine='InnoDB'
)
op.create_index('ongeki_static_music_ranking_uk', 'ongeki_static_music_ranking_list', ['version', 'musicId'], unique=True)
# ### end Alembic commands ###

View File

@@ -0,0 +1,30 @@
"""chuni_nameplate_add_opt
Revision ID: ae364c078429
Revises: 5cf98cfe52ad
Create Date: 2025-04-08 00:22:22.370660
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision = 'ae364c078429'
down_revision = '5cf98cfe52ad'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('chuni_static_nameplate', sa.Column('opt', sa.BIGINT(), nullable=True))
op.create_foreign_key(None, 'chuni_static_nameplate', 'chuni_static_opt', ['opt'], ['id'], onupdate='cascade', ondelete='SET NULL')
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_constraint("chuni_static_nameplate_ibfk_1", 'chuni_static_nameplate', type_='foreignkey')
op.drop_column('chuni_static_nameplate', 'opt')
# ### end Alembic commands ###

View File

@@ -0,0 +1,24 @@
"""mai2_favorite_song_ordering
Revision ID: bc91c1206dca
Revises: 28443e2da5b8
Create Date: 2024-09-16 14:24:56.714066
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = 'bc91c1206dca'
down_revision = '28443e2da5b8'
branch_labels = None
depends_on = None
def upgrade():
op.add_column('mai2_item_favorite_music', sa.Column('orderId', sa.Integer(), nullable=True))
def downgrade():
op.drop_column('mai2_item_favorite_music', 'orderId')

View File

@@ -0,0 +1,29 @@
"""Mai2 add PRiSM+ playlog support
Revision ID: bdf710616ba4
Revises: 16f34bf7b968
Create Date: 2025-04-02 12:42:08.981516
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = 'bdf710616ba4'
down_revision = '49c295e89cd4'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('mai2_playlog', sa.Column('extBool3', sa.Boolean(), nullable=True,server_default=sa.text("NULL")))
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_column('mai2_playlog', 'extBool3')
# ### end Alembic commands ###

View File

@@ -0,0 +1,38 @@
"""mai2_add_photos
Revision ID: d8cd1fa04c2a
Revises: 54a84103b84e
Create Date: 2024-10-06 03:09:15.959817
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision = 'd8cd1fa04c2a'
down_revision = '54a84103b84e'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('mai2_user_photo',
sa.Column('id', sa.VARCHAR(length=36), nullable=False),
sa.Column('user', sa.Integer(), nullable=False),
sa.Column('playlog_num', sa.INTEGER(), nullable=False),
sa.Column('track_num', sa.INTEGER(), nullable=False),
sa.Column('when_upload', sa.TIMESTAMP(), server_default=sa.text('now()'), nullable=False),
sa.ForeignKeyConstraint(['user'], ['aime_user.id'], onupdate='cascade', ondelete='cascade'),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('user', 'playlog_num', 'track_num', name='mai2_user_photo_uk'),
mysql_charset='utf8mb4'
)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_table('mai2_user_photo')
# ### end Alembic commands ###

View File

@@ -0,0 +1,50 @@
"""add_billing_playcount
Revision ID: f6007bbf057d
Revises: 27e3434740df
Create Date: 2025-04-19 18:20:35.554137
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision = 'f6007bbf057d'
down_revision = '27e3434740df'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('machine_billing_playcount',
sa.Column('id', sa.BIGINT(), nullable=False),
sa.Column('machine', sa.Integer(), nullable=False),
sa.Column('game_id', sa.CHAR(length=5), nullable=False),
sa.Column('year', sa.INTEGER(), nullable=False),
sa.Column('month', sa.INTEGER(), nullable=False),
sa.Column('playct', sa.BIGINT(), server_default='1', nullable=False),
sa.ForeignKeyConstraint(['machine'], ['machine.id'], onupdate='cascade', ondelete='cascade'),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('machine'),
sa.UniqueConstraint('machine', 'game_id', 'year', 'month', name='machine_billing_playcount_uk'),
mysql_charset='utf8mb4'
)
op.add_column('machine_billing_credit', sa.Column('game_id', sa.CHAR(length=5), nullable=False))
op.drop_constraint("machine_billing_credit_ibfk_1", "machine_billing_credit", "foreignkey")
op.drop_index('machine', table_name='machine_billing_credit')
op.create_unique_constraint('machine_billing_credit_uk', 'machine_billing_credit', ['machine', 'game_id'])
op.create_foreign_key("machine_billing_credit_ibfk_1", "machine_billing_credit", "machine", ["machine"], ["id"], onupdate='cascade', ondelete='cascade')
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_constraint("machine_billing_credit_ibfk_1", "machine_billing_credit", "foreignkey")
op.drop_constraint('machine_billing_credit_uk', 'machine_billing_credit', type_='unique')
op.create_index('machine', 'machine_billing_credit', ['machine'], unique=True)
op.create_foreign_key("machine_billing_credit_ibfk_1", "machine_billing_credit", "machine", ["machine"], ["id"], onupdate='cascade', ondelete='cascade')
op.drop_column('machine_billing_credit', 'game_id')
op.drop_table('machine_billing_playcount')
# ### end Alembic commands ###

View File

@@ -1,54 +1,70 @@
import logging, coloredlogs
from typing import Optional
from sqlalchemy.orm import scoped_session, sessionmaker
from sqlalchemy import create_engine
from logging.handlers import TimedRotatingFileHandler
import logging
import os
import secrets, string
import bcrypt
import secrets
import string
import warnings
from hashlib import sha256
from logging.handlers import TimedRotatingFileHandler
from typing import ClassVar, Optional
import alembic.config
import glob
import bcrypt
import coloredlogs
import pymysql.err
from sqlalchemy.ext.asyncio import (
AsyncEngine,
AsyncSession,
create_async_engine,
)
from sqlalchemy.orm import sessionmaker
from core.config import CoreConfig
from core.data.schema import *
from core.utils import Utils
from core.data.schema import ArcadeData, BaseData, CardData, UserData, metadata
from core.utils import MISSING, Utils
class Data:
engine = None
session = None
user = None
arcade = None
card = None
base = None
engine: ClassVar[AsyncEngine] = MISSING
session: ClassVar["sessionmaker[AsyncSession]"] = MISSING
user: ClassVar[UserData] = MISSING
arcade: ClassVar[ArcadeData] = MISSING
card: ClassVar[CardData] = MISSING
base: ClassVar[BaseData] = MISSING
def __init__(self, cfg: CoreConfig) -> None:
self.config = cfg
if self.config.database.sha2_password:
passwd = sha256(self.config.database.password.encode()).digest()
self.__url = f"{self.config.database.protocol}://{self.config.database.username}:{passwd.hex()}@{self.config.database.host}:{self.config.database.port}/{self.config.database.name}?charset=utf8mb4"
self.__url = f"{self.config.database.protocol}+aiomysql://{self.config.database.username}:{passwd.hex()}@{self.config.database.host}:{self.config.database.port}/{self.config.database.name}"
else:
self.__url = f"{self.config.database.protocol}://{self.config.database.username}:{self.config.database.password}@{self.config.database.host}:{self.config.database.port}/{self.config.database.name}?charset=utf8mb4"
self.__url = f"{self.config.database.protocol}+aiomysql://{self.config.database.username}:{self.config.database.password}@{self.config.database.host}:{self.config.database.port}/{self.config.database.name}"
if Data.engine is None:
Data.engine = create_engine(self.__url, pool_recycle=3600)
if Data.engine is MISSING:
Data.engine = create_async_engine(
self.__url,
pool_recycle=3600,
isolation_level="AUTOCOMMIT",
connect_args={
"charset": "utf8mb4",
"ssl": self.config.database.create_ssl_context_if_enabled(),
},
)
self.__engine = Data.engine
if Data.session is None:
s = sessionmaker(bind=Data.engine, autoflush=True, autocommit=True)
Data.session = scoped_session(s)
if Data.session is MISSING:
Data.session = sessionmaker(Data.engine, expire_on_commit=False, class_=AsyncSession)
if Data.user is None:
if Data.user is MISSING:
Data.user = UserData(self.config, self.session)
if Data.arcade is None:
if Data.arcade is MISSING:
Data.arcade = ArcadeData(self.config, self.session)
if Data.card is None:
if Data.card is MISSING:
Data.card = CardData(self.config, self.session)
if Data.base is None:
if Data.base is MISSING:
Data.base = BaseData(self.config, self.session)
self.logger = logging.getLogger("database")
@@ -94,40 +110,73 @@ class Data:
alembic.config.main(argv=alembicArgs)
os.chdir(old_dir)
def create_database(self):
async def create_database(self):
self.logger.info("Creating databases...")
metadata.create_all(
self.engine,
checkfirst=True,
)
for _, mod in Utils.get_all_titles().items():
if hasattr(mod, "database"):
mod.database(self.config)
metadata.create_all(
self.engine,
checkfirst=True,
)
with warnings.catch_warnings():
# SQLAlchemy will generate a nice primary key constraint name, but in
# MySQL/MariaDB the constraint name is always PRIMARY. Every time a
# custom primary key name is generated, a warning is emitted from pymysql,
# which we don't care about. Other warnings may be helpful though, don't
# suppress everything.
warnings.filterwarnings(
action="ignore",
message=r"Name '(.+)' ignored for PRIMARY key\.",
category=pymysql.err.Warning,
)
# Stamp the end revision as if alembic had created it, so it can take off after this.
self.__alembic_cmd(
"stamp",
"head",
)
async with self.engine.begin() as conn:
await conn.run_sync(metadata.create_all, checkfirst=True)
def schema_upgrade(self, ver: str = None):
self.__alembic_cmd(
"upgrade",
"head" if not ver else ver,
)
for _, mod in Utils.get_all_titles().items():
if hasattr(mod, "database"):
mod.database(self.config)
await conn.run_sync(metadata.create_all, checkfirst=True)
# Stamp the end revision as if alembic had created it, so it can take off after this.
self.__alembic_cmd(
"stamp",
"head",
)
def schema_upgrade(self, ver: Optional[str] = None):
with warnings.catch_warnings():
# SQLAlchemy will generate a nice primary key constraint name, but in
# MySQL/MariaDB the constraint name is always PRIMARY. Every time a
# custom primary key name is generated, a warning is emitted from pymysql,
# which we don't care about. Other warnings may be helpful though, don't
# suppress everything.
warnings.filterwarnings(
action="ignore",
message=r"Name '(.+)' ignored for PRIMARY key\.",
category=pymysql.err.Warning,
)
self.__alembic_cmd(
"upgrade",
"head" if not ver else ver,
)
def schema_downgrade(self, ver: str):
self.__alembic_cmd(
"downgrade",
ver,
)
with warnings.catch_warnings():
# SQLAlchemy will generate a nice primary key constraint name, but in
# MySQL/MariaDB the constraint name is always PRIMARY. Every time a
# custom primary key name is generated, a warning is emitted from pymysql,
# which we don't care about. Other warnings may be helpful though, don't
# suppress everything.
warnings.filterwarnings(
action="ignore",
message=r"Name '(.+)' ignored for PRIMARY key\.",
category=pymysql.err.Warning,
)
async def create_owner(self, email: Optional[str] = None, code: Optional[str] = "00000000000000000000") -> None:
self.__alembic_cmd(
"downgrade",
ver,
)
async def create_owner(self, email: Optional[str] = None, code: str = "00000000000000000000") -> None:
pw = "".join(
secrets.choice(string.ascii_letters + string.digits) for i in range(20)
)
@@ -150,12 +199,12 @@ class Data:
async def migrate(self) -> None:
exist = await self.base.execute("SELECT * FROM alembic_version")
if exist is not None:
self.logger.warn("No need to migrate as you have already migrated to alembic. If you are trying to upgrade the schema, use `upgrade` instead!")
self.logger.warning("No need to migrate as you have already migrated to alembic. If you are trying to upgrade the schema, use `upgrade` instead!")
return
self.logger.info("Upgrading to latest with legacy system")
if not await self.legacy_upgrade():
self.logger.warn("No need to migrate as you have already deleted the old schema_versions system. If you are trying to upgrade the schema, use `upgrade` instead!")
self.logger.warning("No need to migrate as you have already deleted the old schema_versions system. If you are trying to upgrade the schema, use `upgrade` instead!")
return
self.logger.info("Done")
@@ -174,7 +223,7 @@ class Data:
async def legacy_upgrade(self) -> bool:
vers = await self.base.execute("SELECT * FROM schema_versions")
if vers is None:
self.logger.warn("Cannot legacy upgrade, schema_versions table unavailable!")
self.logger.warning("Cannot legacy upgrade, schema_versions table unavailable!")
return False
db_vers = {}
@@ -203,7 +252,7 @@ class Data:
game_codes = getattr(mod, "game_codes", [])
for game in game_codes:
if game not in db_vers:
self.logger.warn(f"{game} does not have an antry in schema_versions, skipping")
self.logger.warning(f"{game} does not have an antry in schema_versions, skipping")
continue
now_ver = int(db_vers[game]) + 1

View File

@@ -1,16 +1,17 @@
from typing import Optional, Dict, List
from sqlalchemy import Table, Column, and_, or_
from sqlalchemy.sql.schema import ForeignKey, PrimaryKeyConstraint
from sqlalchemy.types import Integer, String, Boolean, JSON
from sqlalchemy.sql import func, select
import re
from typing import List, Optional
from datetime import datetime
from sqlalchemy import Column, Table, and_, or_, UniqueConstraint
from sqlalchemy.dialects.mysql import insert
from sqlalchemy.engine import Row
import re
from sqlalchemy.sql import func, select
from sqlalchemy.sql.schema import ForeignKey, PrimaryKeyConstraint
from sqlalchemy.types import JSON, Boolean, Integer, String, BIGINT, INTEGER, CHAR, FLOAT, VARCHAR
from core.data.schema.base import BaseData, metadata
from core.const import *
arcade = Table(
arcade: Table = Table(
"arcade",
metadata,
Column("id", Integer, primary_key=True, nullable=False),
@@ -26,7 +27,7 @@ arcade = Table(
mysql_charset="utf8mb4",
)
machine = Table(
machine: Table = Table(
"machine",
metadata,
Column("id", Integer, primary_key=True, nullable=False),
@@ -40,14 +41,27 @@ machine = Table(
Column("game", String(4)),
Column("country", String(3)), # overwrites if not null
Column("timezone", String(255)),
Column("ota_enable", Boolean),
Column("memo", String(255)),
Column("is_cab", Boolean),
Column("ota_channel", VARCHAR(260)),
Column("data", JSON),
mysql_charset="utf8mb4",
)
arcade_owner = Table(
update: Table = Table(
"machine_update",
metadata,
Column("id", Integer, primary_key=True, nullable=False),
Column("game", CHAR(4), nullable=False),
Column("version", VARCHAR(15), nullable=False),
Column("channel", VARCHAR(260), nullable=False),
Column("app_ini", VARCHAR(260)),
Column("opt_ini", VARCHAR(260)),
UniqueConstraint("game", "version", "channel", name="machine_update_uk"),
mysql_charset="utf8mb4",
)
arcade_owner: Table = Table(
"arcade_owner",
metadata,
Column(
@@ -67,9 +81,79 @@ arcade_owner = Table(
mysql_charset="utf8mb4",
)
billing_charge: Table = Table(
"machine_billing_charge",
metadata,
Column("id", BIGINT, primary_key=True, nullable=False),
Column(
"machine",
Integer,
ForeignKey("machine.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
Column("game_id", CHAR(5), nullable=False),
Column("game_ver", FLOAT, nullable=False),
Column("play_count", INTEGER, nullable=False),
Column("play_limit", INTEGER, nullable=False),
Column("product_code", INTEGER, nullable=False),
Column("product_count", INTEGER, nullable=False),
Column("func_type", INTEGER, nullable=False),
Column("player_number", INTEGER, nullable=False),
mysql_charset="utf8mb4",
)
# These settings are only really of interest
# for real cabinets operating as pay-to-play
billing_credit: Table = Table(
"machine_billing_credit",
metadata,
Column("id", BIGINT, primary_key=True, nullable=False),
Column(
"machine",
Integer,
ForeignKey("machine.id", ondelete="cascade", onupdate="cascade"),
nullable=False
),
Column("game_id", CHAR(5), nullable=False),
Column("chute_type", INTEGER, nullable=False),
Column("service_type", INTEGER, nullable=False),
Column("operation_type", INTEGER, nullable=False),
Column("coin_rate0", INTEGER, nullable=False),
Column("coin_rate1", INTEGER, nullable=False),
Column("coin_bonus", INTEGER, nullable=False),
Column("credit_rate", INTEGER, nullable=False),
Column("coin_count_slot0", INTEGER, nullable=False),
Column("coin_count_slot1", INTEGER, nullable=False),
Column("coin_count_slot2", INTEGER, nullable=False),
Column("coin_count_slot3", INTEGER, nullable=False),
Column("coin_count_slot4", INTEGER, nullable=False),
Column("coin_count_slot5", INTEGER, nullable=False),
Column("coin_count_slot6", INTEGER, nullable=False),
Column("coin_count_slot7", INTEGER, nullable=False),
UniqueConstraint("machine", "game_id", name="machine_billing_credit_uk"),
mysql_charset="utf8mb4",
)
billing_playct: Table = Table(
"machine_billing_playcount",
metadata,
Column("id", BIGINT, primary_key=True, nullable=False),
Column(
"machine",
Integer,
ForeignKey("machine.id", ondelete="cascade", onupdate="cascade"),
nullable=False, unique=True
),
Column("game_id", CHAR(5), nullable=False),
Column("year", INTEGER, nullable=False),
Column("month", INTEGER, nullable=False),
Column("playct", BIGINT, nullable=False, server_default="1"),
UniqueConstraint("machine", "game_id", "year", "month", name="machine_billing_playcount_uk"),
mysql_charset="utf8mb4",
)
class ArcadeData(BaseData):
async def get_machine(self, serial: str = None, id: int = None) -> Optional[Row]:
async def get_machine(self, serial: Optional[str] = None, id: Optional[int] = None) -> Optional[Row]:
if serial is not None:
serial = serial.replace("-", "")
if len(serial) == 11:
@@ -98,8 +182,8 @@ class ArcadeData(BaseData):
self,
arcade_id: int,
serial: str = "",
board: str = None,
game: str = None,
board: Optional[str] = None,
game: Optional[str] = None,
is_cab: bool = False,
) -> Optional[int]:
if not arcade_id:
@@ -115,6 +199,15 @@ class ArcadeData(BaseData):
return None
return result.lastrowid
async def set_machine_arcade(self, machine_id: int, new_arcade: int) -> bool:
sql = machine.update(machine.c.id == machine_id).values(arcade = new_arcade)
result = await self.execute(sql)
if result is None:
self.logger.error(f"Failed to update machine {machine_id} arcade to {new_arcade}")
return False
return True
async def set_machine_serial(self, machine_id: int, serial: str) -> None:
result = await self.execute(
machine.update(machine.c.id == machine_id).values(keychip=serial)
@@ -134,6 +227,60 @@ class ArcadeData(BaseData):
f"Failed to update board id for machine {machine_id} -> {boardid}"
)
async def set_machine_game(self, machine_id: int, new_game: Optional[str]) -> bool:
sql = machine.update(machine.c.id == machine_id).values(game = new_game)
result = await self.execute(sql)
if result is None:
self.logger.error(f"Failed to update machine {machine_id} game to {new_game}")
return False
return True
async def set_machine_country(self, machine_id: int, new_country: Optional[str]) -> bool:
sql = machine.update(machine.c.id == machine_id).values(country = new_country)
result = await self.execute(sql)
if result is None:
self.logger.error(f"Failed to update machine {machine_id} country to {new_country}")
return False
return True
async def set_machine_timezone(self, machine_id: int, new_timezone: Optional[str]) -> bool:
sql = machine.update(machine.c.id == machine_id).values(timezone = new_timezone)
result = await self.execute(sql)
if result is None:
self.logger.error(f"Failed to update machine {machine_id} timezone to {new_timezone}")
return False
return True
async def set_machine_real_cabinet(self, machine_id: int, is_real: bool = False) -> bool:
sql = machine.update(machine.c.id == machine_id).values(is_cab = is_real)
result = await self.execute(sql)
if result is None:
self.logger.error(f"Failed to update machine {machine_id} is_cab to {is_real}")
return False
return True
async def set_machine_ota_channel(self, machine_id: int, channel_name: Optional[str] = None) -> bool:
sql = machine.update(machine.c.id == machine_id).values(ota_channel = channel_name)
result = await self.execute(sql)
if result is None:
self.logger.error(f"Failed to update machine {machine_id} ota channel to {channel_name}")
return False
return True
async def set_machine_memo(self, machine_id: int, new_memo: Optional[str]) -> bool:
sql = machine.update(machine.c.id == machine_id).values(memo = new_memo)
result = await self.execute(sql)
if result is None:
self.logger.error(f"Failed to update machine {machine_id} memo")
return False
return True
async def get_arcade(self, id: int) -> Optional[Row]:
sql = arcade.select(arcade.c.id == id)
result = await self.execute(sql)
@@ -150,8 +297,8 @@ class ArcadeData(BaseData):
async def create_arcade(
self,
name: str = None,
nickname: str = None,
name: Optional[str] = None,
nickname: Optional[str] = None,
country: str = "JPN",
country_id: int = 1,
state: str = "",
@@ -187,8 +334,11 @@ class ArcadeData(BaseData):
sql = select(arcade_owner.c.permissions).where(and_(arcade_owner.c.user == user_id, arcade_owner.c.arcade == arcade_id))
result = await self.execute(sql)
if result is None:
return False
return result.fetchone()
return None
row = result.fetchone()
if row:
return row['permissions']
return None
async def get_arcade_owners(self, arcade_id: int) -> Optional[Row]:
sql = select(arcade_owner).where(arcade_owner.c.arcade == arcade_id)
@@ -198,14 +348,25 @@ class ArcadeData(BaseData):
return None
return result.fetchall()
async def add_arcade_owner(self, arcade_id: int, user_id: int) -> None:
sql = insert(arcade_owner).values(arcade=arcade_id, user=user_id)
async def add_arcade_owner(self, arcade_id: int, user_id: int, permissions: int = 1) -> Optional[int]:
sql = insert(arcade_owner).values(arcade=arcade_id, user=user_id, permissions=permissions)
result = await self.execute(sql)
if result is None:
return None
return result.lastrowid
async def set_arcade_owner_permissions(self, arcade_id: int, user_id: int, new_permissions: int = 1) -> bool:
sql = arcade_owner.update(
and_(arcade_owner.c.arcade == arcade_id, arcade_owner.c.user == user_id)
).values(permissions = new_permissions)
result = await self.execute(sql)
if result is None:
self.logger.error(f"Failed to update arcade owner permissions to {new_permissions} for user {user_id} arcade {arcade_id}")
return False
return True
async def get_arcade_by_name(self, name: str) -> Optional[List[Row]]:
sql = arcade.select(or_(arcade.c.name.like(f"%{name}%"), arcade.c.nickname.like(f"%{name}%")))
result = await self.execute(sql)
@@ -219,20 +380,199 @@ class ArcadeData(BaseData):
if result is None:
return None
return result.fetchall()
async def set_arcade_name_nickname(self, arcade_id: int, new_name: Optional[str], new_nickname: Optional[str]) -> bool:
sql = arcade.update(arcade.c.id == arcade_id).values(name = new_name, nickname = new_nickname)
result = await self.execute(sql)
if result is None:
self.logger.error(f"Failed to update arcade {arcade_id} name to {new_name}/{new_nickname}")
return False
return True
async def set_arcade_region_info(self, arcade_id: int, new_country: Optional[str], new_state: Optional[str], new_city: Optional[str], new_region_id: Optional[int], new_country_id: Optional[int]) -> bool:
sql = arcade.update(arcade.c.id == arcade_id).values(
country = new_country,
state = new_state,
city = new_city,
region_id = new_region_id,
country_id = new_country_id
)
result = await self.execute(sql)
if result is None:
self.logger.error(f"Failed to update arcade {arcade_id} regional info to {new_country}/{new_state}/{new_city}/{new_region_id}/{new_country_id}")
return False
return True
async def set_arcade_timezone(self, arcade_id: int, new_timezone: Optional[str]) -> bool:
sql = arcade.update(arcade.c.id == arcade_id).values(timezone = new_timezone)
result = await self.execute(sql)
if result is None:
self.logger.error(f"Failed to update arcade {arcade_id} timezone to {new_timezone}")
return False
return True
async def set_arcade_vpn_ip(self, arcade_id: int, new_ip: Optional[str]) -> bool:
sql = arcade.update(arcade.c.id == arcade_id).values(ip = new_ip)
result = await self.execute(sql)
if result is None:
self.logger.error(f"Failed to update arcade {arcade_id} VPN address to {new_ip}")
return False
return True
async def get_num_generated_keychips(self) -> Optional[int]:
result = await self.execute(select(func.count("serial LIKE 'A69A%'")).select_from(machine))
if result:
return result.fetchone()['count_1']
self.logger.error("Failed to count machine serials that start with A69A!")
async def billing_add_charge(self, machine_id: int, game_id: str, game_ver: float, playcount: int, playlimit, product_code: int, product_count: int, func_type: int, player_num: int) -> Optional[int]:
result = await self.execute(billing_charge.insert().values(
machine=machine_id,
game_id=game_id,
game_ver=game_ver,
play_count=playcount,
play_limit=playlimit,
product_code=product_code,
product_count=product_count,
func_type=func_type,
player_number=player_num
))
if result is None:
self.logger.error(f"Failed to add billing charge for machine {machine_id}!")
return None
return result.lastrowid
async def billing_get_last_charge(self, machine_id: int, game_id: str) -> Optional[Row]:
result = await self.execute(billing_charge.select(
and_(billing_charge.c.machine == machine_id, billing_charge.c.game_id == game_id)
).order_by(billing_charge.c.id.desc()).limit(3))
if result:
return result.fetchone()
async def billing_set_credit(self, machine_id: int, game_id: str, chute_type: int, service_type: int, op_mode: int, coin_rate0: int, coin_rate1: int,
bonus_adder: int, coin_to_credit_rate: int, coin_count_slot0: int, coin_count_slot1: int, coin_count_slot2: int, coin_count_slot3: int,
coin_count_slot4: int, coin_count_slot5: int, coin_count_slot6: int, coin_count_slot7: int) -> Optional[int]:
sql = insert(billing_credit).values(
machine=machine_id,
game_id=game_id,
chute_type=chute_type,
service_type=service_type,
operation_type=op_mode,
coin_rate0=coin_rate0,
coin_rate1=coin_rate1,
coin_bonus=bonus_adder,
credit_rate=coin_to_credit_rate,
coin_count_slot0=coin_count_slot0,
coin_count_slot1=coin_count_slot1,
coin_count_slot2=coin_count_slot2,
coin_count_slot3=coin_count_slot3,
coin_count_slot4=coin_count_slot4,
coin_count_slot5=coin_count_slot5,
coin_count_slot6=coin_count_slot6,
coin_count_slot7=coin_count_slot7,
)
conflict = sql.on_duplicate_key_update(
chute_type=chute_type,
service_type=service_type,
operation_type=op_mode,
coin_rate0=coin_rate0,
coin_rate1=coin_rate1,
coin_bonus=bonus_adder,
credit_rate=coin_to_credit_rate,
coin_count_slot0=coin_count_slot0,
coin_count_slot1=coin_count_slot1,
coin_count_slot2=coin_count_slot2,
coin_count_slot3=coin_count_slot3,
coin_count_slot4=coin_count_slot4,
coin_count_slot5=coin_count_slot5,
coin_count_slot6=coin_count_slot6,
coin_count_slot7=coin_count_slot7,
)
result = await self.execute(conflict)
if result is None:
self.logger.error(f"Failed to set billing credit settings for machine {machine_id}!")
return None
return result.lastrowid
async def billing_get_credit(self, machine_id: int, game_id: str) -> Optional[Row]:
result = await self.execute(billing_credit.select(
and_(billing_credit.c.machine == machine_id, billing_credit.c.game_id == game_id)
))
if result:
return result.fetchone()
async def billing_add_playcount(self, machine_id: int, game_id: str, playct: int = 1) -> None:
now = datetime.now()
sql = insert(billing_playct).values(
machine=machine_id,
game_id=game_id,
year=now.year,
month=now.month,
playct=playct
)
conflict = sql.on_duplicate_key_update(playct=billing_playct.c.playct + playct)
result = await self.execute(conflict)
if result is None:
self.logger.error(f"Failed to add playcount for machine {machine_id} running {game_id}")
async def billing_get_playcount_3mo(self, machine_id: int, game_id: str) -> Optional[List[Row]]:
result = await self.execute(billing_playct.select(and_(
billing_playct.c.machine == machine_id,
billing_playct.c.game_id == game_id
)).order_by(billing_playct.c.year.desc(), billing_playct.c.month.desc()).limit(3))
if result is not None:
return result.fetchall()
async def billing_get_last_playcount(self, machine_id: int, game_id: str) -> Optional[Row]:
result = await self.execute(billing_playct.select(and_(
billing_playct.c.machine == machine_id,
billing_playct.c.game_id == game_id
)).order_by(billing_playct.c.year.desc(), billing_playct.c.month.desc()).limit(1))
if result is not None:
return result.fetchone()
async def create_ota_update(self, game_id: str, ver: str, channel: str, app: Optional[str], opt: Optional[str] = None) -> Optional[int]:
result = await self.execute(insert(update).values(
game = game_id,
version = ver,
channel = channel,
app_ini = app,
opt_ini = opt
))
if result is None:
self.logger.error(f"Failed to create {game_id} v{ver} update on channel {channel}")
return result.lastrowid
async def get_ota_update(self, game_id: str, ver: str, channel: str) -> Optional[Row]:
result = await self.execute(update.select(and_(
and_(update.c.game == game_id, update.c.version == ver),
update.c.channel == channel
)))
if result is None:
return None
return result.fetchone()
def format_serial(
self, platform_code: str, platform_rev: int, serial_letter: str, serial_num: int, append: int, dash: bool = False
) -> str:
return f"{platform_code}{'-' if dash else ''}{platform_rev:02d}{serial_letter}{serial_num:04d}{append:04d}"
def validate_keychip_format(self, serial: str) -> bool:
# For the 2nd letter, E and X are the only "real" values that have been observed
# For the 2nd letter, E and X are the only "real" values that have been observed (A is used for generated keychips)
if re.fullmatch(r"^A[0-9]{2}[A-Z][-]?[0-9]{2}[A-HJ-NP-Z][0-9]{4}([0-9]{4})?$", serial) is None:
return False
@@ -252,7 +592,6 @@ class ArcadeData(BaseData):
month = ((month - 1) + 9) % 12 # Offset so April=0
return f"{year:02}{month // 6:01}{month % 6 + 1:01}"
def parse_keychip_suffix(self, suffix: str) -> tuple[int, int]:
year = int(suffix[0:2])
half = int(suffix[2])

View File

@@ -1,22 +1,24 @@
import asyncio
import json
import logging
from random import randrange
from typing import Any, Optional, Dict, List
from typing import Any, Dict, List, Optional
from sqlalchemy import Column, MetaData, Table
from sqlalchemy.engine import Row
from sqlalchemy.engine.cursor import CursorResult
from sqlalchemy.engine.base import Connection
from sqlalchemy.sql import text, func, select
from sqlalchemy.exc import SQLAlchemyError
from sqlalchemy import MetaData, Table, Column
from sqlalchemy.types import Integer, String, TIMESTAMP, JSON, INTEGER, TEXT
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy.orm import sessionmaker
from sqlalchemy.schema import ForeignKey
from sqlalchemy.dialects.mysql import insert
from sqlalchemy.sql import func, text
from sqlalchemy.types import INTEGER, JSON, TEXT, TIMESTAMP, Integer, String
from core.config import CoreConfig
metadata = MetaData()
event_log = Table(
event_log: Table = Table(
"event_log",
metadata,
Column("id", Integer, primary_key=True, nullable=False),
@@ -37,7 +39,7 @@ event_log = Table(
class BaseData:
def __init__(self, cfg: CoreConfig, conn: Connection) -> None:
def __init__(self, cfg: CoreConfig, conn: "sessionmaker[AsyncSession]") -> None:
self.config = cfg
self.conn = conn
self.logger = logging.getLogger("database")
@@ -45,21 +47,10 @@ class BaseData:
async def execute(self, sql: str, opts: Dict[str, Any] = {}) -> Optional[CursorResult]:
res = None
try:
self.logger.debug(f"SQL Execute: {''.join(str(sql).splitlines())}")
res = self.conn.execute(text(sql), opts)
except SQLAlchemyError as e:
self.logger.error(f"SQLAlchemy error {e}")
return None
except UnicodeEncodeError as e:
self.logger.error(f"UnicodeEncodeError error {e}")
return None
except Exception:
async with self.conn() as session:
try:
res = self.conn.execute(sql, opts)
self.logger.debug(f"SQL Execute: {''.join(str(sql).splitlines())}")
res = await session.execute(text(sql), opts)
except SQLAlchemyError as e:
self.logger.error(f"SQLAlchemy error {e}")
@@ -70,8 +61,20 @@ class BaseData:
return None
except Exception:
self.logger.error(f"Unknown error")
raise
try:
res = await session.execute(sql, opts)
except SQLAlchemyError as e:
self.logger.error(f"SQLAlchemy error {e}")
return None
except UnicodeEncodeError as e:
self.logger.error(f"UnicodeEncodeError error {e}")
return None
except Exception:
self.logger.error(f"Unknown error")
raise
return res
@@ -83,7 +86,7 @@ class BaseData:
async def log_event(
self, system: str, type: str, severity: int, message: str, details: Dict = {}, user: int = None,
arcade: int = None, machine: int = None, ip: str = None, game: str = None, version: str = None
arcade: int = None, machine: int = None, ip: Optional[str] = None, game: Optional[str] = None, version: Optional[str] = None
) -> Optional[int]:
sql = event_log.insert().values(
system=system,

View File

@@ -1,13 +1,14 @@
from typing import Dict, List, Optional
from sqlalchemy import Table, Column, UniqueConstraint
from sqlalchemy.types import Integer, String, Boolean, TIMESTAMP, BIGINT, VARCHAR
from sqlalchemy.sql.schema import ForeignKey
from sqlalchemy.sql import func
from sqlalchemy import Column, Table, UniqueConstraint
from sqlalchemy.engine import Row
from sqlalchemy.sql import func
from sqlalchemy.sql.schema import ForeignKey
from sqlalchemy.types import BIGINT, TIMESTAMP, VARCHAR, Boolean, Integer, String
from core.data.schema.base import BaseData, metadata
aime_card = Table(
aime_card: Table = Table(
"aime_card",
metadata,
Column("id", Integer, primary_key=True, nullable=False),
@@ -122,7 +123,7 @@ class CardData(BaseData):
result = await self.execute(sql)
if result is None:
self.logger.warn(f"Failed to update last login time for {access_code}")
self.logger.warning(f"Failed to update last login time for {access_code}")
async def get_card_by_idm(self, idm: str) -> Optional[Row]:
result = await self.execute(aime_card.select(aime_card.c.idm == idm))

View File

@@ -1,15 +1,15 @@
from typing import Optional, List
from sqlalchemy import Table, Column
from sqlalchemy.types import Integer, String, TIMESTAMP
from sqlalchemy.sql import func
from sqlalchemy.dialects.mysql import insert
from sqlalchemy.sql import func, select
from sqlalchemy.engine import Row
from typing import List, Optional
import bcrypt
from sqlalchemy import Column, Table
from sqlalchemy.dialects.mysql import insert
from sqlalchemy.engine import Row
from sqlalchemy.sql import func, select
from sqlalchemy.types import TIMESTAMP, Integer, String
from core.data.schema.base import BaseData, metadata
aime_user = Table(
aime_user: Table = Table(
"aime_user",
metadata,
Column("id", Integer, nullable=False, primary_key=True, autoincrement=True),
@@ -26,10 +26,10 @@ aime_user = Table(
class UserData(BaseData):
async def create_user(
self,
id: int = None,
username: str = None,
email: str = None,
password: str = None,
id: Optional[int] = None,
username: Optional[str] = None,
email: Optional[str] = None,
password: Optional[str] = None,
permission: int = 1,
) -> Optional[int]:
if id is None:
@@ -124,3 +124,15 @@ class UserData(BaseData):
async def get_user_by_username(self, username: str) -> Optional[Row]:
result = await self.execute(aime_user.select(aime_user.c.username == username))
if result: return result.fetchone()
async def change_permission(self, user_id: int, new_perms: int) -> Optional[bool]:
sql = aime_user.update(aime_user.c.id == user_id).values(permissions = new_perms)
result = await self.execute(sql)
return result is not None
async def change_email(self, user_id: int, new_email: int) -> Optional[bool]:
sql = aime_user.update(aime_user.c.id == user_id).values(email = new_email)
result = await self.execute(sql)
return result is not None

View File

@@ -12,7 +12,6 @@ import jwt
import yaml
import secrets
import string
import random
from base64 import b64decode
from enum import Enum
from datetime import datetime, timezone
@@ -20,6 +19,11 @@ from os import path, environ, mkdir, W_OK, access
from core import CoreConfig, Utils
from core.data import Data
from core.const import AllnetCountryCode
# A-HJ-NP-Z
SERIAL_LETTERS = ['A', 'B', 'C', 'D', 'E', 'F', 'G', 'H', 'J', 'K', 'L', 'M', 'N', 'P', 'Q', 'R', 'S', 'T', 'U', 'V', 'W', 'X', 'Y', 'Z']
ARTEMIS_SERIAL_PREFIX = "A69A"
class PermissionOffset(Enum):
USER = 0 # Regular user
@@ -33,8 +37,7 @@ class ShopPermissionOffset(Enum):
VIEW = 0 # View info and cabs
BOOKKEEP = 1 # View bookeeping info
EDITOR = 2 # Can edit name, settings
REGISTRAR = 3 # Can add cabs
# 4 - 6 reserved for future use
# 3 - 6 reserved for future use
OWNER = 7 # Can do anything
class ShopOwner():
@@ -145,10 +148,13 @@ class FrontendServlet():
Mount("/shop", routes=[
Route("/", self.arcade.render_GET, methods=['GET']),
Route("/{shop_id:int}", self.arcade.render_GET, methods=['GET']),
Route("/{shop_id:int}/info.update", self.arcade.update_shop, methods=['POST']),
]),
Mount("/cab", routes=[
Route("/", self.machine.render_GET, methods=['GET']),
Route("/{machine_id:int}", self.machine.render_GET, methods=['GET']),
Route("/{machine_id:int}/info.update", self.machine.update_cab, methods=['POST']),
Route("/{machine_id:int}/reassign", self.machine.reassign_cab, methods=['POST']),
]),
Mount("/game", routes=g_routes),
Route("/robots.txt", self.robots)
@@ -435,7 +441,7 @@ class FE_User(FE_Base):
if user_id:
if not self.test_perm(usr_sesh.permissions, PermissionOffset.USERMOD) and user_id != usr_sesh.user_id:
self.logger.warn(f"User {usr_sesh.user_id} does not have permission to view user {user_id}")
self.logger.warning(f"User {usr_sesh.user_id} does not have permission to view user {user_id}")
return RedirectResponse("/user/", 303)
else:
@@ -451,6 +457,16 @@ class FE_User(FE_Base):
card_data = []
arcade_data = []
managed_arcades = await self.data.arcade.get_arcades_managed_by_user(user_id)
if managed_arcades:
for arcade in managed_arcades:
ac = await self.data.arcade.get_arcade(arcade['id'])
if ac:
arcade_data.append({
"id": ac['id'],
"name": ac['name'],
})
for c in cards:
if c['is_locked']:
status = 'Locked'
@@ -857,14 +873,16 @@ class FE_System(FE_Base):
name = frm.get("shopName", None)
country = frm.get("shopCountry", "JPN")
ip = frm.get("shopIp", None)
owner = frm.get("shopOwner", None)
acid = await self.data.arcade.create_arcade(name if name else None, name if name else None, country)
if not acid:
return RedirectResponse("/sys/?e=99", 303)
if ip:
# TODO: set IP
pass
await self.data.arcade.set_arcade_vpn_ip(acid, ip if ip else None)
if owner:
await self.data.arcade.add_arcade_owner(acid, int(owner), 255)
return Response(template.render(
title=f"{self.core_config.server.name} | System",
@@ -892,10 +910,17 @@ class FE_System(FE_Base):
generated = await self.data.arcade.get_num_generated_keychips()
if not generated:
generated = 0
serial = self.data.arcade.format_serial("A69A", 1, "A", generated + 1, int(append))
serial_dash = self.data.arcade.format_serial("A69A", 1, "A", generated + 1, int(append), True)
rollover = generated // 9999
serial_num = (generated % 9999) + 1
serial_letter = SERIAL_LETTERS[rollover]
serial_dash = self.data.arcade.format_serial(ARTEMIS_SERIAL_PREFIX, 1, serial_letter, serial_num, int(append), True)
serial = serial_dash.replace("-", "")
cab_id = await self.data.arcade.create_machine(int(shopid), serial, None, game_code if game_code else None)
if cab_id is None:
return RedirectResponse("/sys/?e=4", 303)
return Response(template.render(
title=f"{self.core_config.server.name} | System",
@@ -938,15 +963,20 @@ class FE_Arcade(FE_Base):
shop_id = request.path_params.get('shop_id', None)
usr_sesh = self.validate_session(request)
if not usr_sesh or not self.test_perm(usr_sesh.permissions, PermissionOffset.ACMOD):
self.logger.warn(f"User {usr_sesh.user_id} does not have permission to view shops!")
if not usr_sesh:
return RedirectResponse("/gate/", 303)
if not shop_id:
return Response(template.render(
title=f"{self.core_config.server.name} | Arcade",
sesh=vars(usr_sesh),
), media_type="text/html; charset=utf-8")
return Response('Not Found', status_code=404)
is_acmod = self.test_perm(usr_sesh.permissions, PermissionOffset.ACMOD)
if not is_acmod:
usr_shop_perm = await self.data.arcade.get_manager_permissions(usr_sesh.user_id, shop_id)
if usr_shop_perm is None or usr_shop_perm == 0:
self.logger.warning(f"User {usr_sesh.user_id} does not have permission to view shop {shop_id}!")
return RedirectResponse("/", 303)
else:
usr_shop_perm = 15 # view, bookeep, edit
sinfo = await self.data.arcade.get_arcade(shop_id)
if not sinfo:
@@ -965,38 +995,204 @@ class FE_Arcade(FE_Base):
"game": x['game'],
})
managers = []
if (usr_shop_perm & 1 << ShopPermissionOffset.OWNER.value) or is_acmod:
mgrs = await self.data.arcade.get_arcade_owners(sinfo['id'])
if mgrs:
for mgr in mgrs:
usr = await self.data.user.get_user(mgr['user'])
managers.append({
'user': mgr['user'],
'name': usr['username'] if usr['username'] else 'No Name Set',
'is_view': bool(mgr['permissions'] & 1 << ShopPermissionOffset.VIEW.value),
'is_bookkeep': bool(mgr['permissions'] & 1 << ShopPermissionOffset.BOOKKEEP.value),
'is_edit': bool(mgr['permissions'] & 1 << ShopPermissionOffset.EDITOR.value),
'is_owner': bool(mgr['permissions'] & 1 << ShopPermissionOffset.OWNER.value),
})
if request.query_params.get("e", None):
err = int(request.query_params.get("e"))
else:
err = 0
if request.query_params.get("s", None):
suc = int(request.query_params.get("s"))
else:
suc = 0
return Response(template.render(
title=f"{self.core_config.server.name} | Arcade",
sesh=vars(usr_sesh),
arcade={
"name": sinfo['name'],
"id": sinfo['id'],
"cabs": cablst
}
cablst=cablst,
arcade=sinfo._asdict(),
can_bookkeep=bool(usr_shop_perm & 1 << ShopPermissionOffset.BOOKKEEP.value) or is_acmod,
can_edit=bool(usr_shop_perm & 1 << ShopPermissionOffset.EDITOR.value) or is_acmod,
is_owner=usr_shop_perm & 1 << ShopPermissionOffset.OWNER.value,
is_acmod=is_acmod,
managers=managers,
error=err,
success=suc
), media_type="text/html; charset=utf-8")
async def update_shop(self, request: Request):
shop_id = request.path_params.get('shop_id', None)
usr_sesh = self.validate_session(request)
if not usr_sesh:
return RedirectResponse("/gate/", 303)
sinfo = await self.data.arcade.get_arcade(shop_id)
if not shop_id or not sinfo:
return RedirectResponse("/", 303)
if not self.test_perm(usr_sesh.permissions, PermissionOffset.ACMOD):
usr_shop_perm = await self.data.arcade.get_manager_permissions(usr_sesh.user_id, sinfo['id'])
if usr_shop_perm is None or usr_shop_perm == 0:
self.logger.warning(f"User {usr_sesh.user_id} does not have permission to view shop {sinfo['id']}!")
return RedirectResponse("/", 303)
frm = await request.form()
new_name = frm.get('name', None)
new_nickname = frm.get('nickname', None)
new_country = frm.get('country', None)
new_region1 = frm.get('region1', None)
new_region2 = frm.get('region2', None)
new_tz = frm.get('tz', None)
new_ip = frm.get('ip', None)
try:
AllnetCountryCode(new_country)
except ValueError:
new_country = 'JPN'
did_name = await self.data.arcade.set_arcade_name_nickname(sinfo['id'], new_name if new_name else f'Arcade{sinfo["id"]}', new_nickname if new_nickname else None)
did_region = await self.data.arcade.set_arcade_region_info(sinfo['id'], new_country, new_region1 if new_region1 else None, new_region2 if new_region2 else None, None, None)
did_timezone = await self.data.arcade.set_arcade_timezone(sinfo['id'], new_tz if new_tz else None)
did_vpn = await self.data.arcade.set_arcade_vpn_ip(sinfo['id'], new_ip if new_ip else None)
if not did_name or not did_region or not did_timezone or not did_vpn:
self.logger.error(f"Failed to update some shop into: Name: {did_name} Region: {did_region} TZ: {did_timezone} VPN: {did_vpn}")
return RedirectResponse(f"/shop/{shop_id}?e=15", 303)
return RedirectResponse(f"/shop/{shop_id}?s=1", 303)
class FE_Machine(FE_Base):
async def render_GET(self, request: Request):
template = self.environment.get_template("core/templates/machine/index.jinja")
cab_id = request.path_params.get('cab_id', None)
cab_id = request.path_params.get('machine_id', None)
usr_sesh = self.validate_session(request)
if not usr_sesh or not self.test_perm(usr_sesh.permissions, PermissionOffset.ACMOD):
self.logger.warn(f"User {usr_sesh.user_id} does not have permission to view shops!")
if not usr_sesh:
return RedirectResponse("/gate/", 303)
cab = await self.data.arcade.get_machine(id=cab_id)
if not cab_id:
return Response(template.render(
title=f"{self.core_config.server.name} | Machine",
sesh=vars(usr_sesh),
), media_type="text/html; charset=utf-8")
if not cab_id or not cab:
return Response('Not Found', status_code=404)
shop = await self.data.arcade.get_arcade(cab['arcade'])
is_acmod = self.test_perm(usr_sesh.permissions, PermissionOffset.ACMOD)
if not is_acmod:
usr_shop_perm = await self.data.arcade.get_manager_permissions(usr_sesh.user_id, shop['id'])
if usr_shop_perm is None or usr_shop_perm == 0:
self.logger.warning(f"User {usr_sesh.user_id} does not have permission to view shop {shop['id']}!")
return RedirectResponse("/", 303)
else:
usr_shop_perm = 15 # view, bookeep, edit
if request.query_params.get("e", None):
err = int(request.query_params.get("e"))
else:
err = 0
if request.query_params.get("s", None):
suc = int(request.query_params.get("s"))
else:
suc = 0
return Response(template.render(
title=f"{self.core_config.server.name} | Machine",
sesh=vars(usr_sesh),
arcade={}
arcade=shop._asdict(),
machine=cab._asdict(),
can_bookkeep=bool(usr_shop_perm & 1 << ShopPermissionOffset.BOOKKEEP.value) or is_acmod,
can_edit=bool(usr_shop_perm & 1 << ShopPermissionOffset.EDITOR.value) or is_acmod,
is_owner=usr_shop_perm & 1 << ShopPermissionOffset.OWNER.value,
is_acmod=is_acmod,
error=err,
success=suc
), media_type="text/html; charset=utf-8")
async def update_cab(self, request: Request):
cab_id = request.path_params.get('machine_id', None)
usr_sesh = self.validate_session(request)
if not usr_sesh:
return RedirectResponse("/gate/", 303)
cab = await self.data.arcade.get_machine(id=cab_id)
if not cab_id or not cab:
return RedirectResponse("/", 303)
if not self.test_perm(usr_sesh.permissions, PermissionOffset.ACMOD):
usr_shop_perm = await self.data.arcade.get_manager_permissions(usr_sesh.user_id, cab['arcade'])
if usr_shop_perm is None or usr_shop_perm == 0:
self.logger.warning(f"User {usr_sesh.user_id} does not have permission to view shop {cab['arcade']}!")
return RedirectResponse("/", 303)
frm = await request.form()
new_game = frm.get('game', None)
new_country = frm.get('country', None)
new_tz = frm.get('tz', None)
new_is_cab = frm.get('is_cab', False) == 'on'
new_ota_channel = frm.get('ota_channel', None)
new_memo = frm.get('memo', None)
try:
AllnetCountryCode(new_country)
except ValueError:
new_country = None
did_game = await self.data.arcade.set_machine_game(cab['id'], new_game if new_game else None)
did_country = await self.data.arcade.set_machine_country(cab['id'], new_country if new_country else None)
did_timezone = await self.data.arcade.set_machine_timezone(cab['id'], new_tz if new_tz else None)
did_real_cab = await self.data.arcade.set_machine_real_cabinet(cab['id'], new_is_cab)
did_ota = await self.data.arcade.set_machine_ota_channel(cab['id'], new_ota_channel if new_is_cab else None)
did_memo = await self.data.arcade.set_machine_memo(cab['id'], new_memo if new_memo else None)
if not did_game or not did_country or not did_timezone or not did_real_cab or not did_ota or not did_memo:
self.logger.error(f"Failed to update some shop into: Game: {did_game} Country: {did_country} TZ: {did_timezone} Real: {did_real_cab} OTA: {did_ota} Memo: {did_memo}")
return RedirectResponse(f"/cab/{cab['id']}?e=15", 303)
return RedirectResponse(f"/cab/{cab_id}?s=1", 303)
async def reassign_cab(self, request: Request):
cab_id = request.path_params.get('machine_id', None)
usr_sesh = self.validate_session(request)
if not usr_sesh:
return RedirectResponse("/gate/", 303)
cab = await self.data.arcade.get_machine(id=cab_id)
if not cab_id or not cab:
return RedirectResponse("/", 303)
frm = await request.form()
new_shop = frm.get('new_arcade', None)
if not self.test_perm(usr_sesh.permissions, PermissionOffset.ACMOD):
self.logger.warning(f"User {usr_sesh.user_id} does not have permission to reassign cab {cab['id']} to arcade !")
return RedirectResponse(f"/cab/{cab_id}?e=11", 303)
new_sinfo = await self.data.arcade.get_arcade(new_shop)
if not new_sinfo:
return RedirectResponse(f"/cab/{cab_id}?e=14", 303)
if not await self.data.arcade.set_machine_arcade(cab['id'], new_sinfo['id']):
return RedirectResponse(f"/cab/{cab_id}?e=99", 303)
return RedirectResponse(f"/cab/{cab_id}?s=2", 303)
cfg_dir = environ.get("ARTEMIS_CFG_DIR", "config")
cfg: CoreConfig = CoreConfig()

View File

@@ -64,7 +64,7 @@ class MuchaServlet:
self.logger.debug(f"Mucha request {vars(req)}")
if not req.gameCd or not req.gameVer or not req.sendDate or not req.countryCd or not req.serialNum:
self.logger.warn(f"Missing required fields - {vars(req)}")
self.logger.warning(f"Missing required fields - {vars(req)}")
return PlainTextResponse("RESULTS=000")
minfo = self.mucha_registry.get(req.gameCd, {})
@@ -133,7 +133,7 @@ class MuchaServlet:
self.logger.info(f"Allow unknown serial {netid} ({sn_decrypt}) to auth")
else:
self.logger.warn(f'Auth failed for NetID {netid}')
self.logger.warning(f'Auth failed for NetID {netid}')
return PlainTextResponse("RESULTS=000")
self.logger.debug(f"Mucha response {vars(resp)}")

View File

@@ -2,18 +2,104 @@
{% block content %}
{% if arcade is defined %}
<h1>{{ arcade.name }}</h1>
<h2>PCBs assigned to this arcade <button class="btn btn-success" id="btn_add_cab" onclick="toggle_add_cab_form()">Add</button></h2>
<h2>Assigned Machines</h2>
{% if success is defined and success == 3 %}
<div style="background-color: #00AA00; padding: 20px; margin-bottom: 10px; width: 15%;">
Cab added successfully
</div>
{% endif %}
{% if success is defined and success == 1 %}
<div style="background-color: #00AA00; padding: 20px; margin-bottom: 10px; width: 15%;">
Info Updated
</div>
{% endif %}
{% include "core/templates/widgets/err_banner.jinja" %}
<ul style="font-size: 20px;">
{% for c in arcade.cabs %}
<li><a href="/cab/{{ c.id }}">{{ c.serial }}</a> ({{ c.game if c.game else "Any" }})&nbsp;<button class="btn btn-secondary" onclick="prep_edit_form()">Edit</button>&nbsp;<button class="btn-danger btn">Delete</button></li>
{% for c in cablst %}
<li><a href="/cab/{{ c.id }}">{{ c.serial }}</a> ({{ c.game if c.game else "Any" }})</li>
{% endfor %}
</ul>
Info
<form style="max-width: 50%;" action="/shop/{{ arcade.id }}/info.update" method="post" id="shop_info">
<div class="row">
<div class="col mb-3">
<label for="name" class="form-label">Name</label>
<input type="text" class="form-control" id="name" name="name" maxlength="255" value="{{ arcade.name if arcade.name is not none else "" }}">
</div>
<div class="col mb-3">
<label for="nickname" class="form-label">Nickname</label>
<input type="text" class="form-control" id="nickname" name="nickname" maxlength="255" value="{{ arcade.nickname if arcade.nickname is not none else "" }}">
</div>
</div>
<div class="row">
<div class="col mb-3">
<label for="country" class="form-label">Country</label>
<select id="country" name="country" class="form-select bg-dark text-white">
<option value="JPN" {{ 'selected' if arcade.country == 'JPN' else ''}}>Japan</option>
<option value="USA" {{ 'selected' if arcade.country == 'USA' else ''}}>USA</option>
<option value="HKG" {{ 'selected' if arcade.country == 'HKG' else ''}}>Hong Kong</option>
<option value="SGP" {{ 'selected' if arcade.country == 'SGP' else ''}}>Singapore</option>
<option value="KOR" {{ 'selected' if arcade.country == 'KOR' else ''}}>South Korea</option>
<option value="TWN" {{ 'selected' if arcade.country == 'TWN' else ''}}>Taiwan</option>
<option value="CHN" {{ 'selected' if arcade.country == 'CHN' else ''}}>China</option>
<option value="AUS" {{ 'selected' if arcade.country == 'AUS' else ''}}>Australia</option>
<option value="IDN" {{ 'selected' if arcade.country == 'IDN' else ''}}>Indonesia</option>
<option value="MMR" {{ 'selected' if arcade.country == 'MMR' else ''}}>Myanmar</option>
<option value="MYS" {{ 'selected' if arcade.country == 'MYS' else ''}}>Malaysia</option>
<option value="NZL" {{ 'selected' if arcade.country == 'NZL' else ''}}>New Zealand</option>
<option value="PHL" {{ 'selected' if arcade.country == 'PHL' else ''}}>Philippines</option>
<option value="THA" {{ 'selected' if arcade.country == 'THA' else ''}}>Thailand</option>
<option value="VNM" {{ 'selected' if arcade.country == 'VNM' else ''}}>Vietnam</option>
</select>
</div>
<div class="col mb-3">
<label for="region1" class="form-label">Region 1</label>
<input type="text" class="form-control" id="region1" name="region1" maxlength="255" value="{{ arcade.state if arcade.state is not none else "" }}">
</div>
<div class="col mb-3">
<label for="region2" class="form-label">Region 2</label>
<input type="text" class="form-control" id="region2" name="region2" maxlength="255" value="{{ arcade.city if arcade.city is not none else "" }}">
</div>
<div class="col mb-3">
<label for="tz" class="form-label">Timezone</label>
<input type="text" class="form-control" id="tz" name="tz" placeholder="+09:00" maxlength="255" value="{{ arcade.timezone if arcade.timezone is not none else "" }}">
</div>
</div>
<div class="row">
<div class="col mb-3">
<label for="ip" class="form-label">VPN IP</label>
<input type="text" class="form-control" id="ip" name="ip" maxlength="39" value="{{ arcade.ip if arcade.ip is not none else "" }}">
</div>
</div>
{% if can_edit %}
<div class="row">
<div class="col mb-3">
<input type="submit" value="Update" class="btn btn-primary">
</div>
</div>
{% endif %}
</form>
{% if is_owner or is_acmod %}
<br>
<h2>Arcade Managers&nbsp;<button type="button" class="btn btn-success">Add</button></h2>
<ul style="font-size: 20px;">
{% for u in managers %}
<li>{{ u.name }}:
<label for="is_view_{{ u.user }}" class="form-label">View Arcade</label>
<input type="checkbox" class="form-control-check" id="is_view_{{ u.user }}" name="is_view" {{ 'checked' if u.is_view else ''}}>&nbsp;|
<label for="is_bookkeep_{{ u.user }}" class="form-label">View Bookkeeping</label>
<input type="checkbox" class="form-control-check" id="is_bookkeep_{{ u.user }}" name="is_bookkeep" {{ 'checked' if u.is_bookkeep else ''}}>&nbsp;|
<label for="is_edit_{{ u.user }}" class="form-label">Edit Arcade</label>
<input type="checkbox" class="form-control-check" id="is_edit_{{ u.user }}" name="is_edit" {{ 'checked' if u.is_edit else ''}}>&nbsp;|
<label for="is_owner_{{ u.user }}" class="form-label">Owner</label>
<input type="checkbox" class="form-control-check" id="is_owner_{{ u.user }}" name="is_owner" {{ 'checked' if u.is_owner else ''}}>&nbsp;|
<button type="submit" class="btn btn-primary">Update</button>
<button type="button" class="btn btn-danger">Delete</button>
</li>
{% endfor %}
</ul>
{% endif %}
{% else %}
<h3>Arcade Not Found</h3>
{% endif %}
{% endblock content %}
{% endblock content %}

View File

@@ -1,4 +1,109 @@
{% extends "core/templates/index.jinja" %}
{% block content %}
<h1>Machine Management</h1>
{% endblock content %}
<script type="text/javascript">
function swap_ota() {
let is_cab = document.getElementById("is_cab").checked;
let txt_ota = document.getElementById("ota_channel");
cbx_ota.disabled = !is_cab;
}
</script>
<h1>Machine: {{machine.serial}}</h1>
<h3>Arcade: <a href=/shop/{{ arcade.id }}>{{ arcade.name }}</a>{% if is_acmod %}&nbsp;<button class="btn btn-danger" data-bs-toggle="modal" data-bs-target="#reassign_modal">Reassign</button>{% endif %}</h3>
{% include "core/templates/widgets/err_banner.jinja" %}
{% if success is defined and success == 1 %}
<div style="background-color: #00AA00; padding: 20px; margin-bottom: 10px; width: 15%;">
Info Updated
</div>
{% endif %}
{% if success is defined and success == 2 %}
<div style="background-color: #00AA00; padding: 20px; margin-bottom: 10px; width: 15%;">
Machine successfully reassigned
</div>
{% endif %}
Info
<form style="max-width: 50%;" action="/cab/{{ machine.id }}/info.update" method="post" id="mech_info">
<div class="row">
<div class="col mb-3">
<label for="game" class="form-label">Game</label>
<input type="text" class="form-control" id="game" name="game" placeholder="SXXX" maxlength="4" value="{{ machine.game if machine.game is not none else "" }}">
</div>
<div class="col mb-3">
<label for="country" class="form-label">Country Override</label>
<select id="country" name="country" class="form-select bg-dark text-white">
<option value="" {{ 'selected' if machine.country is none else ''}}>Same As Arcade</option>
<option value="JPN" {{ 'selected' if machine.country == 'JPN' else ''}}>Japan</option>
<option value="USA" {{ 'selected' if machine.country == 'USA' else ''}}>USA</option>
<option value="HKG" {{ 'selected' if machine.country == 'HKG' else ''}}>Hong Kong</option>
<option value="SGP" {{ 'selected' if machine.country == 'SGP' else ''}}>Singapore</option>
<option value="KOR" {{ 'selected' if machine.country == 'KOR' else ''}}>South Korea</option>
<option value="TWN" {{ 'selected' if machine.country == 'TWN' else ''}}>Taiwan</option>
<option value="CHN" {{ 'selected' if machine.country == 'CHN' else ''}}>China</option>
<option value="AUS" {{ 'selected' if machine.country == 'AUS' else ''}}>Australia</option>
<option value="IDN" {{ 'selected' if machine.country == 'IDN' else ''}}>Indonesia</option>
<option value="MMR" {{ 'selected' if machine.country == 'MMR' else ''}}>Myanmar</option>
<option value="MYS" {{ 'selected' if machine.country == 'MYS' else ''}}>Malaysia</option>
<option value="NZL" {{ 'selected' if machine.country == 'NZL' else ''}}>New Zealand</option>
<option value="PHL" {{ 'selected' if machine.country == 'PHL' else ''}}>Philippines</option>
<option value="THA" {{ 'selected' if machine.country == 'THA' else ''}}>Thailand</option>
<option value="VNM" {{ 'selected' if machine.country == 'VNM' else ''}}>Vietnam</option>
</select>
</div>
<div class="col mb-3">
<label for="tz" class="form-label">Timezone Override</label>
<input type="text" class="form-control" id="tz" name="tz" placeholder="+09:00" maxlength="6" value="{{ machine.timezone if machine.timezone is not none else "" }}">
</div>
</div>
<div class="row">
<div class="col mb-3">
<input type="checkbox" class="form-control-check" id="is_cab" name="is_cab" {{ 'checked' if machine.is_cab else ''}} onchange="swap_ota()">
<label for="is_cab" class="form-label">Real Cabinet</label>
</div>
<div class="col mb-3">
<input type="text" class="form-control-check" id="ota_channel" name="ota_channel" value={{ machine.ota_channel }} {{ 'disabled' if not machine.is_cab else '' }}>
<label for="ota_channel" class="form-label">OTA Update Channel</label>
</div>
<div class="col mb-3">
</div>
</div>
<div class="row">
<div class="col mb-3">
<label for="memo" class="form-label">Memo</label>
<input type="text" class="form-control" id="memo" name="memo" maxlength="255" value="{{ machine.memo if machine.memo is not none else "" }}">
</div>
</div>
{% if can_edit %}
<div class="row">
<div class="col mb-3">
<input type="submit" value="Update" class="btn btn-primary">
</div>
</div>
{% endif %}
</form>
{% if is_acmod %}
<form id="frm_reassign" method="post" action="/cab/{{ machine.id }}/reassign" style="outline: 0px;">
<div class="modal" tabindex="-1" id="reassign_modal">
<div class="modal-dialog">
<div class="modal-content">
<div class="modal-header">
<h5 class="modal-title">Reassign {{ machine.serial }}</h5>
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
</div>
<div class="modal-body">
<p>This will reassign this cabinet from the current arcade "{{ arcade.name }}" to the arcade who's ID you enter below.</p>
<label for="new_arcade" class="form-label">New Arcade</label>
<input type="text" class="form-control" id="new_arcade" name="new_arcade" value="{{ arcade.id }}">
</div>
<div class="modal-footer">
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Close</button>
<button type="submit" class="btn btn-primary">Save changes</button>
</div>
</div>
</div>
</div>
</form>
{% endif %}
<script type="text/javascript">
swap_ota();
</script>
{% endblock content %}

View File

@@ -137,6 +137,11 @@
<input type="text" class="form-control" id="shopIp" name="shopIp">
</div>
<br />
<div class="form-group">
<label for="shopOwner">Owner User ID</label>
<input type="text" class="form-control" id="shopOwner" name="shopOwner">
</div>
<br />
<button type="submit" class="btn btn-primary">Add</button>
</form>
</div>

View File

@@ -159,19 +159,10 @@ Update successful
</form>
{% if arcades is defined and arcades|length > 0 %}
<h2>Arcades</h2>
<h2>Arcades you manage</h2>
<ul>
{% for a in arcades %}
<li><h3>{{ a.name }}</h3>
{% if a.machines|length > 0 %}
<table>
<tr><th>Serial</th><th>Game</th><th>Last Seen</th></tr>
{% for m in a.machines %}
<tr><td>{{ m.serial }}</td><td>{{ m.game }}</td><td>{{ m.last_seen }}</td></tr>
{% endfor %}
</table>
{% endif %}
</li>
<li><h3><a href=/shop/{{a.id}}>{{ a.name }}</a></h3></li>
{% endfor %}
</ul>
{% endif %}

View File

@@ -27,6 +27,10 @@ Access Denied
Card already registered
{% elif error == 13 %}
AmusementIC Access Codes beginning with 5 must have IDm
{% elif error == 14 %}
Arcade does not exist
{% elif error == 15 %}
Some info failed to update
{% else %}
An unknown error occoured
{% endif %}

View File

@@ -86,11 +86,11 @@ class BaseServlet:
return (False, [], [])
async def render_POST(self, request: Request) -> bytes:
self.logger.warn(f"Game Does not dispatch POST")
self.logger.warning(f"Game Does not dispatch POST")
return Response()
async def render_GET(self, request: Request) -> bytes:
self.logger.warn(f"Game Does not dispatch GET")
self.logger.warning(f"Game Does not dispatch GET")
return Response()
class TitleServlet:
@@ -149,41 +149,3 @@ class TitleServlet:
self.logger.info(
f"Serving {len(self.title_registry)} game codes {'on port ' + str(core_cfg.server.port) if core_cfg.server.port > 0 else ''}"
)
def render_GET(self, request: Request, endpoints: dict) -> bytes:
code = endpoints["title"]
subaction = endpoints['subaction']
if code not in self.title_registry:
self.logger.warning(f"Unknown game code {code}")
request.setResponseCode(404)
return b""
index = self.title_registry[code]
handler = getattr(index, f"{subaction}", None)
if handler is None:
self.logger.error(f"{code} does not have handler for GET subaction {subaction}")
request.setResponseCode(500)
return b""
return handler(request, code, endpoints)
def render_POST(self, request: Request, endpoints: dict) -> bytes:
code = endpoints["title"]
subaction = endpoints['subaction']
if code not in self.title_registry:
self.logger.warning(f"Unknown game code {code}")
request.setResponseCode(404)
return b""
index = self.title_registry[code]
handler = getattr(index, f"{subaction}", None)
if handler is None:
self.logger.error(f"{code} does not have handler for POST subaction {subaction}")
request.setResponseCode(500)
return b""
endpoints.pop("title")
endpoints.pop("subaction")
return handler(request, code, endpoints)

View File

@@ -1,18 +1,48 @@
from typing import Dict, Any, Optional
from types import ModuleType
from starlette.requests import Request
import logging
import importlib
from os import walk
import jwt
import logging
from base64 import b64decode
from datetime import datetime, timezone
from os import walk
from types import ModuleType
from typing import Any, Dict, Optional
import math
import jwt
from starlette.requests import Request
from .config import CoreConfig
class _MissingSentinel:
__slots__: tuple[str, ...] = ()
def __eq__(self, other) -> bool:
return False
def __bool__(self) -> bool:
return False
def __hash__(self) -> int:
return 0
def __repr__(self):
return "..."
MISSING: Any = _MissingSentinel()
"""This is different from `None` in that its type is `Any`, and so it can be used
as a placeholder for values that are *definitely* going to be initialized,
so they don't have to be typed as `T | None`, which makes type checkers
angry when an attribute is accessed.
This can also be used for when `None` has actual meaning as a value, and so a
separate value is needed to mean "unset"."""
class Utils:
real_title_port = None
real_title_port_ssl = None
@classmethod
def get_all_titles(cls) -> Dict[str, ModuleType]:
ret: Dict[str, Any] = {}
@@ -36,27 +66,58 @@ class Utils:
def get_ip_addr(cls, req: Request) -> str:
ip = req.headers.get("x-forwarded-for", req.client.host)
return ip.split(", ")[0]
@classmethod
def get_title_port(cls, cfg: CoreConfig):
if cls.real_title_port is not None: return cls.real_title_port
if cls.real_title_port is not None:
return cls.real_title_port
cls.real_title_port = (
cfg.server.proxy_port
if cfg.server.is_using_proxy and cfg.server.proxy_port
else cfg.server.port
)
cls.real_title_port = cfg.server.proxy_port if cfg.server.is_using_proxy and cfg.server.proxy_port else cfg.server.port
return cls.real_title_port
@classmethod
def get_title_port_ssl(cls, cfg: CoreConfig):
if cls.real_title_port_ssl is not None: return cls.real_title_port_ssl
if cls.real_title_port_ssl is not None:
return cls.real_title_port_ssl
cls.real_title_port_ssl = (
cfg.server.proxy_port_ssl
if cfg.server.is_using_proxy and cfg.server.proxy_port_ssl
else 443
)
cls.real_title_port_ssl = cfg.server.proxy_port_ssl if cfg.server.is_using_proxy and cfg.server.proxy_port_ssl else 443
return cls.real_title_port_ssl
def create_sega_auth_key(aime_id: int, game: str, place_id: int, keychip_id: str, b64_secret: str, exp_seconds: int = 86400, err_logger: str = 'aimedb') -> Optional[str]:
def floor_to_nearest_005(version: int) -> int:
return (version // 5) * 5
def create_sega_auth_key(
aime_id: int,
game: str,
place_id: int,
keychip_id: str,
b64_secret: str,
exp_seconds: int = 86400,
err_logger: str = "aimedb",
) -> Optional[str]:
logger = logging.getLogger(err_logger)
try:
return jwt.encode({ "aime_id": aime_id, "game": game, "place_id": place_id, "keychip_id": keychip_id, "exp": int(datetime.now(tz=timezone.utc).timestamp()) + exp_seconds }, b64decode(b64_secret), algorithm="HS256")
return jwt.encode(
{
"aime_id": aime_id,
"game": game,
"place_id": place_id,
"keychip_id": keychip_id,
"exp": int(datetime.now(tz=timezone.utc).timestamp()) + exp_seconds,
},
b64decode(b64_secret),
algorithm="HS256",
)
except jwt.InvalidKeyError:
logger.error("Failed to encode Sega Auth Key because the secret is invalid!")
return None
@@ -64,10 +125,19 @@ def create_sega_auth_key(aime_id: int, game: str, place_id: int, keychip_id: str
logger.error(f"Unknown exception occoured when encoding Sega Auth Key! {e}")
return None
def decode_sega_auth_key(token: str, b64_secret: str, err_logger: str = 'aimedb') -> Optional[Dict]:
def decode_sega_auth_key(
token: str, b64_secret: str, err_logger: str = "aimedb"
) -> Optional[Dict]:
logger = logging.getLogger(err_logger)
try:
return jwt.decode(token, "secret", b64decode(b64_secret), algorithms=["HS256"], options={"verify_signature": True})
return jwt.decode(
token,
"secret",
b64decode(b64_secret),
algorithms=["HS256"],
options={"verify_signature": True},
)
except jwt.ExpiredSignatureError:
logger.error("Sega Auth Key failed to validate due to an expired signature!")
return None
@@ -83,4 +153,3 @@ def decode_sega_auth_key(token: str, b64_secret: str, err_logger: str = 'aimedb'
except Exception as e:
logger.error(f"Unknown exception occoured when decoding Sega Auth Key! {e}")
return None

View File

@@ -1,14 +1,15 @@
#!/usr/bin/env python3
import argparse
import logging
from os import mkdir, path, access, W_OK
import yaml
import asyncio
import logging
from os import W_OK, access, environ, mkdir, path
import yaml
from core.data import Data
from core.config import CoreConfig
from core.data import Data
if __name__ == "__main__":
async def main():
parser = argparse.ArgumentParser(description="Database utilities")
parser.add_argument(
"--config", "-c", type=str, help="Config folder to use", default="config"
@@ -25,10 +26,11 @@ if __name__ == "__main__":
parser.add_argument("action", type=str, help="create, upgrade, downgrade, create-owner, migrate, create-revision, create-autorevision")
args = parser.parse_args()
environ["ARTEMIS_CFG_DIR"] = args.config
cfg = CoreConfig()
if path.exists(f"{args.config}/core.yaml"):
cfg_dict = yaml.safe_load(open(f"{args.config}/core.yaml"))
cfg_dict.get("database", {})["loglevel"] = "info"
cfg.update(cfg_dict)
if not path.exists(cfg.server.log_dir):
@@ -42,10 +44,8 @@ if __name__ == "__main__":
data = Data(cfg)
loop = asyncio.get_event_loop()
if args.action == "create":
data.create_database()
await data.create_database()
elif args.action == "upgrade":
data.schema_upgrade(args.version)
@@ -57,16 +57,20 @@ if __name__ == "__main__":
data.schema_downgrade(args.version)
elif args.action == "create-owner":
loop.run_until_complete(data.create_owner(args.email, args.access_code))
await data.create_owner(args.email, args.access_code)
elif args.action == "migrate":
loop.run_until_complete(data.migrate())
await data.migrate()
elif args.action == "create-revision":
loop.run_until_complete(data.create_revision(args.message))
await data.create_revision(args.message)
elif args.action == "create-autorevision":
loop.run_until_complete(data.create_revision_auto(args.message))
await data.create_revision_auto(args.message)
else:
logging.getLogger("database").info(f"Unknown action {args.action}")
if __name__ == "__main__":
asyncio.run(main())

View File

@@ -6,6 +6,7 @@ services:
volumes:
- ./aime:/app/aime
- ./configs/config:/app/config
- ./cert:/app/cert
environment:
CFG_DEV: 1
@@ -14,7 +15,8 @@ services:
CFG_CORE_MEMCACHED_HOSTNAME: ma.memcached
CFG_CORE_AIMEDB_KEY: <INSERT AIMEDB KEY HERE>
CFG_CHUNI_SERVER_LOGLEVEL: debug
##Note: comment 80 and 8443 when you plan to use with nginx
ports:
- "80:80"
- "8443:8443"
@@ -64,3 +66,18 @@ services:
ports:
- "9090:8080"
##Note: uncomment to allow use nginx with artemis, don't forget to comment 80 and 8443 ports on artemis
#nginx:
# hostname: ma.nginx
# image: nginx:latest
# ports:
# - "80:80"
# - "443:443"
# - "8443:8443"
# volumes:
##Note: copy example_config/example_nginx.conf to configs/nginx folder, edit it and rename to nginx.conf
# - ./configs/nginx:/etc/nginx/conf.d
# - ./cert:/etc/nginx/cert
# - ./logs/nginx:/var/log/nginx
# depends_on:
# - app

View File

@@ -26,6 +26,7 @@
- `name`: Name of the database the server should expect. Default `aime`
- `port`: Port the database server is listening on. Default `3306`
- `protocol`: Protocol used in the connection string, e.i `mysql` would result in `mysql://...`. Default `mysql`
- `ssl_enabled`: Enforce SSL to be used in the connection string. Default `False`
- `sha2_password`: Whether or not the password in the connection string should be hashed via SHA2. Default `False`
- `loglevel`: Logging level for the database. Default `info`
- `memcached_host`: Host of the memcached server. Default `localhost`
@@ -40,6 +41,13 @@
- `loglevel`: Logging level for the allnet server. Default `info`
- `allow_online_updates`: Allow allnet to distribute online updates via DownloadOrders. This system is currently non-functional, so leave it disabled. Default `False`
- `update_cfg_folder`: Folder where delivery INI files will be checked for. Ignored if `allow_online_updates` is `False`. Default `""`
- `allnet_lite_keys:` Allnet Lite (Chinese Allnet) PowerOn/DownloadOrder unique keys. Default ` `
```yaml
allnet_lite_keys:
"SDJJ": [ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 ]
"SDHJ": [ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 ]
"SDGB": [ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 ]
```
## Billing
- `standalone`: Whether the billing server should launch it's own servlet on it's own port, or be part of the main servlet on the default port. Setting this to `True` requires that you have `ssl_key` and `ssl_cert` set. Default `False`
- `loglevel`: Logging level for the billing server. Default `info`
@@ -55,3 +63,8 @@
- `key`: Key to encrypt/decrypt aimedb requests and responses. MUST be set or the server will not start. If set incorrectly, your server will not properly handle aimedb requests. Default `""`
- `id_secret`: Base64-encoded JWT secret for Sega Auth IDs. Leaving this blank disables this feature. Default `""`
- `id_lifetime_seconds`: Number of secons a JWT generated should be valid for. Default `86400` (1 day)
## Chimedb
- `enable`: Whether or not chimedb should run. Default `False`
- `loglevel`: Logging level for the chimedb server. Default `info`
- `key`: Key to hash chimedb requests and responses. MUST be set or the server will not start. If set incorrectly, your server will not properly handle chimedb requests. Default `""`

View File

@@ -26,11 +26,14 @@ python dbutils.py migrate
- [CHUNITHM](#chunithm)
- [crossbeats REV.](#crossbeats-rev)
- [maimai DX](#maimai-dx)
- [Project Diva](#hatsune-miku-project-diva)
- [O.N.G.E.K.I.](#o-n-g-e-k-i)
- [Card Maker](#card-maker)
- [WACCA](#wacca)
- [Sword Art Online Arcade](#sao)
- [Initial D Zero](#initial-d-zero)
- [Initial D THE ARCADE](#initial-d-the-arcade)
- [Pokken Tournament](#pokken)
# Supported Games
@@ -57,13 +60,16 @@ Games listed below have been tested and confirmed working.
### SDHD/SDBT
| Version ID | Version Name |
| ---------- | ------------------- |
| 11 | CHUNITHM NEW!! |
| 12 | CHUNITHM NEW PLUS!! |
| 13 | CHUNITHM SUN |
| 14 | CHUNITHM SUN PLUS |
| 15 | CHUNITHM LUMINOUS |
| Version ID | Version Name |
| ---------- | ---------------------- |
| 11 | CHUNITHM NEW!! |
| 12 | CHUNITHM NEW PLUS!! |
| 13 | CHUNITHM SUN |
| 14 | CHUNITHM SUN PLUS |
| 15 | CHUNITHM LUMINOUS |
| 16 | CHUNITHM LUMINOUS PLUS |
| 17 | CHUNITHM VERSE |
| 18 | CHUNITHM X-VERSE |
### Importer
@@ -74,18 +80,21 @@ In order to use the importer locate your game installation folder and execute:
python read.py --game SDBT --version <version ID> --binfolder /path/to/game/folder --optfolder /path/to/game/option/folder
```
The importer for Chunithm will import: Events, Music, Charge Items and Avatar Accesories.
The importer for Chunithm will import: Events, Music, Charge Items, Avatar Accesories, Nameplates, Characters, Trophies, Map Icons, and System Voices.
### Config
Config file is located in `config/chuni.yaml`.
| Option | Info |
|------------------|----------------------------------------------------------------------------------------------------------------|
| `news_msg` | If this is set, the news at the top of the main screen will be displayed (up to Chunithm Paradise Lost) |
| `name` | If this is set, all players that are not on a team will use this one by default. |
| `use_login_bonus`| This is used to enable the login bonuses |
| `crypto` | This option is used to enable the TLS Encryption |
| Option | Info |
|-----------------------|-------------------------------------------------------------------------------------------------------------------------------------------|
| `news_msg` | If this is set, the news at the top of the main screen will be displayed (up to Chunithm Paradise Lost) |
| `name` | If this is set, all players that are not on a team will use this one by default. |
| `use_login_bonus` | This is used to enable the login bonuses |
| `stock_tickets` | If this is set, specifies tickets to auto-stock at login. Format is a comma-delimited list of IDs. Defaults to None |
| `stock_count` | Ignored if stock_tickets is not specified. Number to stock of each ticket. Defaults to 99 |
| `forced_item_unlocks` | Frontend UI customization overrides that allow all items of given types to be used (instead of just those unlocked/purchased by the user) |
| `crypto` | This option is used to enable the TLS Encryption |
If you would like to use network encryption, add the keys to the `keys` section under `crypto`, where the key
@@ -101,6 +110,7 @@ crypto:
keys:
13: ["0000000000000000000000000000000000000000000000000000000000000000", "00000000000000000000000000000000", "0000000000000000"]
"13_int": ["0000000000000000000000000000000000000000000000000000000000000000", "00000000000000000000000000000000", "0000000000000000", 42]
"13_chn": ["0000000000000000000000000000000000000000000000000000000000000000", "00000000000000000000000000000000", "0000000000000000", 8]
```
### Database upgrade
@@ -148,12 +158,15 @@ INSERT INTO aime.chuni_profile_team (teamName) VALUES (<teamName>);
Team names can be regular ASCII, and they will be displayed ingame.
### Favorite songs
You can set the songs that will be in a user's Favorite Songs category using the following SQL entries:
Favorites can be set through the Frontend Web UI for songs previously played. Alternatively, you can set the songs that will be in a user's Favorite Songs category using the following SQL entries:
```sql
INSERT INTO aime.chuni_item_favorite (user, version, favId, favKind) VALUES (<user>, <version>, <songId>, 1);
```
The songId is based on the actual ID within your version of Chunithm.
### Profile Customization
The Frontend Web UI supports configuration of the userbox, avatar (NEW!! and newer), map icon (AMAZON and newer), and system voice (AMAZON and newer).
## crossbeats REV.
@@ -185,37 +198,40 @@ Config file is located in `config/cxb.yaml`.
### Presents
Presents are items given to the user when they login, with a little animation (for example, the KOP song was given to the finalists as a present). To add a present, you must insert it into the `mai2_item_present` table. In that table, a NULL version means any version, a NULL user means any user, a NULL start date means always open, and a NULL end date means it never expires. Below is a list of presents one might wish to add:
| Game Version | Item ID | Item Kind | Item Description | Present Description |
|--------------|---------|-----------|-------------------------------------------------|------------------------------------------------|
| BUDDiES (21) | 409505 | Icon (3) | 旅行スタンプ(月面基地) (Travel Stamp - Moon Base) | Officially obtained on the webui with a serial |
| | | | | number, for project raputa |
| Game Version | Item ID | Item Kind | Item Description | Present Description |
|--------------|---------|----------------------|--------------------------------------------|----------------------------------------------------------------------------|
| BUDDiES (21) | 409505 | Icon (3) | 旅行スタンプ(月面基地) (Travel Stamp - Moon Base) | Officially obtained on the webui with a serial number, for project raputa |
| PRiSM (23) | 3 | KaleidxScopeKey (15) | 紫の鍵 (Purple Key) | Officially obtained on the webui with a serial number, for KaleidxScope |
### Versions
| Game Code | Version ID | Version Name |
| --------- | ---------- | ----------------------- |
| SBXL | 0 | maimai |
| SBXL | 1 | maimai PLUS |
| SBZF | 2 | maimai GreeN |
| SBZF | 3 | maimai GreeN PLUS |
| SDBM | 4 | maimai ORANGE |
| SDBM | 5 | maimai ORANGE PLUS |
| SDCQ | 6 | maimai PiNK |
| SDCQ | 7 | maimai PiNK PLUS |
| SDDK | 8 | maimai MURASAKi |
| SDDK | 9 | maimai MURASAKi PLUS |
| SDDZ | 10 | maimai MiLK |
| SDDZ | 11 | maimai MiLK PLUS |
| SDEY | 12 | maimai FiNALE |
| SDEZ | 13 | maimai DX |
| SDEZ | 14 | maimai DX PLUS |
| SDEZ | 15 | maimai DX Splash |
| SDEZ | 16 | maimai DX Splash PLUS |
| SDEZ | 17 | maimai DX UNiVERSE |
| SDEZ | 18 | maimai DX UNiVERSE PLUS |
| SDEZ | 19 | maimai DX FESTiVAL |
| SDEZ | 20 | maimai DX FESTiVAL PLUS |
| SDEZ | 21 | maimai DX BUDDiES |
|----------|------------|-------------------------|
| SBXL | 0 | maimai |
| SBXL | 1 | maimai PLUS |
| SBZF | 2 | maimai GreeN |
| SBZF | 3 | maimai GreeN PLUS |
| SDBM | 4 | maimai ORANGE |
| SDBM | 5 | maimai ORANGE PLUS |
| SDCQ | 6 | maimai PiNK |
| SDCQ | 7 | maimai PiNK PLUS |
| SDDK | 8 | maimai MURASAKi |
| SDDK | 9 | maimai MURASAKi PLUS |
| SDDZ | 10 | maimai MiLK |
| SDDZ | 11 | maimai MiLK PLUS |
| SDEY | 12 | maimai FiNALE |
| SDEZ | 13 | maimai DX |
| SDEZ | 14 | maimai DX PLUS |
| SDEZ | 15 | maimai DX Splash |
| SDEZ | 16 | maimai DX Splash PLUS |
| SDEZ | 17 | maimai DX UNiVERSE |
| SDEZ | 18 | maimai DX UNiVERSE PLUS |
| SDEZ | 19 | maimai DX FESTiVAL |
| SDEZ | 20 | maimai DX FESTiVAL PLUS |
| SDEZ | 21 | maimai DX BUDDiES |
| SDEZ | 22 | maimai DX BUDDiES PLUS |
| SDEZ | 23 | maimai DX PRiSM |
| SDEZ | 24 | maimai DX PRiSM PLUS |
### Importer
@@ -244,6 +260,43 @@ python dbutils.py upgrade
Pre-Dx uses the same database as DX, so only upgrade using the SDEZ game code!
### Config
Config file is located in `config/mai2.yaml`.
| Option | Info |
|-----------------------|-------------------------------------------------------------------------------------------------------------------------------------------|
| `crypto` | This option is used to enable the TLS Encryption |
If you would like to use network encryption, add the keys to the `keys` section under `crypto`, where the key
is the version ID for Japanese (SDEZ) versions and `"{versionID}_int"` for Export (SDGA) versions, and the value
is an array containing `[key, iv, salt]` in order.
Just copy your salt in here, no need to convert anything.
```yaml
crypto:
encrypted_only: False
keys:
23: ["0000000000000000000000000000000000000000000000000000000000000000", "00000000000000000000000000000000", "0000000000000000"]
"23_int": ["0000000000000000000000000000000000000000000000000000000000000000", "00000000000000000000000000000000", "0000000000000000"]
"23_chn": ["0000000000000000000000000000000000000000000000000000000000000000", "00000000000000000000000000000000", "0000000000000000"]
```
| Option | Info |
|-----------------|------------------------------------------------------|
| `chart_deliver` | This option is used to delivery charts to the client |
If you would like to use chart delivery, set this option to `True` and configure the directory to read from. Then put charts in your chart folder like this:
```
chart_folder/23/music001736/001736_00.ma2
chart_folder/23/music001736/001736_01.ma2 # PRiSM
chart_folder/24/music001901/001901_00.ma2
chart_folder/24/music001901/001901_01.ma2 # PRiSM PLUS
```
## Hatsune Miku Project Diva
### SBZV
@@ -290,6 +343,23 @@ Always make sure your database (tables) are up-to-date:
python dbutils.py upgrade
```
### Using NGINX
Diva's netcode does not send a `Host` header with it's network requests. This renders it incompatable with NGINX as configured in the example config, because nginx relies on the header to determine how to proxy the request. If you'd still like to use NGINX with diva, please see the sample config below.
```conf
server {
listen 80 default_server;
server_name _;
location / {
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_pass_request_headers on;
proxy_pass http://127.0.0.1:8080/;
}
}
```
## O.N.G.E.K.I.
### SDDT
@@ -435,7 +505,10 @@ After that, on next login the present should be received (or whenever it suppose
* FESTiVAL: Yes (added in A031)
* FESTiVAL PLUS: Yes (added in A035)
* BUDDiES: Yes (added in A039)
* O.N.G.E.K.I. bright MEMORY: Yes
* BUDDiES PLUS: Yes (added in A047)
* O.N.G.E.K.I.:
* bright MEMORY: Yes
* bright MEMORY Act.3 (added in A046)
### Importer
@@ -648,21 +721,32 @@ python dbutils.py upgrade
```
### Notes
- Defrag Match will crash at loading
- Co-Op Online is not supported
- Shop is displayed but cannot purchase heroes or items
- Defrag Match and online coop requires a cloud instance of Photon and a working application ID
- Player title is currently static and cannot be changed in-game
- QR Card Scanning currently only load a static hero
- Ex-quests progression not supported yet
- QR Card Scanning of existing cards requires them to be registered on the webui
- Daily Missions not implemented
- EX TOWER 1,2 & 3 are not yet supported
- Daily Yui coin not yet fixed
- Terminal functionality is almost entirely untested
### Credits for SAO support:
- Midorica - Network Support
- Dniel97 - Helping with network base
- tungnotpunk - Source
- Hay1tsme - fixing many issues with the original implemetation
## Initial D Zero
### SDDF
| Version ID | Version Name |
| ---------- | -------------------- |
| 0 | Initial D Zero v1.10 |
| 1 | Initial D Zero v1.30 |
| 2 | Initial D Zero v2.10 |
| 3 | Initial D Zero v2.30 |
### Info
TODO, probably just leave disabled unless you're doing development things for it.
## Initial D THE ARCADE
@@ -794,3 +878,82 @@ python dbutils.py upgrade
A huge thanks to all people who helped shaping this project to what it is now and don't want to be mentioned here.
## Pokken
### SDAK
| Version ID | Version Name |
| ---------- | ------------ |
| 0 | Pokken |
### Config
Config file is `pokken.yaml`
#### server
| Option | Info | Default |
| ------ | ---- | ------- |
| `hostname` | Hostname override for allnet to tell the game where to connect. Useful for local setups that need to use a different hostname for pokken's proxy. Otherwise, it should match `server`->`hostname` in `core.yaml`. | `localhost` |
| `enabled` | `True` if the pokken service should be enabled. `False` otherwise. | `True` |
| `loglevel` | String indicating how verbose pokken logs should be. Acceptable values are `debug`, `info`, `warn`, and `error`. | `info` |
| `auto_register` | For games that don't use aimedb, this controls weather connecting cards that aren't registered should automatically be registered when making a profile. Set to `False` to require cards be already registered before being usable with Pokken. | `True` |
| `enable_matching` | If `True`, allow non-local matching. This doesn't currently work because BIWA, the matching protocol the game uses, is not understood, so this should be set to `False`. | `False` |
| `stun_server_host` | Hostname of the STUN server the game will use for matching. | `stunserver.stunprotocol.org` (might not work anymore? recomend changing) |
| `stun_server_port` | Port for the external STUN server. Will probably be moved to the `ports` section in the future. | `3478` |
#### ports
| Option | Info | Default |
| ------ | ---- | ------- |
| `game` | Override for the title server port sent by allnet. Useful for local setups utalizing NGINX. | `9000` |
| `admission` | Port for the admission server used in global matching. May be obsolited later. | `9001` |
### Connecting to Artemis
Pokken is a bit tricky to get working due to it having a hard requirement of the connection being HTTPS. This is simplified somewhat by Pokken simply not validating the certificate in any way, shape or form (it can be self-signed, expired, for a different domain, etc.) but it does have to be there. The work-around is to spin up a local NGINX (or other proxy) instance and point traffic back to artemis. See below for a sample nginx config:
`nginx.conf`
```conf
# This example assumes your artemis instance is configured to listed on port 8080, and your certs exists at /path/to/cert and are called title.crt and title.key.
server {
listen 443 ssl;
server_name your.hostname.here;
ssl_certificate /path/to/cert/title.crt;
ssl_certificate_key /path/to/cert/title.key;
ssl_session_timeout 1d;
ssl_session_cache shared:MozSSL:10m;
ssl_session_tickets off;
ssl_protocols TLSv1 TLSv1.1 TLSv1.2 TLSv1.3;
ssl_ciphers "ALL:@SECLEVEL=0";
ssl_prefer_server_ciphers off;
location / {
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_pass_request_headers on;
proxy_pass http://127.0.0.1:8080/;
}
}
```
`pokken.yaml`
```yaml
server:
hostname: "your.hostname.here"
enable: True
loglevel: "info"
auto_register: True
enable_matching: False
stun_server_host: "stunserver.stunprotocol.org"
stun_server_port: 3478
ports:
game: 443
admission: 9001
```
### Info
The arcade release is missing a few fighters and supports compared to the switch version. It may be possible to mod these in in the future, but not much headway has been made on this as far as I know. Mercifully, the game uses the pokedex number (illustration_book_no) wherever possible when referingto both fighters and supports. Customization is entirely done on the webui. Artemis currently only supports changing your name, gender, and supporrt teams, but more is planned for the future.
### Credits
Special thanks to Pocky for pointing me in the right direction in terms of getting this game to function at all, and Lightning and other pokken cab owners for doing testing and reporting bugs/issues.

View File

@@ -8,6 +8,6 @@ version:
chuni: 2.00.00
maimai: 1.20.00
1:
ongeki: 1.35.03
chuni: 2.10.00
maimai: 1.30.00
ongeki: 1.45.01
chuni: 2.25.00
maimai: 1.45.00

View File

@@ -2,12 +2,29 @@ server:
enable: True
loglevel: "info"
news_msg: ""
use_https: False # for CRYSTAL PLUS and later or SUPERSTAR and later
team:
name: ARTEMiS # If this is set, all players that are not on a team will use this one by default.
mods:
use_login_bonus: True
# stock_tickets allows specified ticket IDs to be auto-stocked at login. Format is a comma-delimited string of ticket IDs
# note: quanity is not refreshed on "continue" after set - only on subsequent login
stock_tickets:
stock_count: 99
# Allow use of all available customization items in frontend web ui
# note: This effectively makes every available item appear to be in the user's inventory. It does _not_ override the "disableFlag" setting on individual items
# warning: This can result in pushing a lot of data, especially the userbox items. Recommended for local network use only.
forced_item_unlocks:
map_icons: False
system_voices: False
avatar_accessories: False
nameplates: False
trophies: False
character_icons: False
stages: False
version:
11:
@@ -25,6 +42,15 @@ version:
15:
rom: 2.20.00
data: 2.20.00
16:
rom: 2.25.00
data: 2.25.00
17:
rom: 2.30.00
data: 2.30.00
18:
rom: 2.40.00
data: 2.40.00
crypto:
encrypted_only: False

View File

@@ -27,6 +27,7 @@ database:
name: "aime"
port: 3306
protocol: "mysql"
ssl_enabled: False
sha2_password: False
loglevel: "info"
enable_memcached: True
@@ -44,6 +45,8 @@ allnet:
loglevel: "info"
allow_online_updates: False
update_cfg_folder: ""
save_billing: True
allnet_lite_keys: []
billing:
standalone: True
@@ -62,5 +65,10 @@ aimedb:
id_secret: ""
id_lifetime_seconds: 86400
chimedb:
enable: False
loglevel: "info"
key: ""
mucha:
loglevel: "info"

View File

@@ -1,5 +1,5 @@
server:
enable: True
enable: False
loglevel: "info"
hostname: ""
news: ""

View File

@@ -1,12 +1,17 @@
server:
enable: True
loglevel: "info"
use_https: False # for DX and later
deliver:
enable: False
udbdl_enable: False
content_folder: ""
chart_deliver: #for Prism and later
enable: False
chart_folder: ""
uploads:
photos: False
photos_dir: ""

View File

@@ -66,6 +66,52 @@ server {
}
}
# WAHLAP Billing, they use 443 port
# comment this out if running billing standalone
# still not work for some reason, please set
# billing=127.0.0.1 in segatools.ini for now and looking for fix
server {
listen 443 ssl;
server_name bl.sys-all.cn;
ssl_certificate /path/to/cert/server.pem;
ssl_certificate_key /path/to/cert/server.key;
ssl_session_timeout 1d;
ssl_session_cache shared:MozSSL:10m;
ssl_session_tickets off;
ssl_protocols TLSv1 TLSv1.1 TLSv1.2 TLSv1.3;
ssl_ciphers "ALL:@SECLEVEL=0";
ssl_prefer_server_ciphers off;
location / {
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_pass_request_headers on;
proxy_pass http://127.0.0.1:8080/;
}
}
server {
listen 443 ssl;
server_name bl.sys-allnet.cn;
ssl_certificate /path/to/cert/server.pem;
ssl_certificate_key /path/to/cert/server.key;
ssl_session_timeout 1d;
ssl_session_cache shared:MozSSL:10m;
ssl_session_tickets off;
ssl_protocols TLSv1 TLSv1.1 TLSv1.2 TLSv1.3;
ssl_ciphers "ALL:@SECLEVEL=0";
ssl_prefer_server_ciphers off;
location / {
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_pass_request_headers on;
proxy_pass http://127.0.0.1:8080/;
}
}
# Frontend, set to redirect to HTTPS. Comment out if you don't intend to use the frontend
server {
listen 80;

View File

@@ -7,12 +7,12 @@ gachas:
enabled_gachas:
- 1011
- 1012
- 1043
- 1067
- 1068
- 1069
- 1070
- 1071
# - 1043
# - 1067
# - 1068
# - 1069
# - 1070
# - 1071
- 1072
- 1073
- 1074
@@ -30,12 +30,22 @@ gachas:
- 1156
- 1163
- 1164
# 5th anniversary gacha
- 1165
# 2024 gacha
- 1166
# 6th anniversary gacha
- 1167
# 2025 gacha
- 1168
version:
6:
card_maker: 1.30.01
7:
card_maker: 1.35.03
8:
card_maker: 1.45.01
crypto:
encrypted_only: False

View File

@@ -6,7 +6,8 @@ import uvicorn
import logging
import asyncio
from core import CoreConfig, AimedbServlette
from core.config import CoreConfig
from core.aimedb import AimedbServlette
async def launch_main(cfg: CoreConfig, ssl: bool) -> None:
if ssl:

24
read.py
View File

@@ -1,16 +1,16 @@
#!/usr/bin/env python3
import argparse
import re
import os
import yaml
from os import path
import logging
import coloredlogs
import asyncio
import logging
import os
import re
from logging.handlers import TimedRotatingFileHandler
from os import path
from typing import List, Optional
import coloredlogs
import yaml
from core import CoreConfig, Utils
@@ -44,7 +44,7 @@ class BaseReader:
pass
if __name__ == "__main__":
async def main():
parser = argparse.ArgumentParser(description="Import Game Information")
parser.add_argument(
"--game",
@@ -140,8 +140,12 @@ if __name__ == "__main__":
for dir, mod in titles.items():
if args.game in mod.game_codes:
handler = mod.reader(config, args.version, bin_arg, opt_arg, args.extra)
loop = asyncio.get_event_loop()
loop.run_until_complete(handler.read())
await handler.read()
logger.info("Done")
if __name__ == "__main__":
asyncio.run(main())

View File

@@ -8,6 +8,11 @@ Games listed below have been tested and confirmed working. Only game versions ol
+ 1.30
+ 1.35
+ CHUNITHM CHINA
+ NEW
+ 2024 (NEW)
+ 2024 (LUMINOUS)
+ CHUNITHM INTL
+ SUPERSTAR
+ SUPERSTAR PLUS
@@ -15,6 +20,8 @@ Games listed below have been tested and confirmed working. Only game versions ol
+ NEW PLUS
+ SUN
+ SUN PLUS
+ LUMINOUS
+ LUMINOUS PLUS
+ CHUNITHM JP
+ AIR
@@ -30,6 +37,9 @@ Games listed below have been tested and confirmed working. Only game versions ol
+ SUN
+ SUN PLUS
+ LUMINOUS
+ LUMINOUS PLUS
+ VERSE
+ X-VERSE
+ crossbeats REV.
+ Crossbeats REV.
@@ -42,7 +52,16 @@ Games listed below have been tested and confirmed working. Only game versions ol
+ Initial D THE ARCADE
+ Season 2
+ maimai DX
+ maimai DX CHINA
+ DX (Muji)
+ 2021 (Muji)
+ 2022 (Muji)
+ 2023 (FESTiVAL)
+ 2024 (BUDDiES)
+ maimai DX INTL
+ DX
+ DX Plus
+ Splash
+ Splash Plus
+ UNiVERSE
@@ -50,6 +69,22 @@ Games listed below have been tested and confirmed working. Only game versions ol
+ FESTiVAL
+ FESTiVAL PLUS
+ BUDDiES
+ BUDDiES PLUS
+ PRiSM
+ maimai DX
+ DX
+ DX Plus
+ Splash
+ Splash Plus
+ UNiVERSE
+ UNiVERSE PLUS
+ FESTiVAL
+ FESTiVAL PLUS
+ BUDDiES
+ BUDDiES PLUS
+ PRiSM
+ PRiSM PLUS
+ O.N.G.E.K.I.
+ SUMMER
@@ -58,6 +93,7 @@ Games listed below have been tested and confirmed working. Only game versions ol
+ R.E.D. PLUS
+ bright
+ bright MEMORY
+ bright MEMORY Act.3
+ POKKÉN TOURNAMENT
+ Final Online
@@ -83,3 +119,6 @@ Read [Games specific info](docs/game_specific_info.md) for all supported games,
## Production guide
See the [production guide](docs/prod.md) for running a production server.
## Text User Interface
Invoke `tui.py` (with optional `-c <command dir>` parameter) for an interactive TUI to perform management actions (add, edit or delete users, cards, arcades and machines) without needing to spin up the frontend. Requires installing asciimatics via `pip install asciimatics`

View File

@@ -3,7 +3,7 @@ wheel
pytz
pyyaml
sqlalchemy==1.4.46
mysqlclient
aiomysql
pyopenssl
service_identity
PyCryptodome
@@ -21,4 +21,4 @@ starlette
asyncio
uvicorn
alembic
python-multipart
python-multipart

View File

@@ -8,4 +8,4 @@ index = ChuniServlet
database = ChuniData
reader = ChuniReader
frontend = ChuniFrontend
game_codes = [ChuniConstants.GAME_CODE, ChuniConstants.GAME_CODE_NEW, ChuniConstants.GAME_CODE_INT]
game_codes = [ChuniConstants.GAME_CODE, ChuniConstants.GAME_CODE_NEW, ChuniConstants.GAME_CODE_INT, ChuniConstants.GAME_CODE_CHN]

View File

@@ -1,16 +1,16 @@
import logging
import itertools
import json
import logging
from datetime import datetime, timedelta
from time import strftime
from typing import Any, Dict, List
import pytz
from typing import Dict, Any, List
from core.config import CoreConfig
from titles.chuni.const import ChuniConstants
from titles.chuni.database import ChuniData
from titles.chuni.config import ChuniConfig
SCORE_BUFFER = {}
from titles.chuni.const import ChuniConstants, FavoriteItemKind, ItemKind
from titles.chuni.database import ChuniData
class ChuniBase:
def __init__(self, core_cfg: CoreConfig, game_cfg: ChuniConfig) -> None:
@@ -24,21 +24,38 @@ class ChuniBase:
async def handle_game_login_api_request(self, data: Dict) -> Dict:
"""
Handles the login bonus logic, required for the game because
getUserLoginBonus gets called after getUserItem and therefore the
Handles the login bonus and ticket stock logic, required for the game
because getUserLoginBonus gets called after getUserItem; therefore the
items needs to be inserted in the database before they get requested.
Adds a bonusCount after a user logged in after 24 hours, makes sure
loginBonus 30 gets looped, only show the login banner every 24 hours,
adds the bonus to items (itemKind 6)
- Adds a stock for each specified ticket (itemKind 5)
- Adds a bonusCount after a user logged in after 24 hours, makes sure
loginBonus 30 gets looped, only show the login banner every 24 hours,
adds the bonus to items (itemKind 6)
"""
user_id = data["userId"]
# If we want to make certain tickets always available, stock them now
if self.game_cfg.mods.stock_tickets:
for ticket in self.game_cfg.mods.stock_tickets.split(","):
await self.data.item.put_item(
user_id,
{
"itemId": ticket.strip(),
"itemKind": ItemKind.TICKET.value,
"stock": self.game_cfg.mods.stock_count,
"isValid": True,
},
)
# ignore the login bonus if disabled in config
if not self.game_cfg.mods.use_login_bonus:
return {"returnCode": 1}
user_id = data["userId"]
login_bonus_presets = await self.data.static.get_login_bonus_presets(self.version)
login_bonus_presets = await self.data.static.get_login_bonus_presets(
self.version
)
for preset in login_bonus_presets:
# check if a user already has some pogress and if not add the
@@ -101,7 +118,7 @@ class ChuniBase:
user_id,
{
"itemId": login_item["presentId"],
"itemKind": 6,
"itemKind": ItemKind.PRESENT.value,
"stock": login_item["itemNum"],
"isValid": True,
},
@@ -182,15 +199,21 @@ class ChuniBase:
async def handle_get_game_message_api_request(self, data: Dict) -> Dict:
return {
"type": data["type"],
"length": 1,
"gameMessageList": [{
"id": 1,
"type": 1,
"message": f"Welcome to {self.core_cfg.server.name} network!" if not self.game_cfg.server.news_msg else self.game_cfg.server.news_msg,
"startDate": "2017-12-05 07:00:00.0",
"endDate": "2099-12-31 00:00:00.0"
}]
"type": data["type"],
"length": 1,
"gameMessageList": [
{
"id": 1,
"type": 1,
"message": (
f"Welcome to {self.core_cfg.server.name} network!"
if not self.game_cfg.server.news_msg
else self.game_cfg.server.news_msg
),
"startDate": "2017-12-05 07:00:00.0",
"endDate": "2099-12-31 00:00:00.0",
}
],
}
async def handle_get_game_ranking_api_request(self, data: Dict) -> Dict:
@@ -202,7 +225,10 @@ class ChuniBase:
async def handle_get_game_setting_api_request(self, data: Dict) -> Dict:
# if reboot start/end time is not defined use the default behavior of being a few hours ago
if self.core_cfg.title.reboot_start_time == "" or self.core_cfg.title.reboot_end_time == "":
if (
self.core_cfg.title.reboot_start_time == ""
or self.core_cfg.title.reboot_end_time == ""
):
reboot_start = datetime.strftime(
datetime.utcnow() + timedelta(hours=6), self.date_time_format
)
@@ -211,15 +237,29 @@ class ChuniBase:
)
else:
# get current datetime in JST
current_jst = datetime.now(pytz.timezone('Asia/Tokyo')).date()
current_jst = datetime.now(pytz.timezone("Asia/Tokyo")).date()
# parse config start/end times into datetime
reboot_start_time = datetime.strptime(self.core_cfg.title.reboot_start_time, "%H:%M")
reboot_end_time = datetime.strptime(self.core_cfg.title.reboot_end_time, "%H:%M")
reboot_start_time = datetime.strptime(
self.core_cfg.title.reboot_start_time, "%H:%M"
)
reboot_end_time = datetime.strptime(
self.core_cfg.title.reboot_end_time, "%H:%M"
)
# offset datetimes with current date/time
reboot_start_time = reboot_start_time.replace(year=current_jst.year, month=current_jst.month, day=current_jst.day, tzinfo=pytz.timezone('Asia/Tokyo'))
reboot_end_time = reboot_end_time.replace(year=current_jst.year, month=current_jst.month, day=current_jst.day, tzinfo=pytz.timezone('Asia/Tokyo'))
reboot_start_time = reboot_start_time.replace(
year=current_jst.year,
month=current_jst.month,
day=current_jst.day,
tzinfo=pytz.timezone("Asia/Tokyo"),
)
reboot_end_time = reboot_end_time.replace(
year=current_jst.year,
month=current_jst.month,
day=current_jst.day,
tzinfo=pytz.timezone("Asia/Tokyo"),
)
# create strings for use in gameSetting
reboot_start = reboot_start_time.strftime(self.date_time_format)
@@ -240,6 +280,7 @@ class ChuniBase:
"isDumpUpload": "false",
"isAou": "false",
}
async def handle_get_user_activity_api_request(self, data: Dict) -> Dict:
user_activity_list = await self.data.profile.get_profile_activity(
data["userId"], data["kind"]
@@ -262,35 +303,39 @@ class ChuniBase:
}
async def handle_get_user_character_api_request(self, data: Dict) -> Dict:
characters = await self.data.item.get_characters(data["userId"])
if characters is None:
user_id = int(data["userId"])
next_idx = int(data["nextIndex"])
max_ct = int(data["maxCount"])
# add one to the limit so we know if there's a next page of items
rows = await self.data.item.get_characters(
user_id, limit=max_ct + 1, offset=next_idx
)
if rows is None or len(rows) == 0:
return {
"userId": data["userId"],
"userId": user_id,
"length": 0,
"nextIndex": -1,
"userCharacterList": [],
}
character_list = []
next_idx = int(data["nextIndex"])
max_ct = int(data["maxCount"])
for x in range(next_idx, len(characters)):
tmp = characters[x]._asdict()
tmp.pop("user")
for row in rows[:max_ct]:
tmp = row._asdict()
tmp.pop("id")
tmp.pop("user")
character_list.append(tmp)
if len(character_list) >= max_ct:
break
if len(characters) >= next_idx + max_ct:
if len(rows) > max_ct:
next_idx += max_ct
else:
next_idx = -1
return {
"userId": data["userId"],
"userId": user_id,
"length": len(character_list),
"nextIndex": next_idx,
"userCharacterList": character_list,
@@ -316,33 +361,35 @@ class ChuniBase:
return {
"userId": data["userId"],
"length": 0,
"userRecentPlayerList": [], # playUserId, playUserName, playDate, friendPoint
"userRecentPlayerList": [], # playUserId, playUserName, playDate, friendPoint
}
async def handle_get_user_course_api_request(self, data: Dict) -> Dict:
user_course_list = await self.data.score.get_courses(data["userId"])
if user_course_list is None:
user_id = int(data["userId"])
next_idx = int(data["nextIndex"])
max_ct = int(data["maxCount"])
rows = await self.data.score.get_courses(
user_id, limit=max_ct + 1, offset=next_idx
)
if rows is None or len(rows) == 0:
return {
"userId": data["userId"],
"userId": user_id,
"length": 0,
"nextIndex": -1,
"userCourseList": [],
}
course_list = []
next_idx = int(data.get("nextIndex", 0))
max_ct = int(data.get("maxCount", 300))
for x in range(next_idx, len(user_course_list)):
tmp = user_course_list[x]._asdict()
for row in rows[:max_ct]:
tmp = row._asdict()
tmp.pop("user")
tmp.pop("id")
course_list.append(tmp)
if len(user_course_list) >= max_ct:
break
if len(user_course_list) >= next_idx + max_ct:
if len(rows) > max_ct:
next_idx += max_ct
else:
next_idx = -1
@@ -400,85 +447,105 @@ class ChuniBase:
p = await self.data.profile.get_rival(data["rivalId"])
if p is None:
return {}
userRivalData = {
"rivalId": p.user,
"rivalName": p.userName
}
return {
"userId": data["userId"],
"userRivalData": userRivalData
}
userRivalData = {"rivalId": p.user, "rivalName": p.userName}
return {"userId": data["userId"], "userRivalData": userRivalData}
async def handle_get_user_rival_music_api_request(self, data: Dict) -> Dict:
rival_id = data["rivalId"]
next_index = int(data["nextIndex"])
max_count = int(data["maxCount"])
user_rival_music_list = []
user_id = int(data["userId"])
rival_id = int(data["rivalId"])
next_idx = int(data["nextIndex"])
max_ct = int(data["maxCount"])
rival_levels = [int(x["level"]) for x in data["userRivalMusicLevelList"]]
# Fetch all the rival music entries for the user
all_entries = await self.data.score.get_rival_music(rival_id)
rows = await self.data.score.get_scores(
rival_id,
levels=rival_levels,
limit=max_ct + 1,
offset=next_idx,
)
# Process the entries based on max_count and nextIndex
for music in all_entries:
music_id = music["musicId"]
level = music["level"]
score = music["scoreMax"]
rank = music["scoreRank"]
if rows is None or len(rows) == 0:
return {
"userId": user_id,
"rivalId": rival_id,
"nextIndex": -1,
"userRivalMusicList": [],
}
# Create a music entry for the current music_id if it's unique
music_entry = next((entry for entry in user_rival_music_list if entry["musicId"] == music_id), None)
if music_entry is None:
music_entry = {
music_details = [x._asdict() for x in rows]
returned_music_details_count = 0
music_list = []
# note that itertools.groupby will only work on sorted keys, which is already sorted by
# the query in get_scores
for music_id, details_iter in itertools.groupby(
music_details, key=lambda x: x["musicId"]
):
details: list[dict[Any, Any]] = [
{"level": d["level"], "scoreMax": d["scoreMax"]} for d in details_iter
]
music_list.append(
{
"musicId": music_id,
"length": 0,
"userRivalMusicDetailList": []
"length": len(details),
"userRivalMusicDetailList": details,
}
user_rival_music_list.append(music_entry)
)
returned_music_details_count += len(details)
# Create a level entry for the current level if it's unique or has a higher score
level_entry = next((entry for entry in music_entry["userRivalMusicDetailList"] if entry["level"] == level), None)
if level_entry is None:
level_entry = {
"level": level,
"scoreMax": score,
"scoreRank": rank
}
music_entry["userRivalMusicDetailList"].append(level_entry)
elif score > level_entry["scoreMax"]:
level_entry["scoreMax"] = score
level_entry["scoreRank"] = rank
if len(music_list) >= max_ct:
break
# Calculate the length for each "musicId" by counting the unique levels
for music_entry in user_rival_music_list:
music_entry["length"] = len(music_entry["userRivalMusicDetailList"])
# if we returned fewer PBs than we originally asked for from the database, that means
# we queried for the PBs of max_ct + 1 songs.
if returned_music_details_count < len(rows):
next_idx += max_ct
else:
next_idx = -1
# Prepare the result dictionary with user rival music data
result = {
"userId": data["userId"],
"rivalId": data["rivalId"],
"nextIndex": str(next_index + len(user_rival_music_list[next_index: next_index + max_count]) if max_count <= len(user_rival_music_list[next_index: next_index + max_count]) else -1),
"userRivalMusicList": user_rival_music_list[next_index: next_index + max_count]
return {
"userId": user_id,
"rivalId": rival_id,
"length": len(music_list),
"nextIndex": next_idx,
"userRivalMusicList": music_list,
}
return result
async def handle_get_user_favorite_item_api_request(self, data: Dict) -> Dict:
user_id = int(data["userId"])
next_idx = int(data["nextIndex"])
max_ct = int(data["maxCount"])
kind = int(data["kind"])
is_all_favorite_item = str(data["isAllFavoriteItem"]) == "true"
user_fav_item_list = []
# still needs to be implemented on WebUI
# 1: Music, 2: User, 3: Character
fav_list = await self.data.item.get_all_favorites(
data["userId"], self.version, fav_kind=int(data["kind"])
rows = await self.data.item.get_all_favorites(
user_id,
self.version,
fav_kind=kind,
limit=max_ct + 1,
offset=next_idx,
)
if fav_list is not None:
for fav in fav_list:
if rows is not None:
for fav in rows[:max_ct]:
user_fav_item_list.append({"id": fav["favId"]})
if rows is None or len(rows) <= max_ct:
next_idx = -1
else:
next_idx += max_ct
return {
"userId": data["userId"],
"userId": user_id,
"length": len(user_fav_item_list),
"kind": data["kind"],
"nextIndex": -1,
"kind": kind,
"nextIndex": next_idx,
"userFavoriteItemList": user_fav_item_list,
}
@@ -490,36 +557,39 @@ class ChuniBase:
return {"userId": data["userId"], "length": 0, "userFavoriteMusicList": []}
async def handle_get_user_item_api_request(self, data: Dict) -> Dict:
kind = int(int(data["nextIndex"]) / 10000000000)
next_idx = int(int(data["nextIndex"]) % 10000000000)
user_item_list = await self.data.item.get_items(data["userId"], kind)
user_id = int(data["userId"])
next_idx = int(data["nextIndex"])
max_ct = int(data["maxCount"])
if user_item_list is None or len(user_item_list) == 0:
kind = next_idx // 10000000000
next_idx = next_idx % 10000000000
rows = await self.data.item.get_items(
user_id, kind, limit=max_ct + 1, offset=next_idx
)
if rows is None or len(rows) == 0:
return {
"userId": data["userId"],
"userId": user_id,
"nextIndex": -1,
"itemKind": kind,
"userItemList": [],
}
items: List[Dict[str, Any]] = []
for i in range(next_idx, len(user_item_list)):
tmp = user_item_list[i]._asdict()
for row in rows[:max_ct]:
tmp = row._asdict()
tmp.pop("user")
tmp.pop("id")
items.append(tmp)
if len(items) >= int(data["maxCount"]):
break
xout = kind * 10000000000 + next_idx + len(items)
if len(items) < int(data["maxCount"]):
next_idx = 0
if len(rows) > max_ct:
next_idx = kind * 10000000000 + next_idx + max_ct
else:
next_idx = xout
next_idx = -1
return {
"userId": data["userId"],
"userId": user_id,
"nextIndex": next_idx,
"itemKind": kind,
"length": len(items),
@@ -528,7 +598,9 @@ class ChuniBase:
async def handle_get_user_login_bonus_api_request(self, data: Dict) -> Dict:
user_id = data["userId"]
user_login_bonus = await self.data.item.get_all_login_bonus(user_id, self.version)
user_login_bonus = await self.data.item.get_all_login_bonus(
user_id, self.version
)
# ignore the loginBonus request if its disabled in config
if user_login_bonus is None or not self.game_cfg.mods.use_login_bonus:
return {"userId": user_id, "length": 0, "userLoginBonusList": []}
@@ -571,62 +643,57 @@ class ChuniBase:
}
async def handle_get_user_music_api_request(self, data: Dict) -> Dict:
music_detail = await self.data.score.get_scores(data["userId"])
if music_detail is None:
user_id = int(data["userId"])
next_idx = int(data["nextIndex"])
max_ct = int(data["maxCount"])
rows = await self.data.score.get_scores(
user_id, limit=max_ct + 1, offset=next_idx
)
if rows is None or len(rows) == 0:
return {
"userId": data["userId"],
"userId": user_id,
"length": 0,
"nextIndex": -1,
"userMusicList": [], # 240
}
song_list = []
next_idx = int(data["nextIndex"])
max_ct = int(data["maxCount"])
music_details = [x._asdict() for x in rows]
returned_music_details_count = 0
music_list = []
for x in range(next_idx, len(music_detail)):
found = False
tmp = music_detail[x]._asdict()
tmp.pop("user")
tmp.pop("id")
# note that itertools.groupby will only work on sorted keys, which is already sorted by
# the query in get_scores
for _music_id, details_iter in itertools.groupby(
music_details, key=lambda x: x["musicId"]
):
details: list[dict[Any, Any]] = []
for song in song_list:
score_buf = SCORE_BUFFER.get(str(data["userId"])) or []
if song["userMusicDetailList"][0]["musicId"] == tmp["musicId"]:
found = True
song["userMusicDetailList"].append(tmp)
song["length"] = len(song["userMusicDetailList"])
score_buf.append(tmp["musicId"])
SCORE_BUFFER[str(data["userId"])] = score_buf
for d in details_iter:
d.pop("id")
d.pop("user")
score_buf = SCORE_BUFFER.get(str(data["userId"])) or []
if not found and tmp["musicId"] not in score_buf:
song_list.append({"length": 1, "userMusicDetailList": [tmp]})
score_buf.append(tmp["musicId"])
SCORE_BUFFER[str(data["userId"])] = score_buf
details.append(d)
if len(song_list) >= max_ct:
music_list.append({"length": len(details), "userMusicDetailList": details})
returned_music_details_count += len(details)
if len(music_list) >= max_ct:
break
for songIdx in range(len(song_list)):
for recordIdx in range(x+1, len(music_detail)):
if song_list[songIdx]["userMusicDetailList"][0]["musicId"] == music_detail[recordIdx]["musicId"]:
music = music_detail[recordIdx]._asdict()
music.pop("user")
music.pop("id")
song_list[songIdx]["userMusicDetailList"].append(music)
song_list[songIdx]["length"] += 1
if len(song_list) >= max_ct:
next_idx += len(song_list)
# if we returned fewer PBs than we originally asked for from the database, that means
# we queried for the PBs of max_ct + 1 songs.
if returned_music_details_count < len(rows):
next_idx += max_ct
else:
next_idx = -1
SCORE_BUFFER[str(data["userId"])] = []
return {
"userId": data["userId"],
"length": len(song_list),
"userId": user_id,
"length": len(music_list),
"nextIndex": next_idx,
"userMusicList": song_list, # 240
"userMusicList": music_list,
}
async def handle_get_user_option_api_request(self, data: Dict) -> Dict:
@@ -651,7 +718,9 @@ class ChuniBase:
return bytes([ord(c) for c in src]).decode("utf-8")
async def handle_get_user_preview_api_request(self, data: Dict) -> Dict:
profile = await self.data.profile.get_profile_preview(data["userId"], self.version)
profile = await self.data.profile.get_profile_preview(
data["userId"], self.version
)
if profile is None:
return None
profile_character = await self.data.item.get_character(
@@ -693,7 +762,9 @@ class ChuniBase:
}
async def handle_get_user_recent_rating_api_request(self, data: Dict) -> Dict:
recent_rating_list = await self.data.profile.get_profile_recent_rating(data["userId"])
recent_rating_list = await self.data.profile.get_profile_recent_rating(
data["userId"]
)
if recent_rating_list is None:
return {
"userId": data["userId"],
@@ -726,7 +797,7 @@ class ChuniBase:
profile = await self.data.profile.get_profile_data(data["userId"], self.version)
if profile is None:
return {"userId": data["userId"], "teamId": 0}
return {"userId": data["userId"], "teamId": 0}
if profile and profile["teamId"]:
# Get team by id
@@ -751,7 +822,7 @@ class ChuniBase:
"teamId": team_id,
"teamRank": team_rank,
"teamName": team_name,
"assaultTimeRate": 1, # TODO: Figure out assaultTime, which might be team point boost?
"assaultTimeRate": 1, # TODO: Figure out assaultTime, which might be team point boost?
"userTeamPoint": {
"userId": data["userId"],
"teamId": team_id,
@@ -760,7 +831,7 @@ class ChuniBase:
"aggrDate": data["playDate"],
},
}
async def handle_get_team_course_setting_api_request(self, data: Dict) -> Dict:
return {
"userId": data["userId"],
@@ -769,7 +840,9 @@ class ChuniBase:
"teamCourseSettingList": [],
}
async def handle_get_team_course_setting_api_request_proto(self, data: Dict) -> Dict:
async def handle_get_team_course_setting_api_request_proto(
self, data: Dict
) -> Dict:
return {
"userId": data["userId"],
"length": 1,
@@ -784,11 +857,11 @@ class ChuniBase:
"teamCourseMusicList": [
{"track": 184, "type": 1, "level": 3, "selectLevel": -1},
{"track": 184, "type": 1, "level": 3, "selectLevel": -1},
{"track": 184, "type": 1, "level": 3, "selectLevel": -1}
{"track": 184, "type": 1, "level": 3, "selectLevel": -1},
],
"teamCourseRankingInfoList": [],
"recodeDate": "2099-12-31 11:59:99.0",
"isPlayed": False
"isPlayed": False,
}
],
}
@@ -798,7 +871,7 @@ class ChuniBase:
"userId": data["userId"],
"length": 0,
"nextIndex": -1,
"teamCourseRuleList": []
"teamCourseRuleList": [],
}
async def handle_get_team_course_rule_api_request_proto(self, data: Dict) -> Dict:
@@ -813,7 +886,7 @@ class ChuniBase:
"damageMiss": 1,
"damageAttack": 1,
"damageJustice": 1,
"damageJusticeC": 1
"damageJusticeC": 1,
}
],
}
@@ -824,7 +897,7 @@ class ChuniBase:
if int(user_id) & 0x1000000000001 == 0x1000000000001:
place_id = int(user_id) & 0xFFFC00000000
self.logger.info("Guest play from place ID %d, ignoring.", place_id)
return {"returnCode": "1"}
@@ -846,7 +919,9 @@ class ChuniBase:
)
if "userGameOption" in upsert:
await self.data.profile.put_profile_option(user_id, upsert["userGameOption"][0])
await self.data.profile.put_profile_option(
user_id, upsert["userGameOption"][0]
)
if "userGameOptionEx" in upsert:
await self.data.profile.put_profile_option_ex(
@@ -893,33 +968,41 @@ class ChuniBase:
for playlog in upsert["userPlaylogList"]:
# convert the player names to utf-8
if playlog["playedUserName1"] is not None:
playlog["playedUserName1"] = self.read_wtf8(playlog["playedUserName1"])
playlog["playedUserName1"] = self.read_wtf8(
playlog["playedUserName1"]
)
if playlog["playedUserName2"] is not None:
playlog["playedUserName2"] = self.read_wtf8(playlog["playedUserName2"])
playlog["playedUserName2"] = self.read_wtf8(
playlog["playedUserName2"]
)
if playlog["playedUserName3"] is not None:
playlog["playedUserName3"] = self.read_wtf8(playlog["playedUserName3"])
playlog["playedUserName3"] = self.read_wtf8(
playlog["playedUserName3"]
)
await self.data.score.put_playlog(user_id, playlog, self.version)
if "userTeamPoint" in upsert:
team_points = upsert["userTeamPoint"]
try:
for tp in team_points:
if tp["teamId"] != '65535':
if tp["teamId"] != "65535":
# Fetch the current team data
current_team = await self.data.profile.get_team_by_id(tp["teamId"])
current_team = await self.data.profile.get_team_by_id(
tp["teamId"]
)
# Calculate the new teamPoint
new_team_point = int(tp["teamPoint"]) + current_team["teamPoint"]
new_team_point = (
int(tp["teamPoint"]) + current_team["teamPoint"]
)
# Prepare the data to update
team_data = {
"teamPoint": new_team_point
}
team_data = {"teamPoint": new_team_point}
# Update the team data
await self.data.profile.update_team(tp["teamId"], team_data)
except:
pass # Probably a better way to catch if the team is not set yet (new profiles), but let's just pass
pass # Probably a better way to catch if the team is not set yet (new profiles), but let's just pass
if "userMapAreaList" in upsert:
for map_area in upsert["userMapAreaList"]:
await self.data.item.put_map_area(user_id, map_area)
@@ -937,22 +1020,28 @@ class ChuniBase:
await self.data.item.put_login_bonus(
user_id, self.version, login["presetId"], isWatched=True
)
if "userRecentPlayerList" in upsert: # TODO: Seen in Air, maybe implement sometime
if (
"userRecentPlayerList" in upsert
): # TODO: Seen in Air, maybe implement sometime
for rp in upsert["userRecentPlayerList"]:
pass
for rating_type in {"userRatingBaseList", "userRatingBaseHotList", "userRatingBaseNextList"}:
for rating_type in {
"userRatingBaseList",
"userRatingBaseHotList",
"userRatingBaseNextList",
}:
if rating_type not in upsert:
continue
await self.data.profile.put_profile_rating(
user_id,
self.version,
rating_type,
upsert[rating_type],
)
# added in LUMINOUS
if "userCMissionList" in upsert:
for cmission in upsert["userCMissionList"]:
@@ -967,7 +1056,9 @@ class ChuniBase:
)
for progress in cmission["userCMissionProgressList"]:
await self.data.item.put_cmission_progress(user_id, mission_id, progress)
await self.data.item.put_cmission_progress(
user_id, mission_id, progress
)
if "userNetBattleData" in upsert:
net_battle = upsert["userNetBattleData"][0]
@@ -978,6 +1069,46 @@ class ChuniBase:
)
await self.data.profile.put_net_battle(user_id, net_battle)
# New in LUMINOUS PLUS
if "userFavoriteMusicList" in upsert:
# musicId, orderId
music_ids = set(
int(m["musicId"])
for m in upsert["userFavoriteMusicList"]
if m["musicId"] != "-1"
)
current_favorites = await self.data.item.get_all_favorites(
user_id, self.version, fav_kind=FavoriteItemKind.MUSIC.value
)
if current_favorites is None:
current_favorites = []
current_favorite_ids = set(x.favId for x in current_favorites)
keep_ids = current_favorite_ids.intersection(music_ids)
deleted_ids = current_favorite_ids - keep_ids
added_ids = music_ids - keep_ids
for fav_id in deleted_ids:
await self.data.item.delete_favorite_music(
user_id, self.version, fav_id
)
for fav_id in added_ids:
await self.data.item.put_favorite_music(user_id, self.version, fav_id)
# added in CHUNITHM VERSE
if "userUnlockChallengeList" in upsert:
for unlock_challenge in upsert["userUnlockChallengeList"]:
await self.data.item.put_unlock_challenge(
user_id, self.version, unlock_challenge
)
# added in CHUNITHM X-VERSE
if "userLinkedVerseList" in upsert:
for linked_verse in upsert["userLinkedVerseList"]:
await self.data.item.put_linked_verse(user_id, linked_verse)
return {"returnCode": "1"}
async def handle_upsert_user_chargelog_api_request(self, data: Dict) -> Dict:

View File

@@ -25,6 +25,12 @@ class ChuniServerConfig:
return CoreConfig.get_config_field(
self.__config, "chuni", "server", "news_msg", default=""
)
@property
def use_https(self) -> bool:
return CoreConfig.get_config_field(
self.__config, "chuni", "server", "use_https", default=False
)
class ChuniTeamConfig:
@@ -53,6 +59,29 @@ class ChuniModsConfig:
self.__config, "chuni", "mods", "use_login_bonus", default=True
)
@property
def stock_tickets(self) -> str:
return CoreConfig.get_config_field(
self.__config, "chuni", "mods", "stock_tickets", default=None
)
@property
def stock_count(self) -> int:
return CoreConfig.get_config_field(
self.__config, "chuni", "mods", "stock_count", default=99
)
def forced_item_unlocks(self, item: str) -> bool:
forced_item_unlocks = CoreConfig.get_config_field(
self.__config, "chuni", "mods", "forced_item_unlocks", default={}
)
if item not in forced_item_unlocks.keys():
# default to no forced unlocks
return False
return forced_item_unlocks[item]
class ChuniVersionConfig:
def __init__(self, parent_config: "ChuniConfig") -> None:
@@ -63,9 +92,14 @@ class ChuniVersionConfig:
in the form of:
11: {"rom": 2.00.00, "data": 2.00.00}
"""
return CoreConfig.get_config_field(
versions = CoreConfig.get_config_field(
self.__config, "chuni", "version", default={}
)[version]
)
if version not in versions.keys():
return None
return versions[version]
class ChuniCryptoConfig:

View File

@@ -1,10 +1,12 @@
from enum import Enum
from enum import Enum, IntEnum
from typing import Optional
from core.utils import floor_to_nearest_005
class ChuniConstants:
GAME_CODE = "SDBT"
GAME_CODE_NEW = "SDHD"
GAME_CODE_INT = "SDGS"
GAME_CODE_CHN = "SDHJ"
CONFIG_NAME = "chuni.yaml"
@@ -25,6 +27,9 @@ class ChuniConstants:
VER_CHUNITHM_SUN = 13
VER_CHUNITHM_SUN_PLUS = 14
VER_CHUNITHM_LUMINOUS = 15
VER_CHUNITHM_LUMINOUS_PLUS = 16
VER_CHUNITHM_VERSE = 17
VER_CHUNITHM_X_VERSE = 18
VERSION_NAMES = [
"CHUNITHM",
@@ -43,6 +48,9 @@ class ChuniConstants:
"CHUNITHM SUN",
"CHUNITHM SUN PLUS",
"CHUNITHM LUMINOUS",
"CHUNITHM LUMINOUS PLUS",
"CHUNITHM VERSE",
"CHUNITHM X-VERSE",
]
SCORE_RANK_INTERVALS_OLD = [
@@ -76,18 +84,256 @@ class ChuniConstants:
( 0, "D"),
]
VERSION_LUT = {
"100": VER_CHUNITHM,
"105": VER_CHUNITHM_PLUS,
"110": VER_CHUNITHM_AIR,
"115": VER_CHUNITHM_AIR_PLUS,
"120": VER_CHUNITHM_STAR,
"125": VER_CHUNITHM_STAR_PLUS,
"130": VER_CHUNITHM_AMAZON,
"135": VER_CHUNITHM_AMAZON_PLUS,
"140": VER_CHUNITHM_CRYSTAL,
"145": VER_CHUNITHM_CRYSTAL_PLUS,
"150": VER_CHUNITHM_PARADISE,
"200": VER_CHUNITHM_NEW,
"205": VER_CHUNITHM_NEW_PLUS,
"210": VER_CHUNITHM_SUN,
"215": VER_CHUNITHM_SUN_PLUS,
"220": VER_CHUNITHM_LUMINOUS,
"225": VER_CHUNITHM_LUMINOUS_PLUS,
"230": VER_CHUNITHM_VERSE,
"240": VER_CHUNITHM_X_VERSE,
}
@classmethod
def game_ver_to_string(cls, ver: int):
return cls.VERSION_NAMES[ver]
@classmethod
def int_ver_to_game_ver(cls, ver: int) -> Optional[int]:
""" Takes an int ver (ex 100 for 1.00) and returns an internal game version """
return cls.VERSION_LUT.get(str(floor_to_nearest_005(ver)), None)
class MapAreaConditionType(IntEnum):
"""
Condition IDs for the `GetGameMapAreaConditionApi` and `GetGameUCConditionApi` requests.
- "Item" or "locked item" refers to the map area, unlock challenge or
Linked VERSE locked using this system.
- "Chart ID" refers to musicID \\* 100 + difficulty, where difficulty is 0 for BASIC
up to 6 for WORLD'S END. For example, Halcyon ULTIMA is 17305.
"""
INVALID = 0
"""
Invalid condition type. Should cause the hidden item to be automatically unlocked,
but seemingly only works with map areas.
"""
class MapAreaConditionType(Enum):
UNLOCKED = 0
MAP_CLEARED = 1
MAP_AREA_CLEARED = 2
TROPHY_OBTAINED = 3
"""Finish the map with ID `conditionId`."""
MAP_AREA_CLEARED = 2
"""Finish the map area with ID `conditionId`."""
TROPHY_OBTAINED = 3
"""Unlock the trophy with ID `conditionId`."""
TROPHY_EQUIPPED = 4
"""
Equip the trophy with ID `conditionId`. The item is locked again when the trophy is
unequipped.
"""
NAMEPLATE_OBTAINED = 5
"""Unlock the nameplate with ID `conditionId`."""
NAMEPLATE_EQUIPPED = 6
"""
Equip the nameplate with ID `conditionId`. The item is locked again when the nameplate
is unequipped.
"""
CHARACTER_OBTAINED = 7
"""Unlock the character with ID `conditionId`."""
CHARACTER_EQUIPPED = 8
"""
Equip the character with ID `conditionId`. The item is locked again when the character
is unequipped.
"""
CHARACTER_TRANSFORM_EQUIPPED = 9
"""
Equip the character, with the character transform ID `conditionId`. The item is locked again
if the incorrect character is equipped, or the correct character is equipped with the wrong
transform.
"""
MUSIC_OBTAINED = 10
"""Unlock the music with ID `conditionId`."""
AVATAR_ACCESSORY_OBTAINED = 11
"""Unlock the avatar accessory with ID `conditionId`."""
AVATAR_ACCESSORY_EQUIPPED = 12
"""
Equip the avatar accessory with ID `conditionId`. The item is locked again when the avatar
accessory is unequipped.
"""
MAP_ICON_OBTAINED = 13
"""Unlock the map icon with ID `conditionId`."""
MAP_ICON_EQUIPPED = 14
"""
Equip the map icon with ID `conditionId`. The item is locked again when the map icon is
unequipped.
"""
SYSTEM_VOICE_OBTAINED = 15
"""Unlock the system voice with ID `conditionId`."""
SYSTEM_VOICE_EQUIPPED = 16
"""
Equip the system voice with ID `conditionId`. The item is locked again when the system voice
is unequipped.
"""
ALL_JUSTICE_CRITICAL = 17
"""Obtain ALL JUSTICE CRITICAL on the chart given by `conditionId`."""
RANK_SSSP = 18
"""Obtain rank SSS+ on the chart given by `conditionId`."""
RANK_SSS = 19
"""Obtain rank SSS on the chart given by `conditionId`."""
RANK_SSP = 20
"""Obtain rank SS+ on the chart given by `conditionId`."""
RANK_SS = 21
"""Obtain rank SS on the chart given by `conditionId`."""
RANK_SP = 22
"""Obtain rank S+ on the chart given by `conditionId`."""
RANK_S = 23
"""Obtain rank S on the chart given by `conditionId`."""
RANK_AAA = 24
"""Obtain rank AAA on the chart given by `conditionId`."""
RANK_AA = 25
"""Obtain rank AA on the chart given by `conditionId`."""
RANK_A = 26
"""Obtain rank A on the chart given by `conditionId`."""
MINIMUM_BEST_30_AVERAGE = 27
"""Obtain a best 30 average of at least `conditionId / 100`."""
ALL_JUSTICE = 28
"""Obtain ALL JUSTICE on the chart given by `conditionId`."""
FULL_COMBO = 29
"""Obtain FULL COMBO on the chart given by `conditionId`."""
UNLOCK_CHALLENGE_DISCOVERED = 30
"""Discover/unlock the unlock challenge with ID `conditionId`."""
UNLOCK_CHALLENGE_CLEARED = 31
"""Clear the unlock challenge with ID `conditionId`."""
MINIMUM_RATING = 32
"""Obtain a rating of at least `conditionId / 100`."""
class LinkedVerseUnlockConditionType(IntEnum):
"""
`conditionList` is a semicolon-delimited list of numbers, where the number's meaning
is defined by the specific `conditionId`. Additionally, each element of the list
can be further separated by underscores. For example `1;2_3;4` means that the player
must achieve 1 AND (2 OR 3) AND 4.
"""
PLAY_SONGS = 33
"""
Play songs given by `conditionList`, where `conditionList` is a
list of song IDs.
"""
COURSE_CLEAR_AND_CLASS_EMBLEM = 34
"""
Obtain a class emblem (by clearing all courses of a given class) on **any**
of the classes given by `conditionList`, where `conditionList` is an
underscore-separated list of class IDs (1 for CLASS I to 6 for CLASS ∞).
"""
TROPHY_OBTAINED = 35
"""
Obtain trophies given by `conditionList`, where `conditionList` is a
list of trophy IDs.
"""
PLAY_SONGS_IN_FAVORITE = 36
"""
Play songs given by `conditionList` **from the favorites folder**, where
`conditionList` is a list of song IDs.
"""
CLEAR_TEAM_COURSE_WITH_CHARACTER_OF_MINIMUM_RANK = 37
"""
Clear a team course while equipping a character of minimum rank.
"""
class MapAreaConditionLogicalOperator(Enum):
AND = 1
OR = 2
class AvatarCategory(Enum):
WEAR = 1
HEAD = 2
FACE = 3
SKIN = 4
ITEM = 5
FRONT = 6
BACK = 7
class ItemKind(IntEnum):
NAMEPLATE = 1
FRAME = 2
"""
"Frame" is the background for the gauge/score/max combo display
shown during gameplay. This item cannot be equipped (as of LUMINOUS PLUS)
and is hardcoded to the current game's version.
"""
TROPHY = 3
SKILL = 4
TICKET = 5
"""A statue is also a ticket."""
PRESENT = 6
MUSIC_UNLOCK = 7
MAP_ICON = 8
SYSTEM_VOICE = 9
SYMBOL_CHAT = 10
AVATAR_ACCESSORY = 11
ULTIMA_UNLOCK = 12
"""This only applies to ULTIMA difficulties that are *not* unlocked by
reaching S rank on EXPERT difficulty or above.
"""
STAGE = 13
class FavoriteItemKind(IntEnum):
MUSIC = 1
RIVAL = 2
CHARACTER = 3

View File

@@ -1,13 +1,17 @@
from core.data import Data
from core.config import CoreConfig
from titles.chuni.schema import *
from .config import ChuniConfig
class ChuniData(Data):
def __init__(self, cfg: CoreConfig) -> None:
def __init__(self, cfg: CoreConfig, chuni_cfg: ChuniConfig = None) -> None:
super().__init__(cfg)
self.item = ChuniItemData(cfg, self.session)
self.profile = ChuniProfileData(cfg, self.session)
self.score = ChuniScoreData(cfg, self.session)
self.static = ChuniStaticData(cfg, self.session)
# init rom versioning for use with score playlog data
if chuni_cfg:
ChuniRomVersion.init_versions(chuni_cfg)

View File

@@ -1,7 +1,9 @@
from typing import List
from typing import List, Tuple, Dict
from starlette.routing import Route, Mount
from starlette.requests import Request
from starlette.responses import Response, RedirectResponse
from starlette.staticfiles import StaticFiles
from sqlalchemy.engine import Row
from os import path
import yaml
import jinja2
@@ -10,7 +12,8 @@ from core.frontend import FE_Base, UserSession
from core.config import CoreConfig
from .database import ChuniData
from .config import ChuniConfig
from .const import ChuniConstants
from .const import ChuniConstants, AvatarCategory, ItemKind
from .read import ChuniReader
def pairwise(iterable):
@@ -81,14 +84,17 @@ class ChuniFrontend(FE_Base):
self, cfg: CoreConfig, environment: jinja2.Environment, cfg_dir: str
) -> None:
super().__init__(cfg, environment)
self.data = ChuniData(cfg)
self.game_cfg = ChuniConfig()
if path.exists(f"{cfg_dir}/{ChuniConstants.CONFIG_NAME}"):
self.game_cfg.update(
yaml.safe_load(open(f"{cfg_dir}/{ChuniConstants.CONFIG_NAME}"))
)
self.data = ChuniData(cfg, self.game_cfg)
self.nav_name = "Chunithm"
# Convert any old assets created with a previous version of the importer
ChuniReader.ConvertOldAssets(self.logger)
def get_routes(self) -> List[Route]:
return [
Route("/", self.render_GET, methods=['GET']),
@@ -97,8 +103,19 @@ class ChuniFrontend(FE_Base):
Route("/", self.render_GET_playlog, methods=['GET']),
Route("/{index}", self.render_GET_playlog, methods=['GET']),
]),
Route("/favorites", self.render_GET_favorites, methods=['GET']),
Route("/userbox", self.render_GET_userbox, methods=['GET']),
Route("/avatar", self.render_GET_avatar, methods=['GET']),
Route("/update.map-icon", self.update_map_icon, methods=['POST']),
Route("/update.system-voice", self.update_system_voice, methods=['POST']),
Route("/update.stage", self.update_stage, methods=['POST']),
Route("/update.userbox", self.update_userbox, methods=['POST']),
Route("/update.avatar", self.update_avatar, methods=['POST']),
Route("/update.name", self.update_name, methods=['POST']),
Route("/update.favorite_music_playlog", self.update_favorite_music_playlog, methods=['POST']),
Route("/update.favorite_music_favorites", self.update_favorite_music_favorites, methods=['POST']),
Route("/version.change", self.version_change, methods=['POST']),
Mount('/img', app=StaticFiles(directory='titles/chuni/img'), name="img")
]
async def render_GET(self, request: Request) -> bytes:
@@ -111,22 +128,38 @@ class ChuniFrontend(FE_Base):
if usr_sesh.user_id > 0:
versions = await self.data.profile.get_all_profile_versions(usr_sesh.user_id)
profile = []
profile = None
if versions:
# chunithm_version is -1 means it is not initialized yet, select a default version from existing.
if usr_sesh.chunithm_version < 0:
usr_sesh.chunithm_version = versions[0]
profile = await self.data.profile.get_profile_data(usr_sesh.user_id, usr_sesh.chunithm_version)
user_id = usr_sesh.user_id
version = usr_sesh.chunithm_version
# While map icons and system voices weren't present prior to AMAZON, we don't need to bother checking
# version here - it'll just end up being empty sets and the jinja will ignore the variables anyway.
map_icons, total_map_icons = await self.get_available_map_icons(version, profile)
system_voices, total_system_voices = await self.get_available_system_voices(version, profile)
stages, total_stages = await self.get_available_stages(version, profile)
resp = Response(template.render(
title=f"{self.core_config.server.name} | {self.nav_name}",
game_list=self.environment.globals["game_list"],
sesh=vars(usr_sesh),
user_id=usr_sesh.user_id,
user_id=user_id,
profile=profile,
version_list=ChuniConstants.VERSION_NAMES,
versions=versions,
cur_version=usr_sesh.chunithm_version
cur_version=version,
cur_version_name=ChuniConstants.game_ver_to_string(version),
map_icons=map_icons,
system_voices=system_voices,
total_map_icons=total_map_icons,
total_system_voices=total_system_voices,
stages=stages,
total_stages=total_stages
), media_type="text/html; charset=utf-8")
if usr_sesh.chunithm_version >= 0:
@@ -184,6 +217,8 @@ class ChuniFrontend(FE_Base):
profile=profile,
hot_list=hot_list,
base_list=base_list,
cur_version=usr_sesh.chunithm_version,
cur_version_name=ChuniConstants.game_ver_to_string(usr_sesh.chunithm_version)
), media_type="text/html; charset=utf-8")
else:
return RedirectResponse("/gate/", 303)
@@ -205,43 +240,515 @@ class ChuniFrontend(FE_Base):
else:
index = int(path_index) - 1 # 0 and 1 are 1st page
user_id = usr_sesh.user_id
playlog_count = await self.data.score.get_user_playlogs_count(user_id)
version = usr_sesh.chunithm_version
playlog_count = await self.data.score.get_user_playlogs_count(user_id, version)
if playlog_count < index * 20 :
return Response(template.render(
title=f"{self.core_config.server.name} | {self.nav_name}",
game_list=self.environment.globals["game_list"],
sesh=vars(usr_sesh),
playlog_count=0
playlog_count=0,
cur_version=version,
cur_version_name=ChuniConstants.game_ver_to_string(version)
), media_type="text/html; charset=utf-8")
playlog = await self.data.score.get_playlogs_limited(user_id, index, 20)
playlog = await self.data.score.get_playlogs_limited(user_id, version, index, 20)
playlog_with_title = []
for record in playlog:
music_chart = await self.data.static.get_music_chart(usr_sesh.chunithm_version, record.musicId, record.level)
for idx,record in enumerate(playlog):
music_chart = await self.data.static.get_music_chart(version, record.musicId, record.level)
if music_chart:
difficultyNum=music_chart.level
artist=music_chart.artist
title=music_chart.title
(jacket, ext) = path.splitext(music_chart.jacketPath)
jacket += ".webp"
else:
difficultyNum=0
artist="unknown"
title="musicid: " + str(record.musicId)
jacket = "unknown.webp"
# Check if this song is a favorite so we can populate the add/remove button
is_favorite = await self.data.item.is_favorite(user_id, version, record.musicId)
playlog_with_title.append({
# Values for the actual readable results
"raw": record,
"title": title,
"difficultyNum": difficultyNum,
"artist": artist,
"jacket": jacket,
# Values used solely for favorite updates
"idx": idx,
"musicId": record.musicId,
"isFav": is_favorite
})
return Response(template.render(
title=f"{self.core_config.server.name} | {self.nav_name}",
game_list=self.environment.globals["game_list"],
sesh=vars(usr_sesh),
user_id=usr_sesh.user_id,
user_id=user_id,
playlog=playlog_with_title,
playlog_count=playlog_count
playlog_count=playlog_count,
cur_version=version,
cur_version_name=ChuniConstants.game_ver_to_string(version)
), media_type="text/html; charset=utf-8")
else:
return RedirectResponse("/gate/", 303)
async def render_GET_favorites(self, request: Request) -> bytes:
template = self.environment.get_template(
"titles/chuni/templates/chuni_favorites.jinja"
)
usr_sesh = self.validate_session(request)
if not usr_sesh:
usr_sesh = UserSession()
if usr_sesh.user_id > 0:
if usr_sesh.chunithm_version < 0:
return RedirectResponse("/game/chuni/", 303)
user_id = usr_sesh.user_id
version = usr_sesh.chunithm_version
favorites = await self.data.item.get_all_favorites(user_id, version, 1)
favorites_count = len(favorites)
favorites_with_title = []
favorites_by_genre = dict()
for idx,favorite in enumerate(favorites):
song = await self.data.static.get_song(favorite.favId)
if song:
# we likely got multiple results - one for each chart. Just use the first
artist=song.artist
title=song.title
genre=song.genre
(jacket, ext) = path.splitext(song.jacketPath)
jacket += ".webp"
else:
artist="unknown"
title="musicid: " + str(favorite.favId)
genre="unknown"
jacket = "unknown.webp"
# add a new collection for the genre if this is our first time seeing it
if genre not in favorites_by_genre:
favorites_by_genre[genre] = []
# add the song to the appropriate genre collection
favorites_by_genre[genre].append({
"idx": idx,
"title": title,
"artist": artist,
"jacket": jacket,
"favId": favorite.favId
})
# Sort favorites by title before rendering the page
for g in favorites_by_genre:
favorites_by_genre[g].sort(key=lambda x: x["title"].lower())
return Response(template.render(
title=f"{self.core_config.server.name} | {self.nav_name}",
game_list=self.environment.globals["game_list"],
sesh=vars(usr_sesh),
user_id=user_id,
favorites_by_genre=favorites_by_genre,
favorites_count=favorites_count,
cur_version=version,
cur_version_name=ChuniConstants.game_ver_to_string(version)
), media_type="text/html; charset=utf-8")
else:
return RedirectResponse("/gate/", 303)
async def get_available_map_icons(self, version: int, profile: Row) -> Tuple[List[Dict], int]:
if profile is None:
return ([], 0)
items = dict()
rows = await self.data.static.get_map_icons(version)
if rows is None:
return (items, 0) # can only happen with old db
force_unlocked = self.game_cfg.mods.forced_item_unlocks("map_icons")
user_map_icons = []
if not force_unlocked:
user_map_icons = await self.data.item.get_items(profile.user, ItemKind.MAP_ICON.value)
user_map_icons = [icon["itemId"] for icon in user_map_icons] + [profile.mapIconId]
for row in rows:
if force_unlocked or row["defaultHave"] or row["mapIconId"] in user_map_icons:
item = dict()
item["id"] = row["mapIconId"]
item["name"] = row["name"]
item["iconPath"] = path.splitext(row["iconPath"])[0] + ".webp"
items[row["mapIconId"]] = item
return (items, len(rows))
async def get_available_system_voices(self, version: int, profile: Row) -> Tuple[List[Dict], int]:
if profile is None:
return ([], 0)
items = dict()
rows = await self.data.static.get_system_voices(version)
if rows is None:
return (items, 0) # can only happen with old db
force_unlocked = self.game_cfg.mods.forced_item_unlocks("system_voices")
user_system_voices = []
if not force_unlocked:
user_system_voices = await self.data.item.get_items(profile.user, ItemKind.SYSTEM_VOICE.value)
user_system_voices = [icon["itemId"] for icon in user_system_voices] + [profile.voiceId]
for row in rows:
if force_unlocked or row["defaultHave"] or row["voiceId"] in user_system_voices:
item = dict()
item["id"] = row["voiceId"]
item["name"] = row["name"]
item["imagePath"] = path.splitext(row["imagePath"])[0] + ".webp"
items[row["voiceId"]] = item
return (items, len(rows))
async def get_available_stages(self, version: int, profile: Row) -> Tuple[List[Dict], int]:
if profile is None:
return ([], 0)
items = dict()
rows = await self.data.static.get_stages(version)
if rows is None:
return (items, 0) # can only happen with old db
force_unlocked = self.game_cfg.mods.forced_item_unlocks("stages")
user_stages = []
if not force_unlocked:
user_stages = await self.data.item.get_items(profile.user, ItemKind.STAGE.value)
user_stages = [icon["itemId"] for icon in user_stages] + [profile.stageId]
for row in rows:
if force_unlocked or row["defaultHave"] or row["stageId"] in user_stages:
item = dict()
item["id"] = row["stageId"]
item["name"] = row["name"]
item["imagePath"] = path.splitext(row["imagePath"])[0] + ".webp"
items[row["stageId"]] = item
return (items, len(rows))
async def get_available_nameplates(self, version: int, profile: Row) -> Tuple[List[Dict], int]:
items = dict()
rows = await self.data.static.get_nameplates(version)
if rows is None:
return (items, 0) # can only happen with old db
force_unlocked = self.game_cfg.mods.forced_item_unlocks("nameplates")
user_nameplates = []
if not force_unlocked:
user_nameplates = await self.data.item.get_items(profile.user, ItemKind.NAMEPLATE.value)
user_nameplates = [item["itemId"] for item in user_nameplates] + [profile.nameplateId]
for row in rows:
if force_unlocked or row["defaultHave"] or row["nameplateId"] in user_nameplates:
item = dict()
item["id"] = row["nameplateId"]
item["name"] = row["name"]
item["texturePath"] = path.splitext(row["texturePath"])[0] + ".webp"
items[row["nameplateId"]] = item
return (items, len(rows))
async def get_available_trophies(self, version: int, profile: Row) -> Tuple[List[Dict], int]:
items = dict()
rows = await self.data.static.get_trophies(version)
if rows is None:
return (items, 0) # can only happen with old db
force_unlocked = self.game_cfg.mods.forced_item_unlocks("trophies")
user_trophies = []
if not force_unlocked:
user_trophies = await self.data.item.get_items(profile.user, ItemKind.TROPHY.value)
user_trophies = [item["itemId"] for item in user_trophies] + [profile.trophyId]
for row in rows:
if force_unlocked or row["defaultHave"] or row["trophyId"] in user_trophies:
item = dict()
item["id"] = row["trophyId"]
item["name"] = row["name"]
item["rarity"] = row["rareType"]
items[row["trophyId"]] = item
return (items, len(rows))
async def get_available_characters(self, version: int, profile: Row) -> Tuple[List[Dict], int]:
items = dict()
rows = await self.data.static.get_characters(version)
if rows is None:
return (items, 0) # can only happen with old db
force_unlocked = self.game_cfg.mods.forced_item_unlocks("character_icons")
user_characters = []
if not force_unlocked:
user_characters = await self.data.item.get_characters(profile.user)
user_characters = [chara["characterId"] for chara in user_characters] + [profile.characterId]
for row in rows:
if force_unlocked or row["defaultHave"] or row["characterId"] in user_characters:
item = dict()
item["id"] = row["characterId"]
item["name"] = row["name"]
item["iconPath"] = path.splitext(row["imagePath3"])[0] + ".webp"
items[row["characterId"]] = item
return (items, len(rows))
async def get_available_avatar_items(self, version: int, category: AvatarCategory, user_unlocked_items: List[int]) -> Tuple[List[Dict], int]:
items = dict()
rows = await self.data.static.get_avatar_items(version, category.value)
if rows is None:
return (items, 0) # can only happen with old db
force_unlocked = self.game_cfg.mods.forced_item_unlocks("avatar_accessories")
for row in rows:
if force_unlocked or row["defaultHave"] or row["avatarAccessoryId"] in user_unlocked_items:
item = dict()
item["id"] = row["avatarAccessoryId"]
item["name"] = row["name"]
item["iconPath"] = path.splitext(row["iconPath"])[0] + ".webp"
item["texturePath"] = path.splitext(row["texturePath"])[0] + ".webp"
items[row["avatarAccessoryId"]] = item
return (items, len(rows))
async def render_GET_userbox(self, request: Request) -> bytes:
template = self.environment.get_template(
"titles/chuni/templates/chuni_userbox.jinja"
)
usr_sesh = self.validate_session(request)
if not usr_sesh:
usr_sesh = UserSession()
if usr_sesh.user_id > 0:
if usr_sesh.chunithm_version < 0:
return RedirectResponse("/game/chuni/", 303)
user_id = usr_sesh.user_id
version = usr_sesh.chunithm_version
# Get the user profile so we know how the userbox is currently configured
profile = await self.data.profile.get_profile_data(user_id, version)
# Build up lists of available userbox components
nameplates, total_nameplates = await self.get_available_nameplates(version, profile)
trophies, total_trophies = await self.get_available_trophies(version, profile)
characters, total_characters = await self.get_available_characters(version, profile)
# Get the user's team
team_name = "ARTEMiS"
if profile["teamId"]:
team = await self.data.profile.get_team_by_id(profile["teamId"])
team_name = team["teamName"]
# Figure out the rating color we should use (rank maps to the stylesheet)
rating = profile.playerRating / 100;
rating_rank = 0
if rating >= 16:
rating_rank = 8
elif rating >= 15.25:
rating_rank = 7
elif rating >= 14.5:
rating_rank = 6
elif rating >= 13.25:
rating_rank = 5
elif rating >= 12:
rating_rank = 4
elif rating >= 10:
rating_rank = 3
elif rating >= 7:
rating_rank = 2
elif rating >= 4:
rating_rank = 1
return Response(template.render(
title=f"{self.core_config.server.name} | {self.nav_name}",
game_list=self.environment.globals["game_list"],
sesh=vars(usr_sesh),
user_id=user_id,
cur_version=version,
cur_version_name=ChuniConstants.game_ver_to_string(version),
profile=profile,
team_name=team_name,
rating_rank=rating_rank,
nameplates=nameplates,
trophies=trophies,
characters=characters,
total_nameplates=total_nameplates,
total_trophies=total_trophies,
total_characters=total_characters
), media_type="text/html; charset=utf-8")
else:
return RedirectResponse("/gate/", 303)
async def render_GET_avatar(self, request: Request) -> bytes:
template = self.environment.get_template(
"titles/chuni/templates/chuni_avatar.jinja"
)
usr_sesh = self.validate_session(request)
if not usr_sesh:
usr_sesh = UserSession()
if usr_sesh.user_id > 0:
if usr_sesh.chunithm_version < 11:
# Avatar configuration only for NEW!! and newer
return RedirectResponse("/game/chuni/", 303)
user_id = usr_sesh.user_id
version = usr_sesh.chunithm_version
# Get the user profile so we know what avatar items are currently in use
profile = await self.data.profile.get_profile_data(user_id, version)
# Get all the user avatar accessories so we know what to populate
user_accessories = await self.data.item.get_items(user_id, ItemKind.AVATAR_ACCESSORY.value)
user_accessories = [item["itemId"] for item in user_accessories] + \
[profile.avatarBack, profile.avatarItem, profile.avatarWear, \
profile.avatarFront, profile.avatarSkin, profile.avatarHead, profile.avatarFace]
# Build up available list of items for each avatar category
wears, total_wears = await self.get_available_avatar_items(version, AvatarCategory.WEAR, user_accessories)
faces, total_faces = await self.get_available_avatar_items(version, AvatarCategory.FACE, user_accessories)
heads, total_heads = await self.get_available_avatar_items(version, AvatarCategory.HEAD, user_accessories)
skins, total_skins = await self.get_available_avatar_items(version, AvatarCategory.SKIN, user_accessories)
items, total_items = await self.get_available_avatar_items(version, AvatarCategory.ITEM, user_accessories)
fronts, total_fronts = await self.get_available_avatar_items(version, AvatarCategory.FRONT, user_accessories)
backs, total_backs = await self.get_available_avatar_items(version, AvatarCategory.BACK, user_accessories)
return Response(template.render(
title=f"{self.core_config.server.name} | {self.nav_name}",
game_list=self.environment.globals["game_list"],
sesh=vars(usr_sesh),
user_id=user_id,
cur_version=version,
cur_version_name=ChuniConstants.game_ver_to_string(version),
profile=profile,
wears=wears,
faces=faces,
heads=heads,
skins=skins,
items=items,
fronts=fronts,
backs=backs,
total_wears=total_wears,
total_faces=total_faces,
total_heads=total_heads,
total_skins=total_skins,
total_items=total_items,
total_fronts=total_fronts,
total_backs=total_backs
), media_type="text/html; charset=utf-8")
else:
return RedirectResponse("/gate/", 303)
async def update_map_icon(self, request: Request) -> bytes:
usr_sesh = self.validate_session(request)
if not usr_sesh:
return RedirectResponse("/gate/", 303)
form_data = await request.form()
new_map_icon: str = form_data.get("id")
if not new_map_icon:
return RedirectResponse("/gate/?e=4", 303)
if not await self.data.profile.update_map_icon(usr_sesh.user_id, usr_sesh.chunithm_version, new_map_icon):
return RedirectResponse("/gate/?e=999", 303)
return RedirectResponse("/game/chuni/", 303)
async def update_system_voice(self, request: Request) -> bytes:
usr_sesh = self.validate_session(request)
if not usr_sesh:
return RedirectResponse("/gate/", 303)
form_data = await request.form()
new_system_voice: str = form_data.get("id")
if not new_system_voice:
return RedirectResponse("/gate/?e=4", 303)
if not await self.data.profile.update_system_voice(usr_sesh.user_id, usr_sesh.chunithm_version, new_system_voice):
return RedirectResponse("/gate/?e=999", 303)
return RedirectResponse("/game/chuni/", 303)
async def update_stage(self, request: Request) -> bytes:
usr_sesh = self.validate_session(request)
if not usr_sesh:
return RedirectResponse("/gate/", 303)
form_data = await request.form()
new_system_voice: str = form_data.get("id")
if not new_system_voice:
return RedirectResponse("/gate/?e=4", 303)
if not await self.data.profile.update_stage(usr_sesh.user_id, usr_sesh.chunithm_version, new_system_voice):
return RedirectResponse("/gate/?e=999", 303)
return RedirectResponse("/game/chuni/", 303)
async def update_userbox(self, request: Request) -> bytes:
usr_sesh = self.validate_session(request)
if not usr_sesh:
return RedirectResponse("/gate/", 303)
form_data = await request.form()
new_nameplate: str = form_data.get("nameplate")
new_trophy: str = form_data.get("trophy")
new_trophy_sub_1: str = form_data.get("trophySub1")
new_trophy_sub_2: str = form_data.get("trophySub2")
new_character: str = form_data.get("character")
if not new_nameplate or \
not new_trophy or \
not new_trophy_sub_1 or \
not new_trophy_sub_2 or \
not new_character:
return RedirectResponse("/game/chuni/userbox?e=4", 303)
if not await self.data.profile.update_userbox(usr_sesh.user_id, usr_sesh.chunithm_version, new_nameplate, new_trophy, new_trophy_sub_1, new_trophy_sub_2, new_character):
return RedirectResponse("/gate/?e=999", 303)
return RedirectResponse("/game/chuni/userbox", 303)
async def update_avatar(self, request: Request) -> bytes:
usr_sesh = self.validate_session(request)
if not usr_sesh:
return RedirectResponse("/gate/", 303)
form_data = await request.form()
new_wear: str = form_data.get("wear")
new_face: str = form_data.get("face")
new_head: str = form_data.get("head")
new_skin: str = form_data.get("skin")
new_item: str = form_data.get("item")
new_front: str = form_data.get("front")
new_back: str = form_data.get("back")
if not new_wear or \
not new_face or \
not new_head or \
not new_skin or \
not new_item or \
not new_front or \
not new_back:
return RedirectResponse("/game/chuni/avatar?e=4", 303)
if not await self.data.profile.update_avatar(usr_sesh.user_id, usr_sesh.chunithm_version, new_wear, new_face, new_head, new_skin, new_item, new_front, new_back):
return RedirectResponse("/gate/?e=999", 303)
return RedirectResponse("/game/chuni/avatar", 303)
async def update_name(self, request: Request) -> bytes:
usr_sesh = self.validate_session(request)
if not usr_sesh:
@@ -265,7 +772,7 @@ class ChuniFrontend(FE_Base):
elif o < 0x7F and o > 0x20:
new_name_full += chr(o + 0xFEE0)
elif o <= 0x7F:
self.logger.warn(f"Invalid ascii character {o:02X}")
self.logger.warning(f"Invalid ascii character {o:02X}")
return RedirectResponse("/gate/?e=4", 303)
else:
new_name_full += x
@@ -279,6 +786,32 @@ class ChuniFrontend(FE_Base):
return RedirectResponse("/game/chuni/?s=1", 303)
async def update_favorite_music(self, request: Request, retPage: str):
usr_sesh = self.validate_session(request)
if not usr_sesh:
return RedirectResponse(retPage, 303)
user_id = usr_sesh.user_id
version = usr_sesh.chunithm_version
form_data = await request.form()
music_id: str = form_data.get("musicId")
isAdd: int = int(form_data.get("isAdd"))
if isAdd:
if await self.data.item.put_favorite_music(user_id, version, music_id) == None:
return RedirectResponse("/gate/?e=999", 303)
else:
if await self.data.item.delete_favorite_music(user_id, version, music_id) == None:
return RedirectResponse("/gate/?e=999", 303)
return RedirectResponse(retPage, 303)
async def update_favorite_music_playlog(self, request: Request):
return await self.update_favorite_music(request, "/game/chuni/playlog")
async def update_favorite_music_favorites(self, request: Request):
return await self.update_favorite_music(request, "/game/chuni/favorites")
async def version_change(self, request: Request):
usr_sesh = self.validate_session(request)
if not usr_sesh:

Binary file not shown.

After

Width:  |  Height:  |  Size: 6.9 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.5 KiB

4
titles/chuni/img/avatar/.gitignore vendored Normal file
View File

@@ -0,0 +1,4 @@
# Ignore everything in this directory
*
# Except this file
!.gitignore

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.6 KiB

4
titles/chuni/img/character/.gitignore vendored Normal file
View File

@@ -0,0 +1,4 @@
# Ignore everything in this directory
*
# Except this file
!.gitignore

5
titles/chuni/img/jacket/.gitignore vendored Normal file
View File

@@ -0,0 +1,5 @@
# Ignore everything in this directory
*
# Except this file and default unknown
!.gitignore
!unknown.webp

Binary file not shown.

After

Width:  |  Height:  |  Size: 6.4 KiB

4
titles/chuni/img/mapIcon/.gitignore vendored Normal file
View File

@@ -0,0 +1,4 @@
# Ignore everything in this directory
*
# Except this file
!.gitignore

4
titles/chuni/img/nameplate/.gitignore vendored Normal file
View File

@@ -0,0 +1,4 @@
# Ignore everything in this directory
*
# Except this file
!.gitignore

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.1 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.9 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.9 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.6 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.4 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.7 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.7 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.5 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.5 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.0 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.0 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 570 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.1 KiB

4
titles/chuni/img/stage/.gitignore vendored Normal file
View File

@@ -0,0 +1,4 @@
# Ignore everything in this directory
*
# Except this file
!.gitignore

View File

@@ -0,0 +1,4 @@
# Ignore everything in this directory
*
# Except this file
!.gitignore

View File

@@ -1,20 +1,22 @@
from starlette.requests import Request
from starlette.routing import Route
from starlette.responses import Response
import asyncio
import re
import logging
import coloredlogs
from logging.handlers import TimedRotatingFileHandler
import zlib
import yaml
import json
import inflection
import string
from os import path
from typing import Tuple, Dict, List
from logging.handlers import TimedRotatingFileHandler
from starlette.requests import Request
from starlette.routing import Route
from starlette.responses import Response
from Crypto.Cipher import AES
from Crypto.Util.Padding import pad
from Crypto.Protocol.KDF import PBKDF2
from Crypto.Hash import SHA1
from os import path
from typing import Tuple, Dict, List
from core import CoreConfig, Utils
from core.title import BaseServlet
@@ -36,6 +38,10 @@ from .newplus import ChuniNewPlus
from .sun import ChuniSun
from .sunplus import ChuniSunPlus
from .luminous import ChuniLuminous
from .luminousplus import ChuniLuminousPlus
from .verse import ChuniVerse
from .xverse import ChuniXVerse
class ChuniServlet(BaseServlet):
def __init__(self, core_cfg: CoreConfig, cfg_dir: str) -> None:
@@ -64,6 +70,9 @@ class ChuniServlet(BaseServlet):
ChuniSun,
ChuniSunPlus,
ChuniLuminous,
ChuniLuminousPlus,
ChuniVerse,
ChuniXVerse,
]
self.logger = logging.getLogger("chuni")
@@ -94,19 +103,27 @@ class ChuniServlet(BaseServlet):
known_iter_counts = {
ChuniConstants.VER_CHUNITHM_CRYSTAL_PLUS: 67,
f"{ChuniConstants.VER_CHUNITHM_CRYSTAL_PLUS}_int": 25, # SUPERSTAR
f"{ChuniConstants.VER_CHUNITHM_CRYSTAL_PLUS}_int": 25, # SUPERSTAR
ChuniConstants.VER_CHUNITHM_PARADISE: 44,
f"{ChuniConstants.VER_CHUNITHM_PARADISE}_int": 51, # SUPERSTAR PLUS
f"{ChuniConstants.VER_CHUNITHM_PARADISE}_int": 51, # SUPERSTAR PLUS
ChuniConstants.VER_CHUNITHM_NEW: 54,
f"{ChuniConstants.VER_CHUNITHM_NEW}_int": 49,
f"{ChuniConstants.VER_CHUNITHM_NEW}_chn": 37,
ChuniConstants.VER_CHUNITHM_NEW_PLUS: 25,
f"{ChuniConstants.VER_CHUNITHM_NEW_PLUS}_int": 31,
f"{ChuniConstants.VER_CHUNITHM_NEW_PLUS}_chn": 35, # NEW
ChuniConstants.VER_CHUNITHM_SUN: 70,
f"{ChuniConstants.VER_CHUNITHM_SUN}_int": 35,
ChuniConstants.VER_CHUNITHM_SUN_PLUS: 36,
f"{ChuniConstants.VER_CHUNITHM_SUN_PLUS}_int": 36,
ChuniConstants.VER_CHUNITHM_LUMINOUS: 8,
f"{ChuniConstants.VER_CHUNITHM_LUMINOUS}_int": 8,
f"{ChuniConstants.VER_CHUNITHM_LUMINOUS}_chn": 8,
ChuniConstants.VER_CHUNITHM_LUMINOUS_PLUS: 56,
ChuniConstants.VER_CHUNITHM_VERSE: 42,
f"{ChuniConstants.VER_CHUNITHM_VERSE}_chn": 37,
ChuniConstants.VER_CHUNITHM_X_VERSE: 14,
f"{ChuniConstants.VER_CHUNITHM_X_VERSE}_int": 96,
}
for version, keys in self.game_cfg.crypto.keys.items():
@@ -117,7 +134,7 @@ class ChuniServlet(BaseServlet):
version_idx = version
else:
version_idx = int(version.split("_")[0])
salt = bytes.fromhex(keys[2])
if len(keys) >= 4:
@@ -147,7 +164,9 @@ class ChuniServlet(BaseServlet):
and version_idx >= ChuniConstants.VER_CHUNITHM_NEW
):
method_fixed += "C3Exp"
elif isinstance(version, str) and version.endswith("_chn"):
method_fixed += "Chn"
hash = PBKDF2(
method_fixed,
salt,
@@ -156,7 +175,8 @@ class ChuniServlet(BaseServlet):
hmac_hash_module=SHA1,
)
hashed_name = hash.hex()[:32] # truncate unused bytes like the game does
# truncate unused bytes like the game does
hashed_name = hash.hex()[:32]
self.hash_table[version][hashed_name] = method_fixed
self.logger.debug(
@@ -178,22 +198,48 @@ class ChuniServlet(BaseServlet):
return True
def get_allnet_info(self, game_code: str, game_ver: int, keychip: str) -> Tuple[str, str]:
if not self.core_cfg.server.is_using_proxy and Utils.get_title_port(self.core_cfg) != 80:
return (f"http://{self.core_cfg.server.hostname}:{Utils.get_title_port(self.core_cfg)}/{game_code}/{game_ver}/", self.core_cfg.server.hostname)
def get_allnet_info(
self, game_code: str, game_ver: int, keychip: str
) -> Tuple[str, str]:
title_port_int = Utils.get_title_port(self.core_cfg)
title_port_ssl_int = Utils.get_title_port_ssl(self.core_cfg)
return (f"http://{self.core_cfg.server.hostname}/{game_code}/{game_ver}/", self.core_cfg.server.hostname)
if self.game_cfg.server.use_https and (
(game_code == "SDBT" and game_ver >= 145) or # JP use TLS from CRYSTAL PLUS
game_code != "SDBT" # SDGS and SDHJ all version can use TLS
):
proto = "https"
else:
proto = "http"
if proto == "https":
t_port = f":{title_port_ssl_int}" if title_port_ssl_int != 443 else ""
else:
t_port = f":{title_port_int}" if title_port_int != 80 else ""
return (
f"{proto}://{self.core_cfg.server.hostname}{t_port}/{game_code}/{game_ver}/",
f"{self.core_cfg.server.hostname}",
)
def get_routes(self) -> List[Route]:
return [
Route("/{game:str}/{version:int}/ChuniServlet/{endpoint:str}", self.render_POST, methods=['POST']),
Route("/{game:str}/{version:int}/ChuniServlet/MatchingServer/{endpoint:str}", self.render_POST, methods=['POST']),
Route(
"/{game:str}/{version:int}/ChuniServlet/{endpoint:str}",
self.render_POST,
methods=["POST"],
),
Route(
"/{game:str}/{version:int}/ChuniServlet/MatchingServer/{endpoint:str}",
self.render_POST,
methods=["POST"],
),
]
async def render_POST(self, request: Request) -> bytes:
endpoint: str = request.path_params.get('endpoint')
version: int = request.path_params.get('version')
game_code: str = request.path_params.get('game')
endpoint: str = request.path_params.get("endpoint")
version: int = request.path_params.get("version")
game_code: str = request.path_params.get("game")
if endpoint.lower() == "ping":
return Response(zlib.compress(b'{"returnCode": "1"}'))
@@ -204,54 +250,79 @@ class ChuniServlet(BaseServlet):
internal_ver = 0
client_ip = Utils.get_ip_addr(request)
if game_code == "SDHD" or game_code == "SDBT": # JP
if version < 105: # 1.0
internal_ver = ChuniConstants.VER_CHUNITHM
elif version >= 105 and version < 110: # PLUS
internal_ver = ChuniConstants.VER_CHUNITHM_PLUS
elif version >= 110 and version < 115: # AIR
internal_ver = ChuniConstants.VER_CHUNITHM_AIR
elif version >= 115 and version < 120: # AIR PLUS
internal_ver = ChuniConstants.VER_CHUNITHM_AIR_PLUS
elif version >= 120 and version < 125: # STAR
internal_ver = ChuniConstants.VER_CHUNITHM_STAR
elif version >= 125 and version < 130: # STAR PLUS
internal_ver = ChuniConstants.VER_CHUNITHM_STAR_PLUS
elif version >= 130 and version < 135: # AMAZON
internal_ver = ChuniConstants.VER_CHUNITHM_AMAZON
elif version >= 135 and version < 140: # AMAZON PLUS
internal_ver = ChuniConstants.VER_CHUNITHM_AMAZON_PLUS
elif version >= 140 and version < 145: # CRYSTAL
internal_ver = ChuniConstants.VER_CHUNITHM_CRYSTAL
elif version >= 145 and version < 150: # CRYSTAL PLUS
internal_ver = ChuniConstants.VER_CHUNITHM_CRYSTAL_PLUS
elif version >= 150 and version < 200: # PARADISE
internal_ver = ChuniConstants.VER_CHUNITHM_PARADISE
elif version >= 200 and version < 205: # NEW!!
internal_ver = ChuniConstants.VER_CHUNITHM_NEW
elif version >= 205 and version < 210: # NEW PLUS!!
internal_ver = ChuniConstants.VER_CHUNITHM_NEW_PLUS
elif version >= 210 and version < 215: # SUN
internal_ver = ChuniConstants.VER_CHUNITHM_SUN
elif version >= 215 and version < 220: # SUN PLUS
internal_ver = ChuniConstants.VER_CHUNITHM_SUN_PLUS
elif version >= 220: # LUMINOUS
internal_ver = ChuniConstants.VER_CHUNITHM_LUMINOUS
elif game_code == "SDGS": # Int
if version < 105: # SUPERSTAR
internal_ver = ChuniConstants.VER_CHUNITHM_CRYSTAL_PLUS
elif version >= 105 and version < 110: # SUPERSTAR PLUS *Cursed but needed due to different encryption key
internal_ver = ChuniConstants.VER_CHUNITHM_PARADISE
elif version >= 110 and version < 115: # NEW
internal_ver = ChuniConstants.VER_CHUNITHM_NEW
elif version >= 115 and version < 120: # NEW PLUS!!
internal_ver = ChuniConstants.VER_CHUNITHM_NEW_PLUS
elif version >= 120 and version < 125: # SUN
internal_ver = ChuniConstants.VER_CHUNITHM_SUN
elif version >= 125 and version < 130: # SUN PLUS
internal_ver = ChuniConstants.VER_CHUNITHM_SUN_PLUS
elif version >= 130: # LUMINOUS
internal_ver = ChuniConstants.VER_CHUNITHM_LUMINOUS
if game_code == "SDHD" or game_code == "SDBT": # JP
if version < 105: # 1.0
internal_ver = ChuniConstants.VER_CHUNITHM
elif version >= 105 and version < 110: # PLUS
internal_ver = ChuniConstants.VER_CHUNITHM_PLUS
elif version >= 110 and version < 115: # AIR
internal_ver = ChuniConstants.VER_CHUNITHM_AIR
elif version >= 115 and version < 120: # AIR PLUS
internal_ver = ChuniConstants.VER_CHUNITHM_AIR_PLUS
elif version >= 120 and version < 125: # STAR
internal_ver = ChuniConstants.VER_CHUNITHM_STAR
elif version >= 125 and version < 130: # STAR PLUS
internal_ver = ChuniConstants.VER_CHUNITHM_STAR_PLUS
elif version >= 130 and version < 135: # AMAZON
internal_ver = ChuniConstants.VER_CHUNITHM_AMAZON
elif version >= 135 and version < 140: # AMAZON PLUS
internal_ver = ChuniConstants.VER_CHUNITHM_AMAZON_PLUS
elif version >= 140 and version < 145: # CRYSTAL
internal_ver = ChuniConstants.VER_CHUNITHM_CRYSTAL
elif version >= 145 and version < 150: # CRYSTAL PLUS
internal_ver = ChuniConstants.VER_CHUNITHM_CRYSTAL_PLUS
elif version >= 150 and version < 200: # PARADISE
internal_ver = ChuniConstants.VER_CHUNITHM_PARADISE
elif version >= 200 and version < 205: # NEW!!
internal_ver = ChuniConstants.VER_CHUNITHM_NEW
elif version >= 205 and version < 210: # NEW PLUS!!
internal_ver = ChuniConstants.VER_CHUNITHM_NEW_PLUS
elif version >= 210 and version < 215: # SUN
internal_ver = ChuniConstants.VER_CHUNITHM_SUN
elif version >= 215 and version < 220: # SUN PLUS
internal_ver = ChuniConstants.VER_CHUNITHM_SUN_PLUS
elif version >= 220 and version < 225: # LUMINOUS
internal_ver = ChuniConstants.VER_CHUNITHM_LUMINOUS
elif version >= 225 and version < 230: # LUMINOUS PLUS
internal_ver = ChuniConstants.VER_CHUNITHM_LUMINOUS_PLUS
elif version >= 230 and version < 240: # VERSE
internal_ver = ChuniConstants.VER_CHUNITHM_VERSE
elif version >= 240: # X-VERSE
internal_ver = ChuniConstants.VER_CHUNITHM_X_VERSE
elif game_code == "SDGS": # Int
if version < 105: # SUPERSTAR
internal_ver = ChuniConstants.VER_CHUNITHM_CRYSTAL_PLUS
elif (
version >= 105 and version < 110
): # SUPERSTAR PLUS *Cursed but needed due to different encryption key
internal_ver = ChuniConstants.VER_CHUNITHM_PARADISE
elif version >= 110 and version < 115: # NEW
internal_ver = ChuniConstants.VER_CHUNITHM_NEW
elif version >= 115 and version < 120: # NEW PLUS!!
internal_ver = ChuniConstants.VER_CHUNITHM_NEW_PLUS
elif version >= 120 and version < 125: # SUN
internal_ver = ChuniConstants.VER_CHUNITHM_SUN
elif version >= 125 and version < 130: # SUN PLUS
internal_ver = ChuniConstants.VER_CHUNITHM_SUN_PLUS
elif version >= 130 and version < 135: # LUMINOUS
internal_ver = ChuniConstants.VER_CHUNITHM_LUMINOUS
elif version >= 135 and version < 140: # LUMINOUS PLUS
internal_ver = ChuniConstants.VER_CHUNITHM_LUMINOUS_PLUS
elif version >= 140 and version < 150: # VERSE
internal_ver = ChuniConstants.VER_CHUNITHM_VERSE
elif version >= 150: # X-VERSE
internal_ver = ChuniConstants.VER_CHUNITHM_X_VERSE
elif game_code == "SDHJ": # Chn
if version < 110: # NEW
internal_ver = ChuniConstants.VER_CHUNITHM_NEW
elif (
version >= 110 and version < 120
): # NEW *Cursed but needed due to different encryption key
internal_ver = ChuniConstants.VER_CHUNITHM_NEW_PLUS
elif version >= 120 and version < 130: # LUMINOUS
internal_ver = ChuniConstants.VER_CHUNITHM_LUMINOUS
elif version >= 130: # VERSE
internal_ver = ChuniConstants.VER_CHUNITHM_VERSE
if all(c in string.hexdigits for c in endpoint) and len(endpoint) == 32:
# If we get a 32 character long hex string, it's a hash and we're
@@ -261,6 +332,9 @@ class ChuniServlet(BaseServlet):
if game_code == "SDGS":
crypto_cfg_key = f"{internal_ver}_int"
hash_table_key = f"{internal_ver}_int"
elif game_code == "SDHJ":
crypto_cfg_key = f"{internal_ver}_chn"
hash_table_key = f"{internal_ver}_chn"
else:
crypto_cfg_key = internal_ver
hash_table_key = internal_ver
@@ -311,8 +385,10 @@ class ChuniServlet(BaseServlet):
return Response(zlib.compress(b'{"stat": "0"}'))
try:
unzip = zlib.decompress(req_raw)
if request.headers.get("x-debug") is not None:
unzip = req_raw
else:
unzip = zlib.decompress(req_raw)
except zlib.error as e:
self.logger.error(
f"Failed to decompress v{version} {endpoint} request -> {e}"
@@ -328,14 +404,16 @@ class ChuniServlet(BaseServlet):
endpoint = endpoint.replace("C3Exp", "")
elif game_code == "SDGS" and version < 110:
endpoint = endpoint.replace("Exp", "")
elif game_code == "SDHJ":
endpoint = endpoint.replace("Chn", "")
else:
endpoint = endpoint
func_to_find = "handle_" + inflection.underscore(endpoint) + "_request"
func_to_find = "handle_" + self.strict_underscore(endpoint) + "_request"
handler_cls = self.versions[internal_ver](self.core_cfg, self.game_cfg)
if not hasattr(handler_cls, func_to_find):
self.logger.warning(f"Unhandled v{version} request {endpoint}")
self.logger.warning(f"Unhandled v{version} request {func_to_find}")
resp = {"returnCode": 1}
else:
@@ -352,6 +430,9 @@ class ChuniServlet(BaseServlet):
self.logger.debug(f"Response {resp}")
if request.headers.get("x-debug") is not None:
return Response(json.dumps(resp, ensure_ascii=False).encode("utf-8"))
zipped = zlib.compress(json.dumps(resp, ensure_ascii=False).encode("utf-8"))
if not encrtped:
@@ -366,3 +447,9 @@ class ChuniServlet(BaseServlet):
)
return Response(crypt.encrypt(padded))
def strict_underscore(self, name: str) -> str:
# Insert underscores between *all* capital letters
name = re.sub(r"([A-Z])([A-Z])", r"\1_\2", name)
return inflection.underscore(name)

View File

@@ -1,10 +1,83 @@
from datetime import timedelta
from typing import Dict
from sqlalchemy.engine import Row
from core.config import CoreConfig
from titles.chuni.sunplus import ChuniSunPlus
from titles.chuni.const import ChuniConstants, MapAreaConditionLogicalOperator, MapAreaConditionType
from titles.chuni.config import ChuniConfig
from titles.chuni.const import (
ChuniConstants,
MapAreaConditionLogicalOperator,
MapAreaConditionType,
)
from titles.chuni.sunplus import ChuniSunPlus
class MysticAreaConditions:
"""The "Mystic Rainbow of <VERSION>" map is a special reward map for obtaining
rainbow statues. There's one gold statue area that's unlocked when at least one
original map is finished, and additional rainbow statue areas are added as new
original maps are added.
"""
def __init__(
self, events_by_id: dict[int, Row], map_area_1_id: int, date_time_format: str
):
self.events_by_id = events_by_id
self.date_time_format = date_time_format
self._map_area_1_conditions = {
"mapAreaId": map_area_1_id,
"length": 0,
"mapAreaConditionList": [],
}
self._map_area_1_added = False
self._conditions = []
@property
def conditions(self):
return self._conditions
def add_condition(
self, map_flag_event_id: int, condition_map_id: int, mystic_map_area_id: int
):
if (event := self.events_by_id.get(map_flag_event_id)) is None:
return
start_date = event["startDate"].strftime(self.date_time_format)
self._map_area_1_conditions["mapAreaConditionList"].append(
{
"type": MapAreaConditionType.MAP_CLEARED.value,
"conditionId": condition_map_id,
"logicalOpe": MapAreaConditionLogicalOperator.OR.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00",
}
)
self._map_area_1_conditions["length"] = len(
self._map_area_1_conditions["mapAreaConditionList"]
)
if not self._map_area_1_added:
self._conditions.append(self._map_area_1_conditions)
self._map_area_1_added = True
self._conditions.append(
{
"mapAreaId": mystic_map_area_id,
"length": 1,
"mapAreaConditionList": [
{
"type": MapAreaConditionType.MAP_CLEARED.value,
"conditionId": condition_map_id,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00",
}
],
}
)
class ChuniLuminous(ChuniSunPlus):
@@ -12,13 +85,13 @@ class ChuniLuminous(ChuniSunPlus):
super().__init__(core_cfg, game_cfg)
self.version = ChuniConstants.VER_CHUNITHM_LUMINOUS
async def handle_cm_get_user_preview_api_request(self, data: Dict) -> Dict:
user_data = await super().handle_cm_get_user_preview_api_request(data)
async def handle_c_m_get_user_preview_api_request(self, data: Dict) -> Dict:
user_data = await super().handle_c_m_get_user_preview_api_request(data)
# Does CARD MAKER 1.35 work this far up?
user_data["lastDataVersion"] = "2.20.00"
return user_data
async def handle_get_user_c_mission_api_request(self, data: Dict) -> Dict:
user_id = data["userId"]
mission_id = data["missionId"]
@@ -28,7 +101,7 @@ class ChuniLuminous(ChuniSunPlus):
mission_data = await self.data.item.get_cmission(user_id, mission_id)
progress_data = await self.data.item.get_cmission_progress(user_id, mission_id)
if mission_data and progress_data:
point = mission_data["point"]
@@ -48,12 +121,14 @@ class ChuniLuminous(ChuniSunPlus):
"userCMissionProgressList": progress_list,
}
async def handle_get_user_net_battle_ranking_info_api_request(self, data: Dict) -> Dict:
async def handle_get_user_net_battle_ranking_info_api_request(
self, data: Dict
) -> Dict:
user_id = data["userId"]
net_battle = {}
net_battle_data = await self.data.profile.get_net_battle(user_id)
if net_battle_data:
net_battle = {
"isRankUpChallengeFailed": net_battle_data["isRankUpChallengeFailed"],
@@ -71,18 +146,18 @@ class ChuniLuminous(ChuniSunPlus):
async def handle_get_game_map_area_condition_api_request(self, data: Dict) -> Dict:
# There is no game data for this, everything is server side.
# However, we can selectively show/hide events as data is imported into the server.
events = await self.data.static.get_enabled_events(self.version)
events = await self.data.static.get_enabled_events(self.version) or []
event_by_id = {evt["eventId"]: evt for evt in events}
conditions = []
# The Mystic Rainbow of LUMINOUS map unlocks when any mainline LUMINOUS area
# (ep. I, ep. II, ep. III) are completed.
mystic_area_1_conditions = {
"mapAreaId": 3229301, # Mystic Rainbow of LUMINOUS Area 1
"length": 0,
"mapAreaConditionList": [],
}
mystic_area_1_added = False
mystic_conditions = MysticAreaConditions(
event_by_id, 3229301, self.date_time_format
)
mystic_conditions.add_condition(14005, 3020701, 3229302)
mystic_conditions.add_condition(14251, 3020702, 3229303)
mystic_conditions.add_condition(14481, 3020703, 3229304)
conditions += mystic_conditions.conditions
# Secret AREA: MUSIC GAME
if 14029 in event_by_id:
@@ -94,203 +169,296 @@ class ChuniLuminous(ChuniSunPlus):
# (event ID 14214) was imported into ARTEMiS, we disable the requirement
# for this trophy.
if 14214 in event_by_id:
mission_in_progress_end_date = (event_by_id[14214]["startDate"] - timedelta(hours=2)).strftime(self.date_time_format)
conditions.extend([
{
"mapAreaId": 2206201, # BlythE ULTIMA
"length": 1,
# Obtain the trophy "MISSION in progress".
"mapAreaConditionList": [
{
"type": MapAreaConditionType.TROPHY_OBTAINED.value,
"conditionId": 6832,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": mission_in_progress_end_date,
}
],
},
{
"mapAreaId": 2206202, # PRIVATE SERVICE ULTIMA
"length": 1,
# Obtain the trophy "MISSION in progress".
"mapAreaConditionList": [
{
"type": MapAreaConditionType.TROPHY_OBTAINED.value,
"conditionId": 6832,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": mission_in_progress_end_date,
}
],
},
{
"mapAreaId": 2206203, # New York Back Raise
"length": 1,
# SS NightTheater's EXPERT chart and get the title
# "今宵、劇場に映し出される景色とは――――。"
"mapAreaConditionList": [
{
"type": MapAreaConditionType.TROPHY_OBTAINED.value,
"conditionId": 6833,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00.0",
},
],
},
{
"mapAreaId": 2206204, # Spasmodic
"length": 2,
# - Get 1 miss on Random (any difficulty) and get the title "当たり待ち"
# - Get 1 miss on 花たちに希望を (any difficulty) and get the title "花たちに希望を"
"mapAreaConditionList": [
{
"type": MapAreaConditionType.TROPHY_OBTAINED.value,
"conditionId": 6834,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00.0",
},
{
"type": MapAreaConditionType.TROPHY_OBTAINED.value,
"conditionId": 6835,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00.0",
},
],
},
{
"mapAreaId": 2206205, # ΩΩPARTS
"length": 2,
# - S Sage EXPERT to get the title "マターリ進行キボンヌ"
# - Equip this title and play cab-to-cab with another person with this title
# to get "マターリしようよ". Disabled because it is difficult to play cab2cab
# on data setups. A network operator may consider re-enabling it by uncommenting
# the second condition.
"mapAreaConditionList": [
{
"type": MapAreaConditionType.TROPHY_OBTAINED.value,
"conditionId": 6836,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00.0",
},
# {
# "type": MapAreaConditionType.TROPHY_OBTAINED.value,
# "conditionId": 6837,
# "logicalOpe": MapAreaConditionLogicalOperator.AND.value,
# "startDate": start_date,
# "endDate": "2099-12-31 00:00:00.0",
# },
],
},
{
"mapAreaId": 2206206, # Blow My Mind
"length": 1,
# SS on CHAOS EXPERT, Hydra EXPERT, Surive EXPERT and Jakarta PROGRESSION EXPERT
# to get the title "Can you hear me?"
"mapAreaConditionList": [
{
"type": MapAreaConditionType.TROPHY_OBTAINED.value,
"conditionId": 6838,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00.0",
},
],
},
{
"mapAreaId": 2206207, # VALLIS-NERIA
"length": 6,
# Finish the 6 other areas
"mapAreaConditionList": [
{
"type": MapAreaConditionType.MAP_AREA_CLEARED.value,
"conditionId": x,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00.0",
}
for x in range(2206201, 2206207)
],
},
])
# LUMINOUS ep. I
if 14005 in event_by_id:
start_date = event_by_id[14005]["startDate"].strftime(self.date_time_format)
mission_in_progress_end_date = (
event_by_id[14214]["startDate"] - timedelta(hours=2)
).strftime(self.date_time_format)
if not mystic_area_1_added:
conditions.append(mystic_area_1_conditions)
mystic_area_1_added = True
mystic_area_1_conditions["length"] += 1
mystic_area_1_conditions["mapAreaConditionList"].append(
{
"type": MapAreaConditionType.MAP_CLEARED.value,
"conditionId": 3020701,
"logicalOpe": MapAreaConditionLogicalOperator.OR.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00.0",
}
conditions.extend(
[
{
"mapAreaId": 2206201, # BlythE ULTIMA
"length": 1,
# Obtain the trophy "MISSION in progress".
"mapAreaConditionList": [
{
"type": MapAreaConditionType.TROPHY_OBTAINED.value,
"conditionId": 6832,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": mission_in_progress_end_date,
}
],
},
{
"mapAreaId": 2206202, # PRIVATE SERVICE ULTIMA
"length": 1,
# Obtain the trophy "MISSION in progress".
"mapAreaConditionList": [
{
"type": MapAreaConditionType.TROPHY_OBTAINED.value,
"conditionId": 6832,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": mission_in_progress_end_date,
}
],
},
{
"mapAreaId": 2206203, # New York Back Raise
"length": 1,
# SS NightTheater's EXPERT chart and get the title
# "今宵、劇場に映し出される景色とは――――。"
"mapAreaConditionList": [
{
"type": MapAreaConditionType.TROPHY_OBTAINED.value,
"conditionId": 6833,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00.0",
},
],
},
{
"mapAreaId": 2206204, # Spasmodic
"length": 2,
# - Get 1 miss on Random (any difficulty) and get the title "当たり待ち"
# - Get 1 miss on 花たちに希望を (any difficulty) and get the title "花たちに希望を"
"mapAreaConditionList": [
{
"type": MapAreaConditionType.TROPHY_OBTAINED.value,
"conditionId": 6834,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00.0",
},
{
"type": MapAreaConditionType.TROPHY_OBTAINED.value,
"conditionId": 6835,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00.0",
},
],
},
{
"mapAreaId": 2206205, # ΩΩPARTS
"length": 2,
# - S Sage EXPERT to get the title "マターリ進行キボンヌ"
# - Equip this title and play cab-to-cab with another person with this title
# to get "マターリしようよ". Disabled because it is difficult to play cab2cab
# on data setups. A network operator may consider re-enabling it by uncommenting
# the second condition.
"mapAreaConditionList": [
{
"type": MapAreaConditionType.TROPHY_OBTAINED.value,
"conditionId": 6836,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00.0",
},
# {
# "type": MapAreaConditionType.TROPHY_OBTAINED.value,
# "conditionId": 6837,
# "logicalOpe": MapAreaConditionLogicalOperator.AND.value,
# "startDate": start_date,
# "endDate": "2099-12-31 00:00:00.0",
# },
],
},
{
"mapAreaId": 2206206, # Blow My Mind
"length": 1,
# SS on CHAOS EXPERT, Hydra EXPERT, Surive EXPERT and Jakarta PROGRESSION EXPERT
# to get the title "Can you hear me?"
"mapAreaConditionList": [
{
"type": MapAreaConditionType.TROPHY_OBTAINED.value,
"conditionId": 6838,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00.0",
},
],
},
{
"mapAreaId": 2206207, # VALLIS-NERIA
"length": 6,
# Finish the 6 other areas
"mapAreaConditionList": [
{
"type": MapAreaConditionType.MAP_AREA_CLEARED.value,
"conditionId": x,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00.0",
}
for x in range(2206201, 2206207)
],
},
]
)
# 1UM1N0U5 ep. 111
if 14483 in event_by_id:
start_date = event_by_id[14483]["startDate"].replace(
hour=0, minute=0, second=0
)
# conditions to unlock the 6 "Key of ..." area in the map
# for the first 14 days: Defandour MASTER AJ, crazy (about you) MASTER AJ, Halcyon ULTIMA SSS
title_conditions = [
{
"type": MapAreaConditionType.ALL_JUSTICE.value,
"conditionId": 258103, # Defandour MASTER
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date.strftime(self.date_time_format),
"endDate": (
start_date + timedelta(days=14) - timedelta(seconds=1)
).strftime(self.date_time_format),
},
{
"type": MapAreaConditionType.ALL_JUSTICE.value,
"conditionId": 258003, # crazy (about you) MASTER
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date.strftime(self.date_time_format),
"endDate": (
start_date + timedelta(days=14) - timedelta(seconds=1)
).strftime(self.date_time_format),
},
{
"type": MapAreaConditionType.RANK_SSS.value,
"conditionId": 17304, # Halcyon ULTIMA
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date.strftime(self.date_time_format),
"endDate": (
start_date + timedelta(days=14) - timedelta(seconds=1)
).strftime(self.date_time_format),
},
]
# For each next 14 days, the conditions are lowered to SS+, S+, S, and then always unlocked
for i, typ in enumerate(
[
MapAreaConditionType.RANK_SSP.value,
MapAreaConditionType.RANK_SP.value,
MapAreaConditionType.RANK_S.value,
MapAreaConditionType.INVALID.value,
]
):
start = (start_date + timedelta(days=14 * (i + 1))).strftime(
self.date_time_format
)
if typ != MapAreaConditionType.INVALID.value:
end = (
start_date + timedelta(days=14 * (i + 2)) - timedelta(seconds=1)
).strftime(self.date_time_format)
title_conditions.extend(
[
{
"type": typ,
"conditionId": condition_id,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start,
"endDate": end,
}
for condition_id in {17304, 258003, 258103}
]
)
else:
end = "2099-12-31 00:00:00"
title_conditions.append(
{
"type": typ,
"conditionId": 0,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start,
"endDate": end,
}
)
# actually add all the conditions
for map_area_id in range(3229201, 3229207):
conditions.append(
{
"mapAreaId": map_area_id,
"length": len(title_conditions),
"mapAreaConditionList": title_conditions,
}
)
# Ultimate Force
# For the first 14 days, the condition is to obtain all 9 "Key of ..." titles
# Afterwards, the condition is the 6 "Key of ..." titles that you can obtain
# by playing the 6 areas, as well as obtaining specific ranks on
# [CRYSTAL_ACCESS] / Strange Love / βlαnoir
ultimate_force_conditions = []
# Trophies obtained by playing the 6 areas
for trophy_id in {6851, 6853, 6855, 6857, 6858, 6860}:
ultimate_force_conditions.append(
{
"type": MapAreaConditionType.TROPHY_OBTAINED.value,
"conditionId": trophy_id,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date.strftime(self.date_time_format),
"endDate": "2099-12-31 00:00:00",
}
)
# βlαnoir MASTER SSS+ / Strange Love MASTER SSS+ / [CRYSTAL_ACCESS] MASTER SSS+
for trophy_id in {6852, 6854, 6856}:
ultimate_force_conditions.append(
{
"type": MapAreaConditionType.TROPHY_OBTAINED.value,
"conditionId": trophy_id,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date.strftime(self.date_time_format),
"endDate": (
start_date + timedelta(days=14) - timedelta(seconds=1)
).strftime(self.date_time_format),
}
)
# For each next 14 days, the rank conditions for the 3 songs lowers
# Finally, the Ultimate Force area is unlocked as soon as you finish the 6 other areas.
for i, typ in enumerate(
[
MapAreaConditionType.RANK_SSS.value,
MapAreaConditionType.RANK_SS.value,
MapAreaConditionType.RANK_S.value,
]
):
start = (start_date + timedelta(days=14 * (i + 1))).strftime(
self.date_time_format
)
end = (
start_date + timedelta(days=14 * (i + 2)) - timedelta(seconds=1)
).strftime(self.date_time_format)
ultimate_force_conditions.extend(
[
{
"type": typ,
"conditionId": condition_id,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start,
"endDate": end,
}
for condition_id in {109403, 212103, 244203}
]
)
conditions.append(
{
"mapAreaId": 3229302, # Mystic Rainbow of LUMINOUS Area 2,
"length": 1,
# Unlocks when LUMINOUS ep. I is completed.
"mapAreaConditionList": [
{
"type": MapAreaConditionType.MAP_CLEARED.value,
"conditionId": 3020701,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00.0",
},
],
"mapAreaId": 3229207,
"length": len(ultimate_force_conditions),
"mapAreaConditionList": ultimate_force_conditions,
}
)
# LUMINOUS ep. II
if 14251 in event_by_id:
start_date = event_by_id[14251]["startDate"].strftime(self.date_time_format)
if not mystic_area_1_added:
conditions.append(mystic_area_1_conditions)
mystic_area_1_added = True
mystic_area_1_conditions["length"] += 1
mystic_area_1_conditions["mapAreaConditionList"].append(
{
"type": MapAreaConditionType.MAP_CLEARED.value,
"conditionId": 3020702,
"logicalOpe": MapAreaConditionLogicalOperator.OR.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00.0",
}
)
conditions.append(
{
"mapAreaId": 3229303, # Mystic Rainbow of LUMINOUS Area 3,
"length": 1,
# Unlocks when LUMINOUS ep. II is completed.
"mapAreaConditionList": [
{
"type": MapAreaConditionType.MAP_CLEARED.value,
"conditionId": 3020702,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00.0",
},
],
}
)
return {
"length": len(conditions),

View File

@@ -0,0 +1,322 @@
from datetime import timedelta
from typing import Dict
from core.config import CoreConfig
from titles.chuni.config import ChuniConfig
from titles.chuni.const import ChuniConstants, MapAreaConditionLogicalOperator, MapAreaConditionType
from titles.chuni.luminous import ChuniLuminous, MysticAreaConditions
class ChuniLuminousPlus(ChuniLuminous):
def __init__(self, core_cfg: CoreConfig, game_cfg: ChuniConfig) -> None:
super().__init__(core_cfg, game_cfg)
self.version = ChuniConstants.VER_CHUNITHM_LUMINOUS_PLUS
async def handle_c_m_get_user_preview_api_request(self, data: Dict) -> Dict:
user_data = await super().handle_c_m_get_user_preview_api_request(data)
# Does CARD MAKER 1.35 work this far up?
user_data["lastDataVersion"] = "2.25.00"
return user_data
async def handle_get_user_c_mission_list_api_request(self, data: Dict) -> Dict:
user_id = int(data["userId"])
user_mission_list_request = data["userCMissionList"]
user_mission_list = []
for request in user_mission_list_request:
user_id = int(request["userId"])
mission_id = int(request["missionId"])
point = int(request["point"])
mission_data = await self.data.item.get_cmission(user_id, mission_id)
progress_data = await self.data.item.get_cmission_progress(user_id, mission_id)
if mission_data is None or progress_data is None:
continue
point = mission_data.point
user_mission_progress_list = [
{
"order": progress.order,
"stage": progress.stage,
"progress": progress.progress,
}
for progress in progress_data
]
user_mission_list.append(
{
"userId": user_id,
"missionId": mission_id,
"point": point,
"userCMissionProgressList": user_mission_progress_list,
},
)
return {
"userId": user_id,
"userCMissionList": user_mission_list,
}
async def handle_get_game_map_area_condition_api_request(self, data: Dict) -> Dict:
# There is no game data for this, everything is server side.
# However, we can selectively show/hide events as data is imported into the server.
events = await self.data.static.get_enabled_events(self.version)
event_by_id = {evt["eventId"]: evt for evt in events}
conditions = []
mystic_conditions = MysticAreaConditions(
event_by_id,
3229601,
self.date_time_format,
)
# Mystic Rainbow of LUMINOUS PLUS - LUMINOUS ep. IV
mystic_conditions.add_condition(15005, 3020704, 3229602)
# Mystic Rainbow of LUMINOUS PLUS - LUMINOUS ep. V
mystic_conditions.add_condition(15306, 3020705, 3229603)
# Mystic Rainbow of LUMINOUS PLUS - LUMINOUS ep. VI
mystic_conditions.add_condition(15451, 3020706, 3229604)
# Mystic Rainbow of LUMINOUS PLUS - LUMINOUS ep. VII
mystic_conditions.add_condition(15506, 3020707, 3229605)
conditions += mystic_conditions.conditions
# 1UM1N0U5 ep. 111 continues. The map is automatically unlocked after finishing
# LUMINOUS ep. III in LUMINOUS PLUS.
if ep_111 := event_by_id.get(15009):
start_date = ep_111["startDate"].strftime(self.date_time_format)
conditions.append({
"mapAreaId": 3229207,
"length": 1,
"mapAreaConditionList": [
{
"type": MapAreaConditionType.MAP_CLEARED.value,
"conditionId": 3020703,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00",
},
],
})
# ■・■■■■■■・■
# Finish LUMINOUS ep. IV and obtain the title 「ここは…何処なんだ…?」.
if re_fiction_o := event_by_id.get(15032):
start_date = re_fiction_o["startDate"].strftime(self.date_time_format)
conditions.append({
"mapAreaId": 3229501,
"length": 2,
"mapAreaConditionList": [
{
"type": MapAreaConditionType.MAP_CLEARED.value,
"conditionId": 3020704,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00",
},
{
"type": MapAreaConditionType.TROPHY_OBTAINED.value,
"conditionId": 7105,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00",
}
]
})
# The Conductor's Path
# ALL JUSTICE CRITICAL 其のエメラルドを見よ MASTER.
if the_conductors_path := event_by_id.get(15033):
start_date = the_conductors_path["startDate"].strftime(self.date_time_format)
conditions.append({
"mapAreaId": 3229701,
"length": 1,
"mapAreaConditionList": [
{
"type": MapAreaConditionType.ALL_JUSTICE_CRITICAL.value,
"conditionId": 260003,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00",
}
]
})
# Cave of RVESE
if episode__x__ := event_by_id.get(15254):
start_date = episode__x__["startDate"].strftime(self.date_time_format)
conditions.extend([
# Episode. _ _ X _ _ map area 1
# Finish the HARDCORE TANO*C collaboration map.
{
"mapAreaId": 2208801,
"length": 1,
"mapAreaConditionList": [
{
"type": MapAreaConditionType.MAP_CLEARED.value,
"conditionId": 2006533,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00",
},
],
},
# Episode. _ _ X _ _ map area 2
# Equip the title 「第壱の石版【V】」 to access the map area.
{
"mapAreaId": 2208802,
"length": 1,
"mapAreaConditionList": [
{
"type": MapAreaConditionType.TROPHY_EQUIPPED.value,
"conditionId": 7107,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00",
},
],
},
# Episode. _ _ X _ _ map area 3
# Equip the title 「第弐の石版【Λ】」 to access the map area.
{
"mapAreaId": 2208803,
"length": 1,
"mapAreaConditionList": [
{
"type": MapAreaConditionType.TROPHY_EQUIPPED.value,
"conditionId": 7104,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00",
},
],
},
# Episode. _ _ X _ _ map area 4
# Complete the 3 other map areas.
{
"mapAreaId": 2208804,
"length": 3,
"mapAreaConditionList": [
{
"type": MapAreaConditionType.MAP_AREA_CLEARED.value,
"conditionId": area_id,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date,
"endDate": "2099-12-31 00:00:00",
}
for area_id in range(2208801, 2208804)
],
},
])
# LUMINOUS ep. Ascension
if ep_ascension := event_by_id.get(15512):
start_date = ep_ascension["startDate"].replace(hour=0, minute=0, second=0)
# Finish LUMINOUS ep. VII to unlock LUMINOUS ep. Ascension.
task_track_map_conditions = [
{
"type": MapAreaConditionType.MAP_CLEARED.value,
"conditionId": 3020707,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start_date.strftime(self.date_time_format),
"endDate": "2099-12-31 00:00:00",
}
]
# You also need to reach a specific rank on Acid God MASTER.
# This condition lowers every 7 days.
# After the first 4 weeks, you only need to finish ep. VII.
for i, typ in enumerate([
MapAreaConditionType.RANK_SSSP.value,
MapAreaConditionType.RANK_SSS.value,
MapAreaConditionType.RANK_SS.value,
MapAreaConditionType.RANK_S.value,
]):
start = start_date + timedelta(days=7 * i)
end = start_date + timedelta(days=7 * (i + 1)) - timedelta(seconds=1)
task_track_map_conditions.append(
{
"type": typ,
"conditionId": 265103,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start.strftime(self.date_time_format),
"endDate": end.strftime(self.date_time_format),
}
)
conditions.extend(
[
{
"mapAreaId": map_area_id,
"length": len(task_track_map_conditions),
"mapAreaConditionList": task_track_map_conditions,
}
for map_area_id in {3220801, 3220802, 3220803, 3220804}
]
)
# To unlock the final map area (Forsaken Tale), achieve a specific rank
# on the 4 task tracks in the previous map areas. This condition also lowers
# every 7 days, similar to Acid God.
# After 28 days, you only need to finish the other 4 areas in ep. Ascension.
forsaken_tale_conditions = []
for i, typ in enumerate([
MapAreaConditionType.RANK_SSSP.value,
MapAreaConditionType.RANK_SSS.value,
MapAreaConditionType.RANK_SS.value,
MapAreaConditionType.RANK_S.value,
]):
start = start_date + timedelta(days=7 * i)
end = start_date + timedelta(days=7 * (i + 1)) - timedelta(seconds=1)
forsaken_tale_conditions.extend(
[
{
"type": typ,
"conditionId": condition_id,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": start.strftime(self.date_time_format),
"endDate": end.strftime(self.date_time_format),
}
for condition_id in {98203, 108603, 247503, 233903}
]
)
forsaken_tale_conditions.extend(
[
{
"type": MapAreaConditionType.MAP_AREA_CLEARED.value,
"conditionId": map_area_id,
"logicalOpe": MapAreaConditionLogicalOperator.AND.value,
"startDate": (start_date + timedelta(days=28)).strftime(self.date_time_format),
"endDate": "2099-12-31 00:00:00",
}
for map_area_id in {3220801, 3220802, 3220803, 3220804}
]
)
conditions.append(
{
"mapAreaId": 3220805,
"length": len(forsaken_tale_conditions),
"mapAreaConditionList": forsaken_tale_conditions,
}
)
return {
"length": len(conditions),
"gameMapAreaConditionList": conditions,
}

View File

@@ -4,12 +4,14 @@ from random import randint
from typing import Dict
import pytz
from core.config import CoreConfig
from core.utils import Utils
from titles.chuni.const import ChuniConstants
from titles.chuni.database import ChuniData
from titles.chuni.base import ChuniBase
from titles.chuni.config import ChuniConfig
from titles.chuni.const import ChuniConstants
from titles.chuni.database import ChuniData
class ChuniNew(ChuniBase):
ITEM_TYPE = {"character": 20, "story": 21, "card": 22}
@@ -26,14 +28,20 @@ class ChuniNew(ChuniBase):
def _interal_ver_to_intver(self) -> str:
if self.version == ChuniConstants.VER_CHUNITHM_NEW:
return "200"
if self.version == ChuniConstants.VER_CHUNITHM_NEW_PLUS:
elif self.version == ChuniConstants.VER_CHUNITHM_NEW_PLUS:
return "205"
if self.version == ChuniConstants.VER_CHUNITHM_SUN:
elif self.version == ChuniConstants.VER_CHUNITHM_SUN:
return "210"
if self.version == ChuniConstants.VER_CHUNITHM_SUN_PLUS:
elif self.version == ChuniConstants.VER_CHUNITHM_SUN_PLUS:
return "215"
if self.version == ChuniConstants.VER_CHUNITHM_LUMINOUS:
elif self.version == ChuniConstants.VER_CHUNITHM_LUMINOUS:
return "220"
elif self.version == ChuniConstants.VER_CHUNITHM_LUMINOUS_PLUS:
return "225"
elif self.version == ChuniConstants.VER_CHUNITHM_VERSE:
return "230"
elif self.version == ChuniConstants.VER_CHUNITHM_X_VERSE:
return "240"
async def handle_get_game_setting_api_request(self, data: Dict) -> Dict:
# use UTC time and convert it to JST time by adding +9
@@ -104,7 +112,8 @@ class ChuniNew(ChuniBase):
return {"returnCode": "1"}
async def handle_get_user_map_area_api_request(self, data: Dict) -> Dict:
user_map_areas = await self.data.item.get_map_areas(data["userId"])
map_area_ids = [int(area["mapAreaId"]) for area in data["mapAreaIdList"]]
user_map_areas = await self.data.item.get_map_areas(data["userId"], map_area_ids)
map_areas = []
for map_area in user_map_areas:
@@ -166,7 +175,7 @@ class ChuniNew(ChuniBase):
}
return data1
async def handle_cm_get_user_preview_api_request(self, data: Dict) -> Dict:
async def handle_c_m_get_user_preview_api_request(self, data: Dict) -> Dict:
p = await self.data.profile.get_profile_data(data["userId"], self.version)
if p is None:
return {}
@@ -239,7 +248,7 @@ class ChuniNew(ChuniBase):
"ssrBookCalcList": [],
}
async def handle_cm_get_user_data_api_request(self, data: Dict) -> Dict:
async def handle_c_m_get_user_data_api_request(self, data: Dict) -> Dict:
p = await self.data.profile.get_profile_data(data["userId"], self.version)
if p is None:
return {}
@@ -284,35 +293,37 @@ class ChuniNew(ChuniBase):
}
async def handle_get_user_printed_card_api_request(self, data: Dict) -> Dict:
user_print_list = await self.data.item.get_user_print_states(
data["userId"], has_completed=True
user_id = int(data["userId"])
next_idx = int(data["nextIndex"])
max_ct = int(data["maxCount"])
rows = await self.data.item.get_user_print_states(
user_id,
has_completed=True,
limit=max_ct + 1,
offset=next_idx,
)
if user_print_list is None:
if rows is None or len(rows) == 0:
return {
"userId": data["userId"],
"userId": user_id,
"length": 0,
"nextIndex": -1,
"userPrintedCardList": [],
}
print_list = []
next_idx = int(data["nextIndex"])
max_ct = int(data["maxCount"])
for x in range(next_idx, len(user_print_list)):
tmp = user_print_list[x]._asdict()
for row in rows[:max_ct]:
tmp = row._asdict()
print_list.append(tmp["cardId"])
if len(print_list) >= max_ct:
break
if len(print_list) >= max_ct:
next_idx = next_idx + max_ct
if len(rows) > max_ct:
next_idx += max_ct
else:
next_idx = -1
return {
"userId": data["userId"],
"userId": user_id,
"length": len(print_list),
"nextIndex": next_idx,
"userPrintedCardList": print_list,
@@ -340,10 +351,10 @@ class ChuniNew(ChuniBase):
"userCardPrintStateList": card_print_state_list,
}
async def handle_cm_get_user_character_api_request(self, data: Dict) -> Dict:
async def handle_c_m_get_user_character_api_request(self, data: Dict) -> Dict:
return await super().handle_get_user_character_api_request(data)
async def handle_cm_get_user_item_api_request(self, data: Dict) -> Dict:
async def handle_c_m_get_user_item_api_request(self, data: Dict) -> Dict:
return await super().handle_get_user_item_api_request(data)
async def handle_roll_gacha_api_request(self, data: Dict) -> Dict:
@@ -388,7 +399,7 @@ class ChuniNew(ChuniBase):
return {"length": len(rolled_cards), "gameGachaCardList": rolled_cards}
async def handle_cm_upsert_user_gacha_api_request(self, data: Dict) -> Dict:
async def handle_c_m_upsert_user_gacha_api_request(self, data: Dict) -> Dict:
upsert = data["cmUpsertUserGacha"]
user_id = data["userId"]
place_id = data["placeId"]
@@ -443,7 +454,7 @@ class ChuniNew(ChuniBase):
"userCardPrintStateList": card_print_state_list,
}
async def handle_cm_upsert_user_printlog_api_request(self, data: Dict) -> Dict:
async def handle_c_m_upsert_user_printlog_api_request(self, data: Dict) -> Dict:
return {
"returnCode": 1,
"orderId": 0,
@@ -451,7 +462,7 @@ class ChuniNew(ChuniBase):
"apiName": "CMUpsertUserPrintlogApi",
}
async def handle_cm_upsert_user_print_api_request(self, data: Dict) -> Dict:
async def handle_c_m_upsert_user_print_api_request(self, data: Dict) -> Dict:
user_print_detail = data["userPrintDetail"]
user_id = data["userId"]
@@ -476,7 +487,7 @@ class ChuniNew(ChuniBase):
"apiName": "CMUpsertUserPrintApi",
}
async def handle_cm_upsert_user_print_subtract_api_request(self, data: Dict) -> Dict:
async def handle_c_m_upsert_user_print_subtract_api_request(self, data: Dict) -> Dict:
upsert = data["userCardPrintState"]
user_id = data["userId"]
place_id = data["placeId"]
@@ -493,7 +504,7 @@ class ChuniNew(ChuniBase):
return {"returnCode": "1", "apiName": "CMUpsertUserPrintSubtractApi"}
async def handle_cm_upsert_user_print_cancel_api_request(self, data: Dict) -> Dict:
async def handle_c_m_upsert_user_print_cancel_api_request(self, data: Dict) -> Dict:
order_ids = data["orderIdList"]
user_id = data["userId"]

View File

@@ -11,8 +11,8 @@ class ChuniNewPlus(ChuniNew):
super().__init__(core_cfg, game_cfg)
self.version = ChuniConstants.VER_CHUNITHM_NEW_PLUS
async def handle_cm_get_user_preview_api_request(self, data: Dict) -> Dict:
user_data = await super().handle_cm_get_user_preview_api_request(data)
async def handle_c_m_get_user_preview_api_request(self, data: Dict) -> Dict:
user_data = await super().handle_c_m_get_user_preview_api_request(data)
# hardcode lastDataVersion for CardMaker 1.35 A028
user_data["lastDataVersion"] = "2.05.00"

View File

@@ -1,11 +1,16 @@
from logging import Logger
from typing import Optional
from os import walk, path
from os import walk, path, remove
import xml.etree.ElementTree as ET
from read import BaseReader
from PIL import Image
import configparser
import glob
from core.config import CoreConfig
from titles.chuni.database import ChuniData
from titles.chuni.const import ChuniConstants
from titles.chuni.schema.static import music as MusicTable
class ChuniReader(BaseReader):
@@ -35,16 +40,39 @@ class ChuniReader(BaseReader):
if self.opt_dir is not None:
data_dirs += self.get_data_directories(self.opt_dir)
we_diff = "4"
if self.version >= ChuniConstants.VER_CHUNITHM_NEW:
we_diff = "5"
# Convert any old assets created with a previous version of the importer
ChuniReader.ConvertOldAssets(self.logger)
# character images could be stored anywhere across all the data dirs. Map them first
self.logger.info(f"Mapping DDS image files...")
dds_images = dict()
for dir in data_dirs:
self.map_dds_images(dds_images, f"{dir}/ddsImage")
for dir in data_dirs:
self.logger.info(f"Read from {dir}")
await self.read_events(f"{dir}/event")
await self.read_music(f"{dir}/music")
await self.read_charges(f"{dir}/chargeItem")
await self.read_avatar(f"{dir}/avatarAccessory")
await self.read_login_bonus(f"{dir}/")
this_opt_id = await self.read_opt_info(dir) # this also treats A000 as an opt, which is intended
await self.read_events(f"{dir}/event", this_opt_id)
await self.read_music(f"{dir}/music", we_diff, this_opt_id)
await self.read_charges(f"{dir}/chargeItem", this_opt_id)
await self.read_avatar(f"{dir}/avatarAccessory", this_opt_id)
await self.read_login_bonus(f"{dir}/", this_opt_id)
await self.read_nameplate(f"{dir}/namePlate", this_opt_id)
await self.read_trophy(f"{dir}/trophy", this_opt_id)
await self.read_character(f"{dir}/chara", dds_images, this_opt_id)
await self.read_map_icon(f"{dir}/mapIcon", this_opt_id)
await self.read_system_voice(f"{dir}/systemVoice", this_opt_id)
await self.read_unlock_challenge(f"{dir}/unlockChallenge")
await self.read_linked_verse(f"{dir}/linkedVerse")
if self.version >= ChuniConstants.VER_CHUNITHM_X_VERSE:
await self.read_stage(f"{dir}/stage", this_opt_id)
async def read_login_bonus(self, root_dir: str) -> None:
async def read_login_bonus(self, root_dir: str, opt_id: Optional[int] = None) -> None:
for root, dirs, files in walk(f"{root_dir}loginBonusPreset"):
for dir in dirs:
if path.exists(f"{root}/{dir}/LoginBonusPreset.xml"):
@@ -55,12 +83,11 @@ class ChuniReader(BaseReader):
for name in xml_root.findall("name"):
id = name.find("id").text
name = name.find("str").text
is_enabled = (
True if xml_root.find("disableFlag").text == "false" else False
)
disableFlag = xml_root.find("disableFlag") # may not exist in older data
is_enabled = True if (disableFlag is None or disableFlag.text == "false") else False
result = await self.data.static.put_login_bonus_preset(
self.version, id, name, is_enabled
self.version, id, name, is_enabled, opt_id
)
if result is not None:
@@ -107,6 +134,7 @@ class ChuniReader(BaseReader):
item_num,
need_login_day_count,
login_bonus_category_type,
opt_id
)
if result is not None:
@@ -116,7 +144,7 @@ class ChuniReader(BaseReader):
f"Failed to insert login bonus {bonus_id}"
)
async def read_events(self, evt_dir: str) -> None:
async def read_events(self, evt_dir: str, opt_id: Optional[int] = None) -> None:
for root, dirs, files in walk(evt_dir):
for dir in dirs:
if path.exists(f"{root}/{dir}/Event.xml"):
@@ -131,14 +159,17 @@ class ChuniReader(BaseReader):
event_type = substances.find("type").text
result = await self.data.static.put_event(
self.version, id, event_type, name
self.version, id, event_type, name, opt_id
)
if result is not None:
self.logger.info(f"Inserted event {id}")
else:
self.logger.warning(f"Failed to insert event {id}")
async def read_music(self, music_dir: str) -> None:
async def read_music(self, music_dir: str, we_diff: str = "4", opt_id: Optional[int] = None) -> None:
max_title_len = MusicTable.columns["title"].type.length
max_artist_len = MusicTable.columns["artist"].type.length
for root, dirs, files in walk(music_dir):
for dir in dirs:
if path.exists(f"{root}/{dir}/Music.xml"):
@@ -149,9 +180,15 @@ class ChuniReader(BaseReader):
for name in xml_root.findall("name"):
song_id = name.find("id").text
title = name.find("str").text
if len(title) > max_title_len:
self.logger.warning(f"Truncating music {song_id} song title")
title = title[:max_title_len]
for artistName in xml_root.findall("artistName"):
artist = artistName.find("str").text
if len(artist) > max_artist_len:
self.logger.warning(f"Truncating music {song_id} artist name")
artist = artist[:max_artist_len]
for genreNames in xml_root.findall("genreNames"):
for list_ in genreNames.findall("list"):
@@ -160,6 +197,8 @@ class ChuniReader(BaseReader):
for jaketFile in xml_root.findall("jaketFile"): # nice typo, SEGA
jacket_path = jaketFile.find("path").text
# Save off image for use in frontend
self.copy_image(jacket_path, f"{root}/{dir}", "titles/chuni/img/jacket/")
for fumens in xml_root.findall("fumens"):
for MusicFumenData in fumens.findall("MusicFumenData"):
@@ -169,7 +208,7 @@ class ChuniReader(BaseReader):
chart_type = MusicFumenData.find("type")
chart_id = chart_type.find("id").text
chart_diff = chart_type.find("str").text
if chart_diff == "WorldsEnd" and (chart_id == "4" or chart_id == "5"): # 4 in SDBT, 5 in SDHD
if chart_diff == "WorldsEnd" and chart_id == we_diff: # 4 in SDBT, 5 in SDHD
level = float(xml_root.find("starDifType").text)
we_chara = (
xml_root.find("worldsEndTagName")
@@ -192,6 +231,7 @@ class ChuniReader(BaseReader):
genre,
jacket_path,
we_chara,
opt_id
)
if result is not None:
@@ -203,7 +243,7 @@ class ChuniReader(BaseReader):
f"Failed to insert music {song_id} chart {chart_id}"
)
async def read_charges(self, charge_dir: str) -> None:
async def read_charges(self, charge_dir: str, opt_id: Optional[int] = None) -> None:
for root, dirs, files in walk(charge_dir):
for dir in dirs:
if path.exists(f"{root}/{dir}/ChargeItem.xml"):
@@ -225,6 +265,7 @@ class ChuniReader(BaseReader):
expirationDays,
consumeType,
sellingAppeal,
opt_id
)
if result is not None:
@@ -232,7 +273,7 @@ class ChuniReader(BaseReader):
else:
self.logger.warning(f"Failed to insert charge {id}")
async def read_avatar(self, avatar_dir: str) -> None:
async def read_avatar(self, avatar_dir: str, opt_id: Optional[int] = None) -> None:
for root, dirs, files in walk(avatar_dir):
for dir in dirs:
if path.exists(f"{root}/{dir}/AvatarAccessory.xml"):
@@ -243,17 +284,374 @@ class ChuniReader(BaseReader):
for name in xml_root.findall("name"):
id = name.find("id").text
name = name.find("str").text
sortName = xml_root.find("sortName").text
category = xml_root.find("category").text
defaultHave = xml_root.find("defaultHave").text == 'true'
disableFlag = xml_root.find("disableFlag") # may not exist in older data
is_enabled = True if (disableFlag is None or disableFlag.text == "false") else False
for image in xml_root.findall("image"):
iconPath = image.find("path").text
self.copy_image(iconPath, f"{root}/{dir}", "titles/chuni/img/avatar/")
for texture in xml_root.findall("texture"):
texturePath = texture.find("path").text
self.copy_image(texturePath, f"{root}/{dir}", "titles/chuni/img/avatar/")
result = await self.data.static.put_avatar(
self.version, id, name, category, iconPath, texturePath
self.version, id, name, category, iconPath, texturePath, is_enabled, defaultHave, sortName, opt_id
)
if result is not None:
self.logger.info(f"Inserted avatarAccessory {id}")
else:
self.logger.warning(f"Failed to insert avatarAccessory {id}")
async def read_nameplate(self, nameplate_dir: str, opt_id: Optional[int] = None) -> None:
for root, dirs, files in walk(nameplate_dir):
for dir in dirs:
if path.exists(f"{root}/{dir}/NamePlate.xml"):
with open(f"{root}/{dir}/NamePlate.xml", "r", encoding='utf-8') as fp:
strdata = fp.read()
xml_root = ET.fromstring(strdata)
for name in xml_root.findall("name"):
id = name.find("id").text
name = name.find("str").text
sortName = name if xml_root.find("sortName") is None else xml_root.find("sortName").text
defaultHave = xml_root.find("defaultHave").text == 'true'
disableFlag = xml_root.find("disableFlag") # may not exist in older data
is_enabled = True if (disableFlag is None or disableFlag.text == "false") else False
for image in xml_root.findall("image"):
texturePath = image.find("path").text
self.copy_image(texturePath, f"{root}/{dir}", "titles/chuni/img/nameplate/")
result = await self.data.static.put_nameplate(
self.version, id, name, texturePath, is_enabled, defaultHave, sortName, opt_id
)
if result is not None:
self.logger.info(f"Inserted nameplate {id}")
else:
self.logger.warning(f"Failed to insert nameplate {id}")
async def read_trophy(self, trophy_dir: str, opt_id: Optional[int] = None) -> None:
for root, dirs, files in walk(trophy_dir):
for dir in dirs:
if path.exists(f"{root}/{dir}/Trophy.xml"):
with open(f"{root}/{dir}/Trophy.xml", "r", encoding='utf-8') as fp:
strdata = fp.read()
xml_root = ET.fromstring(strdata)
for name in xml_root.findall("name"):
id = name.find("id").text
name = name.find("str").text
rareType = xml_root.find("rareType").text
disableFlag = xml_root.find("disableFlag") # may not exist in older data
is_enabled = True if (disableFlag is None or disableFlag.text == "false") else False
defaultHave = xml_root.find("defaultHave").text == 'true'
result = await self.data.static.put_trophy(
self.version, id, name, rareType, is_enabled, defaultHave, opt_id
)
if result is not None:
self.logger.info(f"Inserted trophy {id}")
else:
self.logger.warning(f"Failed to insert trophy {id}")
async def read_character(self, chara_dir: str, dds_images: dict, opt_id: Optional[int] = None) -> None:
for root, dirs, files in walk(chara_dir):
for dir in dirs:
if path.exists(f"{root}/{dir}/Chara.xml"):
with open(f"{root}/{dir}/Chara.xml", "r", encoding='utf-8') as fp:
strdata = fp.read()
# ET may choke if there is a & symbol (which is present in some character xml)
if "&" in strdata:
strdata = strdata.replace("&", "&#38;")
xml_root = ET.fromstring(strdata)
for name in xml_root.findall("name"):
id = name.find("id").text
name = name.find("str").text
sortName = name if xml_root.find("sortName") is None else xml_root.find("sortName").text
for work in xml_root.findall("works"):
worksName = work.find("str").text
rareType = xml_root.find("rareType").text
defaultHave = xml_root.find("defaultHave").text == 'true'
disableFlag = xml_root.find("disableFlag") # may not exist in older data
is_enabled = True if (disableFlag is None or disableFlag.text == "false") else False
# character images are not stored alongside
for image in xml_root.findall("defaultImages"):
imageKey = image.find("str").text
if imageKey in dds_images.keys():
(imageDir, imagePaths) = dds_images[imageKey]
imagePath1 = imagePaths[0] if len(imagePaths) > 0 else ""
imagePath2 = imagePaths[1] if len(imagePaths) > 1 else ""
imagePath3 = imagePaths[2] if len(imagePaths) > 2 else ""
# @note the third image is the image needed for the user box ui
if imagePath3:
self.copy_image(imagePath3, imageDir, "titles/chuni/img/character/")
else:
self.logger.warning(f"Character {id} only has {len(imagePaths)} images. Expected 3")
else:
self.logger.warning(f"Unable to location character {id} images")
result = await self.data.static.put_character(
self.version, id, name, sortName, worksName, rareType, imagePath1, imagePath2, imagePath3, is_enabled, defaultHave, opt_id
)
if result is not None:
self.logger.info(f"Inserted character {id}")
else:
self.logger.warning(f"Failed to insert character {id}")
async def read_map_icon(self, mapicon_dir: str, opt_id: Optional[int] = None) -> None:
for root, dirs, files in walk(mapicon_dir):
for dir in dirs:
if path.exists(f"{root}/{dir}/MapIcon.xml"):
with open(f"{root}/{dir}/MapIcon.xml", "r", encoding='utf-8') as fp:
strdata = fp.read()
xml_root = ET.fromstring(strdata)
for name in xml_root.findall("name"):
id = name.find("id").text
name = name.find("str").text
sortName = name if xml_root.find("sortName") is None else xml_root.find("sortName").text
for image in xml_root.findall("image"):
iconPath = image.find("path").text
self.copy_image(iconPath, f"{root}/{dir}", "titles/chuni/img/mapIcon/")
defaultHave = xml_root.find("defaultHave").text == 'true'
disableFlag = xml_root.find("disableFlag") # may not exist in older data
is_enabled = True if (disableFlag is None or disableFlag.text == "false") else False
result = await self.data.static.put_map_icon(
self.version, id, name, sortName, iconPath, is_enabled, defaultHave, opt_id
)
if result is not None:
self.logger.info(f"Inserted map icon {id}")
else:
self.logger.warning(f"Failed to map icon {id}")
async def read_system_voice(self, voice_dir: str, opt_id: Optional[int] = None) -> None:
for root, dirs, files in walk(voice_dir):
for dir in dirs:
if path.exists(f"{root}/{dir}/SystemVoice.xml"):
with open(f"{root}/{dir}/SystemVoice.xml", "r", encoding='utf-8') as fp:
strdata = fp.read()
xml_root = ET.fromstring(strdata)
for name in xml_root.findall("name"):
id = name.find("id").text
name = name.find("str").text
sortName = name if xml_root.find("sortName") is None else xml_root.find("sortName").text
for image in xml_root.findall("image"):
imagePath = image.find("path").text
self.copy_image(imagePath, f"{root}/{dir}", "titles/chuni/img/systemVoice/")
defaultHave = xml_root.find("defaultHave").text == 'true'
disableFlag = xml_root.find("disableFlag") # may not exist in older data
is_enabled = True if (disableFlag is None or disableFlag.text == "false") else False
result = await self.data.static.put_system_voice(
self.version, id, name, sortName, imagePath, is_enabled, defaultHave, opt_id
)
if result is not None:
self.logger.info(f"Inserted system voice {id}")
else:
self.logger.warning(f"Failed to system voice {id}")
async def read_opt_info(self, directory: str) -> Optional[int]:
if not path.exists(f"{directory}/data.conf"):
self.logger.warning(f"{directory} does not contain data.conf, opt info will not be read")
return None
data_config = configparser.ConfigParser()
if not data_config.read(f"{directory}/data.conf", 'utf-8'):
self.logger.warning(f"{directory}/data.conf failed to read or parse, opt info will not be read")
return None
if 'Version' not in data_config:
self.logger.warning(f"{directory}/data.conf contains no Version section, opt info will not be read")
return None
if 'Name' not in data_config['Version']: # Probably not worth checking that the other sections exist
self.logger.warning(f"{directory}/data.conf contains no Name item in the Version section, opt info will not be read")
return None
if 'VerMajor' not in data_config['Version']: # Probably not worth checking that the other sections exist
self.logger.warning(f"{directory}/data.conf contains no VerMajor item in the Version section, opt info will not be read")
return None
if 'VerMinor' not in data_config['Version']: # Probably not worth checking that the other sections exist
self.logger.warning(f"{directory}/data.conf contains no VerMinor item in the Version section, opt info will not be read")
return None
if 'VerRelease' not in data_config['Version']: # Probably not worth checking that the other sections exist
self.logger.warning(f"{directory}/data.conf contains no VerRelease item in the Version section, opt info will not be read")
return None
opt_seq = data_config['Version']['VerRelease']
opt_folder = path.basename(path.normpath(directory))
opt_id = await self.data.static.get_opt_by_version_folder(self.version, opt_folder)
if not opt_id:
opt_id = await self.data.static.put_opt(self.version, opt_folder, opt_seq)
if not opt_id:
self.logger.error(f"Failed to put opt folder info for {opt_folder}")
return None
else:
opt_id = opt_id['id']
self.logger.info(f"Opt folder {opt_folder} (Database ID {opt_id}) contains {data_config['Version']['Name']} v{data_config['Version']['VerMajor']}.{data_config['Version']['VerMinor']}.{opt_seq}")
return opt_id
async def read_unlock_challenge(self, uc_dir: str) -> None:
for root, dirs, files in walk(uc_dir):
for dir in dirs:
if path.exists(f"{root}/{dir}/UnlockChallenge.xml"):
with open(f"{root}/{dir}/UnlockChallenge.xml", "r", encoding="utf-8") as fp:
strdata = fp.read()
xml_root = ET.fromstring(strdata)
for name in xml_root.findall("name"):
id = name.find("id").text
name = name.find("str").text
course_ids = []
for course in xml_root.find("musicList/list/UnlockChallengeMusicListSubData/unlockChallengeMusicData/courseList/list").findall("UnlockChallengeCourseListSubData"):
course_id = course.find("unlockChallengeCourseData/courseName").find("id").text
course_ids.append(course_id)
# Build keyword arguments dynamically for up to 5 course IDs
course_kwargs = {
f"course_id{i+1}": course_ids[i]
for i in range(min(5, len(course_ids)))
}
result = await self.data.static.put_unlock_challenge(
self.version, id, name,
**course_kwargs
)
if result is not None:
self.logger.info(f"Inserted unlock challenge {id}")
else:
self.logger.warning(f"Failed to unlock challenge {id}")
async def read_linked_verse(self, lv_dir: str) -> None:
for root, dirs, files in walk(lv_dir):
for dir in dirs:
if path.exists(f"{root}/{dir}/LinkedVerse.xml"):
with open(f"{root}/{dir}/LinkedVerse.xml", "r", encoding="utf-8") as fp:
strdata = fp.read()
xml_root = ET.fromstring(strdata)
for name in xml_root.findall("name"):
id = name.find("id").text
name = name.find("str").text
course_ids = []
for course in xml_root.find("musicList/list/LinkedVerseMusicListSubData/linkedVerseMusicData/courseList/list").findall("LinkedVerseCourseListSubData"):
course_id = course.find("linkedVerseCourseData/courseName").find("id").text
course_ids.append(course_id)
# Build keyword arguments dynamically for up to 5 course IDs
course_kwargs = {
f"course_id{i+1}": course_ids[i]
for i in range(min(5, len(course_ids)))
}
result = await self.data.static.put_linked_verse(
self.version, id, name,
**course_kwargs
)
if result is not None:
self.logger.info(f"Inserted Linked VERSE {id}")
else:
self.logger.warning(f"Failed to Linked VERSE {id}")
async def read_stage(self, stage_dir: str, opt_id: Optional[int] = None) -> None:
for root, dirs, files in walk(stage_dir):
for dir in dirs:
if path.exists(f"{root}/{dir}/Stage.xml"):
with open(f"{root}/{dir}/Stage.xml", "r", encoding='utf-8') as fp:
strdata = fp.read()
xml_root = ET.fromstring(strdata)
for name in xml_root.findall("name"):
id = name.find("id").text
name = name.find("str").text
for image in xml_root.findall("image"):
image_path = image.find("path").text
self.copy_image(image_path, f"{root}/{dir}", "titles/chuni/img/stage/")
default_have = xml_root.find("defaultHave").text == 'true'
disable_flag = xml_root.find("disableFlag") # may not exist in older data
is_enabled = True if (disable_flag is None or disable_flag.text == "false") else False
result = await self.data.static.put_stage(
self.version, id, name, image_path, is_enabled, default_have, opt_id
)
if result is not None:
self.logger.info(f"Inserted stage {id}")
else:
self.logger.warning(f"Failed to insert stage {id}")
def copy_image(self, filename: str, src_dir: str, dst_dir: str) -> None:
# Convert the image to webp so we can easily display it in the frontend
file_src = path.join(src_dir, filename)
(basename, ext) = path.splitext(filename)
file_dst = path.join(dst_dir, basename) + ".webp"
if path.exists(file_src) and not path.exists(file_dst):
try:
im = Image.open(file_src)
im.save(file_dst)
except Exception:
self.logger.warning(f"Failed to convert {filename} to webp")
def ConvertOldAssets(logger: Logger):
"""
Converts any previously-imported png files to webp.
In the initial version of the userbox/avatar frontend support, png images were used, scraped via read.py.
The amount of data pushed once a lot of stuff was unlocked was noticeable so the frontend now uses webp format
for these assets. If any png files are present, convert them to webp now.
"""
# Find all pngs under the /img directory
png_files = glob.glob(f'titles/chuni/img/**/*.png', recursive=True)
if len(png_files) > 0:
logger.info(f'Found {len(png_files)} old assets. Converting to webp... (may take a few minutes)')
for img_png in png_files:
img_webp = path.splitext(img_png)[0] + '.webp'
try:
# convert to webp
im = Image.open(img_png)
im.save(img_webp)
# delete the original file
remove(img_png)
except Exception as e:
logger.warning(f'Failed to convert {img_png} to webp')
logger.info(f'Conversion complete')
def map_dds_images(self, image_dict: dict, dds_dir: str) -> None:
for root, dirs, files in walk(dds_dir):
for dir in dirs:
directory = f"{root}/{dir}"
if path.exists(f"{directory}/DDSImage.xml"):
with open(f"{directory}/DDSImage.xml", "r", encoding='utf-8') as fp:
strdata = fp.read()
xml_root = ET.fromstring(strdata)
for name in xml_root.findall("name"):
name = name.find("str").text
images = []
i = 0
while xml_root.findall(f"ddsFile{i}"):
for ddsFile in xml_root.findall(f"ddsFile{i}"):
images += [ddsFile.find("path").text]
i += 1
image_dict[name] = (directory, images)

View File

@@ -1,6 +1,6 @@
from titles.chuni.schema.profile import ChuniProfileData
from titles.chuni.schema.score import ChuniScoreData
from titles.chuni.schema.score import ChuniScoreData, ChuniRomVersion
from titles.chuni.schema.item import ChuniItemData
from titles.chuni.schema.static import ChuniStaticData
__all__ = ["ChuniProfileData", "ChuniScoreData", "ChuniItemData", "ChuniStaticData"]
__all__ = ["ChuniProfileData", "ChuniScoreData", "ChuniRomVersion", "ChuniItemData", "ChuniStaticData"]

View File

@@ -1,27 +1,28 @@
from typing import Dict, List, Optional
from sqlalchemy import (
Table,
Column,
UniqueConstraint,
PrimaryKeyConstraint,
Table,
UniqueConstraint,
and_,
delete,
)
from sqlalchemy.types import Integer, String, TIMESTAMP, Boolean, JSON
from sqlalchemy.engine.base import Connection
from sqlalchemy.schema import ForeignKey
from sqlalchemy.sql import func, select
from sqlalchemy.dialects.mysql import insert
from sqlalchemy.engine import Row
from sqlalchemy.schema import ForeignKey
from sqlalchemy.sql import func, select
from sqlalchemy.types import JSON, TIMESTAMP, Boolean, Integer, String
from core.data.schema import BaseData, metadata
character = Table(
character: Table = Table(
"chuni_item_character",
metadata,
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
@@ -40,12 +41,13 @@ character = Table(
mysql_charset="utf8mb4",
)
item = Table(
item: Table = Table(
"chuni_item_item",
metadata,
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
@@ -63,6 +65,7 @@ duel = Table(
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
@@ -85,6 +88,7 @@ map = Table(
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
@@ -107,6 +111,7 @@ map_area = Table(
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
@@ -127,6 +132,7 @@ gacha = Table(
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
@@ -141,12 +147,13 @@ gacha = Table(
mysql_charset="utf8mb4",
)
print_state = Table(
print_state: Table = Table(
"chuni_item_print_state",
metadata,
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
@@ -167,6 +174,7 @@ print_detail = Table(
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
@@ -197,6 +205,7 @@ login_bonus = Table(
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
@@ -210,12 +219,13 @@ login_bonus = Table(
mysql_charset="utf8mb4",
)
favorite = Table(
favorite: Table = Table(
"chuni_item_favorite",
metadata,
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
@@ -232,6 +242,7 @@ matching = Table(
Column("roomId", Integer, nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
@@ -249,6 +260,7 @@ cmission = Table(
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
@@ -262,7 +274,12 @@ cmission_progress = Table(
"chuni_item_cmission_progress",
metadata,
Column("id", Integer, primary_key=True, nullable=False),
Column("user", ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"), nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
Column("missionId", Integer, nullable=False),
Column("order", Integer),
Column("stage", Integer),
@@ -273,14 +290,66 @@ cmission_progress = Table(
mysql_charset="utf8mb4",
)
unlock_challenge = Table(
"chuni_item_unlock_challenge",
metadata,
Column("id", Integer, primary_key=True, nullable=False),
Column("version", Integer, nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
Column("unlockChallengeId", Integer, nullable=False),
Column("status", Integer),
Column("clearCourseId", Integer),
Column("conditionType", Integer),
Column("score", Integer),
Column("life", Integer),
Column("clearDate", TIMESTAMP, server_default=func.now()),
UniqueConstraint(
"version", "user", "unlockChallengeId", name="chuni_item_unlock_challenge_uk"
),
mysql_charset="utf8mb4",
)
linked_verse: Table = Table(
"chuni_item_linked_verse",
metadata,
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
Column("linkedVerseId", Integer, nullable=False),
Column("progress", String(255)),
Column("statusOpen", Integer),
Column("statusUnlock", Integer),
Column("isFirstClear", Integer),
Column("numClear", Integer),
Column("clearCourseId", Integer),
Column("clearCourseLevel", Integer),
Column("clearScore", Integer),
Column("clearDate", String(25)),
Column("clearUserId1", Integer),
Column("clearUserId2", Integer),
Column("clearUserId3", Integer),
Column("clearUserName0", String(20)),
Column("clearUserName1", String(20)),
Column("clearUserName2", String(20)),
Column("clearUserName3", String(20)),
UniqueConstraint("user", "linkedVerseId", name="chuni_item_linked_verse_uk"),
mysql_charset="utf8mb4",
)
class ChuniItemData(BaseData):
async def get_oldest_free_matching(self, version: int) -> Optional[Row]:
sql = matching.select(
and_(
matching.c.version == version,
matching.c.isFull == False
)
and_(matching.c.version == version, matching.c.isFull == False)
).order_by(matching.c.roomId.asc())
result = await self.execute(sql)
@@ -289,11 +358,9 @@ class ChuniItemData(BaseData):
return result.fetchone()
async def get_newest_matching(self, version: int) -> Optional[Row]:
sql = matching.select(
and_(
matching.c.version == version
)
).order_by(matching.c.roomId.desc())
sql = matching.select(and_(matching.c.version == version)).order_by(
matching.c.roomId.desc()
)
result = await self.execute(sql)
if result is None:
@@ -301,11 +368,7 @@ class ChuniItemData(BaseData):
return result.fetchone()
async def get_all_matchings(self, version: int) -> Optional[List[Row]]:
sql = matching.select(
and_(
matching.c.version == version
)
)
sql = matching.select(and_(matching.c.version == version))
result = await self.execute(sql)
if result is None:
@@ -329,7 +392,7 @@ class ChuniItemData(BaseData):
matching_member_info_list: List,
user_id: int = None,
rest_sec: int = 60,
is_full: bool = False
is_full: bool = False,
) -> Optional[int]:
sql = insert(matching).values(
roomId=room_id,
@@ -359,10 +422,33 @@ class ChuniItemData(BaseData):
return None
return result.lastrowid
async def get_all_favorites(
self, user_id: int, version: int, fav_kind: int = 1
) -> Optional[List[Row]]:
async def is_favorite(
self, user_id: int, version: int, fav_id: int, fav_kind: int = 1
) -> bool:
sql = favorite.select(
and_(
favorite.c.version == version,
favorite.c.user == user_id,
favorite.c.favId == fav_id,
favorite.c.favKind == fav_kind,
)
)
result = await self.execute(sql)
if result is None:
return False
return True if len(result.all()) else False
async def get_all_favorites(
self,
user_id: int,
version: int,
fav_kind: int = 1,
limit: Optional[int] = None,
offset: Optional[int] = None,
) -> Optional[List[Row]]:
sql = select(favorite).where(
and_(
favorite.c.version == version,
favorite.c.user == user_id,
@@ -370,6 +456,13 @@ class ChuniItemData(BaseData):
)
)
if limit is not None or offset is not None:
sql = sql.order_by(favorite.c.id)
if limit is not None:
sql = sql.limit(limit)
if offset is not None:
sql = sql.offset(offset)
result = await self.execute(sql)
if result is None:
return None
@@ -421,6 +514,39 @@ class ChuniItemData(BaseData):
return None
return result.fetchone()
async def put_favorite_music(
self, user_id: int, version: int, music_id: int
) -> Optional[int]:
sql = insert(favorite).values(
user=user_id, version=version, favId=music_id, favKind=1
)
conflict = sql.on_duplicate_key_update(
user=user_id, version=version, favId=music_id, favKind=1
)
result = await self.execute(conflict)
if result is None:
return None
return result.lastrowid
async def delete_favorite_music(
self, user_id: int, version: int, music_id: int
) -> Optional[int]:
sql = delete(favorite).where(
and_(
favorite.c.user == user_id,
favorite.c.version == version,
favorite.c.favId == music_id,
favorite.c.favKind == 1,
)
)
result = await self.execute(sql)
if result is None:
return None
return result.lastrowid
async def put_character(self, user_id: int, character_data: Dict) -> Optional[int]:
character_data["user"] = user_id
@@ -444,9 +570,18 @@ class ChuniItemData(BaseData):
return None
return result.fetchone()
async def get_characters(self, user_id: int) -> Optional[List[Row]]:
async def get_characters(
self, user_id: int, limit: Optional[int] = None, offset: Optional[int] = None
) -> Optional[List[Row]]:
sql = select(character).where(character.c.user == user_id)
if limit is not None or offset is not None:
sql = sql.order_by(character.c.id)
if limit is not None:
sql = sql.limit(limit)
if offset is not None:
sql = sql.offset(offset)
result = await self.execute(sql)
if result is None:
return None
@@ -465,13 +600,26 @@ class ChuniItemData(BaseData):
return None
return result.lastrowid
async def get_items(self, user_id: int, kind: int = None) -> Optional[List[Row]]:
if kind is None:
sql = select(item).where(item.c.user == user_id)
else:
sql = select(item).where(
and_(item.c.user == user_id, item.c.itemKind == kind)
)
async def get_items(
self,
user_id: int,
kind: Optional[int] = None,
limit: Optional[int] = None,
offset: Optional[int] = None,
) -> Optional[List[Row]]:
cond = item.c.user == user_id
if kind is not None:
cond &= item.c.itemKind == kind
sql = select(item).where(cond)
if limit is not None or offset is not None:
sql = sql.order_by(item.c.id)
if limit is not None:
sql = sql.limit(limit)
if offset is not None:
sql = sql.offset(offset)
result = await self.execute(sql)
if result is None:
@@ -533,8 +681,12 @@ class ChuniItemData(BaseData):
return None
return result.lastrowid
async def get_map_areas(self, user_id: int) -> Optional[List[Row]]:
sql = select(map_area).where(map_area.c.user == user_id)
async def get_map_areas(
self, user_id: int, map_area_ids: List[int]
) -> Optional[List[Row]]:
sql = select(map_area).where(
map_area.c.user == user_id, map_area.c.mapAreaId.in_(map_area_ids)
)
result = await self.execute(sql)
if result is None:
@@ -565,15 +717,26 @@ class ChuniItemData(BaseData):
return result.lastrowid
async def get_user_print_states(
self, aime_id: int, has_completed: bool = False
self,
aime_id: int,
has_completed: bool = False,
limit: Optional[int] = None,
offset: Optional[int] = None,
) -> Optional[List[Row]]:
sql = print_state.select(
sql = select(print_state).where(
and_(
print_state.c.user == aime_id,
print_state.c.hasCompleted == has_completed,
)
)
if limit is not None or offset is not None:
sql = sql.order_by(print_state.c.id)
if limit is not None:
sql = sql.limit(limit)
if offset is not None:
sql = sql.offset(offset)
result = await self.execute(sql)
if result is None:
return None
@@ -624,7 +787,7 @@ class ChuniItemData(BaseData):
)
return None
return result.lastrowid
async def put_cmission_progress(
self, user_id: int, mission_id: int, progress_data: Dict
) -> Optional[int]:
@@ -634,10 +797,10 @@ class ChuniItemData(BaseData):
sql = insert(cmission_progress).values(**progress_data)
conflict = sql.on_duplicate_key_update(**progress_data)
result = await self.execute(conflict)
if result is None:
return None
return result.lastrowid
async def get_cmission_progress(
@@ -650,21 +813,21 @@ class ChuniItemData(BaseData):
)
).order_by(cmission_progress.c.order.asc())
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
async def get_cmission(self, user_id: int, mission_id: int) -> Optional[Row]:
sql = cmission.select(
and_(cmission.c.user == user_id, cmission.c.missionId == mission_id)
)
result = await self.execute(sql)
if result is None:
return None
return result.fetchone()
async def put_cmission(self, user_id: int, mission_data: Dict) -> Optional[int]:
@@ -673,17 +836,65 @@ class ChuniItemData(BaseData):
sql = insert(cmission).values(**mission_data)
conflict = sql.on_duplicate_key_update(**mission_data)
result = await self.execute(conflict)
if result is None:
return None
return result.lastrowid
async def get_cmissions(self, user_id: int) -> Optional[List[Row]]:
sql = cmission.select(cmission.c.user == user_id)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
async def put_unlock_challenge(
self, user_id: int, version: int, unlock_challenge_data: Dict
) -> Optional[int]:
unlock_challenge_data["user"] = user_id
unlock_challenge_data["version"] = version
sql = insert(unlock_challenge).values(**unlock_challenge_data)
conflict = sql.on_duplicate_key_update(**unlock_challenge_data)
result = await self.execute(conflict)
if result is None:
return None
return result.lastrowid
async def get_unlock_challenges(
self, user_id: int, version: int
) -> Optional[List[Row]]:
sql = unlock_challenge.select(
and_(
unlock_challenge.c.user == user_id,
unlock_challenge.c.version == version,
)
)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
async def get_linked_verse(self, aime_id: int) -> Optional[List[Row]]:
result = await self.execute(
linked_verse.select().where(linked_verse.c.user == aime_id)
)
if result:
return result.fetchall()
async def put_linked_verse(self, aime_id: int, linked_verse_data: Dict):
linked_verse_data = self.fix_bools(linked_verse_data)
sql = insert(linked_verse).values(user=aime_id, **linked_verse_data)
conflict = sql.on_duplicate_key_update(**linked_verse_data)
result = await self.execute(conflict)
if result:
return result.inserted_primary_key["id"]
self.logger.error("Failed to put Linked Verse data for user %s", aime_id)

View File

@@ -15,6 +15,7 @@ profile = Table(
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
@@ -25,6 +26,8 @@ profile = Table(
Column("frameId", Integer),
Column("isMaimai", Boolean),
Column("trophyId", Integer),
Column("trophyIdSub1", Integer, server_default="-1"),
Column("trophyIdSub2", Integer, server_default="-1"),
Column("userName", String(25)),
Column("isWebJoin", Boolean),
Column("playCount", Integer),
@@ -35,13 +38,13 @@ profile = Table(
Column("friendCount", Integer),
Column("lastPlaceId", Integer),
Column("nameplateId", Integer),
Column("totalMapNum", Integer),
Column("totalMapNum", BigInteger),
Column("lastAllNetId", Integer),
Column("lastClientId", String(25)),
Column("lastPlayDate", String(25)),
Column("lastRegionId", Integer),
Column("playerRating", Integer),
Column("totalHiScore", Integer),
Column("totalHiScore", BigInteger),
Column("webLimitDate", String(25)),
Column("firstPlayDate", String(25)),
Column("highestRating", Integer),
@@ -59,12 +62,12 @@ profile = Table(
Column("firstDataVersion", String(25)),
Column("reincarnationNum", Integer),
Column("playedTutorialBit", Integer),
Column("totalBasicHighScore", Integer),
Column("totalExpertHighScore", Integer),
Column("totalMasterHighScore", Integer),
Column("totalRepertoireCount", Integer),
Column("totalBasicHighScore", BigInteger),
Column("totalExpertHighScore", BigInteger),
Column("totalMasterHighScore", BigInteger),
Column("totalRepertoireCount", BigInteger),
Column("firstTutorialCancelNum", Integer),
Column("totalAdvancedHighScore", Integer),
Column("totalAdvancedHighScore", BigInteger),
Column("masterTutorialCancelNum", Integer),
Column("ext1", Integer), # Added in chunew
Column("ext2", Integer),
@@ -111,7 +114,7 @@ profile = Table(
Column("classEmblemBase", Integer, server_default="0"),
Column("battleRankPoint", Integer, server_default="0"),
Column("netBattle2ndCount", Integer, server_default="0"),
Column("totalUltimaHighScore", Integer, server_default="0"),
Column("totalUltimaHighScore", BigInteger, server_default="0"),
Column("skillId", Integer, server_default="0"),
Column("lastCountryCode", String(5), server_default="JPN"),
Column("isNetBattleHost", Boolean, server_default="0"),
@@ -129,6 +132,9 @@ profile = Table(
Column("avatarFront", Integer, server_default="0"),
Column("avatarSkin", Integer, server_default="0"),
Column("avatarHead", Integer, server_default="0"),
Column(
"stageId", Integer, server_default="99999", nullable=False
), # 99999 is the pseudo stage ID for unset stage
UniqueConstraint("user", "version", name="chuni_profile_profile_uk"),
mysql_charset="utf8mb4",
)
@@ -139,6 +145,7 @@ profile_ex = Table(
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
@@ -187,6 +194,7 @@ option = Table(
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
@@ -250,6 +258,7 @@ option_ex = Table(
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
@@ -283,6 +292,7 @@ recent_rating = Table(
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
@@ -297,6 +307,7 @@ region = Table(
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
@@ -312,6 +323,7 @@ activity = Table(
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
@@ -334,6 +346,7 @@ charge = Table(
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
@@ -354,6 +367,7 @@ emoney = Table(
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
@@ -373,6 +387,7 @@ overpower = Table(
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
@@ -400,6 +415,7 @@ rating = Table(
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
@@ -439,6 +455,71 @@ class ChuniProfileData(BaseData):
return False
return True
async def update_map_icon(self, user_id: int, version: int, new_map_icon: int) -> bool:
sql = profile.update((profile.c.user == user_id) & (profile.c.version == version)).values(
mapIconId=new_map_icon
)
result = await self.execute(sql)
if result is None:
self.logger.warning(f"Failed to set user {user_id} map icon")
return False
return True
async def update_system_voice(self, user_id: int, version: int, new_system_voice: int) -> bool:
sql = profile.update((profile.c.user == user_id) & (profile.c.version == version)).values(
voiceId=new_system_voice
)
result = await self.execute(sql)
if result is None:
self.logger.warning(f"Failed to set user {user_id} system voice")
return False
return True
async def update_stage(self, user_id: int, version: int, new_stage: int) -> bool:
sql = profile.update((profile.c.user == user_id) & (profile.c.version == version)).values(
stageId=new_stage
)
result = await self.execute(sql)
if result is None:
self.logger.warning(f"Failed to set user {user_id} stage")
return False
return True
async def update_userbox(self, user_id: int, version: int, new_nameplate: int, new_trophy: int, new_trophy_sub_1: int, new_trophy_sub_2: int, new_character: int) -> bool:
sql = profile.update((profile.c.user == user_id) & (profile.c.version == version)).values(
nameplateId=new_nameplate,
trophyId=new_trophy,
trophyIdSub1=new_trophy_sub_1,
trophyIdSub2=new_trophy_sub_2,
charaIllustId=new_character
)
result = await self.execute(sql)
if result is None:
self.logger.warning(f"Failed to set user {user_id} userbox")
return False
return True
async def update_avatar(self, user_id: int, version: int, new_wear: int, new_face: int, new_head: int, new_skin: int, new_item: int, new_front: int, new_back: int) -> bool:
sql = profile.update((profile.c.user == user_id) & (profile.c.version == version)).values(
avatarWear=new_wear,
avatarFace=new_face,
avatarHead=new_head,
avatarSkin=new_skin,
avatarItem=new_item,
avatarFront=new_front,
avatarBack=new_back
)
result = await self.execute(sql)
if result is None:
self.logger.warning(f"Failed to set user {user_id} avatar")
return False
return True
async def put_profile_data(
self, aime_id: int, version: int, profile_data: Dict
) -> Optional[int]:
@@ -713,7 +794,7 @@ class ChuniProfileData(BaseData):
existing_team = self.get_team_by_id(team_id)
if existing_team is None or "userTeamPoint" not in existing_team:
self.logger.warn(
self.logger.warning(
f"update_team: Failed to update team! team id: {team_id}. Existing team data not found."
)
return False
@@ -743,7 +824,7 @@ class ChuniProfileData(BaseData):
result = await self.execute(conflict)
if result is None:
self.logger.warn(
self.logger.warning(
f"update_team: Failed to update team! team id: {team_id}"
)
return False
@@ -756,12 +837,13 @@ class ChuniProfileData(BaseData):
if result is None:
return None
return result.fetchone()
async def get_overview(self) -> Dict:
# Fetch and add up all the playcounts
playcount_sql = await self.execute(select(profile.c.playCount))
if playcount_sql is None:
self.logger.warn(
self.logger.warning(
f"get_overview: Couldn't pull playcounts"
)
return 0
@@ -790,7 +872,7 @@ class ChuniProfileData(BaseData):
result = await self.execute(sql)
if result is None:
self.logger.warn(
self.logger.warning(
f"put_profile_rating: Could not insert {rating_type}, aime_id: {aime_id}",
)
return
@@ -846,4 +928,4 @@ class ChuniProfileData(BaseData):
async def get_net_battle(self, aime_id: int) -> Optional[Row]:
result = await self.execute(net_battle.select(net_battle.c.user == aime_id))
if result:
return result.fetchone()
return result.fetchone()

View File

@@ -1,20 +1,23 @@
from typing import Dict, List, Optional
from sqlalchemy import Table, Column, UniqueConstraint, PrimaryKeyConstraint, and_
from sqlalchemy.types import Integer, String, TIMESTAMP, Boolean, JSON, BigInteger
from sqlalchemy.engine.base import Connection
from sqlalchemy.schema import ForeignKey
from sqlalchemy.engine import Row
from sqlalchemy.sql import func, select
from sqlalchemy import Column, Table, UniqueConstraint
from sqlalchemy.dialects.mysql import insert
from sqlalchemy.sql.expression import exists
from sqlalchemy.engine import Row
from sqlalchemy.schema import ForeignKey
from sqlalchemy.sql import func, select
from sqlalchemy.types import Boolean, Integer, String
from core.data.schema import BaseData, metadata
course = Table(
from ..config import ChuniConfig
course: Table = Table(
"chuni_score_course",
metadata,
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
@@ -40,12 +43,13 @@ course = Table(
mysql_charset="utf8mb4",
)
best_score = Table(
best_score: Table = Table(
"chuni_score_best",
metadata,
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
@@ -77,6 +81,7 @@ playlog = Table(
Column("id", Integer, primary_key=True, nullable=False),
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
@@ -137,14 +142,121 @@ playlog = Table(
Column("regionId", Integer),
Column("machineType", Integer),
Column("ticketId", Integer),
Column("monthPoint", Integer),
Column("eventPoint", Integer),
mysql_charset="utf8mb4"
)
class ChuniRomVersion():
"""
Class used to easily compare rom version strings and map back to the internal integer version.
Used with methods that touch the playlog table.
"""
Versions = {}
def init_versions(cfg: ChuniConfig):
if len(ChuniRomVersion.Versions) > 0:
# dont bother with reinit
return
# Build up a easily comparible list of versions. Used when deriving romVersion from the playlog
all_versions = {
10: ChuniRomVersion("1.50.0"),
9: ChuniRomVersion("1.45.0"),
8: ChuniRomVersion("1.40.0"),
7: ChuniRomVersion("1.35.0"),
6: ChuniRomVersion("1.30.0"),
5: ChuniRomVersion("1.25.0"),
4: ChuniRomVersion("1.20.0"),
3: ChuniRomVersion("1.15.0"),
2: ChuniRomVersion("1.10.0"),
1: ChuniRomVersion("1.05.0"),
0: ChuniRomVersion("1.00.0")
}
# add the versions from the config
for ver in range(11,999):
cfg_ver = cfg.version.version(ver)
if cfg_ver:
all_versions[ver] = ChuniRomVersion(cfg_ver["rom"])
else:
break
# sort it by version number for easy iteration
ChuniRomVersion.Versions = dict(sorted(all_versions.items()))
def __init__(self, rom_version: Optional[str] = None) -> None:
if rom_version is None:
self.major = 0
self.minor = 0
self.maint = 0
self.version = "0.00.00"
return
(major, minor, maint) = rom_version.split('.')
self.major = int(major)
self.minor = int(minor)
self.maint = int(maint)
self.version = rom_version
def __str__(self) -> str:
return self.version
def __eq__(self, other) -> bool:
return (self.major == other.major and
self.minor == other.minor and
self.maint == other.maint)
def __lt__(self, other) -> bool:
return (self.major < other.major) or \
(self.major == other.major and self.minor < other.minor) or \
(self.major == other.major and self.minor == other.minor and self.maint < other.maint)
def __gt__(self, other) -> bool:
return (self.major > other.major) or \
(self.major == other.major and self.minor > other.minor) or \
(self.major == other.major and self.minor == other.minor and self.maint > other.maint)
def get_int_version(self) -> int:
"""
Used when displaying the playlog to walk backwards from the recorded romVersion to our internal version number.
This is effectively a workaround to avoid recording our internal version number along with the romVersion in the db at insert time.
"""
for ver,rom in ChuniRomVersion.Versions.items():
# if the version matches exactly, great!
if self == rom:
return ver
# If this isnt the last version, use the next as an upper bound
if ver + 1 < len(ChuniRomVersion.Versions):
if self > rom and self < ChuniRomVersion.Versions[ver + 1]:
# this version fits in the middle! It must be a revision of the version
# e.g. 2.15.00 vs 2.16.00
return ver
else:
# this is the last version in the list.
# If its greate than this one and still the same major, this call it a match
if self.major == rom.major and self > rom:
return ver
# Only way we get here is if it was a version that started with "0." which is def invalid
return -1
class ChuniScoreData(BaseData):
async def get_courses(self, aime_id: int) -> Optional[Row]:
async def get_courses(
self,
aime_id: int,
limit: Optional[int] = None,
offset: Optional[int] = None,
) -> Optional[List[Row]]:
sql = select(course).where(course.c.user == aime_id)
if limit is not None or offset is not None:
sql = sql.order_by(course.c.id)
if limit is not None:
sql = sql.limit(limit)
if offset is not None:
sql = sql.offset(offset)
result = await self.execute(sql)
if result is None:
return None
@@ -162,8 +274,45 @@ class ChuniScoreData(BaseData):
return None
return result.lastrowid
async def get_scores(self, aime_id: int) -> Optional[Row]:
sql = select(best_score).where(best_score.c.user == aime_id)
async def get_scores(
self,
aime_id: int,
levels: Optional[list[int]] = None,
limit: Optional[int] = None,
offset: Optional[int] = None,
) -> Optional[List[Row]]:
condition = best_score.c.user == aime_id
if levels is not None:
condition &= best_score.c.level.in_(levels)
if limit is None and offset is None:
sql = (
select(best_score)
.where(condition)
.order_by(best_score.c.musicId.asc(), best_score.c.level.asc())
)
else:
subq = (
select(best_score.c.musicId)
.distinct()
.where(condition)
.order_by(best_score.c.musicId)
)
if limit is not None:
subq = subq.limit(limit)
if offset is not None:
subq = subq.offset(offset)
subq = subq.subquery()
sql = (
select(best_score)
.join(subq, best_score.c.musicId == subq.c.musicId)
.where(condition)
.order_by(best_score.c.musicId, best_score.c.level)
)
result = await self.execute(sql)
if result is None:
@@ -190,88 +339,90 @@ class ChuniScoreData(BaseData):
return None
return result.fetchall()
async def get_playlogs_limited(self, aime_id: int, index: int, count: int) -> Optional[Row]:
sql = select(playlog).where(playlog.c.user == aime_id).order_by(playlog.c.id.desc()).limit(count).offset(index * count)
async def get_playlog_rom_versions_by_int_version(self, version: int, aime_id: int = -1) -> Optional[str]:
# Get a set of all romVersion values present
sql = select([playlog.c.romVersion])
if aime_id != -1:
# limit results to a specific user
sql = sql.where(playlog.c.user == aime_id)
sql = sql.distinct()
result = await self.execute(sql)
if result is None:
self.logger.warning(f" aime_id {aime_id} has no playlog ")
return None
record_versions = result.fetchall()
# for each romVersion recorded, check if it maps back the current version we are operating on
matching_rom_versions = []
for v in record_versions:
# Do this to prevent null romVersion from causing an error in ChuniRomVersion.__init__()
if v[0] is None:
continue
if ChuniRomVersion(v[0]).get_int_version() == version:
matching_rom_versions += [v[0]]
self.logger.debug(f"romVersions {matching_rom_versions} map to version {version}")
return matching_rom_versions
async def get_playlogs_limited(self, aime_id: int, version: int, index: int, count: int) -> Optional[Row]:
# Get a list of all the recorded romVersions in the playlog
# for this user that map to the given version.
rom_versions = await self.get_playlog_rom_versions_by_int_version(version, aime_id)
if rom_versions is None:
return None
# Query results that have the matching romVersions
sql = select(playlog).where((playlog.c.user == aime_id) & (playlog.c.romVersion.in_(rom_versions))).order_by(playlog.c.id.desc()).limit(count).offset(index * count)
result = await self.execute(sql)
if result is None:
self.logger.info(f" aime_id {aime_id} has no playlog for version {version}")
return None
return result.fetchall()
async def get_user_playlogs_count(self, aime_id: int) -> Optional[Row]:
sql = select(func.count()).where(playlog.c.user == aime_id)
async def get_user_playlogs_count(self, aime_id: int, version: int) -> Optional[Row]:
# Get a list of all the recorded romVersions in the playlog
# for this user that map to the given version.
rom_versions = await self.get_playlog_rom_versions_by_int_version(version, aime_id)
if rom_versions is None:
return None
# Query results that have the matching romVersions
sql = select(func.count()).where((playlog.c.user == aime_id) & (playlog.c.romVersion.in_(rom_versions)))
result = await self.execute(sql)
if result is None:
self.logger.warning(f" aime_id {aime_id} has no playlog ")
return None
self.logger.info(f" aime_id {aime_id} has no playlog for version {version}")
return 0
return result.scalar()
async def put_playlog(self, aime_id: int, playlog_data: Dict, version: int) -> Optional[int]:
# Calculate the ROM version that should be inserted into the DB, based on the version of the ggame being inserted
# We only need from Version 10 (Plost) and back, as newer versions include romVersion in their upsert
# This matters both for gameRankings, as well as a future DB update to keep version data separate
romVer = {
10: "1.50.0",
9: "1.45.0",
8: "1.40.0",
7: "1.35.0",
6: "1.30.0",
5: "1.25.0",
4: "1.20.0",
3: "1.15.0",
2: "1.10.0",
1: "1.05.0",
0: "1.00.0"
}
playlog_data["user"] = aime_id
playlog_data = self.fix_bools(playlog_data)
# If the romVersion is not in the data (Version 10 and earlier), look it up from our internal mapping
if "romVersion" not in playlog_data:
playlog_data["romVersion"] = romVer.get(version, "1.00.0")
playlog_data["romVersion"] = ChuniRomVersion.Versions[version]
sql = insert(playlog).values(**playlog_data)
conflict = sql.on_duplicate_key_update(**playlog_data)
result = await self.execute(conflict)
result = await self.execute(sql)
if result is None:
return None
return result.lastrowid
async def get_rankings(self, version: int) -> Optional[List[Dict]]:
# Calculates the ROM version that should be fetched for rankings, based on the game version being retrieved
# This prevents tracks that are not accessible in your version from counting towards the 10 results
romVer = {
15: "2.20%",
14: "2.15%",
13: "2.10%",
12: "2.05%",
11: "2.00%",
10: "1.50%",
9: "1.45%",
8: "1.40%",
7: "1.35%",
6: "1.30%",
5: "1.25%",
4: "1.20%",
3: "1.15%",
2: "1.10%",
1: "1.05%",
0: "1.00%"
}
sql = select([playlog.c.musicId.label('id'), func.count(playlog.c.musicId).label('point')]).where((playlog.c.level != 4) & (playlog.c.romVersion.like(romVer.get(version, "%")))).group_by(playlog.c.musicId).order_by(func.count(playlog.c.musicId).desc()).limit(10)
# Get a list of all the recorded romVersions in the playlog for the given version
rom_versions = await self.get_playlog_rom_versions_by_int_version(version)
if rom_versions is None:
return None
# Query results that have the matching romVersions
sql = select([playlog.c.musicId.label('id'), func.count(playlog.c.musicId).label('point')]).where((playlog.c.level != 4) & (playlog.c.romVersion.in_(rom_versions))).group_by(playlog.c.musicId).order_by(func.count(playlog.c.musicId).desc()).limit(10)
result = await self.execute(sql)
if result is None:
return None
rows = result.fetchall()
return [dict(row) for row in rows]
async def get_rival_music(self, rival_id: int) -> Optional[List[Dict]]:
sql = select(best_score).where(best_score.c.user == rival_id)
result = await self.execute(sql)
if result is None:
return None
return result.fetchall()
return [dict(row) for row in rows]

Some files were not shown because too many files have changed in this diff Show More