Author Topic: Glacie 0.1.0: library for manipulation game files  (Read 423 times)

0 Members and 1 Guest are viewing this topic.

Not Yet Rated!

Offline lixiss

  • Member
  • *
  • Topic Author
  • Posts: 41
  • Country: ru
  • Karma: +1/-0
  • Gender: Male
    • View Profile
    • Awards
  • Time Zone: +3
Glacie 0.1.0: library for manipulation game files
« on: 18 August 2020, 14:36:23 »
Hello.

I'm just published sources of library at github: glacie and glacie-tools. Library available as packages at nuget.org. However it currently miss any readmes, but it was long night... so i'm currently shortly describe here.

So, what is about? It is about helping library for "procedural" mod development. I'm get messed with this .dbrs in past, because I'm want modify few fields in some record, but should import full record instead. This forces to write changes aside. Or another case: applying same expression over different equation files is kind of stupid work (and need templating or apply manually). Or another case: you can't simple write something for try, and then enable/disable feature and rebuild mod (what if feature affect all monsters?). Someone probably can remember mine very dirty tool glacie-checkdb (which i'm think doesn't do job right actually), which work very slow, because working over flat .dbr files is not fine, and it rises lot of issues. So, I'm hold general idea for library lot of time, but hit only recently.

Library should be considered as preview, and has some things not completed. However it already can handle some work (unfortunately lot less than i'm want...).

It splitted by some components (and packages):

- Glacie.Data.Arz - expose ArzDatabase, ArzRecord and ArzField, as well ArzReader and ArzWriter.
   ArzDatabase represents content of any given single ARZ file. Database generally can be treated as collection of named records, while record can be treated as collection of fields. This focuses on exact ARZ representation, it doesn't support templates (and never will do), it doesn't care about case-insensitive record lookups or so, it doesn't validate your edits. It just provides access to data in file as-is, and surely allow modify data (add/remove records, add/remove/set field values). API is not final, and can be changed in future.

  So, code which open database, turn all items into rare (so they can be enchanted) can looks something like:
Code: [Select]
            using var database = ArzDatabase.Open("path/to/database.arz");
            foreach (var record in database.GetAll())
            {
                if (record.TryGet("itemClassification", out var field))
                {
                    var value = field.Get<string>();

                    if (value == "Epic" || value == "Legendary")
                    {
                        record["itemClassification"] = "Rare";
                        record["--gx-itemClassification"] = value; // keep original value for later use
                    }
                }
            }
            ArzWriter.Write("out.arz", database);

   I'm should add more examples later, but i hope anyone got idea.

   ArzDatabase has direct support for TQ/IT, TQ/AE and GD files (yes, there is 3 incompatible file formats with minor variations). File format inferred from file, so you not need specify it (but can).

   There is exist limited multithreading support: if reader read data in "full" mode (immediately decompress data), - then it might be done concurrently, making this step bit faster. Reader also implements "raw" mode, where data completely reads but not decompressed, and "lazy" mode when data doesn't read at all, except "headers", but in this mode, records will be readed as you access to them. There is exist different job loads, where different modes more profitable than others (if you plan modify 90% of records, full will be best, but if you touch only few records directly AND want save only changes (e.g. custom map mod) - lazy mode will be better.

   For writer, there is also exist multithreading support: compression done in concurrently, so full recompression even at highest levels is not fearsome. (multithreading is optional, - when it is enabled - writer result is not reproductible, because final order of records in file are effectively race, also this has effect on internal encoder state so you might get different output file size when write same file). Writer able to write only changes or write everything (so you can grab game's database.arz, and write only changes for custom map or write back full content for "database replacement mod").

- Glacie.Data.Compression compression... huh. TQ/IT and TQ/AE uses ZLIB compression, while GD uses LZ4. This package offers two ZLIB implementations: based on .NET DeflateStream (which are sucky for ARZ), and when available, uses libdeflate which offers better performance, and higher compression ratio with more compression levels. However libdeflate in home repository offers binaries only for windows, so it available only on windows. (I'm not build it for self, and while whole library should work on other OSes too, i'm doesn't care about, and currently doesn't see reason to do.)

- Glacie.All - just metapackage which references everything.

- Glacie - soul of project, but currently not usable (as you can't do many things and can't save results).

This gives another look on same what ArzDatabase do, but eventually will provide more services. At least this provide

Code: [Select]
            using var context = Context.Create(c =>
            {
                c.Source(s => s.Path("./test-data/gd-1.1.7.1/database/database.arz"));
                c.Source(s => s.Path("./test-data/gd-1.1.7.1/gdx1/database/gdx1.arz"));
                c.Source(s => s.Path("./test-data/gd-1.1.7.1/gdx2/database/gdx2.arz"));
                c.Target(t => t.Path("./out"));
            });

            // do something with context.Database
            var record = context.Database[@"records\xpack\game\gameengine.dbr"]; // Will throw exception, because GD has no this record.

GDX1 and GDX2 act like custom maps in TQ, effectively shadow records, but unlike IT/AE they release expansions as internal mods, rather than having big single database. This already work (record shadowing and record importing).

Generally that's all what done.


Features to future:
- Support ARC files.
- Support Templates.
- Support Text resources.
- Support reading/writing from/to DBR sets.
- Typed access to records.

I'm will delay with simple samples, because... well, they will not look as they should without having completed features.

Just now you already can do something like:

Code: [Select]
            var rGameEngine = database[@"records\xpack\game\gameengine.dbr"];
            rGameEngine["potionStackLimit"] = 100;
            rGameEngine["scrollStackLimit"] = 100;

And, there is not what i'm want! This is weak code. You can assign strings, float (real) numbers, boolean for fields, nothing stop you. And there is problem: no one know how game engine will answer, so it is not desired. This is why i'm greedy for future features, and want typed access.

However, in glacie-tools releases i'm put one tool (you need install latest .NET Core 3.1 to run, i'm doesn't use self-contained deployment to avoid too big sizes, it is easier install and try). So, this repository now hold simple tool which at this moment act as show case. :)

gx-arz-optimizer

Modders probably already noticed, what record names stored are in lower case, but references to .dbr files (in fields) preserve original casing. And ARZ file store them separately for sure, because this strings naturally different. This turns makes what excess number of strings are stored, making file bigger. This tool remap every such string to respective (lower) casing, and write new arz file with rebuilding new string table / reencode every record, so every unused string disappear.

Let's look on results over some files:

Code: [Select]

> gx-arz-optimizer.exe --target=gx-opt-tqae-2.9.arz .\glacie-test-suite\arz\tqae-2.9\database\database.arz

[ INFO ] Reading: .\glacie-test-suite\arz\tqae-2.9\database\database.arz
[ INFO ] Done In: 424ms
[ INFO ] Optimizing...
[ INFO ] Optimization Result:
  Completed In: 1,560ms
  # of Remapped Strings: 29192
  Estimated Size Reduction: 1875697 bytes
  Estimated File Size: 52650726 (96.6%)
[ INFO ] Writing To: gx-opt-tqae-2.9.arz
[ INFO ] Compression Level: 12
[ INFO ] Written In: 5,740ms
[ INFO ] Source File Length: 54526423
[ INFO ] Target File Length: 51245482 (94.0%)


> gx-arz-optimizer.exe --target=gx-opt-sv-aera-1.7.arz .\glacie-test-suite\arz\sv-aera-1.7\database\database.arz

[ INFO ] Reading: .\glacie-test-suite\arz\sv-aera-1.7\database\database.arz
[ INFO ] Done In: 539ms
[ INFO ] Optimizing...
[ INFO ] Optimization Result:
  Completed In: 2,198ms
  # of Remapped Strings: 40451
  Estimated Size Reduction: 2721844 bytes
  Estimated File Size: 74895993 (96.5%)
[ INFO ] Writing To: gx-opt-sv-aera-1.7.arz
[ INFO ] Compression Level: 12
[ INFO ] Written In: 8,064ms
[ INFO ] Source File Length: 77617837
[ INFO ] Target File Length: 73906007 (95.2%)


> gx-arz-optimizer.exe --compression-level=9 --target=gx-opt-gd.arz .\glacie-test-suite\arz\gd-1.1.7.1\database\database.arz

[ INFO ] Reading: .\glacie-test-suite\arz\gd-1.1.7.1\database\database.arz
[ INFO ] Done In: 282ms
[ INFO ] Optimizing...
[ INFO ] Optimization Result:
  Completed In: 876ms
  # of Remapped Strings: 0
  Estimated Size Reduction: 0 bytes
  Estimated File Size: 56402016 (100.0%)
[ INFO ] Writing To: gx-opt-gd.arz
[ INFO ] Compression Level: 9
[ INFO ] Written In: 2,193ms
[ INFO ] Source File Length: 56402016
[ INFO ] Target File Length: 47038301 (83.4%)


You can see, what GD already has this kind of optimization, while TQ win few MiBs. :) Funny, what GD compressed so loosy, because even at relatively small compression levels it is easy to win 10-15%. :) You can see also what just reencoding file (recompressing) it, gives smaller file.

I'm even quick-test optimized version in SV-AERA, seems like run fine (however doesn't played too much). (It's funny what over five minutes test, i meet hero and he drop soul... rng never drop when you look for, and gives for free when you don't care :) ).


That's all. I'm will be glad to any feedback.

PS: I'm put this post into wrong section. Sorry.
« Last Edit: 18 August 2020, 16:26:41 by lixiss »

Not Yet Rated!

Offline lixiss

  • Member
  • *
  • Topic Author
  • Posts: 41
  • Country: ru
  • Karma: +1/-0
  • Gender: Male
    • View Profile
    • Awards
  • Time Zone: +3
Glacie 0.2.0
« Reply #1 on: 02 September 2020, 15:14:11 »
Here is small update, bit unrelated to goal. glacie-tools-0.2.0 now also includes gx-arc tool which generally do same job as archiveTool. In library, it is surely added counterpart of it (Glacie.Data.Arc package and ArcArchive class). Like with arz, it works with both TQ and GD .arc files.

So, for demonstration i'm pick some files from SV-AERA 1.7b:
Code: [Select]
                7,109,494 A Few Bug Fixes.arc
              459,030,819 Levels.arc
               57,895,262 N66_Mods.arc
                  162,300 Quests.arc
                  528,620 Text_CH.arc
                  570,685 Text_EN.arc
                  606,713 Text_RU.arc
                3,565,171 _DRX_Effects.arc
               42,066,683 _DRX_Meshes.arc
               26,064,655 _DRX_Textures.arc
10 File(s)    597,600,402 bytes

Let's recompress them:
Code: [Select]
> gx-arc optimize --repack --compression-level=maximum "./test/_DRX_Effects.arc"
[done] Processed: ./test/_DRX_Effects.arc
    -157,257 bytes, 95.6%

> gx-arc optimize --repack --compression-level=maximum "./test/_DRX_Meshes.arc"
[done] Processed: ./test/_DRX_Meshes.arc
    -1,796,375 bytes, 95.7%

> gx-arc optimize --repack --compression-level=maximum "./test/_DRX_Textures.arc"
[done] Processed: ./test/_DRX_Textures.arc
    -1,636,480 bytes, 93.7%

> gx-arc optimize --repack --compression-level=maximum "./test/A Few Bug Fixes.arc"
[done] Processed: ./test/A Few Bug Fixes.arc
    -102,456 bytes, 98.6%

> gx-arc optimize --repack --compression-level=maximum "./test/Levels.arc"
[done] Processed: ./test/Levels.arc
    -16,466,817 bytes, 96.4%

> gx-arc optimize --repack --compression-level=maximum "./test/N66_Mods.arc"
[done] Processed: ./test/N66_Mods.arc
    -2,648,977 bytes, 95.4%

> gx-arc optimize --repack --compression-level=maximum "./test/Quests.arc"
[done] Processed: ./test/Quests.arc
    -5,723 bytes, 96.5%

> gx-arc optimize --repack --compression-level=maximum "./test/Text_CH.arc"
[done] Processed: ./test/Text_CH.arc
    -17,766 bytes, 96.6%

> gx-arc optimize --repack --compression-level=maximum "./test/Text_EN.arc"
[done] Processed: ./test/Text_EN.arc
    -53,286 bytes, 90.7%

> gx-arc optimize --repack --compression-level=maximum "./test/Text_RU.arc"
[done] Processed: ./test/Text_RU.arc
    -49,759 bytes, 91.8%

So, result is:
Code: [Select]
            7,007,038 A Few Bug Fixes.arc
          442,564,002 Levels.arc
           55,246,285 N66_Mods.arc
              156,577 Quests.arc
              510,854 Text_CH.arc
              517,399 Text_EN.arc
              556,954 Text_RU.arc
            3,407,914 _DRX_Effects.arc
           40,270,308 _DRX_Meshes.arc
           24,428,175 _DRX_Textures.arc
10 File(s)    574,665,506 bytes

Add: optimize command is similar to archiveTool's compact command, except what it may do all-in-one things. By default it reorder chunks (defragment) and compact archive. Repack option is equivalent to compact, except what on top of this it also attempt to recompress chunks and write recompressed chunk if it smaller. Because it work only with chunks, it never recompress uncompressed files. Rebuild command just create new archive from scratch from source archive by recompressing entries, but again it by default preserve uncompressed entries. But this behavior might be altered by option (--preserve-store=false). Reason of why so complex, is what (tqae) Dialog.arc hold mp3 files in uncompressed form. Saves are pathetic and engine probably optimized to play music without need to uncompress stream.

What gives: 597,600,402 bytes - 574,665,506 bytes => 22,934,896 (96.1%) bytes saved, (so redistributable archives can be made bit smaller, however, it is not so big relative difference, but still). :)


This tool in terms of functions are almost feature complete:
Code: [Select]
> gx-arc --help
gx-arc:
  Glacie Archive Tool

Usage:
  gx-arc [options] [command]

Options:
  --version         Show version information
  -?, -h, --help    Show help and usage information

Commands:
  list, ls <archive>                  Lists contents of archive.
  info <archive>                      Technical information about archive.
  test, verify <archive>              Test integrity of archive.
  extract <archive>                   Extract contents of archive.
  optimize <archive>                  Optimize archive.
  rebuild <archive>                   Rebuild archive.
  add <archive> <input>               Add a file or directory. If a file is already in the archive it will not be
                                      added.
  replace <archive> <input>           Replace a file or directory. If a file is already in the archive it will be
                                      overwritten.
  update <archive> <input>            Update a file or directory. Files will only be added if they are newer than
                                      those already in the archive.
  remove-missing <archive> <input>    Remove the files that are not in the specified inputs.
  remove <archive> <entry>            Remove a file from the archive.

You can see more command options by invoking --help for command:
Code: [Select]
> gx-arc add --help
add:
  Add a file or directory. If a file is already in the archive it will not be added.

Usage:
  gx-arc add [options] <archive> <input>...

Arguments:
  <archive>    Path to ARC file.
  <input>      Input files or directories.

Options:
  --relative-to <relative-to>                            Specifies base directory (entry names will be generated
                                                         relative to this path). [default: .]
  --format <1|3|auto|gd|tq|tqae|tqit>                    Archive file format. Non-automatic value required when you
                                                         create new archive. Valid values are 1 or 3 or use game type
                                                         tags. [default: auto]
  --compression-level <Fastest|Maximum|NoCompression>    Compression level. Valid values from 0 or 'no' (no
                                                         compression), 1..12 from 'fastest' to 'maximum'. [default:
                                                         Maximum]
  --safe-write                                           When enabled, avoid to perform destructive operations.
                                                         [default: True]
  --preserve-case                                        Entry names by default is case-insensitive. This option
                                                         enables creating archives with preserved case. [default:
                                                         False]
  --header-area-size <header-area-size>                  Size of header area. Default is 2048.
  --chunk-size <chunk-size>                              Chunk length. Default is 262144.
  -?, -h, --help                                         Show help and usage information

It behave bit differently than archiveTool because you doesn't need smite own brain to extract archive or something, but be aware about defaults.
Everything by default is resolved against current directory. This behavior altered by options (like --relative-to or for extract you can specify --output-path).

Also by default it uses --safe-write, which causes what for some operations read whole file into memory, modify and then replace file on disk (this is true for optimize command). This allow you break/cancel operations almost without risk to corrupt file. Also in case of bugs it may help a bit. However, normally, this option is not intended to be used. It generally exist and enabled by default to prevent accidental errors or bugs.
For, add/replace/update there is another option for this (which also named --safe-write), which generally doesn't allow overwrite currently in-use chunks or TOC but this results what archive might require compaction after that operations (it might require this after any modifications actually). But it is bit more risky. So... be aware, avoid breaking execution when this not need, or be prepared to corrupted file. (I guess some kind of cancellation can be done... but.)

However, it still not perfect in sense of:

1. Library: There parallel compression is not implemented.
2. Command line tool reports progress, but it currently weirdly implemented (how it written to console), subject to fix.
3. Command line tool almost never show what has been done.
4. Error handling are poor. You will see exception stacktrace instead of error message. It bit related to (2).
5. There is no globbing / it doesn't accept file masks, however it is possible / also subject to implement at some moment.
   E.g. it should be nice to invoke: `gx-arc optimize --repack ./test/**/*.arc` instead of manual invoking of every archive.
6. add/replace/update similarly doesn't accept globs, but their input can be file or directory, so it is less problematique.
7. Think about better CTRL+C handling / cooperative cancellation.

Next thing should be gx-arz which will have some options, like optimizing dbr references, and repacking (which already was done in 0.1), but, I think it is time to add .dbr support, at least tiny helpers, like update arz from dbr file, and unpack archive. And then will back to main goal.

:)
« Last Edit: 02 September 2020, 17:05:03 by lixiss »

Not Yet Rated!

Offline lixiss

  • Member
  • *
  • Topic Author
  • Posts: 41
  • Country: ru
  • Karma: +1/-0
  • Gender: Male
    • View Profile
    • Awards
  • Time Zone: +3
Glacie 0.3: gx-arc, gx-arz now are part of main project
« Reply #2 on: 11 September 2020, 05:17:45 »
0.3 release updates:

glacie-tools made obsoleted, and moving tools content into main project. So, if you interest into command line tools, you should see into releases section of main repository ( https://github.com/lixiss/glacie/releases ).

gx-arc - got mostly cosmetical improvements (better progress meter, better error handling, but generally same functions).

gx-arz - is new tool, but old `gx-arz-optimizer` is now also part of it. I'm done it to be similar with `gx-arc`, however surely this tool do different job.

I'm briefly show what it do:

Code: [Select]
>gx-arz --help
gx-arz:
  Glacie Database Tool

Usage:
  gx-arz [options] [command]

Options:
  --use-libdeflate                                                Use libdeflate for zlib compression
  --log-level <Critical|Debug|Error|Information|Trace|Warning>    Logging level [default: Information]
  --version                                                       Show version information
  -?, -h, --help                                                  Show help and usage information

Commands:
  list, ls <database>                  List contents of database
  info <database>                      Show information about database
  verify <database>                    Verify database integrity
  extract <database>                   Extract contents of database
  optimize <database>                  Optimize database
  build <database> <input>             Build database from dbr files. This command is equivalent to update & remove-misssing comand.
  add <database> <input>               Add records to database. If a file is already in the database it will not be added.
  update <database> <input>            Update records in database. Files will only be added if they are newer than those already in the database.
  replace <database> <input>           Replace records in database. If a file is already in the database it will be overwritten.
  remove-missing <database> <input>    Remove the records that are not in the specified inputs.

Commands:

list - list all record names in arz file.

info - show some techincal info, currently show only file format version (TQ, TQAE or GD) and number of records.

verify - checks file integrity. Check embedded checksums into file... and then just read it as it usually happens (this is bit dirty and not pedantic, but if file broken, it should show error).

extract - extract content of .arz database into .dbr files.

optimize - optimize database. Has some options, like `gx-arz-optimizer` do, but also can be used to just recompress file. (--repack option enables all possible optimizations.)

Updating:
add, update, replace, remove-missing - this commands behave identical to `gx-arc` commands.

build - build is equivalent to update followed by remove-missing.

Let's see few commands more precisely:
extract
Code: [Select]
>gx-arc extract --help
extract:
  Extract contents of archive.

Usage:
  gx-arc extract [options] <archive>

Arguments:
  <archive>    Path to ARC file.

Options:
  --output <output>                         Path to output directory. [default: .]
  --set-last-write-time, --set-timestamp    Restore last write time file attribute from archive. [default: True]
  --use-libdeflate                          Use libdeflate for zlib compression
  -?, -h, --help                            Show help and usage information

So, to extract database you need call: `gx-arz extract my_database.arz --output=./my_dbr_dir`.
I guess it is pretty simple.

add, update, replace, remove-missing, build:

Code: [Select]
>gx-arz build --help
build:
  Build database from dbr files. This command is equivalent to update & remove-misssing comand.

Usage:
  gx-arz build [options] <database> <input>...

Arguments:
  <database>    Path to database (.arz) file
  <input>       Input .dbr files or directories.

Options:
  --definitions, --templates <definitions>                          Path to record definitions (templates). This value might be: directory with .tpl files, path to .arc or .zip file with template files or path to .arz database which will be used to as source of ephemeral record definitions. When this option is not specified, then <database> argument is used as record definition source.
  --output <output>                                                 Path to output database file. If not specified, input database will be replaced
  --relative-to <relative-to>                                       Specifies base directory (record names will be generated relative to this path). [default: .]
  --format <auto|gd|tq|tqae|tqit>                                   Archive file format. Non-automatic value required when you create new archive. Valid values are game type tags. [default: Automatic]
  --compression-level <Fastest|Maximum|NoCompression>               Compression level. Valid values from 1..12 from 'fastest' to 'maximum'. [default: Maximum]
  --checksum                                                        Calculate checksums [default: True]
  -mp, --parallelize                                                Use parallel decompression/compression [default: True]
  -mdop, --max-degree-of-parallelism <max-degree-of-parallelism>    Max degree of parallelism. By default it is equal to number of logical processors. [default: -1]
  --safe-write                                                      When enabled, perform all operations over database in-memory, and write database content to disk only when done. This requires more memory, but database will not be corrupted if you break or cancel operation. When disabled - perform in-place database updates. By default it is enabled if you doesn't specify output path, and disabled if you specify it. [default: False]
  --preserve-case                                                   Record names by default is case-insensitive and stored in lower-case. This option enables creating records with preserved case. [default: False]
  --use-libdeflate                                                  Use libdeflate for zlib compression
  --log-level <Critical|Debug|Error|Information|Trace|Warning>      Logging level [default: Information]
  -?, -h, --help                                                    Show help and usage information

Well, so this command can update existing .arz file with some source .dbr files. This commands also can create new database file, but in this case you should specify format (by providing option --format=tqae).

Assuming what "./my_dbr_dir" you have .dbr files in "normal" directory structure (e.g. under this directory you should have /records):

gx-arz update target.arz ./my_dbr_dir - THIS IS INVALID: because records names will be resolved relative to current directory.

gx-arz update target.arz --relative-to=/my_dbr_dir ./my_dbr_dir - updates target.arz database with right directory layout.
gx-arz update target.arz --relative-to=/my_dbr_dir ./my_dbr_dir/records - updates target.arz database with right directory layout.

gx-arz update new.arz --format=tqae --definitions=game/database/database.arz --relative-to=/my_dbr_dir ./my_dbr_dir/records - create or update new.arz file by using game's database as source of record definitions.

gx-arz update target.arz --output=new.arz --relative-to=/my_dbr_dir ./my_dbr_dir/records - updates target.arz database with right directory layout, but save resulting .arz file as new.arz. (record definitions comes from target.arz in this case)

NOTE: Despite --definitions/--templates description, currently you can specify only .arz file as source of record definitions.
This version still no have support of original templates (but eventually should).
However, you actually doesn't need them to modify any existing arz files, because this tool create ephemeral record definitions from specified .arz file, which is sufficient to making modifications.


Not Yet Rated!

Offline Accolon

  • New Member
  • *
  • Posts: 1
  • Country: ru
  • Karma: +0/-0
  • Gender: Male
    • View Profile
    • Awards
  • Time Zone: ?
Re: Glacie 0.1.0: library for manipulation game files
« Reply #3 on: 14 September 2020, 05:48:22 »
Thank you so much, Lixiss! It's work! :D
There is command string from my gx-arz.cmd:
Quote
C:\Games\_wrk\glacie-cli-0.3.0\gx-arz.exe update "C:\Games\Titan Quest\Database\database.arz" --relative-to="C:\Games\Titan Quest\wrk\Soulvizier_AERA v1.7b" "C:\Games\Titan Quest\wrk\Soulvizier_AERA v1.7b\records\item\merchants\greece\01_market_startingtown_general.dbr"
pause
and the prices in "01_market_startingtown_general.dbr" has been change, yess.
But first, I had to download and install ".NET Core 3.1 Desktop Runtime (v3.1.8)" for my Windows 7.

Note by Admin - Use link on own risk - https://dotnet.microsoft.com/download/dotnet-core/thank-you/runtime-desktop-3.1.8-windows-x64-installer
« Last Edit: 19 September 2020, 11:47:36 by efko »

Not Yet Rated!

Offline lixiss

  • Member
  • *
  • Topic Author
  • Posts: 41
  • Country: ru
  • Karma: +1/-0
  • Gender: Male
    • View Profile
    • Awards
  • Time Zone: +3
Re: Glacie 0.1.0: library for manipulation game files
« Reply #4 on: 19 September 2020, 07:16:14 »
On the road to goal, there is intermediate update... 0.3.1-alpha.2.

gx-arz now understand --metadata option, which can be path to directory with .tpl files, game directory, or path to templates.arc file or path to some.zip file which contain .tpl files. You still can specify .arz file as ephemeral metadata source, however, you can use additional option --metadata-fallback for this. Old options (--definitions, --templates removed.)

Some known minor fixes for typos or nonsense in .tpl files is applied internally, so it able to consume standard tqae or gd or derived templates without need fix them manually. However, i prefer fix things in first place.

glacie-cli-0.3.1-alpha.2.7z - is "anycpu" package, requires .NET Core 3.1 Runtime to be installed to run.

glacie-cli-0.3.1-alpha.2-win-x64.7z - is self-contained package, doesn't requires separate .NET Core Runtime installation.

Is not very well tested, but should work.

PS: Note, what gx-arz currently doesn't sort fields in arz file, like ArtManager do. I'm doesn't believe it matters, so currently not implemented.

PPS: Almost ready for making actual / useful API (gx-arz/gx-arc is just side-tools utilizing few components connected together). However on the road on this i'm plan create gx validate command which will verify records. Initially it will be only based on data types, but eventually it should check presence of referenced resources and some logical things. However bit more components need for this. :)
« Last Edit: 19 September 2020, 15:55:27 by lixiss »

Tags:
 


SimplePortal 2.3.7 © 2008-2020, SimplePortal