Musings about tests

fronobulax

Developer
Staff member
1) There are disabled Commerce Ghost tests because we lack HTML that captures what we are testing for. Anyone with a ghost want to help?
2) There are three disabled tests that access the KoL Wiki. Should we try and capture the wiki data and refactor the code so that some of it can be exercised without using a network? Maybe we just delete the tests instead, acknowledging we have learned much about testing in several months and they aren't as useful as hoped?
3) One of those tests fails anyway because the Wiki returns malformed JSON. (So the corresponding mafia code will not do anything useful). Does anybody have contacts at Coldfront who might be willing and able to fix it? The Wiki page displays just fine. It is the JSON of that page as returned from an api that needs attention.
4) There are several debug commands that will load HTML from a local file and then use KoLmafia to parse it. The use case is programmers trying to tweak their code, new content detection and validation of data files. Is it worthwhile to collect some of this "test html" and then use it in a test that just loaded it and ran the appropriate "debug" command?

@gausie @MCroft @Ryo_Sangnoir ?
 

fronobulax

Developer
Staff member
P.S. Some of the language server tests have returned to the state where expected stack traces appear in output.
 

fronobulax

Developer
Staff member
P.P.S. FamiliarDatatest.java is directly under test in the source tree. Perhaps it should be moved elsewhere, test\net\sourceforge\kolmafia being the place where tests that don't really seem to fit elsewhere reside?
 

Veracity

Developer
Staff member
P.P.S. FamiliarDatatest.java is directly under test in the source tree. Perhaps it should be moved elsewhere, test\net\sourceforge\kolmafia being the place where tests that don't really seem to fit elsewhere reside?
Considering that FamiliarData.java is under src/net/sourceforge/kolmafia, surely FamiliarDataTest.java belongs under test/net/sourceforge/kolmafia?
 

heeheehee

Developer
Staff member
Veracity's point is the only explicit guidance I'd like us to stick with as far as test placement in the tree, namely that the directory structure of test/ should mirror that of src/.

Arguably the general-purpose tests should be moved elsewhere, e.g. an integrationtests/ top-level directory.
 

gausie

D̰͕̝͚̤̥̙̐̇̑͗̒e͍͔͎͈͔ͥ̉̔̅́̈l̠̪̜͓̲ͧ̍̈́͛v̻̾ͤe͗̃ͥ̐̊ͬp̔͒ͪ
Staff member
Considering that FamiliarData.java is under src/net/sourceforge/kolmafia, surely FamiliarDataTest.java belongs under test/net/sourceforge/kolmafia?
This was just a mistake, sorry
 

Ryo_Sangnoir

Developer
Staff member
3) One of those tests fails anyway because the Wiki returns malformed JSON. (So the corresponding mafia code will not do anything useful). Does anybody have contacts at Coldfront who might be willing and able to fix it? The Wiki page displays just fine. It is the JSON of that page as returned from an api that needs attention.
I find that Nightvol still tends to respond to email. What's the page, and what's the problem?
 

Ryo_Sangnoir

Developer
Staff member
Thanks.

That looks to be in the name "papier-mâchete". I'm guessing the right move is to escape the &.

Running the test I get a different error message: "White spaces are required between publicId and systemId."
 

fronobulax

Developer
Staff member
Thanks.

That looks to be in the name "papier-mâchete". I'm guessing the right move is to escape the &.

Running the test I get a different error message: "White spaces are required between publicId and systemId."

I got that from the test too but the core issue for me was that Coldfront was returning what mafia (correctly) thinks is malformed JSON.
 

MCroft

Developer
Staff member
That looks to be in the name "papier-mâchete". I'm guessing the right move is to escape the &.

Running the test I get a different error message: "White spaces are required between publicId and systemId."

I'm hoping the right move is to resolve the entity. We have places where we don't (such as the Gear Changer Pane option selects) and places where we do (such as the inventory manager tables). Ideally our tools support entities or else we resolve them prior to handing them to the tools.
 

fronobulax

Developer
Staff member
From the gCLI said:
> checkpulverization

Checking pulverization data...
Connecting to Well-Tempered Anvil...
Failed to parse XML document from "http://kol.coldfront.net/tools/anvil/export_data.php": White spaces are required between publicId and systemId.
Pulverization data checked.

gCLI, not test.

Since I get an error when I try and look at the same document fetched by a browser my first response is to see if the provider is willing and able to fix the error.

If that fails then I look for some kind of workaround.

We use a DocumentBuilder to parse what we get. It might be possible to insert something so that we resolve the one entity. We could certainly call this a Bug Report against checkpulverization.
 

Veracity

Developer
Staff member
I am now a firm believer in tests.

I whacked how StorageRequest generated multiple requests. Only 11 items per Kol request, sure - but also filter/munge based on how many are available, in Ronin, etc.

I wrote code and then wrote tests to enforce what I expected - and found at least 3 failures due to bugs in my code.

A pure “test first” methodology would have me write the tests first and then code until it passed the tests. Honestly, same result.

(I need one more test - for one path not tested, but which works.)

(I also tested via actual requests how KoL responded to various scenerios, and coded my request generation to actually Work with KoL.

THAT kind of testing is hard (impossible?) to codify: these are KoL requirements; do we fulfill them?

Something which got me is that the only way I know to test is “gradlew test”. That spends at least 40 seconds timing out in some FileMonitor test including a stack trace.

How can I do

gradlew test StorageRequestTest

For example? Just run my test?
 
Last edited:

MCroft

Developer
Staff member
If there's just one test or test pattern you want to exclude, it can be done from the cli

$ gradlew test -PexcludeTests="**/bar*, **/foo.SomeTest*"
 

fronobulax

Developer
Staff member
I use IntelliJ. I have finally gotten it configured so that I can right click on a test method or on the name of the class name. When I do so I have the option of running the single test or all of the tests in the class or debugging the same. That lets me run just what I am interested in. I often trace into the KoLmafia code to craft my test and am overly concerned about coverage as opposed to functionality. But then Test Driven Design was something that never worked for me as well as I wished it had.
 

gausie

D̰͕̝͚̤̥̙̐̇̑͗̒e͍͔͎͈͔ͥ̉̔̅́̈l̠̪̜͓̲ͧ̍̈́͛v̻̾ͤe͗̃ͥ̐̊ͬp̔͒ͪ
Staff member
We are at a stage of adding tests where coverage is a really useful metric. It'll get less useful as we add more tests, but for now it can show you what branches you've missed without expecting you to do silly things.
 

heeheehee

Developer
Staff member
But then Test Driven Design was something that never worked for me as well as I wished it had.
I've noted on many occasions in many different forums that I find TDD most useful for bugfixes. Write a reproducing test case, confirm it fails, write your fix, confirm it passes.
 
Top