Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] HTML output like pytest-cov #193

Open
MartinThoma opened this issue Dec 11, 2020 · 11 comments
Open

[Feature Request] HTML output like pytest-cov #193

MartinThoma opened this issue Dec 11, 2020 · 11 comments

Comments

@MartinThoma
Copy link

I would like to have a colorized output which shows all lines of code. The lines which contain surviving mutants should be colored read, the ones which contain killed mutants green (with the option to remove the highlighting via JavaScript).

I can generate an example with pytest-cov if that helps to communicate the idea what I'm looking for.

Would this be something you would also like?

@boxed
Copy link
Owner

boxed commented Dec 11, 2020

It's been suggested before. The problem I have with it is that it sounds good, and people would like it, but when they have it.. then what? Isn't it actually worthless? The only thing that actually hold value to kill mutants is to have the report we already have I think.

Please tell me where my logic is flawed.

@MartinThoma
Copy link
Author

I'm currently running mutmut over a non-public project. It checks over 10k mutants. The structure of that project ... let's say, it could be improved. There are files that have A LOT of lines. The report generated by mutmut html lists many lines multiple times. The only thing I'm interested in per line is "did the test suite check this line properly?" - so if I see at least one failing mutant. In some cases, I want to see which one it is. But most of the time it's actually just "did I miss something here". It doesn't help me to see that line it multiple times. As there are so many things to be improved, I want to be able to prioritize the ones with the most impact. If I had just the whole code with highlighted lines, I could navigate the non-killed mutants more quickly and find issues more quickly.

Many mutants are showing not super interesting flaws of the test suite. But some are very important. By finding the important ones quickly I would also have an easier time to show my team the value of mutmut with a minimal pull request that addresses those.

@MartinThoma
Copy link
Author

By the way: When I asked the question, I didn't know that mutmut html existed. When I enter mutmut --help it doesn't show up:

$ mutmut --help
Usage: mutmut [OPTIONS] [COMMAND] [ARGUMENT] [ARGUMENT2]

  commands:

      run [mutation id]

          Runs mutmut. You probably want to start with just trying this. If
          you supply a mutation ID mutmut will check just this mutant.

      results

          Print the results.

      apply [mutation id]

          Apply a mutation on disk.

      show [mutation id]

          Show a mutation diff.

      show [path to file]

          Show all mutation diffs for this file.

      junitxml

          Show a mutation diff with junitxml format.

Options:
  --paths-to-mutate TEXT
  --paths-to-exclude TEXT
  --backup / --no-backup
  --runner TEXT
  --use-coverage
  --use-patch-file TEXT           Only mutate lines added/changed in the given
                                  patch file

  --tests-dir TEXT
  -m, --test-time-multiplier FLOAT
  -b, --test-time-base FLOAT
  -s, --swallow-output            turn off output capture
  --dict-synonyms TEXT
  --cache-only
  --version
  --suspicious-policy [ignore|skipped|error|failure]
  --untested-policy [ignore|skipped|error|failure]
  --pre-mutation TEXT
  --post-mutation TEXT
  -h, --help                      Show this message and exit.

I've used a far worse method which wrote all tested mutants in a single HTML file. I had a hard time just opening that file in my browser. It also listed killed mutants.

@boxed
Copy link
Owner

boxed commented Dec 13, 2020

I guess it would make sense in your scenario to have a coverage-like report with green (covered), yellow (some covered, some not) and red (not covered)?

@MartinThoma
Copy link
Author

I would absolutely love that ❤️ 😄

@MartinThoma
Copy link
Author

Do you know why the html command is not shown? Are there other hidden gems?

@boxed
Copy link
Owner

boxed commented Dec 13, 2020

Because I misuse click in mutmut quite badly. It should be using click subcommands but I didn't get that to work when I tried many years ago.

@boxed
Copy link
Owner

boxed commented Dec 13, 2020

I believe such an html report would be very simple as you can just read the file/line numbers from the cache and use that.

@ejocharcole
Copy link

ejocharcole commented Mar 30, 2021

I'm new to 'mutmut' so was interested to see this issue. Have you used http://pitest.org at all? it produces Html rather like is described here. I've been using it for years and it's very helpful in terms of formatting.

Basically for each class you get a html file of the entire class, and on each line you have different colours for
mutants killed/survived/uncovered code.
and you have a link to the footer, which shows you which mutations ran on the line.

so when say, you're working on a new branch, you just compare to the master branch report to start with- and as long as there's no more red (ideally all green:) you know you're in a good place.

apologies if this has already been discussed elsewhere, I know this issue is a few months old:)

I'm coming to Python from Java so very pleased to find another good mutation tool, and interested to see the differences in approach when it comes to reporting etc

@boxed
Copy link
Owner

boxed commented Mar 31, 2021

Personally I don't find an html report actually useful. It's interesting and all, but ultimately pointless for the work, in my opinion. The html report in mutmut is very bare bones and quite slow to generate. I will accept PRs to improve it if others feel differently than me though.

cleder pushed a commit to cleder/mutmut that referenced this issue May 30, 2021
@cleder
Copy link

cleder commented May 30, 2021

@MartinThoma @ejocharcole have a look at #211

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants