What's going on here
What's going on here
There is very little activity here, no source code and and beta that is over 5 months old. Is anyone working on anything?
Bob
Re: What's going on here
In the absence of Peter replying himself I'll chip in what I know.
The official NCover 1.5.7 release is pretty "imminent" - I know Peter had hoped to release it this week but we've found a couple of things to sort through first. I've been helping Peter with testing it and teasing out some of the corner cases, making sure it behaves nicely with TestDriven.Net, NCoverExplorer etc. I'm hoping to continue to thrash it over the next few days and as soon as Peter can free himself up from his other commitments to discuss and make the changes then no doubt a release shall be here at long last...
I'm just as keen to get my hands on the next version as anyone - the list of known issues with all the prior NCover versions hasn't made it the easiest product to get consistent results out of. Peter has done a lot of work with the core to assist with that, so once that is solid I imagine that more focus can go back on extending the features like configurable timeouts for IIS etc that people are asking for.
So yes - it is being worked on...
Re: results differ when testing full range of tests and subset o
Hugh,
I'm afraid I probably can't give you a useful reason why or how to resolve this but this does sound slightly similar to what I have seen others experience, as per the "merge issue" noted in these forums and my blog. In their cases we found that NCover 1.5.4 was omitting some methods from the xml output. My assumption at the time was those methods were not called but that was only a guess having not experienced the issue personally. That is a known issue, but until someone can come up with a good repro case its going to be difficult for Peter to fix I would imagine, unless he has some suspicions already.
NCoverExplorer calculates the percentages based on the methods listed in the file - so if its getting inconsistent coverage.xml inputs you would expect the percentages to differ.
However I would like to be sure it is not an NCoverExplorer issue! It would help if either you can send me your coverage.xml files (I dont need any source code and I keep anything sent confidential) or you can check it out for yourself. The files are located in C:\Documents and Settings[username]\Application Data\Mutant Design\TestDriven.Net 2.0\Coverage[Solution Name].
Run your test combinations again but copy out the coverage.xml files after each run and diff them for the "AppClass1" module sections and let me know what you find. My e-mail is under Help->Send Feedback in NCoverExplorer.
You could also try the coverage using the command line or NAnt/MSBuild tasks (on my site) to compare what that gives you to see if there is some pattern to this. If you are feeling really keen you could try to narrow down what causes it - commenting out tests and/or code until you find some factor that makes a difference between consistent percentages and not. At least with TD.Net its a pretty easy right-click exercise to get the results quickly! We dont have .Net 2.0 at work so I'm still using NCover 1.3.3 or else I would have tried to nut this out myself had I experienced it.
Good luck,
Grant.
http://www.kiwidude.com/blog/
Re: results differ when testing full range of tests and subset o
Hi Grant!
Thanks for the reply. Time (and managerial influence) prevents me from doing much investigation - they are happy if it works as well as it does.
I'm zipping up the coverage files to send to you via your email address - they should arrive shortly...
Hugh
Re: NCover.org and DLL's?
I'm new to NCoverage, and haven't even got it working yet, but think about it. Your code has to execute for NCover to "do it's thing". DLLs do not execute on their own. You most logical thing, I believe, it to write NUnint tests, then execute them. Or you would have to write some type of test harness program.
Neal Walters
http://VBScript-Training.com
http://Biztalk-Training.com
http://Sharepoint-Training.com