We have a solution that contains a series of projects used for our tests. One of those projects contains a set of files that are needed for the tests. Those files are schemas that will get validated against every time an API route is called. Tests, naturally, call one or more API routes.
The solution has a .testsettings file. This file has deployment enabled, and it specifies that these schemas need to be deployed. In addition, every schema file is set to Copy Always. Also. the .testsettings file is in the solution, under Solution Items.
The problem is that the .testsettings file is only occasionally respected. Sometimes the files are copied; sometimes they are not. When they don't copy, we can do the following to fox it:
Go to the Test -> Test Settings menu and Choose Select Test Settings
Select the .testsettings file in the solution
Rebuild the solution
Rerun the tests
This usually works at least once. But inevitably, it stops working and the files aren't deployed again.
Note that when you go to the Test -> Test Settings menu, our current .testsettings file is always already checked. So choosing a new .testsettings file just means choosing the one that the UI already says is chosen.
We thought of going the DeploymentItem route, but this is impractical for a two reasons, surrounding code maintenance.
From what I can tell, DeploymentItem can only be placed on individual tests. With literally hundreds of tests, we'd be sprinkling it everywhere. It'd become a code maintenance nightmare. I thought of placing it on the global TestInitialize method, but that would just re-copy the files every time a test is run, which seems unnecessary. Not to mention that I'd have to put literally dozens of DeploymentItem attributes on the method and we'd need to keep that up-to-date every time a new schema is added.
Related to that, adding new schemas means altering existing tests where necessary. Again, we have hundreds of tests.
A far better solution would be to have the files copied over once, and then have the code look in the communal pool of schemas whenever it needs one.
I also looked at replacing .testsettings with .runsettings, but it doesn't seem to have a DeploymentEnabled node in the XML, and where that option exists it's again specific to DeploymentEnabled.
Does anyone have a solution to this, or does anyone know if it's a known bug? Schema validation happens behind the scenes -- individual test authors don't have to explicitly call it -- and it doesn't fail the test if it doesn't occur (we don't always have schemas available yet for every API call, so we don't want to fail the test if that's the case). As a result, it's often difficult to immediately ascertain whether or not validation happened. Because of that, we sometimes get false passes on tests where the schema actually broke, all because the .testsettings file didn't actually deploy our files like it's set to.
So I found out the problem: apparently this issue was fixed in Visual Studio 2015 Update 3. We were using Update 2. Once we got the new update, this problem went away.
Related
I'm new on a project and the building is quite slow.
Now I see as a postbuild event the next action for a lot of projects:
<PostBuildEvent>rd "$(ProjectDir)obj" /S /Q</PostBuildEvent>
I've read that the obj folder keeps track of the builds so incremental builds can be faster, so I thought maybe this has something to do with it.
However, nobody in my team know why this is done, the removal of this folder, so I'm a bit hesitant to just remove the build action.
What can be a reason to perform this action?
A couple of things come to mind (all rather questionable by themselves):
Custom build steps in the same, or - God forbid - other project that requires it (for the next build to succeed).
A (misguided) attempt to preserve disk space (since all "precious" is in "bin" after the build you technically don't need "obj").
A (misguided) attempt to implement "clean, clobber, etc."-semantics
One needs more information about the complete build system, other projects, etc. you have in place to find out more or better reasons - if at all ;-)
The single possible reason to perform such kind of action is lack of knowledge about power of MSBuild utility.
I believe that target requirement (if it exist) could be achived another way, which will not omit the incremental build feature.
Try to find the author of that string in VCS you are using, and if author is unavailable or could not answer the question, warn your colleagues and remove it and see what happens.
There is a bug in Visual Studio where if you move the obj directory with the IntermediateOutputPath defined in the project file then the compiler still creates an empty obj directory any way. I do both myself, but with VS2010. If VS2015 has this fixed you may be able to remove it.
Having upgraded a large project from VS2008 to VS2013, a large number of unit tests are now failing because the associated data file cannot be found. The original DataSource attributes were created by the test connection string wizard that VS2008 provided, but this is no longer available in VS2013 Pro. The data files are definitely there, in exactly the same place in the solution, and all have the properties set to Copy Always. I suspect that the required arguments to the DataSource attribute have subtly changed but the MSDN documentation offers little help in this respect.
The error is:
Result Message: The unit test adapter failed to connect to the data
source or to read the data. For more information on troubleshooting
this error, see "Troubleshooting Data-Driven Unit Tests"
(http://go.microsoft.com/fwlink/?LinkId=62412) in the MSDN Library.
Error details: The Microsoft Jet database engine could not find the
object 'MatrixSampleResultGrid_ExcludeHiddenResults.csv'. Make sure
the object exists and that you spell its name and the path name
correctly.
Previously this error has always been reported because the data file has been moved or renamed without the attribute being updated, but that is definitely not the case here.
This is a typical current DataSource attribute definition:
[DataSource("Microsoft.VisualStudio.TestTools.DataSource.CSV", "|DataDirectory|\\MatrixSampleResultGrid_ExcludeHiddenResults.csv", "MatrixSampleResultGrid_ExcludeHiddenResults#csv", DataAccessMethod.Sequential)]
In the VS solution (ie on disk) the actual path to the datafile is
[theProjectRootFolder]\TestData\MatrixSampleResultGrid_ExcludeHiddenResults.csv
The test results are published to
[theProjectRootFolder]\TestResults\[testrun_datetimestamp]\In and ...\Out
although I notice that none of the data files have been copied to the In or Out folders. Is that significant?
With VS2008 these attributes have worked unchanged every day for years, so I can only conclude that for VS2013 the data is no longer appropriate, but what has changed? Without that wizard I can't even reconstruct the attribute so I am at a loss.
Also, I don't know what location "|DataDirectory|" represents in the context of a test run.
Can anyone help?
TIA.
I have figured out the solution to why all our VS2008 data-driven tests fail to find their data files in VS2013. Having spent hours reading all the MSDN documentation I could find on unit testing and TDD in VS with absolutely no illumination (useless!), I am posting the solution here to save others in a similar situation from all the pain:
The VS2013 test framework seems to have different rules on where to look for the data file (different to VS2008, that is). Either we were inadvertently employing an ‘undocumented feature’ in VS2008 which no longer works, or MS have simply changed it. I don’t know which, but it has changed.
However the fix is simple once you have tumbled to the cause. Assuming the data file is in a subfolder of the test project folder (ex [projectfolder]\TestData) , the old VS2008 test attributes:
(ex)
[DataSource("Microsoft.VisualStudio.TestTools.DataSource.CSV", "|DataDirectory|\\MyDataFile.csv", " MyDataFile#csv", DataAccessMethod.Sequential)][DeploymentItem("Test Projects\\Project1Tests\\TestData\\MyDataFile.csv"), TestMethod()]
need to be amended to
[DataSource("Microsoft.VisualStudio.TestTools.DataSource.CSV", "MyDataFile.csv", "MyDataFile#csv", DataAccessMethod.Sequential), DeploymentItem("TestData\\MyDataFile.csv"), TestMethod]
The changes are
The second DataSource argument is just the filename (not a relative path), and
The DeploymentItem argument is a path relative to the project folder that contains the tests.
Also (this has been documented elsewhere on this forum)
The datafile properties must be set to BuildAction=none (or the default blank), Copy To Output Directory=Copy Always.
and you have to have a TestSettings configuration in the solution with Deployment checked on.
Do all that and the old VS2008 data driven tests will magically all start finding their datafiles.
I have created a number of (separate) CodedUI projects within Visual Studio 2013, in order to test the basic functions of a website.
Test-cases are created as separate projects, as I expect some (or all?) of them to change over time, and thus to ensure 'modularity' of capture for ease of subsequent maintenance.
Now, I see I can easily create an Ordered Test within each project, which will allow the same test-case to be run and re-run as many times as I wish; But, it's not obvious to me how I can create an Ordered Test whereby I can add different test-cases created as different projects. Certainly, not directly.
Is it possible?
Also, can I rename the Ordered Test list and save it to a separate folder where I can store differing Ordered Tests to test functionality, as I wish?
Ideally, I'd like to create an Ordered Test external to any specific project, so I can go into any project I wish and add whatever tests I wish, as the test-environment is always the same.
you should have created a single project for your application. To ensure 'modularity', coded ui has given us the option of creating different UI Maps within a same project. Each UI MAP will represent a module of your application. This approach will give you easy maintenance and it will also help you to create ordered test cases which contain test cases from different UI Maps.
for more details please see this link
https://msdn.microsoft.com/en-us/library/ff398056.aspx
Thanks
Yes, I sort of see that. And I guess it's easy enough to move the code to become separate 'solutions' within a 'project'.
However, I want to work with TFS server too, so will look at the MTM route as well.
But it may be that I need my captured CodedUI to be 'solutions' within a single project too - though I really want my modules to be 'stand-alone' projects for safe-keeping.
Will investigate further.
When I right-click my solution in the Solution Explorer and choose Properties I get a dialog where I can select the Startup Project.
I sometimes select Current selection (If it is an experimental solution with lots of projects I jump between), but most often it is a Single startup project selected, which would usually be the main WinForms applications or or Console application.
My problem is that whenever I do a treeclean with the tfpt command (Team Foundation Power Tools 2008) this setting is forgotten. So when I try to run my solution the next time, it has defaulted to some random project and I get an error stating that I cannot run a class library or something like that. Which is obvious of course. But where is this setting stored? Why is it forgotten when I do the treeclean? The solution file is still there, right? Isn't solution properties stored there?
Reference 1
Arian Kulp says:
I was struggling with trying to figure
out why a certain solution of mine
wasn’t starting right. It was in VB
with four projects. Upon initial open
it would set a certain project with a
DLL output as startup. If I set the
EXE as startup project, it was fine,
but when I distribute code I always
clean it by removing *.suo and *.user
files, and bin/obj folders. Upon
opening the “cleaned” version, it
would always revert to the DLL project
and fail to F5 nicely. The fix turned
out to be simple, though I’m curious
as to why I needed to do this at all.
In the solution file, there are a list
of pseudo-XML “Project” entries. It
turns out that whatever is the first
one ends up as the Startup Project,
unless it’s overridden in the suo
file. Argh. I just rearranged the
order in the file and it’s good.
I’m guessing that C# is the same way
but I didn’t test it. I hope that
this helps someone!
Reference 2
Setting the StartUp Project
Which project is the "startup" project only has any relevance for debugging, which means it's user metadata from the point of the solution and the projects. Regardless of which project is the "startup" project, the compiled code is the same.
Because of this, the information is stored as a user setting in the Solution User Options file (solution.suo) which accompanies the Solution file (solution.sln). The .suo file "Records all of the options that you might associate with your solution so that each time you open it, it includes customizations that you have made" according to MSDN.
The .suo file is a binary file. If you want to read or change it programatically, you have to use IVsPersistSolutionOpts.LoadUserOptions from the Microsoft.VisualStudio.Shell.Interop namespace.
I suspect that this setting is saved as part of the .suo file created whenever you edit a solution file. This file contains various user settings, such as breakpoints, watch data etc.
I cannot confirm this but that would be my guess.
Unfortunately its not XML its a binary file and not easily edited.
I just wrote a little command line utility for windows called slnStartupProject to solve this. It sets the Startup Project automatically like this:
slnStartupProject slnFilename projectName
I personally use it to set the project after generating the solution with cmake that always sets a dummy ALL_BUILD project as the first project in the solution.
The source is on github:
https://github.com/michaKFromParis/slnStartupProject
Forks and feedbacks are welcome.
Hope this helps!
I have experienced an annoying issue with Visual Studio 2005... sometimes when I rebuild, and even if I do a Rebuild Solution, it will come back with no errors or warnings, but then when I later edit another code file, even without changing it, and rebuild, it will find an error or warning in that other file. Clearly, the earlier Rebuild Solution did not recompile that file! How can I force VS to completely recompile every file?
I've seen this happen before when you have multiple projects in your solution and the references get mixed up.
Say you have four projects in your solution, Common, Business, Data, and UI. Assume that Common is referenced by the other three projects.
What we want is for Common to be a "project reference" from the other three projects - they'll then pick up their copy from the build output directory of Common.
But, sometimes, one of the projects will get it's reference mixed up. Say, in this case, that UI starts referencing the copy of Common in the build output directory of Data. Now, any change that compiles "UI" without also compiling "Data" will result in two, possibly incompatible, versions of "Common" being a dependency of UI.
Another scenario is where the reference is to a binary, such as from a "lib" directory. Then, one of the projects ends up referring to a build output location instead of lib.
I don't know what causes this - but I see it all the time, unfortunately.
The fix is to go through the references of each project and find the one (or more) that point to the wrong place.
It might help to clean the solution prior to rebuilding -- right click on the solution in the Solution Explorer and choose "clean solution" -- this deletes temporary files and is supposed to clear out the bin and obj folders, so everything is rebuilt.
I'm with Guy Starbuck here, but would add that Rebuild Solution is supposed to do a Clean Solution followed by Build Solution, which should, then, have solved your issue to begin with. But VS 2005 can be terrible in this regard. Sometimes it just starts working after several rebuilds. If upgrading to 2008 isn't an option, consider manually clearing the bin folder.
Is this related to the Configuration Manager? There you can select which projects in your solution build. Not sure if this helps.
Depending on the types of warnings it is not possible if I recall correctly.
For example, warning messages for XHTML compliance are ONLY displayed when the file is open. You might check the tolerance settings inside VS to see if you can change it.
This sounds strange - Rebuild should build everything regardless of changes and Build should only build things that have changed.
The behaviour you've described should only happen if you have modified something that is referenced by the unchanged file so that it is now incorrect.