Even though I execute same msbuild batch file with same source code which is a solution of visual studio 2013, I get different files as result.
I'm using msbuild command batch file to make deploy files. One PC has installed DevExpress, and another PC hasn't, and those 2 PCs make different build files. PC without DevExpress includes extra dll files which are not even refered in solution. After I removed DevExpress, the results became same. But I don't understand why.
Does anyone know why this happened?
Related
I recently created a program in Visual Studio that I'd like to share with the world, up untill this point all of the programming I did was only for personal use so I have no experience preparing a program for release.
I found some information about building for release so I tried it but I ended up with 5 different files (3 of which are needed for succesfull execution):
the .exe file
a .dll file of the same name
a .pdb file of the same name (not needed for succesfull execution)
a runtimeconfig.json file
a deps.json file (not needed for succesfull execution)
I'd like to just distribute a single .exe file, like a portable version. Is this possible?
I am trying to distribute a program in our small business environment. I tried to choose therefore ClickOnce in Visual Studio 2019 and DotNet 5. However, if I open the EXE from the published folder, it tells me that a DLL is missing. After adding this one, it tells me another DLL is missing, aso. In contrast, if I publish to a local folder, everything works as expected.
What may I do wrong? It seems like the missing DLLs have something to do with PowerShell automatization (Microsoft.Management.Infrastructure) which as I understood is only available as either x64 or x86. I tried to restrict my program to x64, however, without any success.
If you know any other simple distribution method for a small business (all pcs conneted locally) I am very happy and thankfully.
The solution was simple. Missing dependencies can be added here, during the setup for ClickOnce export creation.
Hi guys i'm new both to this site and to testing and i'm having trouble finding solutions to this problem.
My current project produces a .DLL file as its build and im looking to use visual studio to automate testing on it every time a new build kicks off.
To run the program a .exe must be triggered in the same directory as the newly created .dll this isn't a problem and wouldn't need automating except i need to kick of 16 different variations of it using different config files and separate machines on a physical network for each variation.
Is there anyway to do this using visual studio 2010 ultimate and MTM?
I have looked into generic testing but it runs the exe without moving the new .DLL to the working directory any ideas?
Thanks in advance.
I haven't used VS 2010, but I know in 2008, you can specify Post-Build actions in the project properties, that you could use to copy the output where you need it to go. I would give you more details, but I'm not at work to look at the interface at the moment.
We have some excluded files in a web application project.
One developer's machine builds the project fine.
Another developer's machine sees the excluded files and decides to compile them, therefore throwing compilation errors related to those files.
Both are using VS2010 SP1.
I am not aware of any compilation options that could cause this difference. Any ideas?
Do they both share the same .sln and .csproj file? (assuming its a c# project).
There is a compile cs content flag which can be set on files (right click on the file and select properties). Possibly one has one set on one machine, and the other dev has it set differently on a different machine.
Another issue (assuming its not just standard .cs code) is that there's a custom tool which is used for compilation. For example DevArt Entity Designer uses .edml extensions. If the tool isn't installed it just treats these files as content, if the tool is installed treats them as compilable/generatable.
I have a (C++) project that I originally developed under Linux using make to build it. I would like to also have it run in Windows and am using Visual Studio 2005 to build it. The problem I'm running into is that Visual Studio places all objects into the same output directory. This doesn't work for me because I have source files with the same name in different sub-directories. This means that the last object (with the common name) overwrites all previous ones and thus I get link errors for the "missing" code.
Setting the output directory (even using the Visual Studio variables like $(InputDir)) at the project level doesn't work because Visual Studio passes all of the source files to cl.exe at once (i.e. $(InputDir) is evaluated once rather than for each input file). It appears that this problem can be solved by manually setting the output directory for each file (at a minimum, for the affected files), but this is less than optimal and error-prone.
Is there a better way to solve this problem? I would even appreciate it even if someone is able to suggest a way to get Visual Studio to compile files one-at-a-time so that setting the output directory at the project level would have the desired effect.
You might consider using different project for each directory or so. Otherwise, using the exactly same filename within a certain project might seem a bit strange. (Sort of a hierarchy within the project structure; something I've never seen before anyway.)