OctoPack not populating placeholders in nuspec file - octopus-deploy

I have a solution with many projects, about a dozen of which have Octopack installed and packages are being produced correctly when TeamCity runs msbuild /p:RunOctoPack=true /p:OctoPackEnforceAddingFiles=true. As you can probably tell from the p:/OctoPackEnforceAddingFiles flag, each project with Octopack installed also has a nuspec file.
The problem we're having is that Octopack is not honouring the nuspec placeholders as specified at https://learn.microsoft.com/en-gb/nuget/reference/nuspec#replacement-tokens. The one we want to use right now is $id$ which should equal the assembly name of the project being packaged. Instead, when we run Octpack, that $id$ token is empty.
I can see at https://octopus.com/docs/packaging-applications/creating-packages/nuget-packages/using-octopack#UsingOctoPack-Replacementtokens that Octopack allows one to manually override these tokens, but that doesn't help me since Octopack is run on the solution, but I need the name of the project that is being packaged.
What can I do to get around this issue? At the moment we essentially have the project name hardcoded in the nuspec files, but this is becoming brittle and unwieldy and we'd like to fix it.

I have this working by adding the following to the csproj file
<PropertyGroup>
<OctoPackNuGetProperties>id=$(AssemblyName)</OctoPackNuGetProperties>
</PropertyGroup>
This passes the assembly name through as id to Octo.exe, which will in turn pass it through to NuGet.exe via its -Properties argument.

Have you tried not providing the $id$ section at all in NuSpec? Octopack should try and generate it (it knows what it is being applied to)?
Alternative could be to use the pre-build event with the $(ProjectName) macro to update the relevant nuspecs. In your case, depending on your build process, potentially sticking to a solution wide pre-build process to update all nuspec's.
P.S. I personally stepped away from using Octopack and currently employ Fake (F# Make).

Related

MSBuild.ExtensionPack.tasks inheritance of variables from MSBuild main project file

Case:
When I build from MSBuild (with VS Build Tools 2017) I don't get any value for $(ExtensionTasksPath) variable
from main msbuild file, when I build with integrated VS MSBuild value is passed from main file to MSBuild.ExtensionPack.tasks, is that expected behavior and why does it happen? I don't use any properties for the sake of testing that particular case.
Is that expected behavior and why does it happen?
I think it's not we expect.As I test on a VM, the msbuild from VS Build Tools2017 get the same value as what we can get in msbuild from VS IDE.
For the reason of this behavior I have some suggestions to help trouble
shooting:
1.First of all, make sure both two scenarios builds successfully.
2.As you mentioned above, you runs them in separate VMs, make sure the two file under test are the same and entire solution folder.(The package folder under solution directory makes sense)
3.Check in the .xxproj file, check if exists <Import Project="..\packages\MSBuild.Extension.Pack.1.9.1\build\net40\MSBuild.Extension.Pack.targets" ...>
I've found the $(ExtensionTasksPath) property is defined in MSBuild.Extension.Pack.targets file, and this file is imported into .xxproj file by <Import> tag.
Have a look at pics below from my sample project which install MSBuild.Extension.Pack by nuget:
After my project install the extension bu nuget, there is an Import sentence in csproj file, open it we can find:
The value of $(ExtensionTasksPath) in defined here. So i guess you may have sth missing with the targets file or the import sentense or have sth corrupt this property.
In summary:
1.keep the entire solution folder could be the best suggestion.
2.And if it not works, add a script below to your .xxproj file can work:
<PropertyGroup>
<ExtensionTasksPath> Absolute path of your MSBuild.ExtensionPack.dll</ExtensionTasksPath>
</PropertyGroup>
It will overwrite values from tag and no matter where you put the assembly, just add the absolute path can work.
It my answer is helpful, please give me a feedback. And any update please feel free to contact me.

How to add TypeScript Definitions to .NET Core Nuget Packages

We have an internal NuGet Package that consists of some .NET Code and a TypeScript Definition File (*.d.ts). This is the content of the package:
After installing the package into a new .NET Core project, the folder structure in the solution explorer looks like this.
So far, everything went as expected. But note the small arrow symbols on the "i18n" folder and the "Index.d.ts" file. It looks like they're just links to the actual file. When I click on the d.ts file the content seems correct. But Visual Studio fails to recognize the declarations in it, so I can't use it in my TypeScripts.
One idea was to include the path to the packages in the tsconfig.json, but that can't be the solution... Any other ideas how to do that?
How to add TypeScript Definitions to .NET Core Nuget Packages
As far as I know, Definitely Typed packages are not compatible with .NET Core projects. That because the script files should be included in <contentFiles> element. You can refer to the Including content files for more detail info.
Besides, just as Martin comment, npm is the recommended method of installing Definitely Typed packages:
https://github.com/DefinitelyTyped/DefinitelyTyped#how-do-i-get-them
So, after seeing the replies here and not giving up, I have to put in my approach to this.
As I'm working on a large solution with over 100 subprojects - many of them fast moving NuGets, I couldn't give up. I wanted to have my .NET object models including their interface/class representations in TS, being able to have both imported by including one NuGet (and thereby reduce dependency hell a little bit). I have to mention, I tested this only with my own object model, which has no external dependencies - and I tested only on VS2022.
But in this restricted scenario it works without any issues.
On the project containing the TS definitions
Set the build action for the ts definitions you need to be included in the NuGet to "content". This will include them into the NuGet package.
On the consumer side
Adjust your package reference, add the following property/value:
<GeneratePathProperty>True</GeneratePathProperty>
This will create an MsBuild property/variable referencing the path to the local presence of the restored NuGet file (important if your building on multiple, different machines - like on CI pipelines, build servers etc.) and allowing you to avoid any hardcoded absolute paths.
The generated property has the following format
$(Pkg<PackageNameWithDotsBeingReplacedByUnderlines>)
So a package named "MyPackage.Hello" would result in the variable $(PkgMyPackage_Hello)
Now we create a new build target to copy the files from the restored package's contentfiles folder (as it's restored, and we have the restored and thereby extracted path, we can finally target them).
<Target Name="CopyImportedTypes" BeforeTargets="Build">
<ItemGroup>
<TsTypesToCopy Include="$(PkgMyPackage_Hello)\contentFiles\any\net6.0-windows10.0.20348\*.ts" />
</ItemGroup>
<Copy SourceFiles="#(TsTypesToCopy)" DestinationFolder="$(MSBuildProjectDirectory)\AnyProjectSubFolderIfDesired" SkipUnchangedFiles="true" OverwriteReadOnlyFiles="true" />
</Target>
Make sure to adjust the "Include" path to your package (TFM, Platform etc.). An easy way to get the relative path is to open up the solution explorer, expand your consuming project, expand dependencies and then packages, expand the package with your ts definitions and open up the properties of the contentfiles.
This target is being executed before the actual build (so we can use the imported types on the build being happening right this moment) (BeforeTargets property). The ItemGroup is nothing else than a definition of items (in our case, source files) we want to use, being stored into #(TsTypesToCopy) which is being used by the copy task.
Thankfully, VS does automatically set new files to the right build action (in most cases), so the brand new ts files should be in the right mode automatically - so we don't have to tweak anything manually.

How to emulate /p msbuild parameter in Visual Studio build?

That the logical follow-up for the my previous question: "How to check all projects in solution for some criteria?"
I was given quite a good answer to use CustomAfterMicrosoftCommonTargets, CustomBeforeMicrosoftCommonTargets. They do work, so I decided not to stop in the middle.
Issue is that I don't want machine-wide tasks. It's not a good idea neither for me (it will affect other builds. sure, this can be handled, but still), nor for my teammates (I don't want to let them put something in system folders... ), nor for build server.
What is needed: solution to be built from scratch out of source control on clean machine with either Visual Studio or MSBuild.
It appeared that Custom*MicrosoftCommonTargets are regular properties.
So, how to specify this property? It works pretty fine when to set it from command line.
That's strange, but it appears that bit of magic present here: property passed as command line parameter to one build is transitively passed to all nested builds!
That's fine for build server. But this won't work with Visual Studio build. And even declaring solution-level property won't help: neither static, nor dynamic properties are transfer to nested builds.
...I have a hacky idea to set environment variable on before solution build and erase it on after. But I don't like it. Any better ideas?
I use a bit different technique then #Spider M9. I want that all projects in solution tree/all subdirectories from current directory use extended build throw Custom*MicrosoftCommonTargets. I don't like to be forced to change every new project to import custom targets/props.
I place special file, let's say msbuild.include, in the root directory and my custom targets loader for every project tries to find it in ., ..\, ..\..\, and so on. msbuild.include contains flags that triggers execution of custom actions. If loader can't find this file it disables loading all custom targets and stoppes. This gives me ability to use my build extensions with projects from work repositories and to not use with opensource projects.
If you are interested in I can publish loader. It's a pretty simple and elegant solution.
For example I can sign any assembly in all projects in all subfolders with my key.
I always set up every project to import a standard .props file. Use the GetDirectoryNameOfFileAbove property function (see MSDN) to find it. Do this as the first line of every project file. Once established, you can redirect from that file to other imports. Another trick is to have that standard import (that would obviously be under version control) import conditionally another .props file only if it exists. This optional file would not be in version control, but is available for any developer to create and modify with their own private/temporary properties or other behavior.

Visual Studio msbuild

I have a question regarding the commandline options of msbuild. I am
currently using msbuild to build projects using the existing solution
files. These solution files have references to external dll which have
different paths on each machine. I am currently writing a build script
and passing the specific path to the project file via the /p: switch of
msbuild.
My current build line is:
msbuild test.sln /p:ReferencePath="c:\abc" /p:ReferencePath="c:\rca"
What i have noticed that Reference Path now contains only c:\rca and
not c:\abc. this is causing problems for me since, the external dlls
lie in two different directorys. I am allowed to keep multiple
reference paths via visual studio, but not via the commandline.
Is there any known way by which i can do this
I believe you can use this /p:ReferencePath="c:\abc;c:\rca"
At least that is what that link is hinting at, they are using %3B to encode the ";" within the build file.
Although the correct syntax for providing more the one reference path is listed above, I would suggest solving the root cause which in my opinion is the different locations of your referenced assembly. I would suggest you put all thirdparty dependencies, apart from the framework assemblies in your source code repository for the following reasons:
Relatitve paths are consistent across computers
The source code is always in sink with the correct version of your thirdparty assembly (if you for instance need to build an old version of your software 2 years from now).
Upgrading your thirdparty assembly is as easy as upgrading on one machine and then committing your changes to the repository. (In a previous project we even went as far as checking in the entire java runtime environment and were quite happy with the given setup.)
Try seperating your pathes with a semi-colon (;)
Like this:
c:\abc;c:\rca
You may be better off by synchronizing your libraries across machines. I have found that Visual Studio makes this easy. Simply add a solution folder, and add your libraries there. Then, in each project, reference the libraries from this common place. This way, each developer has them in the same place.
This will remove one of variables you have when trying to script out builds.
The command line options for setting the reference path will work just fine (assuming you escape the semi colon, it seems both %3B and ; will work). However, when the argument was passed in from nant (and I needed multiple paths), creating a 'Visual Studio Project User Options file' seemed to work better.
I just emit (echo) a file to the file system with the following format:
<Project xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<ReferencePath>
C:\abc;c:\rca
</ReferencePath>
</PropertyGroup>
I give the *.user file an appropriate name (given a project file MyProject.csproj, my user file would be MyProject.csproj.user)

How to associate external files with an assembly

Let's say you have a class library project that has any number of supplemental files that also need to be included with the compiled assembly (e.g. simple text files or even a legacy unmanaged DLL that's wrapped by the assembly as an interop layer). While embedding the supplemental files into the assembly itself is relatively straightforward, we have situations where this is not possible or just undesirable. We need to have them as "sidecar" files (i.e. files alongside the assembly, potentially in subdirectories relative to the assembly)
Adding those files to the project with an appropriate value for "Copy to Output Directory" specified appears to be sufficient for projects that are completely self-contained within a solution. But if a separate project in another solution adds a reference to the assembly, it does not automatically pickup its sidecar files. Is there a way in the project to somehow mark the resulting assembly such that anything referencing the assembly will also know it needs to include the associated sidecar files? How do you do this?
You can use al.exe, but there also appears to be a C# compiler option. You want to create a multifile assembly using the /linkresource C# compiler option. Instructions are here, but the command is similar to this:
csc /linkresource:N.dll /t:library A.cs
Where N.dll is a native DLL that will go wherever the managed assembly goes (including into the GAC.) There's a very clear description at the link I provided.
Have you tried creating a setup for your solution ? There's an option of including sidecar files targeting to application installation directory.
Another option would be to include the sidecar files in the Assembly resources and un-wrap them to disk when run for the first time.
What if you create a merge module containing the library plus its dependencies? Your installer will then need to reference this module, but you will ensure all of the necessary files will be present.
Unfortunately there doesn't appear to be a lot of built-in support in Visual Studio for this, although I can definitely see the use case.
If you use Subversion for your source control, then you could link in an external reference as an externals definition. This would bring in the source code, and you'd be making a reference to the necessary assembly as a project reference instead of a DLL reference, and then the copy to output directory rules would come into play.
If that's not possible, another solution would be to include commands in the pre/post-build events of your in-solution project to copy the most up-to-date sidecar files from the remote assembly on a build. Of course this comes with the caveat that it doesn't set itself up automatically when you include the DLL in your project; you have to take manual steps to set it up.
I deal with this some time ago. Its a common problem.
You can create some postbuild actions:
http://www.codingday.com/execute-batch-commands-before-or-after-compilation-using-pre-build-or-post-build-events/
Hope this helps... :)
It appears to me that you're using the wrong type of reference. There are two types of references- Reference and ProjectReference. Reference is an explicit reference to a specific assembly. ProjectReference is a reference to another project (say .csproj).
What you're looking for is ProjectReference. VS and the default MSBuild targets are setup to do CopyLocal. If you set CopyToOutputPath true for your "sidecar" files, any ProjectReferences to this project now will also pull in the same files.
I'm not sure if you can to ProjectReferences across solutions in the IDE. I deal a lot with MSBuild where sln files are not relevant and this is how I deal with it.
What we did in our project is that we created as separate build file to do all those stuffs.
In your build file you can have tags to build your main solution, then add tags to copy files you need after build.
NAnt is also your option, but right now I'm happy using Rake as my build/debug automation.
Since this cannot be integrated within Visual Studio, what I'm doing is I create a task (either in MSBuild, NAnt or Rake), that executes vsjitdebugger.exe in the end to attach it to my Visual Studio when debugging.
These are just my styles for now, you can maybe create your own style.

Resources